diff --git a/.dockerignore b/.dockerignore new file mode 100644 index 0000000..ab6c2dd --- /dev/null +++ b/.dockerignore @@ -0,0 +1,11 @@ +# Exclude everything +* + +# Except files we explicitly need, grouped by copy command +!.git +!src/pysparkplug +src/**/__pycache__/ +src/**/*.egg-info/ +!pyproject.toml +!README.md +!LICENSE diff --git a/.github/CODEOWNERS b/.github/CODEOWNERS new file mode 100644 index 0000000..bece3de --- /dev/null +++ b/.github/CODEOWNERS @@ -0,0 +1,2 @@ +# Require reviews by this Github Team, for any change +* @matteosox diff --git a/.github/CONTRIBUTING.md b/.github/CONTRIBUTING.md new file mode 100644 index 0000000..e9b6d67 --- /dev/null +++ b/.github/CONTRIBUTING.md @@ -0,0 +1,159 @@ +# Contributor Guide + +## Getting started + +We use Docker as a clean, reproducible development environment within which to build, test, generate docs, and so on. As long as you have a modern version of Docker, you should be able to run all developer workflows. That's it! Of course, running things natively isn't a supported/maintained thing. + +## Tests + +_TL;DR: Run `test/test.sh` to run the full suite of tests._ + +### Black Code Formatting + +_TL;DR: Run `test/test.sh -s black` to test your code's formatting._ + +We use [Black](https://black.readthedocs.io/en/stable/index.html) for code formatting. To format your code, run `test/test.sh -s fix` to get all your spaces in a row. Black configuration can be found in the `pyproject.toml` file at the root of the repo. + +### isort Import Ordering + +_TL;DR: Run `test/test.sh -s isort` to test your code's imports._ + +For import ordering, we use [isort](https://pycqa.github.io/isort/). To get imports ordered correctly, run `test/test.sh -s fix`. isort configuration can be found in the `pyproject.toml` file at the root of the repo. + +### Pylint Code Linting + +_TL;DR: Run `test/test.sh -s pylint` to lint your code._ + +We use [Pyint](https://pylint.pycqa.org/en/latest/) for Python linting (h/t Itamar Turner-Trauring from his site [pythonspeed](https://pythonspeed.com/articles/pylint/) for inspiration). To lint your code, run `test/test.sh -s pylint`. In addition to showing any linting errors, it will also print out a report. Pylint configuration can be found in the `pylintrc` file at the root of the repo. + +Pylint is setup to lint the `src`, `test/unit_tests` and `docs` directories, along with `noxfile.py`. To add more modules or packages for linting, edit the `pylint` test found in `noxfile.py`. + +### Mypy Static Type Checking + +_TL;DR: Run `test/test.sh -s mypy` to type check your code._ + +We use [Mypy](https://mypy.readthedocs.io/en/stable/) for static type checking. To type check your code, run `test/test.sh -s mypy`. Mypy configuration can be found in the `pyproject.toml` file at the root of the repo. + +Mypy is setup to run on the `src` and`test/unit_tests`, along with `noxfile.py` and `docs/linkcode.py`. To add more modules or packages for type checking, edit the `mypy` test found in `noxfile.py`. + +### Unit Tests + +_TL;DR: Run `test/test.sh -s unit_tests-3.10 -- fast` to unit test your code quickly._ + +While we use [`unittest`](https://docs.python.org/3/library/unittest.html) to write unit tests, we use [`pytest`](https://docs.pytest.org/) for running them. To unit test your code, run `test/test.sh -s unit_tests-3.10 -- fast`. This will run unit tests in Python 3.10 only, without any coverage reporting overhead. To run the tests across all supported versions of Python, run `test/test.sh -s unit_tests`, which will also generate coverage reports which can be aggregated using `test/test.sh -s coverage`. + +`pytest` is setup to discover tests in the `test/unit_tests` directory. All test files must match the pattern `test*.py`. `pytest` configuration can be found in the `pyproject.toml` file at the root of the repo. To add more directories for unit test discovery, edit the `testpaths` configuration option. + +### Test Coverage + +_TL;DR: Run `test/test.sh -s coverage` after running the unit tests with coverage to test the coverage of the unit test suite._ + +We use [Coverage.py](https://coverage.readthedocs.io/en/coverage-5.5/) to test the coverage of the unit test suite. This will print any coverage gaps from the full test suite. Coverage.py configuration can be found in the `pyproject.toml` file at the root of the repo. + +### Documentation Tests + +_TL;DR: Run `test/test.sh -s docs` to build and test the documentation._ + +See [below](#documentation) for more info on the documentation build process. In addition to building the documentation, the `test/docs.sh` shell script uses Sphinx's [`doctest`](https://www.sphinx-doc.org/en/master/usage/extensions/doctest.html) builder to ensure the documented output of usage examples is accurate. Note that the `README.md` file's ` ```python` code sections are transformed into `{doctest}` directives by `docs/conf.py` during the documentation build process. This allows the `README.md` to render code with syntax highlighting on Github & [PyPI](https://pypi.org) while still ensuring accuracy using `doctest`. + +### Packaging Tests + +_TL;DR: Run `test/test.sh -s packaging` to build and test the package._ + +We use [`build`](https://pypa-build.readthedocs.io/en/latest/) to build source distributions and wheels. We then use [`check-wheel-contents`](https://github.com/jwodder/check-wheel-contents) to test for common errors and mistakes found when building Python wheels. Finally, we use [`twine check`](https://twine.readthedocs.io/en/latest/#twine-check) to check whether or not `pysparkplug`'s long description will render correctly on [PyPI](https://pypi.org). To test the package build, run `test/test.sh -s packaging`. While there is no configuration for `build` or `twine`, the configuration for `check-wheel-contents` can be found in the `pyproject.toml` file at the root of the repo. + +## Documentation + +_TL;DR: To build and test the documentation, run `test/test.sh -s docs`._ + +We use [Sphinx](https://www.sphinx-doc.org/en/master/index.html) for documentation site generation. To build the documentation, run `test/test.sh -s docs`. To view it, open `docs/build/html/index.html` in your browser. + +Sphinx configuration can be found in `docs/conf.py`. It is setup to generate pages based on what it finds in the `toctree` directive in `docs/index.md`. To add new pages, add them to the table of contents with that directive. + +### API Reference + +The "API Reference" page is mostly auto-generated using the [`autodoc`](https://www.sphinx-doc.org/en/master/usage/extensions/autodoc.html), [`autosummary`](https://www.sphinx-doc.org/en/master/usage/extensions/autosummary.html), [`intersphinx`](https://www.sphinx-doc.org/en/master/usage/extensions/intersphinx.html), and [`linkcode`](https://www.sphinx-doc.org/en/master/usage/extensions/viewcode.html) Sphinx extensions. Classes, functions, decorators, and so on need to be added manually to the `docs/api.rst` file, but once included, the entries are auto-generated using type annotations and docstrings. + +### Docstring Formatting + +We use the [`napoleon`](https://www.sphinx-doc.org/en/master/usage/extensions/napoleon.html) Sphinx extension to enable docstring formats other than Sphinx's default, rather unreadable format. Instead, we use [Google's docstring standard](https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings). Types and defaults should not be referenced in the docstring, instead included in annotations. + +### Auto-generated Github Links + +We use the [`linkcode`](https://www.sphinx-doc.org/en/master/usage/extensions/linkcode.html) Sphinx extension to add links to Github on the API Reference page. The code for mapping Python objects to links can be found in the `docs/linkcode.py` Python module. + +### Changelog + +We document changes in the `CHANGELOG.md` file. This project adheres to the [keep a changelog](https://keepachangelog.com/en/1.0.0/) standard. Before committing changes that impact users, make sure to document features added, changed, deprecated, removed, fixed, or security-related changes to the "## Unreleased" section. + +### Publishing Documentation + +We use [Read the Docs](https://docs.readthedocs.io/en/stable/index.html) for building and publishing `pysparkplug`'s documentation. Its Github integration makes this process seamless. Read the Docs configuration can be found in the `.readthedocs.yaml` file at the root of the repo. + +While documentation for the `pysparkplug` package is generated and hosted by Read the Docs, the documentation can be found at a custom domain: [pysparkplug.mattefay.com](https://pysparkplug.mattefay.com). You can read more about this [here](https://docs.readthedocs.io/en/stable/custom_domains.html). + +## Releasing + +### Release Process + +Every push to the `main` branch on Github generates a draft release on Github. To publish a release, one should: + +1.) If creating a final release (i.e. not a pre-release), create and merge a pull request that updates the `CHANGELOG.md` such that the released changes section is renamed from "## Unreleased" to "## {MAJOR.MINOR.MICRO} (YYYY-MM-DD)" + +2.) Review the draft release. Update the tag for the draft release to the version you want to release with a prefixed v, i.e. "v{MAJOR.MINOR.MICRO}", and add any additional notes as you see fit. Publish it. This will trigger the `release` Github Action, which will publish to [PyPI](https://pypi.org). + +3.) After confirming that the release on Github look good, as does the package on [PyPI](https://pypi.org), if this was a final release (i.e. you updated the `CHANGELOG.md`) create and merge a new pull request that creates a new "## Unreleased" section at the top of the `CHANGELOG.md`. This should have new, empty sections for Added, Changed, Deprecated, Removed, Fixed, and Security. + +### Determining the Version + +`pysparkplug` is versioned according to [PEP 440](https://www.python.org/dev/peps/pep-0440/). The type of final release (major, minor, or micro) should be determined by the types of unreleased changes in the changelog. Any "Removed" changes call for a major release (increment the major digit, minor and micro reset to 0). "Added" changes call for a minor release (increment the minor digit, micro set to 0). Otherwise, a "micro" release is called for (increment the micro digit only). + +Intermediate versions between releases are incremented with `dev` and taken care of by [`hatch-vcs`](https://github.com/ofek/hatch-vcs). + +## Continuous Integration & Continuous Deployment + +We use Github actions to run our CI/CD pipeline on every pull request. The configuration can be found in `.github/workflows/cicd.yaml`. That said, every step of every job can also be run locally. + +### Main + +This is the "main" job, which consists of running the test suite, creating a draft release, and publishing the package to [TestPyPI](https://test.pypi.org). + +### OS Compatibility + +Using Github Actions' [build matrix feature](https://docs.github.com/en/actions/learn-github-actions/managing-complex-workflows#using-a-build-matrix), we're able to run unit tests on MacOS, Windows, & Linux, for each supported version of Python. + +### Publish + +A separate `publish` workflow is configured in `.github/workflows/publish.yaml`. This workflow publishes the package to [PyPI](https://pypi.org), and is triggered by a Github release being published. + +## Pull Requests + +The `main` branch has [branch protections](https://help.github.com/en/github/administering-a-repository/about-protected-branches) turned on in Github, requiring one reviewer to approve a PR before merging. We also use the code owners feature to specify who can approve certain PRs. As well, merging a PR requires status checks (Read the Docs along with both CI/CD jobs) to complete successfully. + +When naming a branch, please use the syntax `username/branch-name-here`. If you plan to collaborate with others on that branch, use `team/branch-name-here`. + +## Future Work + +- Better unit testing +- 100% test coverage +- Doctest +- Improve README.md +- Integration testing +- Primary host usecase +- Edge Node + - Aliases + - Rebirth + - Multiple MQTT server support + - Drop datatypes + - Report by exception logic + - Device metric polling +- data types + - Template types + - Metadata + - Properties + - DataSet types + - Array types +- MQTT v5 +- Historian/analytics (just listens) +- Refactor all of `_payload.py`. +- Refactor `_datatype.py` for better type annotation. diff --git a/.github/ISSUE_TEMPLATE/bug_report.md b/.github/ISSUE_TEMPLATE/bug_report.md new file mode 100644 index 0000000..e4d9379 --- /dev/null +++ b/.github/ISSUE_TEMPLATE/bug_report.md @@ -0,0 +1,21 @@ +--- +name: Bug report +about: Create a report to help us fix the issue +title: '[BUG]' +labels: '' +assignees: matteosox + +--- + +## Describe the bug 🐛 +A clear and concise description of what the bug is, i.e. what you did, what you expected to happen, and what happened instead. + +## To Reproduce 🔁 +Preferably a code snippet. + +## Environment 🏔 +Copy and paste what you get when you run the code below in the Python interpreter you're using: +`import sys, platform; print(f"{sys.version}\n{platform.platform()}")` + +## Additional context ✨ +Add any other context about the problem here. diff --git a/.github/ISSUE_TEMPLATE/feature_request.md b/.github/ISSUE_TEMPLATE/feature_request.md new file mode 100644 index 0000000..f93265f --- /dev/null +++ b/.github/ISSUE_TEMPLATE/feature_request.md @@ -0,0 +1,20 @@ +--- +name: Feature request +about: Suggest a new idea +title: '[FEATURE]' +labels: '' +assignees: matteosox + +--- + +## What seems to be the problem here? 🐛 +A clear and concise description of what the problem is. Ex. I'm always frustrated when [...] + +## Proposed solution 👩‍💻 +A clear and concise description of what you want to happen. + +## Alternatives 🚧 +A clear and concise description of any alternative solutions or features you've considered. + +## Additional context ✨ +Add any other context or screenshots about the feature request here. diff --git a/.github/PULL_REQUEST_TEMPLATE.md b/.github/PULL_REQUEST_TEMPLATE.md new file mode 100644 index 0000000..dbe2498 --- /dev/null +++ b/.github/PULL_REQUEST_TEMPLATE.md @@ -0,0 +1,27 @@ +## Summary + +What's the [new hotness](https://youtu.be/ha-uagjJQ9k?t=17)? + +### Why? + +What is my [purpose](https://youtu.be/X7HmltUWXgs?t=52)? + +### How? + +But how?! + +## Checklist + +Most checks are automated, but a few aren't, so make sure to go through and tick them off, even if they don't apply. This checklist is here to help, not deter you. Remember, "Slow is smooth, and smooth is fast". + +- [ ] **Unit tests** + - Every input should have a test for it. + - Every potential raised exception should have a test ensuring it is raised. +- [ ] **Documentation** + - New functions/classes/etc. must be added to `docs/api.rst`. + - Changed/added classes/methods/functions have appropriate `versionadded`, `versionchanged`, or `deprecated` [directives](http://www.sphinx-doc.org/en/stable/markup/para.html#directive-versionadded). + - The appropriate entry in `CHANGELOG.md` has been included in the "Unreleased" section, i.e. "Added", "Changed", "Deprecated", "Removed", "Fixed", or "Security". +- [ ] **Future work** + - Future work should be documented in the contributor guide, i.e. `.github/CONTRIBUTING.md`. + +If you have any questions not answered by a quick readthrough of the [contributor guide](https://pysparkplug.mattefay.com/en/latest/contributor_guide.html), add them to this PR and submit it. diff --git a/.github/workflows/cicd.yaml b/.github/workflows/cicd.yaml new file mode 100644 index 0000000..9c7852e --- /dev/null +++ b/.github/workflows/cicd.yaml @@ -0,0 +1,75 @@ +name: CI/CD + +on: + push: + branches: + - main + pull_request: + branches: + - main + workflow_dispatch: + +jobs: + main: + runs-on: ubuntu-latest + steps: + - name: Checkout + uses: actions/checkout@v2 + with: + fetch-depth: 0 + + - name: Test + run: test/test.sh + + - name: Create draft Github release + if: ${{ github.ref == 'refs/heads/main' }} + run: test/test.sh -s draft_release + env: + GH_TOKEN: ${{ secrets.GITHUB_TOKEN }} + + - name: Publish package to testpypi + if: ${{ github.ref == 'refs/heads/main' }} + run: test/test.sh -s publish -- testpypi + env: + TWINE_USERNAME: __token__ + TWINE_PASSWORD: ${{ secrets.TESTPYPI_TOKEN }} + + os_compatibility: + runs-on: ${{ matrix.os }} + name: "OS: ${{ matrix.os }} Python: ${{ matrix.python-version }}" + strategy: + matrix: + os: ["ubuntu-latest", "windows-latest", "macos-latest"] + python-version: ["3.8", "3.9", "3.10", "3.11"] + env: + OS: ${{ matrix.os }} + PYTHON: ${{ matrix.python-version }} + + steps: + - name: Checkout + uses: actions/checkout@v2 + + - name: Set up Python ${{ matrix.python-version }} + uses: actions/setup-python@v2 + with: + python-version: ${{ matrix.python-version }} + + - name: Install dependencies + run: | + pip install --upgrade pip wheel setuptools + pip install coverage[toml] nox==2023.04.22 + + - name: Run unit tests + run: nox --session unit_tests-{{ matrix.python-version }} + + - name: Combine coverage reports + run: | + coverage combine + coverage xml --fail-under 0 + + - name: Upload to Codecov + uses: codecov/codecov-action@v2 + with: + env_vars: OS,PYTHON + fail_ci_if_error: true + verbose: true diff --git a/.github/workflows/publish.yaml b/.github/workflows/publish.yaml new file mode 100644 index 0000000..804a7a0 --- /dev/null +++ b/.github/workflows/publish.yaml @@ -0,0 +1,20 @@ +name: Release + +on: + release: + types: + - published + workflow_dispatch: + +jobs: + publish: + runs-on: ubuntu-latest + steps: + - name: Checkout + uses: actions/checkout@v2 + + - name: Publish package to pypi + run: test/test.sh -s publish -- pypi + env: + TWINE_USERNAME: __token__ + TWINE_PASSWORD: ${{ secrets.PYPI_TOKEN }} diff --git a/.gitignore b/.gitignore new file mode 100644 index 0000000..dc75973 --- /dev/null +++ b/.gitignore @@ -0,0 +1,19 @@ +# MacOS +.DS_Store + +# Python +__pycache__/ +*.pyc +*.egg-info +.ipynb_checkpoints +src/pysparkplug/_version.py + +# VS Code +/.vscode/ +/*.code-workspace + +# Cache from Docker container +/.cache + +# Docs +/docs/build diff --git a/.readthedocs.yaml b/.readthedocs.yaml new file mode 100644 index 0000000..b07a009 --- /dev/null +++ b/.readthedocs.yaml @@ -0,0 +1,29 @@ +# Read the Docs configuration file +# See https://docs.readthedocs.io/en/stable/config-file/v2.html for details + +version: 2 + +# Set the version of Python and other tools you might need +build: + os: ubuntu-22.04 + tools: + python: "3.10" + +# Declare the Python requirements required to build your docs +python: + install: + - method: pip + path: . + - requirements: + - furo + - myst-parser + - packaging + - sphinx + - sphinx-copybutton + - sphinx-notfound-page + - sphinxext-opengraph + +# Build documentation in the docs/source directory with Sphinx +sphinx: + fail_on_warning: true + configuration: docs/conf.py diff --git a/CHANGELOG.md b/CHANGELOG.md new file mode 100644 index 0000000..6487857 --- /dev/null +++ b/CHANGELOG.md @@ -0,0 +1,46 @@ +# Changelog + +All notable changes for `pysparkplug` will be documented in this file. +This project adheres to [Semantic Versioning](http://semver.org/) and [Keep a Changelog](http://keepachangelog.com/). + +## Unreleased + +### Added +- `Client` low-level MQTT client +- `ClientOptions` class of optional settings for an MQTT client +- `ConnackCode` MQTT Connection Acknowledgement codes +- `DBirth` class representing a DBirth payload +- `DCmd` class representing a DCmd payload +- `DData` class representing a DData payload +- `DDeath` class representing a DDeath payload +- `DataType` enumeration of Sparkplug B datatypes +- `Device` class representing a Device in Sparkplug B +- `EdgeNode` class representing an EdgeNode in Sparkplug B +- `ErrorCode` MQTT error codes +- `MQTTError` Error from MQTT client +- `MQTTProtocol` MQTT protocol enum +- `Message` class representing a Sparkplug B message +- `MessageType` Sparkplug B message type enum +- `Metric` class representing a Sparkplug B metric +- `MetricValue` type alias for the Python type of the value of a Sparkplug B metric +- `NBirth` class representing a NBirth payload +- `NCmd` class representing a NCmd payload +- `NData` class representing a NData payload +- `NDeath` class representing a NDeath payload +- `QoS` MQTT quality of service enum +- `State` class representing a State payload +- `TLSConfig` TLS configuration class +- `Topic` class representing a Sparkplug B topic +- `Transport` MQTT transort enum +- `WSConfig` Websockets configuration class +- `get_current_timestamp` returns current time in a Sparkplug B compliant format + +### Changed + +### Deprecated + +### Removed + +### Fixed + +### Security diff --git a/Dockerfile b/Dockerfile new file mode 100644 index 0000000..92a338f --- /dev/null +++ b/Dockerfile @@ -0,0 +1,45 @@ +# syntax=docker/dockerfile:1.4 +FROM ubuntu:22.04 + +# Install OS-level packages +RUN --mount=type=cache,target=/var/cache/apt \ + --mount=type=cache,target=/var/lib/apt \ + # Ubuntu image is configured to delete cached files. + # We're using a cache mount, so we remove that config. + rm --force /etc/apt/apt.conf.d/docker-clean && \ + echo 'Binary::apt::APT::Keep-Downloaded-Packages "true";' > /etc/apt/apt.conf.d/keep-cache && \ + # Tell apt-get we're never going to be able to give manual feedback + export DEBIAN_FRONTEND=noninteractive && \ + # Update the package listing, so we know what packages exist + apt-get update && \ + # Install security updates + apt-get --yes upgrade && \ + # Add deadsnakes ppa to install other versions of Python + apt-get --yes install --no-install-recommends software-properties-common gpg-agent && \ + add-apt-repository ppa:deadsnakes/ppa && \ + apt-get update && \ + # Install packages, without unnecessary recommended packages + apt-get --yes install --no-install-recommends \ + python3.8 python3.8-distutils \ + python3.9 python3.9-distutils python3.10 python3.10-venv \ + python3.11 git tini + +# Create and activate virtual environment +ENV VIRTUAL_ENV="/root/.venv" +RUN python3.10 -m venv "$VIRTUAL_ENV" +ENV PATH="$VIRTUAL_ENV/bin:$PATH" + +# Setup root home directory +WORKDIR /root/pysparkplug + +# Install nox +RUN --mount=type=cache,target=/root/.cache/pip \ + pip install --upgrade pip setuptools wheel && \ + pip install nox==2023.04.22 + +# Trust repo directory +RUN git config --global --add safe.directory /root/pysparkplug + +ENV XDG_CACHE_HOME="/root/pysparkplug/.cache" + +ENTRYPOINT ["tini", "-v", "--"] diff --git a/LICENSE b/LICENSE new file mode 100644 index 0000000..aed7439 --- /dev/null +++ b/LICENSE @@ -0,0 +1,201 @@ + Apache License + Version 2.0, January 2004 + http://www.apache.org/licenses/ + + TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION + + 1. Definitions. + + "License" shall mean the terms and conditions for use, reproduction, + and distribution as defined by Sections 1 through 9 of this document. + + "Licensor" shall mean the copyright owner or entity authorized by + the copyright owner that is granting the License. + + "Legal Entity" shall mean the union of the acting entity and all + other entities that control, are controlled by, or are under common + control with that entity. For the purposes of this definition, + "control" means (i) the power, direct or indirect, to cause the + direction or management of such entity, whether by contract or + otherwise, or (ii) ownership of fifty percent (50%) or more of the + outstanding shares, or (iii) beneficial ownership of such entity. + + "You" (or "Your") shall mean an individual or Legal Entity + exercising permissions granted by this License. + + "Source" form shall mean the preferred form for making modifications, + including but not limited to software source code, documentation + source, and configuration files. + + "Object" form shall mean any form resulting from mechanical + transformation or translation of a Source form, including but + not limited to compiled object code, generated documentation, + and conversions to other media types. + + "Work" shall mean the work of authorship, whether in Source or + Object form, made available under the License, as indicated by a + copyright notice that is included in or attached to the work + (an example is provided in the Appendix below). + + "Derivative Works" shall mean any work, whether in Source or Object + form, that is based on (or derived from) the Work and for which the + editorial revisions, annotations, elaborations, or other modifications + represent, as a whole, an original work of authorship. For the purposes + of this License, Derivative Works shall not include works that remain + separable from, or merely link (or bind by name) to the interfaces of, + the Work and Derivative Works thereof. + + "Contribution" shall mean any work of authorship, including + the original version of the Work and any modifications or additions + to that Work or Derivative Works thereof, that is intentionally + submitted to Licensor for inclusion in the Work by the copyright owner + or by an individual or Legal Entity authorized to submit on behalf of + the copyright owner. For the purposes of this definition, "submitted" + means any form of electronic, verbal, or written communication sent + to the Licensor or its representatives, including but not limited to + communication on electronic mailing lists, source code control systems, + and issue tracking systems that are managed by, or on behalf of, the + Licensor for the purpose of discussing and improving the Work, but + excluding communication that is conspicuously marked or otherwise + designated in writing by the copyright owner as "Not a Contribution." + + "Contributor" shall mean Licensor and any individual or Legal Entity + on behalf of whom a Contribution has been received by Licensor and + subsequently incorporated within the Work. + + 2. Grant of Copyright License. Subject to the terms and conditions of + this License, each Contributor hereby grants to You a perpetual, + worldwide, non-exclusive, no-charge, royalty-free, irrevocable + copyright license to reproduce, prepare Derivative Works of, + publicly display, publicly perform, sublicense, and distribute the + Work and such Derivative Works in Source or Object form. + + 3. Grant of Patent License. Subject to the terms and conditions of + this License, each Contributor hereby grants to You a perpetual, + worldwide, non-exclusive, no-charge, royalty-free, irrevocable + (except as stated in this section) patent license to make, have made, + use, offer to sell, sell, import, and otherwise transfer the Work, + where such license applies only to those patent claims licensable + by such Contributor that are necessarily infringed by their + Contribution(s) alone or by combination of their Contribution(s) + with the Work to which such Contribution(s) was submitted. If You + institute patent litigation against any entity (including a + cross-claim or counterclaim in a lawsuit) alleging that the Work + or a Contribution incorporated within the Work constitutes direct + or contributory patent infringement, then any patent licenses + granted to You under this License for that Work shall terminate + as of the date such litigation is filed. + + 4. Redistribution. You may reproduce and distribute copies of the + Work or Derivative Works thereof in any medium, with or without + modifications, and in Source or Object form, provided that You + meet the following conditions: + + (a) You must give any other recipients of the Work or + Derivative Works a copy of this License; and + + (b) You must cause any modified files to carry prominent notices + stating that You changed the files; and + + (c) You must retain, in the Source form of any Derivative Works + that You distribute, all copyright, patent, trademark, and + attribution notices from the Source form of the Work, + excluding those notices that do not pertain to any part of + the Derivative Works; and + + (d) If the Work includes a "NOTICE" text file as part of its + distribution, then any Derivative Works that You distribute must + include a readable copy of the attribution notices contained + within such NOTICE file, excluding those notices that do not + pertain to any part of the Derivative Works, in at least one + of the following places: within a NOTICE text file distributed + as part of the Derivative Works; within the Source form or + documentation, if provided along with the Derivative Works; or, + within a display generated by the Derivative Works, if and + wherever such third-party notices normally appear. The contents + of the NOTICE file are for informational purposes only and + do not modify the License. You may add Your own attribution + notices within Derivative Works that You distribute, alongside + or as an addendum to the NOTICE text from the Work, provided + that such additional attribution notices cannot be construed + as modifying the License. + + You may add Your own copyright statement to Your modifications and + may provide additional or different license terms and conditions + for use, reproduction, or distribution of Your modifications, or + for any such Derivative Works as a whole, provided Your use, + reproduction, and distribution of the Work otherwise complies with + the conditions stated in this License. + + 5. Submission of Contributions. Unless You explicitly state otherwise, + any Contribution intentionally submitted for inclusion in the Work + by You to the Licensor shall be under the terms and conditions of + this License, without any additional terms or conditions. + Notwithstanding the above, nothing herein shall supersede or modify + the terms of any separate license agreement you may have executed + with Licensor regarding such Contributions. + + 6. Trademarks. This License does not grant permission to use the trade + names, trademarks, service marks, or product names of the Licensor, + except as required for reasonable and customary use in describing the + origin of the Work and reproducing the content of the NOTICE file. + + 7. Disclaimer of Warranty. Unless required by applicable law or + agreed to in writing, Licensor provides the Work (and each + Contributor provides its Contributions) on an "AS IS" BASIS, + WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or + implied, including, without limitation, any warranties or conditions + of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A + PARTICULAR PURPOSE. You are solely responsible for determining the + appropriateness of using or redistributing the Work and assume any + risks associated with Your exercise of permissions under this License. + + 8. Limitation of Liability. In no event and under no legal theory, + whether in tort (including negligence), contract, or otherwise, + unless required by applicable law (such as deliberate and grossly + negligent acts) or agreed to in writing, shall any Contributor be + liable to You for damages, including any direct, indirect, special, + incidental, or consequential damages of any character arising as a + result of this License or out of the use or inability to use the + Work (including but not limited to damages for loss of goodwill, + work stoppage, computer failure or malfunction, or any and all + other commercial damages or losses), even if such Contributor + has been advised of the possibility of such damages. + + 9. Accepting Warranty or Additional Liability. While redistributing + the Work or Derivative Works thereof, You may choose to offer, + and charge a fee for, acceptance of support, warranty, indemnity, + or other liability obligations and/or rights consistent with this + License. However, in accepting such obligations, You may act only + on Your own behalf and on Your sole responsibility, not on behalf + of any other Contributor, and only if You agree to indemnify, + defend, and hold each Contributor harmless for any liability + incurred by, or claims asserted against, such Contributor by reason + of your accepting any such warranty or additional liability. + + END OF TERMS AND CONDITIONS + + APPENDIX: How to apply the Apache License to your work. + + To apply the Apache License to your work, attach the following + boilerplate notice, with the fields enclosed by brackets "[]" + replaced with your own identifying information. (Don't include + the brackets!) The text should be enclosed in the appropriate + comment syntax for the file format. We also recommend that a + file or class name and description of purpose be included on the + same "printed page" as the copyright notice for easier + identification within third-party archives. + + Copyright (c) 2021 Matt Fay + + Licensed under the Apache License, Version 2.0 (the "License"); + you may not use this file except in compliance with the License. + You may obtain a copy of the License at + + http://www.apache.org/licenses/LICENSE-2.0 + + Unless required by applicable law or agreed to in writing, software + distributed under the License is distributed on an "AS IS" BASIS, + WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + See the License for the specific language governing permissions and + limitations under the License. diff --git a/README.md b/README.md new file mode 100644 index 0000000..d15c712 --- /dev/null +++ b/README.md @@ -0,0 +1,29 @@ +# **PySparkplug**: Sparkplug B for Python + +[![CI/CD: n/a](https://github.com/matteosox/pysparkplug/actions/workflows/cicd.yaml/badge.svg)](https://github.com/matteosox/pysparkplug/actions/workflows/cicd.yaml) +[![Docs: n/a](https://readthedocs.org/projects/pysparkplug/badge/?version=stable)](https://pysparkplug.mattefay.com) +[![Downloads: n/a](https://static.pepy.tech/personalized-badge/pysparkplug?period=total&units=none&left_color=grey&right_color=blue&left_text=Downloads)](https://pepy.tech/project/pysparkplug) +[![PyPI: n/a](https://img.shields.io/badge/dynamic/json?color=blueviolet&label=PyPI&query=%24.info.version&url=https%3A%2F%2Fpypi.org%2Fpypi%pysparkplug%2Fjson)](https://pypi.org/project/pysparkplug/) +[![codecov: n/a](https://codecov.io/gh/matteosox/pysparkplug/branch/main/graph/badge.svg?token=8VKKDG9SMZ)](https://codecov.io/gh/matteosox/pysparkplug) + +## Getting Started + +### Installation + +`pysparkplug` is a pip-installable package [hosted on PyPI](https://pypi.org/project/pysparkplug/). Getting started is as easy as: + +```console +$ pip install pysparkplug +``` + +`pysparkplug` uses the Eclipse Paho™ MQTT Python Client, i.e. [`paho-mqtt`](https://github.com/eclipse/paho.mqtt.python), for low-level MQTT communication. + +### Usage + +More documentation to come later, but for now, you can find some example usage notebooks in the `notebooks` directory. + +## Features + +### Fully type annotated + +`pysparkplug`'s various interfaces are fully type annotated, passing [Mypy](https://mypy.readthedocs.io/en/stable/)'s static type checker. diff --git a/compose.yaml b/compose.yaml new file mode 100644 index 0000000..ef29130 --- /dev/null +++ b/compose.yaml @@ -0,0 +1,33 @@ +services: + cicd: + build: . + volumes: + - ./:/root/pysparkplug + command: ["exit", "0"] + environment: + - GH_TOKEN + - TWINE_USERNAME + - TWINE_PASSWORD + emqx: + image: emqx/emqx:5.0.9 + ports: + - "18083:18083" # Dashboard + environment: + EMQX_DASHBOARD__DEFAULT_PASSWORD: admin + notebook: + build: + context: . + dockerfile: notebook.Dockerfile + command: + - "start.sh" + - "jupyter" + - "lab" + - "--ServerApp.token='bokchoy'" + - "--ServerApp.allow_origin='*'" + - "--ServerApp.disable_check_xsrf=True" + - "--ServerApp.ip='0.0.0.0'" + - "--notebook-dir=/home/jovyan/pysparkplug/notebooks" + ports: + - "8888:8888" # Notebook environment + volumes: + - ./:/home/jovyan/pysparkplug diff --git a/docs/api.rst b/docs/api.rst new file mode 100644 index 0000000..56e46dd --- /dev/null +++ b/docs/api.rst @@ -0,0 +1,139 @@ +API Reference +============= + +.. currentmodule:: pysparkplug + +Summary +------- + +.. autosummary:: + + Client + ClientOptions + ConnackCode + DBirth + DCmd + DData + DDeath + DataType + Device + EdgeNode + ErrorCode + MQTTError + MQTTProtocol + Message + MessageType + Metric + MetricValue + NBirth + NCmd + NData + NDeath + QoS + State + TLSConfig + Topic + Transport + WSConfig + get_current_timestamp + +Interfaces +---------- + +.. autoclass:: Client + :members: + +.. autoclass:: EdgeNode + :members: + +.. autoclass:: Device + :members: + + +Nuts & Bolts +------------ + +.. autoclass:: Message + :members: + +.. autoclass:: Topic + :members: + +.. autoclass:: Metric + :members: + +Payloads +-------- + +.. autoclass:: NBirth + :members: + :inherited-members: + +.. autoclass:: DBirth + :members: + :inherited-members: + +.. autoclass:: NData + :members: + :inherited-members: + +.. autoclass:: DData + :members: + :inherited-members: + +.. autoclass:: NCmd + :members: + :inherited-members: + +.. autoclass:: DCmd + :members: + :inherited-members: + +.. autoclass:: NDeath + :members: + :inherited-members: + +.. autoclass:: DDeath + :members: + :inherited-members: + +.. autoclass:: State + :members: + :inherited-members: + +Config Classes +-------------- + +.. autoclass:: ClientOptions + +.. autoclass:: TLSConfig + +.. autoclass:: WSConfig + +Enums +----- + +.. autoclass:: ConnackCode + +.. autoclass:: DataType + :members: + +.. autoclass:: ErrorCode + +.. autoclass:: MessageType + :members: + +.. autoclass:: MQTTProtocol + +.. autoclass:: QoS + +.. autoclass:: Transport + +Odds & Ends +----------- + +.. autofunction:: get_current_timestamp + +.. autoclass:: MetricValue + +.. autoexception:: MQTTError diff --git a/docs/changelog.md b/docs/changelog.md new file mode 100644 index 0000000..66efc0f --- /dev/null +++ b/docs/changelog.md @@ -0,0 +1,2 @@ +```{include} ../CHANGELOG.md +``` diff --git a/docs/conf.py b/docs/conf.py new file mode 100644 index 0000000..0e56847 --- /dev/null +++ b/docs/conf.py @@ -0,0 +1,156 @@ +""" +Configuration file for the Sphinx documentation builder. + +For a full list of confiuration options, see the documentation: +https://www.sphinx-doc.org/en/master/usage/configuration.html +""" +# pylint: disable=invalid-name + +import datetime +import doctest +import importlib.metadata +import os +import sys + +from packaging.version import Version + +REPO_ROOT = os.path.dirname(os.path.dirname(os.path.realpath(__file__))) + +# Setup sys.path so we can import other modules +sys.path.append(REPO_ROOT) + +from docs import linkcode + +# -- Project information ----------------------------------------------------- + +project = "PySparkplug" +author = "Matt Fay" +copyright = f"2023-{datetime.datetime.now().year}, {author}" # pylint: disable=redefined-builtin + +# The full version, including alpha/beta/rc tags +release = importlib.metadata.version("pysparkplug") +_version = Version(release) +version = f"{_version.major}.{_version.minor}.{_version.micro}" + +# -- General configuration --------------------------------------------------- + +# Add any paths that contain templates here, relative to this directory. +templates_path = ["templates"] + +# List of patterns, relative to source directory, that match files and +# directories to ignore when looking for source files. +# This pattern also affects html_static_path and html_extra_path. +exclude_patterns = [] + +# Pygments style to use for highlighting when the CSS media query +# (prefers-color-scheme: dark) evaluates to true. +pygments_dark_style = "monokai" + +# Sphinx warns about all references where the target cannot be found, except +# those explicitly ignored. +nitpicky = True +nitpick_ignore = [ + ("py:class", "pysparkplug._payload.Payload"), + ("py:class", "paho.mqtt.client.MQTTMessage"), + ("py:class", "pysparkplug._payload.Birth"), + ("py:class", "sparkplug_b_pb2.Metric"), + ("py:class", "ssl._SSLMethod"), + ("py:class", "typing_extensions.Self"), + ("py:class", "Self"), +] + +# -- Extension configuration ------------------------------------------------- + +# Add any Sphinx extension module names here, as strings. They can be +# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom +# ones. +extensions = [ + "myst_parser", + "notfound.extension", + "sphinx.ext.autodoc", + "sphinx.ext.autosummary", + "sphinx.ext.doctest", + "sphinx.ext.intersphinx", + "sphinx.ext.linkcode", + "sphinx.ext.napoleon", + "sphinx_copybutton", + "sphinxext.opengraph", +] + +# Show typehints as content of the function or method The typehints of +# overloaded functions or methods will still be represented in the +# signature. +autodoc_typehints = "description" + +# A dictionary for users defined type aliases that maps a type name to +# the full-qualified object name. +autodoc_type_aliases = { + "Self": "typing.Self", + "typing_extensions.Self": "typing.Self", +} + +# Add links to modules and objects in the Python standard library documentation +intersphinx_mapping = {"python": ("https://docs.python.org/3", None)} + +# Default flags for testing `doctest` directives used by the +# `sphinx.ext.doctest` Sphinx extension +doctest_default_flags = doctest.DONT_ACCEPT_TRUE_FOR_1 | doctest.ELLIPSIS + +# Auto-generate Header Anchors +myst_heading_anchors = 4 + +# We don't need warnings about non-consecutive header level +suppress_warnings = ["myst.header"] + +# The `sphinx.ext.linkcode` extension returns the URL to source code +# corresponding to the object referenced. +linkcode_resolve = linkcode.linkcode_resolve + +# sphinxext-opengraph settings +ogp_site_url = "https://pysparkplug.mattefay.com" +ogp_site_name = f"PySparkplug {release}" +ogp_image = "https://pysparkplug.mattefay.com/en/stable/static/logo.png" +ogp_image_alt = False + +# -- Options for HTML output ------------------------------------------------- + +# The theme to use for HTML and HTML Help pages. See the documentation for +# a list of builtin themes. +html_theme = "furo" +html_theme_options = { + "sidebar_hide_name": True, +} + +# Path to the logo placed at the top of the sidebar +html_logo = "static/logo.png" + +html_title = f"PySparkplug {release}" + +# Add any paths that contain custom static files (such as style sheets) here, +# relative to this directory. They are copied after the builtin static files, +# so a file named "default.css" will overwrite the builtin "default.css". +html_static_path = ["static"] + +# Hide link to each page's source file in the footer. +html_show_sourcelink = False + +# -- Build the readme -------------------------------------------------------- + + +def build_readme() -> None: + """Copy README.md over, in the process adding doctests""" + name = "README.md" + with open(os.path.join(REPO_ROOT, name), encoding="utf-8") as source: + readme = source.read() + + dest_dir = os.path.join(REPO_ROOT, "docs", "build") + try: + os.mkdir(dest_dir) + except FileExistsError: + pass + + with open(os.path.join(dest_dir, name), "w", encoding="utf-8") as dest: + dest.write(readme.replace("```python\n>>> ", "```{doctest}\n>>> ")) + + +build_readme() diff --git a/docs/contributor_guide.md b/docs/contributor_guide.md new file mode 100644 index 0000000..2b93d8f --- /dev/null +++ b/docs/contributor_guide.md @@ -0,0 +1,2 @@ +```{include} ../.github/CONTRIBUTING.md +``` diff --git a/docs/index.md b/docs/index.md new file mode 100644 index 0000000..39f20a3 --- /dev/null +++ b/docs/index.md @@ -0,0 +1,14 @@ +```{include} build/README.md +``` + +```{toctree} +:hidden: + +Overview +api.rst +changelog.md +contributor_guide.md +Report an Issue +GitHub +PyPI +``` diff --git a/docs/linkcode.py b/docs/linkcode.py new file mode 100644 index 0000000..d3363a0 --- /dev/null +++ b/docs/linkcode.py @@ -0,0 +1,83 @@ +"""Python module with single public linkcode_resolve function""" + +import importlib.metadata +import inspect +import os +import shlex +import subprocess +import sys + +from packaging.version import Version + +import pysparkplug + + +def linkcode_resolve(domain: str, info: dict[str, str]) -> str: + """ + linkcode Sphinx extension uses this function to map objects to be + documented to external URLs where the code is kept, in our case + github. Read more at: + https://www.sphinx-doc.org/en/master/usage/extensions/linkcode.html + """ + if domain != "py": + raise ValueError(f"Not currently documenting {domain}, only Python") + + modname = info["module"] + fullname = info["fullname"] + rel_url = _get_rel_url(modname, fullname) + blob = _get_blob() + + return f"https://github.com/matteosox/pysparkplug/blob/{blob}/src/pysparkplug/{rel_url}" + + +def _get_blob() -> str: + version_str = importlib.metadata.version("pysparkplug") + version = Version(version_str) + if version.is_devrelease or version.is_postrelease: + return _get_git_sha() + return version_str + + +def _get_git_sha() -> str: + completed_process = subprocess.run( + shlex.split("git rev-parse --short HEAD"), + check=True, + text=True, + capture_output=True, + ) + return completed_process.stdout.strip() + + +def _get_rel_url(modname: str, fullname: str) -> str: + """Get the relative url given the module name and fullname""" + obj = sys.modules[modname] + for part in fullname.split("."): + obj = getattr(obj, part) + + # strip decorators, which would resolve to the source of the decorator + # possibly an upstream bug in getsourcefile, bpo-1764286 + obj = inspect.unwrap(obj) # type: ignore[arg-type] + + # Can only get source files for some Python objects + source_file = None + try: + source_file = inspect.getsourcefile(obj) + except TypeError: + source_file = sys.modules[modname].__file__ + finally: + if source_file is None: + rel_path = "" + else: + rel_path = os.path.relpath( + source_file, start=os.path.dirname(pysparkplug.__file__) + ) + + # Can only get source lines for some Python objects + try: + source, lineno = inspect.getsourcelines(obj) + except TypeError: + linespec = "" + else: + linespec = f"#L{lineno}-L{lineno + len(source) - 1}" + + return f"{rel_path}{linespec}" diff --git a/docs/static/logo.png b/docs/static/logo.png new file mode 100644 index 0000000..612fde9 Binary files /dev/null and b/docs/static/logo.png differ diff --git a/docs/templates/sidebar/brand.html b/docs/templates/sidebar/brand.html new file mode 100644 index 0000000..a3dc603 --- /dev/null +++ b/docs/templates/sidebar/brand.html @@ -0,0 +1,5 @@ +{% extends "!sidebar/brand.html" %} +{% block brand_content %} + {{ super() }} + {{ release }} +{% endblock %} diff --git a/notebook.Dockerfile b/notebook.Dockerfile new file mode 100644 index 0000000..e198d2e --- /dev/null +++ b/notebook.Dockerfile @@ -0,0 +1,11 @@ +# Start from a core stack version +FROM jupyter/scipy-notebook:2023-05-15 + +# Move to directory where repo will be mounted in home directory +WORKDIR /home/jovyan/pysparkplug + +# Install requirements +COPY --chown=${NB_UID}:${NB_GID} . . +RUN pip install --quiet --no-cache-dir --editable . && \ + fix-permissions "${CONDA_DIR}" && \ + fix-permissions "/home/${NB_USER}" diff --git a/notebooks/dcmd_demo.ipynb b/notebooks/dcmd_demo.ipynb new file mode 100644 index 0000000..c74f6dc --- /dev/null +++ b/notebooks/dcmd_demo.ipynb @@ -0,0 +1,102 @@ +{ + "cells": [ + { + "cell_type": "code", + "execution_count": 1, + "id": "808f4138-328a-4bac-839f-bec4becf1edd", + "metadata": { + "tags": [] + }, + "outputs": [], + "source": [ + "import pysparkplug as psp\n", + "\n", + "\n", + "host = \"emqx\"\n", + "port = 1883\n", + "username = \"\"\n", + "password = \"\"\n", + "\n", + "client = psp.Client(\n", + " username=username,\n", + " password=password,\n", + ")\n", + "\n", + "client.connect(\n", + " host,\n", + " port=port,\n", + " blocking=False,\n", + ")" + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "id": "fa6b1285-8462-479c-a535-7c31e76dfed1", + "metadata": { + "tags": [] + }, + "outputs": [], + "source": [ + "group_id = \"Fab\"\n", + "edge_node_id = \"Opto3 Edge Node\"\n", + "device_id = \"LIQUID\"\n", + "name = \"CM_FEED/Auto_FlowSP\"\n", + "value = 1.1\n", + "\n", + "metrics = (\n", + " psp.Metric(timestamp=psp.get_current_timestamp(), name=name, datatype=psp.DataType.DOUBLE, value=value),\n", + ")\n", + "payload = psp.DCmd(timestamp=psp.get_current_timestamp(), metrics=metrics)\n", + "\n", + "topic = psp.Topic(\n", + " message_type=psp.MessageType.DCMD,\n", + " group_id=group_id,\n", + " edge_node_id=edge_node_id,\n", + " device_id=device_id,\n", + ")" + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "id": "9ef61b7d-c46c-4ff4-99c4-9ec2493a08e2", + "metadata": { + "tags": [] + }, + "outputs": [], + "source": [ + "client.publish(\n", + " psp.Message(\n", + " topic=topic,\n", + " payload=payload,\n", + " qos=psp.QoS.AT_MOST_ONCE,\n", + " retain=False,\n", + " ),\n", + " include_dtypes=True,\n", + ")" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3 (ipykernel)", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.10.11" + } + }, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/notebooks/device_demo.ipynb b/notebooks/device_demo.ipynb new file mode 100644 index 0000000..55a8408 --- /dev/null +++ b/notebooks/device_demo.ipynb @@ -0,0 +1,89 @@ +{ + "cells": [ + { + "cell_type": "code", + "execution_count": 1, + "id": "808f4138-328a-4bac-839f-bec4becf1edd", + "metadata": { + "tags": [] + }, + "outputs": [], + "source": [ + "import time\n", + "import datetime\n", + "\n", + "import pysparkplug as psp\n", + "\n", + "host = \"emqx\"\n", + "group_id = \"my_group\"\n", + "edge_node_id = \"my_edge_node\"\n", + "device_id = \"my_device\"\n", + "metrics = (\n", + " psp.Metric(timestamp=psp.get_current_timestamp(), name=\"uint8\", datatype=psp.DataType.UINT8, value=1),\n", + " psp.Metric(timestamp=psp.get_current_timestamp(), name=\"uint16\", datatype=psp.DataType.UINT16, value=2),\n", + " psp.Metric(timestamp=psp.get_current_timestamp(), name=\"uint32\", datatype=psp.DataType.UINT32, value=3),\n", + " psp.Metric(timestamp=psp.get_current_timestamp(), name=\"uint64\", datatype=psp.DataType.UINT64, value=4),\n", + " psp.Metric(timestamp=psp.get_current_timestamp(), name=\"int8\", datatype=psp.DataType.INT8, value=-1),\n", + " psp.Metric(timestamp=psp.get_current_timestamp(), name=\"int16\", datatype=psp.DataType.INT16, value=-2),\n", + " psp.Metric(timestamp=psp.get_current_timestamp(), name=\"int32\", datatype=psp.DataType.INT32, value=-3),\n", + " psp.Metric(timestamp=psp.get_current_timestamp(), name=\"int64\", datatype=psp.DataType.INT64, value=-4),\n", + " psp.Metric(timestamp=psp.get_current_timestamp(), name=\"float\", datatype=psp.DataType.FLOAT, value=1.1),\n", + " psp.Metric(timestamp=psp.get_current_timestamp(), name=\"double\", datatype=psp.DataType.DOUBLE, value=2.2),\n", + " psp.Metric(timestamp=psp.get_current_timestamp(), name=\"boolean\", datatype=psp.DataType.BOOLEAN, value=True),\n", + " psp.Metric(timestamp=psp.get_current_timestamp(), name=\"string\", datatype=psp.DataType.STRING, value=\"hello world\"),\n", + " psp.Metric(timestamp=psp.get_current_timestamp(), name=\"datetime\", datatype=psp.DataType.DATETIME, value=datetime.datetime(1990, 9, 3, 5, 4, 3)),\n", + " psp.Metric(timestamp=psp.get_current_timestamp(), name=\"text\", datatype=psp.DataType.TEXT, value=\"iamatext\"),\n", + " psp.Metric(timestamp=psp.get_current_timestamp(), name=\"uuid\", datatype=psp.DataType.UUID, value=\"iamauuid\"),\n", + " psp.Metric(timestamp=psp.get_current_timestamp(), name=\"bytes\", datatype=psp.DataType.BYTES, value=b\"iamabytes\"),\n", + " psp.Metric(timestamp=psp.get_current_timestamp(), name=\"file\", datatype=psp.DataType.FILE, value=b\"iamafile\"),\n", + " psp.Metric(timestamp=psp.get_current_timestamp(), name=\"null_uint8\", datatype=psp.DataType.UINT8),\n", + " psp.Metric(timestamp=psp.get_current_timestamp(), name=\"historical_uint8\", datatype=psp.DataType.UINT8, value=1, is_historical=True),\n", + " psp.Metric(timestamp=psp.get_current_timestamp(), name=\"transient_uint8\", datatype=psp.DataType.UINT8, value=1, is_transient=True),\n", + ")\n", + "\n", + "edge_node = psp.EdgeNode(group_id, edge_node_id, metrics)\n", + "device = psp.Device(device_id, metrics)\n", + "edge_node.register(device)\n", + "\n", + "edge_node.connect(host)\n", + "time.sleep(1)\n", + "edge_node.update(metrics)\n", + "edge_node.update_device(device_id, metrics)\n", + "edge_node.deregister(device_id)" + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "id": "af1372c1-0fb2-4f39-a8b2-b01f6d27ea5f", + "metadata": { + "tags": [] + }, + "outputs": [], + "source": [ + "edge_node.disconnect()" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3 (ipykernel)", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.10.11" + } + }, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/notebooks/edge_node_demo.ipynb b/notebooks/edge_node_demo.ipynb new file mode 100644 index 0000000..bf91b4c --- /dev/null +++ b/notebooks/edge_node_demo.ipynb @@ -0,0 +1,72 @@ +{ + "cells": [ + { + "cell_type": "code", + "execution_count": 1, + "id": "808f4138-328a-4bac-839f-bec4becf1edd", + "metadata": { + "tags": [] + }, + "outputs": [], + "source": [ + "import time\n", + "import datetime\n", + "\n", + "import pysparkplug as psp\n", + "\n", + "host = \"emqx\"\n", + "group_id = \"my_group\"\n", + "edge_node_id = \"my_edge_node\"\n", + "metrics = (\n", + " psp.Metric(timestamp=psp.get_current_timestamp(), name=\"uint8\", datatype=psp.DataType.UINT8, value=1),\n", + " psp.Metric(timestamp=psp.get_current_timestamp(), name=\"uint16\", datatype=psp.DataType.UINT16, value=2),\n", + " psp.Metric(timestamp=psp.get_current_timestamp(), name=\"uint32\", datatype=psp.DataType.UINT32, value=3),\n", + " psp.Metric(timestamp=psp.get_current_timestamp(), name=\"uint64\", datatype=psp.DataType.UINT64, value=4),\n", + " psp.Metric(timestamp=psp.get_current_timestamp(), name=\"int8\", datatype=psp.DataType.INT8, value=-1),\n", + " psp.Metric(timestamp=psp.get_current_timestamp(), name=\"int16\", datatype=psp.DataType.INT16, value=-2),\n", + " psp.Metric(timestamp=psp.get_current_timestamp(), name=\"int32\", datatype=psp.DataType.INT32, value=-3),\n", + " psp.Metric(timestamp=psp.get_current_timestamp(), name=\"int64\", datatype=psp.DataType.INT64, value=-4),\n", + " psp.Metric(timestamp=psp.get_current_timestamp(), name=\"float\", datatype=psp.DataType.FLOAT, value=1.1),\n", + " psp.Metric(timestamp=psp.get_current_timestamp(), name=\"double\", datatype=psp.DataType.DOUBLE, value=2.2),\n", + " psp.Metric(timestamp=psp.get_current_timestamp(), name=\"boolean\", datatype=psp.DataType.BOOLEAN, value=True),\n", + " psp.Metric(timestamp=psp.get_current_timestamp(), name=\"string\", datatype=psp.DataType.STRING, value=\"hello world\"),\n", + " psp.Metric(timestamp=psp.get_current_timestamp(), name=\"datetime\", datatype=psp.DataType.DATETIME, value=datetime.datetime(1990, 9, 3, 5, 4, 3)),\n", + " psp.Metric(timestamp=psp.get_current_timestamp(), name=\"text\", datatype=psp.DataType.TEXT, value=\"iamatext\"),\n", + " psp.Metric(timestamp=psp.get_current_timestamp(), name=\"uuid\", datatype=psp.DataType.UUID, value=\"iamauuid\"),\n", + " psp.Metric(timestamp=psp.get_current_timestamp(), name=\"bytes\", datatype=psp.DataType.BYTES, value=b\"iamabytes\"),\n", + " psp.Metric(timestamp=psp.get_current_timestamp(), name=\"file\", datatype=psp.DataType.FILE, value=b\"iamafile\"),\n", + " psp.Metric(timestamp=psp.get_current_timestamp(), name=\"null_uint8\", datatype=psp.DataType.UINT8),\n", + " psp.Metric(timestamp=psp.get_current_timestamp(), name=\"historical_uint8\", datatype=psp.DataType.UINT8, value=1, is_historical=True),\n", + " psp.Metric(timestamp=psp.get_current_timestamp(), name=\"transient_uint8\", datatype=psp.DataType.UINT8, value=1, is_transient=True),\n", + ")\n", + "\n", + "edge_node = psp.EdgeNode(group_id, edge_node_id, metrics)\n", + "\n", + "edge_node.connect(host)\n", + "time.sleep(1)\n", + "edge_node.update(metrics)" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3 (ipykernel)", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.10.11" + } + }, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/notebooks/inspect_mqtt.ipynb b/notebooks/inspect_mqtt.ipynb new file mode 100644 index 0000000..b160b6d --- /dev/null +++ b/notebooks/inspect_mqtt.ipynb @@ -0,0 +1,62 @@ +{ + "cells": [ + { + "cell_type": "code", + "execution_count": null, + "id": "b3267489-0544-4620-b8eb-c1eb8a91c98b", + "metadata": { + "tags": [] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "From spBv1.0/my_group/DCMD/my_edge_node/my_device (QoS=0, retain=0):\n", + " DCmd(timestamp=1690378628005, metrics=(Metric(timestamp=1690378628004, name='my_metric', datatype=, value=1.1, alias=None, is_historical=False, is_transient=False, is_null=False),))\n" + ] + } + ], + "source": [ + "import textwrap\n", + "\n", + "import pysparkplug as psp\n", + "\n", + "\n", + "def callback(client: psp.Client, message: psp.Message) -> None:\n", + " print(f\"From {message.topic} (QoS={message.qos}, retain={message.retain}):\")\n", + " if isinstance(message.payload, (psp.DBirth, psp.NBirth, psp.NData, psp.DData)):\n", + " for metric in message.payload.metrics:\n", + " print(textwrap.indent(str(metric), \" \"))\n", + " else:\n", + " print(textwrap.indent(str(message.payload), \" \"))\n", + "\n", + "\n", + "client = psp.Client(client_id=\"Listening client\")\n", + "client.subscribe(psp.Topic(\"#\"), psp.QoS.AT_LEAST_ONCE, callback)\n", + "client.connect(\"emqx\", blocking=True)" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3 (ipykernel)", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.10.11" + } + }, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/notebooks/run.sh b/notebooks/run.sh new file mode 100755 index 0000000..a7d87ad --- /dev/null +++ b/notebooks/run.sh @@ -0,0 +1,10 @@ +#! /usr/bin/env bash +set -o errexit -o nounset -o pipefail +IFS=$'\n\t' + +REPO_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"/.. +cd "$REPO_DIR" + +echo "Opening notebook environment" + +docker compose run --rm -it --service-ports notebook diff --git a/noxfile.py b/noxfile.py new file mode 100644 index 0000000..ec66c0d --- /dev/null +++ b/noxfile.py @@ -0,0 +1,259 @@ +"""Test/developer workflow automation""" + +from typing import cast + +import nox +from packaging.version import Version + +nox.options.error_on_external_run = True +nox.options.reuse_existing_virtualenvs = True +nox.options.error_on_missing_interpreters = True +nox.options.envdir = ".cache/nox" +nox.options.sessions = [ + "black", + "isort", + "pylint", + "mypy", + "unit_tests", + "coverage", + "docs", + "packaging", +] + + +@nox.session(python=False) +def fix(session: nox.Session) -> None: + """Simple workflow to run black and isort in fix mode""" + session.notify("black", ["fix"]) + session.notify("isort", ["fix"]) + + +@nox.session +def black(session: nox.Session) -> None: + """Black Python formatting tool""" + session.install("black") + if session.posargs and session.posargs[0] == "fix": + session.run("black", ".") + else: + session.run("black", "--diff", "--check", ".") + + +@nox.session +def isort(session: nox.Session) -> None: + """ISort Python import formatting tool""" + session.install("isort") + if session.posargs and session.posargs[0] == "fix": + session.run("isort", ".") + else: + session.run("isort", "--check-only", ".") + + +@nox.session +def pylint(session: nox.Session) -> None: + """Pylint Python linting tool""" + session.install("pylint", ".", "nox", "packaging") + session.run( + "pylint", + "src", + "test/unit_tests", + "noxfile.py", + "docs", + ) + + +@nox.session +def mypy(session: nox.Session) -> None: + """Mypy Python static type checker""" + session.install( + "mypy", + ".", + "nox", + "packaging", + "types-protobuf", + "types-paho-mqtt", + ) + session.run( + "mypy", + "--install-types", + "--non-interactive", + "src", + "noxfile.py", + "test/unit_tests", + "docs/linkcode.py", + ) + + +@nox.session(python=["3.8", "3.9", "3.10", "3.11"]) +def unit_tests(session: nox.Session) -> None: + """Unit test suite run with coverage tracking""" + session.install( + ".", + "coverage[toml]", + "pytest", + "packaging", + ) + if session.posargs and session.posargs[0] == "fast": + session.run("python", "-m", "pytest") + else: + session.run("coverage", "run", "-m", "pytest") + + +@nox.session() +def coverage(session: nox.Session) -> None: + """Report on coverage tracking""" + session.install("coverage[toml]") + try: + session.run("coverage", "combine") + session.run("coverage", "report") + finally: + session.run("rm", "-f", ".coverage", external=True) + + +@nox.session() +def docs(session: nox.Session) -> None: + """Generate and test documentation""" + session.install( + ".", + "furo", + "myst-parser", + "packaging", + "sphinx", + "sphinx-copybutton", + "sphinx-notfound-page", + "sphinxext-opengraph", + ) + session.run( + "sphinx-build", + "-T", + "-W", + "-E", + "--keep-going", + "--color", + "-b", + "html", + "docs", + "docs/build/html", + ) + session.run( + "sphinx-build", + "-T", + "-W", + "-E", + "--keep-going", + "--color", + "-b", + "doctest", + "docs", + "docs/build/html", + ) + + +@nox.session +def packaging(session: nox.Session) -> None: + """Build and test packaging""" + session.install("check-wheel-contents", "twine", "build") + try: + session.run("python", "-m", "build") + session.run("check-wheel-contents", "dist") + session.run("twine", "check", "dist/*") + finally: + session.run("rm", "-rf", "dist", external=True) + + +@nox.session +def draft_release(session: nox.Session) -> None: + """Create a draft Github Release""" + session.install(".") + version_str = _version(session) + version_obj = Version(version_str) + if not version_obj.is_devrelease: + raise ValueError(f"Package version {version_str} should be a dev release") + cmd = [ + "gh", + "release", + "create", + f"v{version_str}", + "--draft", + "--title", + version_str, + "--notes", + _get_notes(), + ] + if version_obj.pre is not None: + cmd.append("--prerelease") + session.run(*cmd, external=True) + + +@nox.session +def publish(session: nox.Session) -> None: + """Publish package to PyPI and upload build artifacts to Github Release""" + session.install(".", "twine", "build") + version_str = _version(session) + version_obj = Version(version_str) + if ( + version_obj.is_devrelease + or version_obj.is_postrelease + or version_obj.local is not None + ): + raise ValueError( + f"Package version {version_str} should not be a post or dev release" + ) + repository = session.posargs[0] + if repository == "pypi": + if ( + version_obj.is_devrelease + or version_obj.is_postrelease + or version_obj.local is not None + ): + raise ValueError( + f"Package version {version_str} should not be a post or dev release" + ) + elif repository == "testpypi": + if not version_obj.is_devrelease: + raise ValueError(f"Package version {version_str} should be a dev release") + else: + raise ValueError(f"Unrecognized repository {repository}") + try: + session.run("python", "-m", "build") + session.run( + "bash", "-c", f"gh release upload v{version_str} dist/*", external=True + ) + session.run( + "bash", + "-c", + f"twine upload --verbose --repository {repository} dist/*", + external=True, + ) + finally: + session.run("rm", "-rf", "dist", external=True) + + +def _version(session: nox.Session) -> str: + output = cast( + str, + session.run( + "python", + "-c", + "import pysparkplug; print(pysparkplug.__version__)", + silent=True, + ), + ) + return output.strip() + + +def _get_notes() -> str: + search_pattern = "## " + changes_lines = [] + with open("CHANGELOG.md", encoding="utf-8") as file_obj: + for line in file_obj: + if line.startswith(search_pattern): + break + else: + raise LookupError(f"Could not find {search_pattern} in CHANGELOG.md") + + for line in file_obj: + if line.startswith(search_pattern): + break + changes_lines.append(line) + + return "".join(changes_lines) diff --git a/pylintrc b/pylintrc new file mode 100644 index 0000000..588a452 --- /dev/null +++ b/pylintrc @@ -0,0 +1,626 @@ +[MAIN] + +# Analyse import fallback blocks. This can be used to support both Python 2 and +# 3 compatible code, which means that the block might have code that exists +# only in one or another interpreter, leading to false positives when analysed. +analyse-fallback-blocks=no + +# Clear in-memory caches upon conclusion of linting. Useful if running pylint +# in a server-like mode. +clear-cache-post-run=no + +# Load and enable all available extensions. Use --list-extensions to see a list +# all available extensions. +#enable-all-extensions= + +# In error mode, messages with a category besides ERROR or FATAL are +# suppressed, and no reports are done by default. Error mode is compatible with +# disabling specific errors. +#errors-only= + +# Always return a 0 (non-error) status code, even if lint errors are found. +# This is primarily useful in continuous integration scripts. +#exit-zero= + +# A comma-separated list of package or module names from where C extensions may +# be loaded. Extensions are loading into the active Python interpreter and may +# run arbitrary code. +extension-pkg-allow-list= + +# A comma-separated list of package or module names from where C extensions may +# be loaded. Extensions are loading into the active Python interpreter and may +# run arbitrary code. (This is an alternative name to extension-pkg-allow-list +# for backward compatibility.) +extension-pkg-whitelist= + +# Return non-zero exit code if any of these messages/categories are detected, +# even if score is above --fail-under value. Syntax same as enable. Messages +# specified are enabled, while categories only check already-enabled messages. +fail-on= + +# Specify a score threshold under which the program will exit with error. +fail-under=10 + +# Interpret the stdin as a python script, whose filename needs to be passed as +# the module_or_package argument. +#from-stdin= + +# Files or directories to be skipped. They should be base names, not paths. +ignore=CVS + +# Add files or directories matching the regular expressions patterns to the +# ignore-list. The regex matches against paths and can be in Posix or Windows +# format. Because '\\' represents the directory delimiter on Windows systems, +# it can't be used as an escape character. +ignore-paths=src/pysparkplug/_version.py + +# Files or directories matching the regular expression patterns are skipped. +# The regex matches against base names, not paths. The default value ignores +# Emacs file locks +ignore-patterns=^\.#, .*_pb2.py + +# List of module names for which member attributes should not be checked +# (useful for modules/projects where namespaces are manipulated during runtime +# and thus existing member attributes cannot be deduced by static analysis). It +# supports qualified module names, as well as Unix pattern matching. +ignored-modules= + +# Python code to execute, usually for sys.path manipulation such as +# pygtk.require(). +#init-hook= + +# Use multiple processes to speed up Pylint. Specifying 0 will auto-detect the +# number of processors available to use, and will cap the count on Windows to +# avoid hangs. +jobs=1 + +# Control the amount of potential inferred values when inferring a single +# object. This can help the performance when dealing with large functions or +# complex, nested conditions. +limit-inference-results=100 + +# List of plugins (as comma separated values of python module names) to load, +# usually to register additional checkers. +load-plugins= + +# Pickle collected data for later comparisons. +persistent=yes + +# Minimum Python version to use for version dependent checks. Will default to +# the version used to run pylint. +py-version=3.10 + +# Discover python modules and packages in the file system subtree. +recursive=no + +# When enabled, pylint would attempt to guess common misconfiguration and emit +# user-friendly hints instead of false-positive error messages. +suggestion-mode=yes + +# Allow loading of arbitrary C extensions. Extensions are imported into the +# active Python interpreter and may run arbitrary code. +unsafe-load-any-extension=no + +# In verbose mode, extra non-checker-related info will be displayed. +#verbose= + + +[BASIC] + +# Naming style matching correct argument names. +argument-naming-style=snake_case + +# Regular expression matching correct argument names. Overrides argument- +# naming-style. If left empty, argument names will be checked with the set +# naming style. +#argument-rgx= + +# Naming style matching correct attribute names. +attr-naming-style=snake_case + +# Regular expression matching correct attribute names. Overrides attr-naming- +# style. If left empty, attribute names will be checked with the set naming +# style. +#attr-rgx= + +# Bad variable names which should always be refused, separated by a comma. +bad-names=foo, + bar, + baz, + toto, + tutu, + tata + +# Bad variable names regexes, separated by a comma. If names match any regex, +# they will always be refused +bad-names-rgxs= + +# Naming style matching correct class attribute names. +class-attribute-naming-style=any + +# Regular expression matching correct class attribute names. Overrides class- +# attribute-naming-style. If left empty, class attribute names will be checked +# with the set naming style. +#class-attribute-rgx= + +# Naming style matching correct class constant names. +class-const-naming-style=UPPER_CASE + +# Regular expression matching correct class constant names. Overrides class- +# const-naming-style. If left empty, class constant names will be checked with +# the set naming style. +#class-const-rgx= + +# Naming style matching correct class names. +class-naming-style=PascalCase + +# Regular expression matching correct class names. Overrides class-naming- +# style. If left empty, class names will be checked with the set naming style. +#class-rgx= + +# Naming style matching correct constant names. +const-naming-style=UPPER_CASE + +# Regular expression matching correct constant names. Overrides const-naming- +# style. If left empty, constant names will be checked with the set naming +# style. +#const-rgx= + +# Minimum line length for functions/classes that require docstrings, shorter +# ones are exempt. +docstring-min-length=-1 + +# Naming style matching correct function names. +function-naming-style=snake_case + +# Regular expression matching correct function names. Overrides function- +# naming-style. If left empty, function names will be checked with the set +# naming style. +#function-rgx= + +# Good variable names which should always be accepted, separated by a comma. +good-names=_, + cb, + rc + +# Good variable names regexes, separated by a comma. If names match any regex, +# they will always be accepted +good-names-rgxs= + +# Include a hint for the correct naming format with invalid-name. +include-naming-hint=no + +# Naming style matching correct inline iteration names. +inlinevar-naming-style=any + +# Regular expression matching correct inline iteration names. Overrides +# inlinevar-naming-style. If left empty, inline iteration names will be checked +# with the set naming style. +#inlinevar-rgx= + +# Naming style matching correct method names. +method-naming-style=snake_case + +# Regular expression matching correct method names. Overrides method-naming- +# style. If left empty, method names will be checked with the set naming style. +#method-rgx= + +# Naming style matching correct module names. +module-naming-style=snake_case + +# Regular expression matching correct module names. Overrides module-naming- +# style. If left empty, module names will be checked with the set naming style. +#module-rgx= + +# Colon-delimited sets of names that determine each other's naming style when +# the name regexes allow several styles. +name-group= + +# Regular expression which should only match function or class names that do +# not require a docstring. +no-docstring-rgx=^_ + +# List of decorators that produce properties, such as abc.abstractproperty. Add +# to this list to register other decorators that produce valid properties. +# These decorators are taken in consideration only for invalid-name. +property-classes=abc.abstractproperty + +# Regular expression matching correct type variable names. If left empty, type +# variable names will be checked with the set naming style. +#typevar-rgx= + +# Naming style matching correct variable names. +variable-naming-style=snake_case + +# Regular expression matching correct variable names. Overrides variable- +# naming-style. If left empty, variable names will be checked with the set +# naming style. +#variable-rgx= + + +[CLASSES] + +# Warn about protected attribute access inside special methods +check-protected-access-in-special-methods=no + +# List of method names used to declare (i.e. assign) instance attributes. +defining-attr-methods=__init__, + __new__, + setUp, + __post_init__ + +# List of member names, which should be excluded from the protected access +# warning. +exclude-protected=_asdict, + _fields, + _replace, + _source, + _make + +# List of valid names for the first argument in a class method. +valid-classmethod-first-arg=cls + +# List of valid names for the first argument in a metaclass class method. +valid-metaclass-classmethod-first-arg=mcs + + +[DESIGN] + +# List of regular expressions of class ancestor names to ignore when counting +# public methods (see R0903) +exclude-too-few-public-methods= + +# List of qualified class names to ignore when counting class parents (see +# R0901) +ignored-parents= + +# Maximum number of arguments for function / method. +max-args=7 + +# Maximum number of attributes for a class (see R0902). +max-attributes=8 + +# Maximum number of boolean expressions in an if statement (see R0916). +max-bool-expr=5 + +# Maximum number of branch for function / method body. +max-branches=12 + +# Maximum number of locals for function / method body. +max-locals=15 + +# Maximum number of parents for a class (see R0901). +max-parents=7 + +# Maximum number of public methods for a class (see R0904). +max-public-methods=20 + +# Maximum number of return / yield for function / method body. +max-returns=6 + +# Maximum number of statements in function / method body. +max-statements=50 + +# Minimum number of public methods for a class (see R0903). +min-public-methods=2 + + +[EXCEPTIONS] + +# Exceptions that will emit a warning when caught. +overgeneral-exceptions=builtins.BaseException,builtins.Exception + + +[FORMAT] + +# Expected format of line ending, e.g. empty (any line ending), LF or CRLF. +expected-line-ending-format= + +# Regexp for a line that is allowed to be longer than the limit. +ignore-long-lines=^\s*(# )??$ + +# Number of spaces of indent required inside a hanging or continued line. +indent-after-paren=4 + +# String used as indentation unit. This is usually " " (4 spaces) or "\t" (1 +# tab). +indent-string=' ' + +# Maximum number of characters on a single line. +max-line-length=100 + +# Maximum number of lines in a module. +max-module-lines=1000 + +# Allow the body of a class to be on the same line as the declaration if body +# contains single statement. +single-line-class-stmt=no + +# Allow the body of an if to be on the same line as the test if there is no +# else. +single-line-if-stmt=no + + +[IMPORTS] + +# List of modules that can be imported at any level, not just the top level +# one. +allow-any-import-level= + +# Allow explicit reexports by alias from a package __init__. +allow-reexport-from-package=no + +# Allow wildcard imports from modules that define __all__. +allow-wildcard-with-all=no + +# Deprecated modules which should not be used, separated by a comma. +deprecated-modules= + +# Output a graph (.gv or any supported image format) of external dependencies +# to the given file (report RP0402 must not be disabled). +ext-import-graph= + +# Output a graph (.gv or any supported image format) of all (i.e. internal and +# external) dependencies to the given file (report RP0402 must not be +# disabled). +import-graph= + +# Output a graph (.gv or any supported image format) of internal dependencies +# to the given file (report RP0402 must not be disabled). +int-import-graph= + +# Force import order to recognize a module as part of the standard +# compatibility libraries. +known-standard-library= + +# Force import order to recognize a module as part of a third party library. +known-third-party=enchant + +# Couples of modules and preferred modules, separated by a comma. +preferred-modules= + + +[LOGGING] + +# The type of string formatting that logging methods do. `old` means using % +# formatting, `new` is for `{}` formatting. +logging-format-style=old + +# Logging modules to check that the string format arguments are in logging +# function parameter format. +logging-modules=logging + + +[MESSAGES CONTROL] + +# Only show warnings with the listed confidence levels. Leave empty to show +# all. Valid levels: HIGH, CONTROL_FLOW, INFERENCE, INFERENCE_FAILURE, +# UNDEFINED. +confidence=HIGH, + CONTROL_FLOW, + INFERENCE, + INFERENCE_FAILURE, + UNDEFINED + +# Disable the message, report, category or checker with the given id(s). You +# can either give multiple identifiers separated by comma (,) or put this +# option multiple times (only on the command line, not in the configuration +# file where it should appear only once). You can also use "--disable=all" to +# disable everything first and then re-enable specific checks. For example, if +# you want to run only the similarities checker, you can use "--disable=all +# --enable=similarities". If you want to run only the classes checker, but have +# no Warning level messages displayed, use "--disable=all --enable=classes +# --disable=W". +disable=raw-checker-failed, + bad-inline-option, + locally-disabled, + file-ignored, + suppressed-message, + useless-suppression, + deprecated-pragma, + use-symbolic-message-instead, + line-too-long, + wrong-import-order, + wrong-import-position, + logging-fstring-interpolation, + useless-import-alias + +# Enable the message, report, category or checker with the given id(s). You can +# either give multiple identifier separated by comma (,) or put this option +# multiple time (only on the command line, not in the configuration file where +# it should appear only once). See also the "--disable" option for examples. +enable=c-extension-no-member + + +[METHOD_ARGS] + +# List of qualified names (i.e., library.method) which require a timeout +# parameter e.g. 'requests.api.get,requests.api.post' +timeout-methods=requests.api.delete,requests.api.get,requests.api.head,requests.api.options,requests.api.patch,requests.api.post,requests.api.put,requests.api.request + + +[MISCELLANEOUS] + +# List of note tags to take in consideration, separated by a comma. +notes=FIXME, + XXX, + TODO + +# Regular expression of note tags to take in consideration. +notes-rgx= + + +[REFACTORING] + +# Maximum number of nested blocks for function / method body +max-nested-blocks=5 + +# Complete name of functions that never returns. When checking for +# inconsistent-return-statements if a never returning function is called then +# it will be considered as an explicit return statement and no message will be +# printed. +never-returning-functions=sys.exit,argparse.parse_error + + +[REPORTS] + +# Python expression which should return a score less than or equal to 10. You +# have access to the variables 'fatal', 'error', 'warning', 'refactor', +# 'convention', and 'info' which contain the number of messages in each +# category, as well as 'statement' which is the total number of statements +# analyzed. This score is used by the global evaluation report (RP0004). +evaluation=max(0, 0 if fatal else 10.0 - ((float(5 * error + warning + refactor + convention) / statement) * 10)) + +# Template used to display messages. This is a python new-style format string +# used to format the message information. See doc for all details. +msg-template= + +# Set the output format. Available formats are text, parseable, colorized, json +# and msvs (visual studio). You can also give a reporter class, e.g. +# mypackage.mymodule.MyReporterClass. +output-format=colorized + +# Tells whether to display a full report or only the messages. +reports=yes + +# Activate the evaluation score. +score=yes + + +[SIMILARITIES] + +# Comments are removed from the similarity computation +ignore-comments=yes + +# Docstrings are removed from the similarity computation +ignore-docstrings=yes + +# Imports are removed from the similarity computation +ignore-imports=yes + +# Signatures are removed from the similarity computation +ignore-signatures=yes + +# Minimum lines number of a similarity. +min-similarity-lines=4 + + +[SPELLING] + +# Limits count of emitted suggestions for spelling mistakes. +max-spelling-suggestions=4 + +# Spelling dictionary name. Available dictionaries: none. To make it work, +# install the 'python-enchant' package. +spelling-dict= + +# List of comma separated words that should be considered directives if they +# appear at the beginning of a comment and should not be checked. +spelling-ignore-comment-directives=fmt: on,fmt: off,noqa:,noqa,nosec,isort:skip,mypy: + +# List of comma separated words that should not be checked. +spelling-ignore-words= + +# A path to a file that contains the private dictionary; one word per line. +spelling-private-dict-file= + +# Tells whether to store unknown words to the private dictionary (see the +# --spelling-private-dict-file option) instead of raising a message. +spelling-store-unknown-words=no + + +[STRING] + +# This flag controls whether inconsistent-quotes generates a warning when the +# character used as a quote delimiter is used inconsistently within a module. +check-quote-consistency=no + +# This flag controls whether the implicit-str-concat should generate a warning +# on implicit string concatenation in sequences defined over several lines. +check-str-concat-over-line-jumps=no + + +[TYPECHECK] + +# List of decorators that produce context managers, such as +# contextlib.contextmanager. Add to this list to register other decorators that +# produce valid context managers. +contextmanager-decorators=contextlib.contextmanager + +# List of members which are set dynamically and missed by pylint inference +# system, and so shouldn't trigger E1101 when accessed. Python regular +# expressions are accepted. +generated-members= + +# Tells whether to warn about missing members when the owner of the attribute +# is inferred to be None. +ignore-none=yes + +# This flag controls whether pylint should warn about no-member and similar +# checks whenever an opaque object is returned when inferring. The inference +# can return multiple potential results while evaluating a Python object, but +# some branches might not be evaluated, which results in partial inference. In +# that case, it might be useful to still emit no-member and other checks for +# the rest of the inferred objects. +ignore-on-opaque-inference=yes + +# List of symbolic message names to ignore for Mixin members. +ignored-checks-for-mixins=no-member, + not-async-context-manager, + not-context-manager, + attribute-defined-outside-init + +# List of class names for which member attributes should not be checked (useful +# for classes with dynamically set attributes). This supports the use of +# qualified names. +ignored-classes=optparse.Values,thread._local,_thread._local,argparse.Namespace + +# Show a hint with possible names when a member name was not found. The aspect +# of finding the hint is based on edit distance. +missing-member-hint=yes + +# The minimum edit distance a name should have in order to be considered a +# similar match for a missing member name. +missing-member-hint-distance=1 + +# The total number of similar names that should be taken in consideration when +# showing a hint for a missing member. +missing-member-max-choices=1 + +# Regex pattern to define which classes are considered mixins. +mixin-class-rgx=.*[Mm]ixin + +# List of decorators that change the signature of a decorated function. +signature-mutators= + + +[VARIABLES] + +# List of additional names supposed to be defined in builtins. Remember that +# you should avoid defining new builtins when possible. +additional-builtins= + +# Tells whether unused global variables should be treated as a violation. +allow-global-unused-variables=yes + +# List of names allowed to shadow builtins +allowed-redefined-builtins= + +# List of strings which can identify a callback function by name. A callback +# name must start or end with one of those strings. +callbacks=cb_, + _cb + +# A regular expression matching the name of dummy variables (i.e. expected to +# not be used). +dummy-variables-rgx=_+$|(_[a-zA-Z0-9_]*[a-zA-Z0-9]+?$)|dummy|^ignored_|^unused_ + +# Argument names that match this expression will be ignored. +ignored-argument-names=_.*|^ignored_|^unused_ + +# Tells whether we should check for unused import in __init__ files. +init-import=no + +# List of qualified module names which can have objects that can redefine +# builtins. +redefining-builtins-modules=six.moves,past.builtins,future.builtins,builtins,io diff --git a/pyproject.toml b/pyproject.toml new file mode 100644 index 0000000..0110db0 --- /dev/null +++ b/pyproject.toml @@ -0,0 +1,105 @@ +[project] +name = "pysparkplug" +description = "An open-source, Python implementation of Sparkplug B, an MQTT topic and payload definition standard" +readme = "README.md" +requires-python = ">=3.8" +license = {text = "Apache License, Version 2.0"} +keywords = ["mqtt", "sparkplug", "manufacturing", "automation"] +authors = [{ name = "Matt Fay", email = "matt.e.fay@gmail.com" }] +classifiers = [ + "Development Status :: 2 - Pre-Alpha", + "Intended Audience :: Manufacturing", + "License :: OSI Approved :: Apache Software License", + "Natural Language :: English", + "Operating System :: OS Independent", + "Programming Language :: Python", + "Programming Language :: Python :: 3", + "Programming Language :: Python :: 3.8", + "Programming Language :: Python :: 3.9", + "Programming Language :: Python :: 3.10", + "Programming Language :: Python :: 3.11", + "Programming Language :: Python :: 3 :: Only", + "Programming Language :: Python :: Implementation :: CPython", + "Typing :: Typed", +] +dependencies = [ + "paho-mqtt", + "protobuf", + "typing-extensions; python_version < '3.11'", +] +dynamic = ["version"] + +[project.urls] +Documentation = "https://pysparkplug.mattefay.com" +Changelog = "https://pysparkplug.mattefay.com/en/stable/changelog.html" +Source = "https://github.com/matteosox/pysparkplug" +"Bug Tracker" = "https://github.com/matteosox/pysparkplug/issues" + +[build-system] +requires = ["hatchling", "hatch-vcs"] +build-backend = "hatchling.build" + +[tool.hatch.version] +source = "vcs" +raw-options = { local_scheme = "no-local-version" } + +[tool.hatch.build.hooks.vcs] +version-file = "src/pysparkplug/_version.py" + +[tool.black] +verbose = true +color = true +target_version = ["py38", "py39", "py310", "py311"] +line_length = 88 +extend-exclude = """ +( + .*_pb2.py$ +) +""" + +[tool.isort] +verbose = true +profile = "black" +skip_gitignore = true +line_length = 88 +extend_skip = [".cache"] +extend_skip_glob = ["*_pb2.py"] + +[tool.mypy] +cache_dir = ".cache/mypy" +color_output = true +strict = true + +[[tool.mypy.overrides]] +module = "pysparkplug._protobuf.sparkplug_b_pb2" +ignore_errors = true + +[[tool.mypy.overrides]] +module = "noxfile" +disallow_untyped_decorators = false + +[tool.pytest.ini_options] +cache_dir = ".cache/pytest" +addopts = "-ra --verbose --color=yes" +testpaths = ["test/unit_tests"] + +[tool.coverage.paths] +source = [ + ".cache/nox/unit_tests-3-8/lib/python3.8/site-packages/", + ".cache/nox/unit_tests-3-9/lib/python3.9/site-packages/", + ".cache/nox/unit_tests-3-10/lib/python3.10/site-packages/", + ".cache/nox/unit_tests-3-11/lib/python3.11/site-packages/", +] + +[tool.coverage.run] +branch = true +parallel = true +source = ["pysparkplug"] + +[tool.coverage.report] +show_missing = true +fail_under = 46 + +[tool.check-wheel-contents] +toplevel = "pysparkplug" +package = "src/pysparkplug" diff --git a/sparkplug_b.proto b/sparkplug_b.proto new file mode 100644 index 0000000..bf72ab5 --- /dev/null +++ b/sparkplug_b.proto @@ -0,0 +1,224 @@ +// * Copyright (c) 2015, 2018 Cirrus Link Solutions and others +// * +// * This program and the accompanying materials are made available under the +// * terms of the Eclipse Public License 2.0 which is available at +// * http://www.eclipse.org/legal/epl-2.0. +// * +// * SPDX-License-Identifier: EPL-2.0 +// * +// * Contributors: +// * Cirrus Link Solutions - initial implementation + +// +// To compile: +// cd client_libraries/java +// protoc --proto_path=../../ --java_out=src/main/java ../../sparkplug_b.proto +// + +syntax = "proto2"; + +package org.eclipse.tahu.protobuf; + +option java_package = "org.eclipse.tahu.protobuf"; +option java_outer_classname = "SparkplugBProto"; + +enum DataType { + // Indexes of Data Types + + // Unknown placeholder for future expansion. + Unknown = 0; + + // Basic Types + Int8 = 1; + Int16 = 2; + Int32 = 3; + Int64 = 4; + UInt8 = 5; + UInt16 = 6; + UInt32 = 7; + UInt64 = 8; + Float = 9; + Double = 10; + Boolean = 11; + String = 12; + DateTime = 13; + Text = 14; + + // Additional Metric Types + UUID = 15; + DataSet = 16; + Bytes = 17; + File = 18; + Template = 19; + + // Additional PropertyValue Types + PropertySet = 20; + PropertySetList = 21; + + // Array Types + Int8Array = 22; + Int16Array = 23; + Int32Array = 24; + Int64Array = 25; + UInt8Array = 26; + UInt16Array = 27; + UInt32Array = 28; + UInt64Array = 29; + FloatArray = 30; + DoubleArray = 31; + BooleanArray = 32; + StringArray = 33; + DateTimeArray = 34; +} + +message Payload { + + message Template { + + message Parameter { + optional string name = 1; + optional uint32 type = 2; + + oneof value { + uint32 int_value = 3; + uint64 long_value = 4; + float float_value = 5; + double double_value = 6; + bool boolean_value = 7; + string string_value = 8; + ParameterValueExtension extension_value = 9; + } + + message ParameterValueExtension { + extensions 1 to max; + } + } + + optional string version = 1; // The version of the Template to prevent mismatches + repeated Metric metrics = 2; // Each metric includes a name, datatype, and optionally a value + repeated Parameter parameters = 3; + optional string template_ref = 4; // MUST be a reference to a template definition if this is an instance (i.e. the name of the template definition) - MUST be omitted for template definitions + optional bool is_definition = 5; + extensions 6 to max; + } + + message DataSet { + + message DataSetValue { + + oneof value { + uint32 int_value = 1; + uint64 long_value = 2; + float float_value = 3; + double double_value = 4; + bool boolean_value = 5; + string string_value = 6; + DataSetValueExtension extension_value = 7; + } + + message DataSetValueExtension { + extensions 1 to max; + } + } + + message Row { + repeated DataSetValue elements = 1; + extensions 2 to max; // For third party extensions + } + + optional uint64 num_of_columns = 1; + repeated string columns = 2; + repeated uint32 types = 3; + repeated Row rows = 4; + extensions 5 to max; // For third party extensions + } + + message PropertyValue { + + optional uint32 type = 1; + optional bool is_null = 2; + + oneof value { + uint32 int_value = 3; + uint64 long_value = 4; + float float_value = 5; + double double_value = 6; + bool boolean_value = 7; + string string_value = 8; + PropertySet propertyset_value = 9; + PropertySetList propertysets_value = 10; // List of Property Values + PropertyValueExtension extension_value = 11; + } + + message PropertyValueExtension { + extensions 1 to max; + } + } + + message PropertySet { + repeated string keys = 1; // Names of the properties + repeated PropertyValue values = 2; + extensions 3 to max; + } + + message PropertySetList { + repeated PropertySet propertyset = 1; + extensions 2 to max; + } + + message MetaData { + // Bytes specific metadata + optional bool is_multi_part = 1; + + // General metadata + optional string content_type = 2; // Content/Media type + optional uint64 size = 3; // File size, String size, Multi-part size, etc + optional uint64 seq = 4; // Sequence number for multi-part messages + + // File metadata + optional string file_name = 5; // File name + optional string file_type = 6; // File type (i.e. xml, json, txt, cpp, etc) + optional string md5 = 7; // md5 of data + + // Catchalls and future expansion + optional string description = 8; // Could be anything such as json or xml of custom properties + extensions 9 to max; + } + + message Metric { + + optional string name = 1; // Metric name - should only be included on birth + optional uint64 alias = 2; // Metric alias - tied to name on birth and included in all later DATA messages + optional uint64 timestamp = 3; // Timestamp associated with data acquisition time + optional uint32 datatype = 4; // DataType of the metric/tag value + optional bool is_historical = 5; // If this is historical data and should not update real time tag + optional bool is_transient = 6; // Tells consuming clients such as MQTT Engine to not store this as a tag + optional bool is_null = 7; // If this is null - explicitly say so rather than using -1, false, etc for some datatypes. + optional MetaData metadata = 8; // Metadata for the payload + optional PropertySet properties = 9; + + oneof value { + uint32 int_value = 10; + uint64 long_value = 11; + float float_value = 12; + double double_value = 13; + bool boolean_value = 14; + string string_value = 15; + bytes bytes_value = 16; // Bytes, File + DataSet dataset_value = 17; + Template template_value = 18; + MetricValueExtension extension_value = 19; + } + + message MetricValueExtension { + extensions 1 to max; + } + } + + optional uint64 timestamp = 1; // Timestamp at message sending time + repeated Metric metrics = 2; // Repeated forever - no limit in Google Protobufs + optional uint64 seq = 3; // Sequence number + optional string uuid = 4; // UUID to track message type in terms of schema definitions + optional bytes body = 5; // To optionally bypass the whole definition above + extensions 6 to max; // For third party extensions +} diff --git a/src/pysparkplug/__init__.py b/src/pysparkplug/__init__.py new file mode 100644 index 0000000..bd1987f --- /dev/null +++ b/src/pysparkplug/__init__.py @@ -0,0 +1,15 @@ +"""Initialization code for Sparkplug B package""" + +from pysparkplug._client import * +from pysparkplug._config import * +from pysparkplug._datatype import * +from pysparkplug._edge_node import * +from pysparkplug._enums import * +from pysparkplug._error import * +from pysparkplug._message import * +from pysparkplug._metric import * +from pysparkplug._payload import * +from pysparkplug._time import * +from pysparkplug._topic import * +from pysparkplug._types import MetricValue +from pysparkplug._version import __version__ as __version__ diff --git a/src/pysparkplug/_client.py b/src/pysparkplug/_client.py new file mode 100644 index 0000000..4e99018 --- /dev/null +++ b/src/pysparkplug/_client.py @@ -0,0 +1,248 @@ +"""Module containing the low-level Sparkplug B client""" + +import logging +from typing import Any, Callable, Dict, Optional, Tuple, Union + +from paho.mqtt import client as paho_mqtt + +from pysparkplug._config import ClientOptions, TLSConfig, WSConfig +from pysparkplug._enums import ErrorCode, MQTTProtocol, QoS, Transport +from pysparkplug._error import check_connack_code, check_error_code +from pysparkplug._message import Message +from pysparkplug._payload import Birth +from pysparkplug._topic import Topic +from pysparkplug._types import Self + +__all__ = ["Client"] +logger = logging.getLogger(__name__) + + +class Client: + """Low-level MQTT client + + Args: + client_id: + the unique client id string used when connecting to the broker + protocol: + the version of the MQTT protocol to use for this client + username: + the username used for broker authentication + password: + the password used for broker authentication + transport_config: + a config object defining the transport layer protocol the + client will use to connect to the broker + client_options: + a config object defining various options for the client + """ + + _client: paho_mqtt.Client + _subscriptions: Dict[Topic, QoS] + _births: Dict[Tuple[Optional[str], Optional[str], Optional[str]], Birth] + + def __init__( + self, + client_id: Optional[str] = None, + protocol: MQTTProtocol = MQTTProtocol.MQTT_V311, + username: Optional[str] = None, + password: Optional[str] = None, + transport_config: Optional[Union[TLSConfig, WSConfig]] = None, + client_options: ClientOptions = ClientOptions(), + ) -> None: + self._client = paho_mqtt.Client( + client_id=client_id, + clean_session=True, + protocol=protocol, + transport=Transport.WS + if isinstance(transport_config, WSConfig) + else Transport.TCP, + reconnect_on_failure=client_options.reconnect_on_failure, + ) + self._client.enable_logger(logger) + if username is not None: + self._client.username_pw_set(username=username, password=password) + if isinstance(transport_config, TLSConfig): + self._client.tls_set( + ca_certs=transport_config.ca_certs, + certfile=transport_config.certfile, + keyfile=transport_config.keyfile, + cert_reqs=transport_config.cert_reqs, + tls_version=transport_config.tls_version, + ciphers=transport_config.ciphers, + ) + elif isinstance(transport_config, WSConfig): + self._client.ws_set_options( + path=transport_config.path, + headers=transport_config.headers, + ) + elif transport_config is not None: + raise TypeError(f"Unrecognized transport_config type {transport_config}") + self._client.max_inflight_messages_set( + inflight=client_options.max_inflight_messages + ) + self._client.max_queued_messages_set( + queue_size=client_options.max_queued_messages + ) + self._client.message_retry_set(retry=client_options.message_retry_timeout) + self._client.reconnect_delay_set( + min_delay=client_options.reconnection_delay_min, + max_delay=client_options.reconnection_delay_max, + ) + self._births = {} + self._subscriptions = {} + + def set_will(self, message: Optional[Message]) -> None: + """Set the last will & testament for the specified message + + Args: + message: + the message to be registered with the broker, or None, which clears the will + """ + if message is None: + self._client.will_clear() + else: + self._client.will_set( + topic=message.topic.to_str(), + payload=message.payload.encode(), + qos=message.qos, + retain=message.retain, + ) + + def connect( + self, + host: str, + *, + port: int = 1883, + keepalive: int = 60, + bind_address: str = "", + blocking: bool = False, + callback: Optional[Callable[[Self], None]] = None, + ) -> None: + """Connect client to the broker + + Args: + host: + the hostname or IP address of the remote broker + port: + the port of the broker + keepalive: + maximum period in seconds allowed between communications with the broker + bind_address: + the IP address of a local network interface to bind this client to, assuming multiple interfaces exist + blocking: + whether or not to connect in a blocking way, or connect with a separate thread + callback: + a custom callback to be called each time the client successfully connects + """ + + def cb( + _client: paho_mqtt.Client, + _userdata: Dict[Any, Any], + _flags: Dict[Any, Any], + rc: int, + ) -> None: + self._on_connect(rc) + if callback is not None: + callback(self) + + self._client.on_connect = cb + self._client.connect( + host=host, + port=port, + keepalive=keepalive, + bind_address=bind_address, + ) + if blocking: + self._client.loop_forever() + else: + self._client.loop_start() + + def _on_connect(self, rc: int) -> None: + check_connack_code(rc) + self._births.clear() + for topic, qos in self._subscriptions.items(): + self._subscribe(topic=topic, qos=qos) + + def disconnect(self) -> None: + """Disconnect from the broker cleanly, i.e. results in no + will message being sent by the broker. + """ + self._client.disconnect() + self._client.loop_stop() # stop loop, even if we were running in blocking mode + + def publish( + self, + message: Message, + *, + include_dtypes: bool = False, + ) -> None: + """Publish a message to the broker + + Args: + message: + the message to be published + include_dtypes: + whether or not to include the dtypes of the message + """ + result = self._client.publish( + topic=message.topic.to_str(), + payload=message.payload.encode(include_dtypes=include_dtypes), + qos=message.qos, + retain=message.retain, + ) + check_error_code(result.rc) + + def subscribe( + self, + topic: Topic, + qos: QoS, + callback: Callable[[Self, Message], None], + ) -> None: + """Subscribe to the specified topic + + Args: + topic: + the topic to be subscribed to + qos: + the qos of the subscription + callback: + the callback to run when messages are received for this subscription + """ + + def cb( + _client: paho_mqtt.Client, + _userdata: Dict[Any, Any], + mqtt_message: paho_mqtt.MQTTMessage, + ) -> None: + message = self._handle_message(mqtt_message) + callback(self, message) + + self._client.message_callback_add(topic.to_str(), cb) + self._subscriptions[topic] = qos + self._subscribe(topic, qos) + + def _handle_message(self, mqtt_message: paho_mqtt.MQTTMessage) -> Message: + topic = Topic.from_str(mqtt_message.topic) + key = (topic.group_id, topic.edge_node_id, topic.device_id) + birth = self._births.get(key) + message = Message.from_mqtt_message(mqtt_message, birth=birth) + if isinstance(message.payload, Birth): + self._births[key] = message.payload + + return message + + def _subscribe(self, topic: Topic, qos: QoS) -> None: + result, _ = self._client.subscribe(topic.to_str(), qos) + check_error_code(result, ignore_codes={ErrorCode.NO_CONN}) + + def unsubscribe(self, topic: Topic) -> None: + """Unsubscribe from the specified topic + + Args: + topic: + the topic to be subscribed to + """ + result, _ = self._client.unsubscribe(topic.to_str()) + check_error_code(result) + del self._subscriptions[topic] + self._client.message_callback_remove(topic.to_str()) diff --git a/src/pysparkplug/_config.py b/src/pysparkplug/_config.py new file mode 100644 index 0000000..dab54c9 --- /dev/null +++ b/src/pysparkplug/_config.py @@ -0,0 +1,133 @@ +"""Configuration classes for MQTT and Sparkplug B""" + +import dataclasses +import ssl +from typing import Any, Callable, Dict, Optional, Union + +__all__ = [ + "ClientOptions", + "TLSConfig", + "WSConfig", +] + + +@dataclasses.dataclass(frozen=True) +class TLSConfig: + """TLS configuration class + + Args: + ca_certs: + a string path to the Certificate Authority certificate files that + are to be treated as trusted by this client. If this is the only + option given then the client will operate in a similar manner to + a web browser. That is to say it will require the broker to have + a certificate signed by the Certificate Authorities in ca_certs + and will communicate using TLS v1.2, but will not attempt any + form of authentication. This provides basic network encryption + but may not be sufficient depending on how the broker is + configured. + certfile: + string pointing to the PEM encoded client certificate. If this + argument is not None then it will be used as client + information for TLS based authentication. Support for this + feature is broker dependent. Note that if this file is + encrypted and needs a password to decrypt it, Python will ask + for the password at the command line. It is not currently possible + to define a callback to provide the password. + keyfile: + string pointing to the PEM encoded private keys. If this + argument is not None then it will be used as client + information for TLS based authentication. Support for this + feature is broker dependent. Note that if this file is + encrypted and needs a password to decrypt it, Python will ask + for the password at the command line. It is not currently possible + to define a callback to provide the password. + cert_reqs: + defines the certificate requirements that the client imposes on the + broker. By default this is `ssl.CERT_REQUIRED`, which means that + the broker must provide a certificate. See the ssl pydoc for more + information on this parameter. + tls_version: + specifies the version of the SSL/TLS protocol to be used. By default + (if the python version supports it) the highest TLS version is + detected. If unavailable, TLS v1.2 is used. Previous versions + (all versions beginning with SSL) are possible but not recommended + due to possible security problems. + ciphers: + a string specifying which encryption ciphers are allowable for this + connection, or `None` to use the defaults. See the ssl pydoc for more + information. + + Returns: + a TLSConfig object + """ + + ca_certs: Optional[str] = None + certfile: Optional[str] = None + keyfile: Optional[str] = None + cert_reqs: ssl.VerifyMode = ssl.VerifyMode.CERT_REQUIRED + tls_version: ssl._SSLMethod = ssl.PROTOCOL_TLS # pylint: disable=no-member + ciphers: Optional[str] = None + + +@dataclasses.dataclass(frozen=True) +class WSConfig: + """Websockets configuration class + + Args: + path: + the mqtt path to use on the broker. + headers: + either a dictionary specifying a list of extra headers which should + be appended to the standard websocket headers, or a callable that + takes the normal websocket headers and returns a new dictionary + with a set of headers to connect to the broker. + + Returns: + a WSConfig object + """ + + path: str = "/mqtt" + headers: Optional[ + Union[Dict[str, Any], Callable[[Dict[str, Any]], Dict[str, Any]]] + ] = None + + +@dataclasses.dataclass(frozen=True) +class ClientOptions: + """Class of optional settings for an MQTT client + + Args: + max_inflight_messages: + maximum number of messages with QoS>0 that can be part way through + their network flow at once. Increasing this value will consume + more memory but can increase throughput. + max_queued_messages: + maximum number of outgoing messages with QoS>0 that can be pending + in the outgoing message queue. 0 means unlimited, but due to + implementation currently limited to 65555 (65535 messages in queue + + 20 in flight). When the queue is full, any further outgoing + messages would be dropped. + message_retry_timeout: + time in seconds before a message with QoS>0 is retried, if the + broker does not respond. This is set to 5 seconds by default and + should not normally need changing. + reconnection_delay_min: + when the connection is lost, the client will automatically retry + connection. Initially, the attempt is delayed of min_delay seconds. + It's doubled between subsequent attempts up to reconnection_delay_max. + reconnection_delay_max: + see `reconnection_delay_min`. + reconnect_on_failure: + whether or not to reconnect the client on failure + + Returns: + a ClientOptions object + """ + + max_inflight_messages: int = 20 + max_queued_messages: int = 0 + message_retry_timeout: int = 5 + reconnection_delay_min: int = 1 + reconnection_delay_max: int = 120 + reconnect_on_failure: bool = True diff --git a/src/pysparkplug/_datatype.py b/src/pysparkplug/_datatype.py new file mode 100644 index 0000000..eeafb59 --- /dev/null +++ b/src/pysparkplug/_datatype.py @@ -0,0 +1,141 @@ +"""Module defining the DataType enum""" + +import datetime +import enum +from typing import Union + +from pysparkplug._types import MetricValue + +__all__ = ["DataType"] + + +class DataType(enum.IntEnum): + """Enumeration of Sparkplug B datatypes""" + + UNKNOWN = 0 + INT8 = 1 + INT16 = 2 + INT32 = 3 + INT64 = 4 + UINT8 = 5 + UINT16 = 6 + UINT32 = 7 + UINT64 = 8 + FLOAT = 9 + DOUBLE = 10 + BOOLEAN = 11 + STRING = 12 + DATETIME = 13 + TEXT = 14 + UUID = 15 + BYTES = 17 + FILE = 18 + + @property + def field(self) -> str: + """The Protobuf field the data is encoded in""" + try: + return _fields[self] + except KeyError as exc: + raise ValueError(f"{self} has no field name") from exc + + def encode(self, value: MetricValue) -> Union[int, float, bool, str, bytes]: + """Encode a value into the form it should take in a Sparkplug B Protobuf object""" + try: + encoder = _encoders[self] + except KeyError as exc: + raise ValueError(f"{self} cannot be encoded") from exc + return encoder(value) # type: ignore[no-any-return,no-untyped-call] + + def decode(self, value: Union[int, float, bool, str, bytes]) -> MetricValue: + """Decode a value from the form it takes in a Sparkplug B Protobuf object""" + try: + decoder = _decoders[self] + except KeyError as exc: + raise ValueError(f"{self} cannot be decoded") from exc + return decoder(value) # type: ignore[no-any-return,no-untyped-call] + + +_fields = { + DataType.UINT8: "int_value", + DataType.UINT16: "int_value", + DataType.UINT32: "int_value", + DataType.UINT64: "long_value", + DataType.INT8: "int_value", + DataType.INT16: "int_value", + DataType.INT32: "int_value", + DataType.INT64: "long_value", + DataType.FLOAT: "float_value", + DataType.DOUBLE: "double_value", + DataType.BOOLEAN: "boolean_value", + DataType.STRING: "string_value", + DataType.DATETIME: "long_value", + DataType.TEXT: "string_value", + DataType.UUID: "string_value", + DataType.BYTES: "bytes_value", + DataType.FILE: "bytes_value", +} + +_encoders = { + DataType.UINT8: lambda val: _uint_coder(val, 8), + DataType.UINT16: lambda val: _uint_coder(val, 16), + DataType.UINT32: lambda val: _uint_coder(val, 32), + DataType.UINT64: lambda val: _uint_coder(val, 64), + DataType.INT8: lambda val: _int_encoder(val, 8), + DataType.INT16: lambda val: _int_encoder(val, 16), + DataType.INT32: lambda val: _int_encoder(val, 32), + DataType.INT64: lambda val: _int_encoder(val, 64), + DataType.FLOAT: lambda val: val, + DataType.DOUBLE: lambda val: val, + DataType.BOOLEAN: lambda val: val, + DataType.STRING: lambda val: val, + DataType.DATETIME: lambda val: int( + val.replace(tzinfo=datetime.timezone.utc).timestamp() * 1e3 + ), + DataType.TEXT: lambda val: val, + DataType.UUID: lambda val: val, + DataType.BYTES: lambda val: val, + DataType.FILE: lambda val: val, +} + +_decoders = { + DataType.UINT8: lambda val: _uint_coder(val, 8), + DataType.UINT16: lambda val: _uint_coder(val, 16), + DataType.UINT32: lambda val: _uint_coder(val, 32), + DataType.UINT64: lambda val: _uint_coder(val, 64), + DataType.INT8: lambda val: _int_decoder(val, 8), + DataType.INT16: lambda val: _int_decoder(val, 16), + DataType.INT32: lambda val: _int_decoder(val, 32), + DataType.INT64: lambda val: _int_decoder(val, 64), + DataType.FLOAT: lambda val: val, + DataType.DOUBLE: lambda val: val, + DataType.BOOLEAN: lambda val: val, + DataType.STRING: lambda val: val, + DataType.DATETIME: lambda val: datetime.datetime.fromtimestamp( + val * 1e-3, tz=datetime.timezone.utc + ), + DataType.TEXT: lambda val: val, + DataType.UUID: lambda val: val, + DataType.BYTES: lambda val: val, + DataType.FILE: lambda val: val, +} + + +def _uint_coder(value: int, bits: int) -> int: + if not 0 <= value < 2**bits: + raise OverflowError(f"UInt{bits} overflow with value {value}") + return value + + +def _int_encoder(value: int, bits: int) -> int: + max_val: int = 2 ** (bits - 1) + if not -max_val <= value < max_val: + raise OverflowError(f"Int{bits} overflow with value {value}") + return value + (max_val * 2 if value < 0 else 0) + + +def _int_decoder(value: int, bits: int) -> int: + max_val: int = 2**bits + if not 0 <= value < max_val: + raise OverflowError(f"Int{bits} overflow with value {value}") + return value - (max_val if value >= 2 ** (bits - 1) else 0) diff --git a/src/pysparkplug/_edge_node.py b/src/pysparkplug/_edge_node.py new file mode 100644 index 0000000..8b0c187 --- /dev/null +++ b/src/pysparkplug/_edge_node.py @@ -0,0 +1,494 @@ +"""Module defining the EdgeNode & Device classes""" + +from __future__ import annotations + +import itertools +import logging +from typing import Callable, Dict, Iterable, Optional + +from pysparkplug._client import Client +from pysparkplug._datatype import DataType +from pysparkplug._enums import MessageType, QoS +from pysparkplug._message import Message +from pysparkplug._metric import Metric +from pysparkplug._payload import DBirth, DData, DDeath, NBirth, NData, NDeath +from pysparkplug._time import get_current_timestamp +from pysparkplug._topic import Topic +from pysparkplug._types import Self + +__all__ = ["Device", "EdgeNode"] +logger = logging.getLogger(__name__) +BD_SEQ = "bdSeq" +SEQ_LIMIT = 256 + + +def _default_cmd_callback(_: EdgeNode, message: Message) -> None: + logger.info(f"Received command {message}") + + +class EdgeNode: + """Class representing an EdgeNode in Sparkplug B + + Args: + group_id: + the Group ID element of the topic namespace provides for a logical + grouping of Sparkplug Edge Nodes into the MQTT Server and back out + to the consuming Sparkplug Host Applications + edge_node_id: + the edge_node_id element of the Sparkplug topic namespace uniquely + identifies the Sparkplug Edge Node within the infrastructure + metrics: + the metrics associated with this edge node + client: + the low-level MQTT client used by this edge node for connecting to + the broker + cmd_callback: + the callback function to execute when an NCMD payload is received + """ + + group_id: str + edge_node_id: str + _metrics: Dict[str, Metric] + _devices: Dict[str, Device] + _client: Client + + _bd_seq_metric: Metric + __seq_cycler: itertools.cycle[int] = itertools.cycle(range(SEQ_LIMIT)) + __bd_seq_cycler: itertools.cycle[int] = itertools.cycle(range(SEQ_LIMIT)) + _connected: bool = False + + def __init__( + self, + group_id: str, + edge_node_id: str, + metrics: Iterable[Metric], + client: Optional[Client] = None, + cmd_callback: Callable[[Self, Message], None] = _default_cmd_callback, + ): + self.group_id = group_id + self.edge_node_id = edge_node_id + self._setup_metrics(metrics) + self._devices = {} + self._client = client if client is not None else Client() + + self._setup_will() + + # Subscribe to NCMD + n_cmd_topic = Topic( + message_type=MessageType.NCMD, + group_id=self.group_id, + edge_node_id=self.edge_node_id, + ) + self.subscribe( + topic=n_cmd_topic, + qos=QoS.AT_LEAST_ONCE, + callback=lambda _, message: cmd_callback(self, message), + ) + + def _setup_metrics(self, metrics: Iterable[Metric]) -> None: + self._metrics = {} + for metric in metrics: + if metric.name is None: + raise ValueError( + f"Metric {metric} must have a defined name when provided to an Edge Node" + ) + if metric.datatype == DataType.UNKNOWN: + raise ValueError( + f"Metric {metric} must have a defined datatype when provided to an Edge Node" + ) + self._metrics[metric.name] = metric + + def _setup_will(self) -> None: + """Set the bdSeq metric and set the will with an NDEATH message with + that metric for the next time we connect. + """ + self._bd_seq_metric = Metric( + timestamp=get_current_timestamp(), + name=BD_SEQ, + datatype=DataType.INT64, + value=self._bd_seq, + ) + will_topic = Topic( + message_type=MessageType.NDEATH, + group_id=self.group_id, + edge_node_id=self.edge_node_id, + ) + will_payload = NDeath( + timestamp=get_current_timestamp(), bd_seq_metric=self._bd_seq_metric + ) + will_message = Message( + topic=will_topic, payload=will_payload, qos=QoS.AT_MOST_ONCE, retain=False + ) + self._client.set_will(will_message) + + def connect( + self, + host: str, + *, + port: int = 1883, + keepalive: int = 60, + bind_address: str = "", + blocking: bool = False, + ) -> None: + """Connect edge node to the broker + + Args: + host: + the hostname or IP address of the remote broker + port: + the port of the broker + keepalive: + maximum period in seconds allowed between communications with the broker + bind_address: + the IP address of a local network interface to bind this client to, assuming multiple interfaces exist + blocking: + whether or not to connect in a blocking way, or connect with a separate thread + """ + + def callback(client: Client) -> None: + self._connected = True + # Reset seq cycler + self.__seq_cycler = itertools.cycle(range(SEQ_LIMIT)) + + # Publish NBIRTH + n_birth_topic = Topic( + message_type=MessageType.NBIRTH, + group_id=self.group_id, + edge_node_id=self.edge_node_id, + ) + metrics = tuple(self._metrics.values()) + (self._bd_seq_metric,) + n_birth = NBirth( + timestamp=get_current_timestamp(), seq=self._seq, metrics=metrics + ) + client.publish( + Message( + topic=n_birth_topic, + payload=n_birth, + qos=QoS.AT_MOST_ONCE, + retain=False, + ), + include_dtypes=True, + ) + + # Publish DBIRTHs + for device in self._devices.values(): + d_birth_topic = Topic( + message_type=MessageType.DBIRTH, + group_id=self.group_id, + edge_node_id=self.edge_node_id, + device_id=device.device_id, + ) + d_birth = DBirth( + timestamp=get_current_timestamp(), + seq=self._seq, + metrics=device.metrics.values(), + ) + client.publish( + Message( + topic=d_birth_topic, + payload=d_birth, + qos=QoS.AT_MOST_ONCE, + retain=False, + ), + include_dtypes=True, + ) + + # Setup will for next connection + self._setup_will() + + self._client.connect( + host, + port=port, + keepalive=keepalive, + bind_address=bind_address, + blocking=blocking, + callback=callback, + ) + + def disconnect(self) -> None: + """Disconnect from the broker cleanly, i.e. results in no + will message being sent by the broker. + """ + self._client.disconnect() + self._connected = False + + def register(self, device: Device) -> None: + """Register a device to the edge node, can be run while edge node is connected + + Args: + device: + the device to register to the edge node + """ + if device.device_id in self._devices: + raise ValueError( + f"Cannot register device with id {device.device_id} as another device " + "is already registered with this edge node with that id" + ) + self._devices[device.device_id] = device + d_cmd_topic = Topic( + message_type=MessageType.DCMD, + group_id=self.group_id, + edge_node_id=self.edge_node_id, + device_id=device.device_id, + ) + self.subscribe( + topic=d_cmd_topic, + qos=QoS.AT_LEAST_ONCE, + callback=lambda _, message: device.cmd_callback(self, message), + ) + if self._connected: + d_birth_topic = Topic( + message_type=MessageType.DBIRTH, + group_id=self.group_id, + edge_node_id=self.edge_node_id, + device_id=device.device_id, + ) + d_birth = DBirth( + timestamp=get_current_timestamp(), + seq=self._seq, + metrics=device.metrics.values(), + ) + self._client.publish( + Message( + topic=d_birth_topic, + payload=d_birth, + qos=QoS.AT_MOST_ONCE, + retain=False, + ), + include_dtypes=True, + ) + + def deregister(self, device_id: str) -> None: + """Remove a device from the edge node, sending a DDeath if the edge node is online. + + Args: + device_id: the id of the device to be deregistered + """ + try: + del self._devices[device_id] + except KeyError as exc: + raise ValueError( + f"Cannot deregister device with id {device_id} as no device with that " + "id is registered with this edge node" + ) from exc + + if self._connected: + d_death_topic = Topic( + message_type=MessageType.DDEATH, + group_id=self.group_id, + edge_node_id=self.edge_node_id, + device_id=device_id, + ) + d_death = DDeath( + timestamp=get_current_timestamp(), + seq=self._seq, + ) + self._client.publish( + Message( + topic=d_death_topic, + payload=d_death, + qos=QoS.AT_MOST_ONCE, + retain=False, + ), + include_dtypes=True, + ) + + d_cmd_topic = Topic( + message_type=MessageType.DCMD, + group_id=self.group_id, + edge_node_id=self.edge_node_id, + device_id=device_id, + ) + self.unsubscribe(d_cmd_topic) + + def subscribe( + self, + topic: Topic, + qos: QoS, + callback: Callable[[Self, Message], None], + ) -> None: + """Subscribe to the specified topic + + Args: + topic: + the topic to be subscribed to + qos: + the qos of the subscription + callback: + the callback to run when messages are received for this subscription + """ + + def cb( + _client: Client, + message: Message, + ) -> None: + callback(self, message) + + self._client.subscribe(topic, qos, cb) + + def unsubscribe(self, topic: Topic) -> None: + """Unsubscribe from the specified topic + + Args: + topic: + the topic to be subscribed to + """ + self._client.unsubscribe(topic) + + @property + def metrics(self) -> Dict[str, Metric]: + """Returns a copy of the metrics for this edge node in a dictionary""" + return self._metrics.copy() + + @property + def devices(self) -> Dict[str, Device]: + """Returns a copy of the devices for this edge node in a dictionary""" + return self._devices.copy() + + def update(self, metrics: Iterable[Metric]) -> None: + """Update some (or all) of the edge node's metrics + + Args: + metrics: + an iterable of metrics to be updated + """ + for metric in metrics: + if metric.name is None: + raise ValueError( + f"Metric {metric} must have a defined name when provided to an Edge Node" + ) + try: + curr_metric = self._metrics[metric.name] + except KeyError as exc: + raise ValueError( + f"Unrecognized metric {metric.name} cannot be updated" + ) from exc + if curr_metric.datatype != metric.datatype: + raise ValueError( + f"Metric datatype provided {metric.datatype} " + f"doesn't match {curr_metric.datatype}" + ) + self._metrics[metric.name] = metric + + topic = Topic( + message_type=MessageType.NDATA, + group_id=self.group_id, + edge_node_id=self.edge_node_id, + ) + n_data = NData( + timestamp=get_current_timestamp(), seq=self._seq, metrics=metrics + ) + self._client.publish( + Message(topic=topic, payload=n_data, qos=QoS.AT_MOST_ONCE, retain=False), + include_dtypes=True, + ) + + def update_device(self, device_id: str, metrics: Iterable[Metric]) -> None: + """Update some (or all) of the metrics associated with the provided device_id + + Args: + device_id: + the id of the device to be updated + metrics: + an iterable of metrics to be updated + """ + try: + device = self._devices[device_id] + except KeyError as exc: + raise ValueError( + f"Unable to update device {device_id} as no device with that id is " + "registered to this edge node" + ) from exc + device.update(metrics) + d_data_topic = Topic( + message_type=MessageType.DDATA, + group_id=self.group_id, + edge_node_id=self.edge_node_id, + ) + d_data = DData(get_current_timestamp(), seq=self._seq, metrics=metrics) + self._client.publish( + Message( + topic=d_data_topic, payload=d_data, qos=QoS.AT_MOST_ONCE, retain=False + ), + include_dtypes=True, + ) + + @property + def _seq(self) -> int: + return next(self.__seq_cycler) + + @property + def _bd_seq(self) -> int: + return next(self.__bd_seq_cycler) + + +class Device: + """Class representing a Device in Sparkplug B + + Args: + device_id: + the device_id element of the Sparkplug topic namespace identifies + a device attached (physically or logically) to the Sparkplug Edge + Node + metrics: + the metrics associated with this device + cmd_callback: + the callback function to execute when a DCMD payload is received + """ + + device_id: str + _metrics: Dict[str, Metric] + cmd_callback: Callable[[EdgeNode, Message], None] + + def __init__( + self, + device_id: str, + metrics: Iterable[Metric], + cmd_callback: Callable[[EdgeNode, Message], None] = _default_cmd_callback, + ): + self.device_id = device_id + self._setup_metrics(metrics) + self.cmd_callback = cmd_callback + + def _setup_metrics(self, metrics: Iterable[Metric]) -> None: + self._metrics = {} + for metric in metrics: + if metric.name is None: + raise ValueError( + f"Metric {metric} must have a defined name when provided to an Edge Node" + ) + if metric.datatype == DataType.UNKNOWN: + raise ValueError( + f"Metric {metric} must have a defined datatype when provided to an Edge Node" + ) + self._metrics[metric.name] = metric + + @property + def metrics(self) -> Dict[str, Metric]: + """Returns a copy of the metrics for this edge node in a dictionary""" + return self._metrics.copy() + + def update(self, metrics: Iterable[Metric]) -> None: + """Update some (or all) of the device's metrics + + Args: + metrics: + an iterable of metrics to be updated + """ + for metric in metrics: + if metric.name is None: + raise ValueError( + f"Metric {metric} must have a defined name when provided to a Device" + ) + try: + curr_metric = self._metrics[metric.name] + except KeyError as exc: + raise ValueError( + f"Unrecognized metric {metric.name} cannot be updated" + ) from exc + if curr_metric.datatype != metric.datatype: + raise ValueError( + f"Metric datatype provided {metric.datatype} " + f"doesn't match {curr_metric.datatype}" + ) + self._metrics[metric.name] = metric diff --git a/src/pysparkplug/_enums.py b/src/pysparkplug/_enums.py new file mode 100644 index 0000000..d0f9d43 --- /dev/null +++ b/src/pysparkplug/_enums.py @@ -0,0 +1,144 @@ +"""Enumerations for MQTT and Sparkplug B""" + +import enum + +from pysparkplug import _payload as payload + +__all__ = [ + "ErrorCode", + "ConnackCode", + "MQTTProtocol", + "QoS", + "MessageType", + "Transport", +] + + +class _StrEnum(str, enum.Enum): + pass + + +class ErrorCode(enum.IntEnum): + """MQTT error codes""" + + AGAIN = -1 + SUCCESS = 0 + NOMEM = 1 + PROTOCOL = 2 + INVAL = 3 + NO_CONN = 4 + CONN_REFUSED = 5 + NOT_FOUND = 6 + CONN_LOST = 7 + TLS = 8 + PAYLOAD_SIZE = 9 + NOT_SUPPORTED = 10 + AUTH = 11 + ACL_DENIED = 12 + UNKNOWN = 13 + ERRNO = 14 + QUEUE_SIZE = 15 + KEEPALIVE = 16 + + def __str__(self) -> str: + return _error_strings.get(self, "Unkown error") + + +_error_strings = { + ErrorCode.SUCCESS: "No error", + ErrorCode.NOMEM: "Out of memory", + ErrorCode.PROTOCOL: "A network protocol error occurred when communicating with the broker", + ErrorCode.INVAL: "Invalid function arguments provided", + ErrorCode.NO_CONN: "The client is not currently connected", + ErrorCode.CONN_REFUSED: "The connection was refused", + ErrorCode.NOT_FOUND: "Message not found (internal error)", + ErrorCode.CONN_LOST: "The connection was lost", + ErrorCode.TLS: "A TLS error occurred", + ErrorCode.PAYLOAD_SIZE: "Payload too large", + ErrorCode.NOT_SUPPORTED: "This feature is not supported", + ErrorCode.AUTH: "Authorisation failed", + ErrorCode.ACL_DENIED: "Access denied by ACL", + ErrorCode.UNKNOWN: "Unknown error", + ErrorCode.ERRNO: "Error defined by errno", + ErrorCode.QUEUE_SIZE: "Message queue full", + ErrorCode.KEEPALIVE: "Client or broker did not communicate in the keepalive interval", +} + + +class ConnackCode(enum.IntEnum): + """MQTT Connection Acknowledgement codes""" + + CONNACK_ACCEPTED = 0 + CONNACK_REFUSED_PROTOCOL_VERSION = 1 + CONNACK_REFUSED_IDENTIFIER_REJECTED = 2 + CONNACK_REFUSED_SERVER_UNAVAILABLE = 3 + CONNACK_REFUSED_BAD_USERNAME_PASSWORD = 4 + CONNACK_REFUSED_NOT_AUTHORIZED = 5 + + def __str__(self) -> str: + return _connack_strings.get(self, "Connection Refused: unknown reason") + + +_connack_strings = { + ConnackCode.CONNACK_ACCEPTED: "Connection accepted", + ConnackCode.CONNACK_REFUSED_PROTOCOL_VERSION: "Connection refused: unacceptable protocol version", + ConnackCode.CONNACK_REFUSED_IDENTIFIER_REJECTED: "Connection refused: identifier rejected", + ConnackCode.CONNACK_REFUSED_SERVER_UNAVAILABLE: "Connection refused: broker unavailable", + ConnackCode.CONNACK_REFUSED_BAD_USERNAME_PASSWORD: "Connection refused: bad user name or password", + ConnackCode.CONNACK_REFUSED_NOT_AUTHORIZED: "Connection refused: not authorised", +} + + +class QoS(enum.IntEnum): + """MQTT quality of service enum""" + + AT_MOST_ONCE = 0 + AT_LEAST_ONCE = 1 + EXACTLY_ONCE = 2 + + +class MQTTProtocol(enum.IntEnum): + """MQTT protocol enum""" + + MQTT_V31 = 3 + MQTT_V311 = 4 + MQTT_V5 = 5 + + +class Transport(_StrEnum): + """MQTT transport enum""" + + WS = "websockets" + TCP = "tcp" + + +class MessageType(_StrEnum): + """Sparkplug B message type enum""" + + STATE = "STATE" + NBIRTH = "NBIRTH" + NDATA = "NDATA" + NCMD = "NCMD" + NDEATH = "NDEATH" + DBIRTH = "DBIRTH" + DDATA = "DDATA" + DCMD = "DCMD" + DDEATH = "DDEATH" + + @property + def payload(self) -> type: + """Returns the payload class for this message type""" + return _payloads[self] + + +_payloads = { + MessageType.STATE: payload.State, + MessageType.NBIRTH: payload.NBirth, + MessageType.DBIRTH: payload.DBirth, + MessageType.NDATA: payload.NData, + MessageType.DDATA: payload.DData, + MessageType.NCMD: payload.NCmd, + MessageType.DCMD: payload.DCmd, + MessageType.NDEATH: payload.NDeath, + MessageType.DDEATH: payload.DDeath, +} diff --git a/src/pysparkplug/_error.py b/src/pysparkplug/_error.py new file mode 100644 index 0000000..e7c3834 --- /dev/null +++ b/src/pysparkplug/_error.py @@ -0,0 +1,31 @@ +"""Module containing errors and functions to handle them""" + +from typing import Optional, Set + +from pysparkplug._enums import ConnackCode, ErrorCode + +__all__ = ["MQTTError"] + + +class MQTTError(Exception): + """Error from MQTT client""" + + +def check_error_code( + error_int: int, *, ignore_codes: Optional[Set[ErrorCode]] = None +) -> None: + """Validate error code""" + if error_int > 0: + error_code = ErrorCode(error_int) + if ignore_codes is None or error_code not in ignore_codes: + raise MQTTError(error_code) + + +def check_connack_code( + connack_int: int, *, ignore_codes: Optional[Set[ConnackCode]] = None +) -> None: + """Validate connack code""" + if connack_int > 0: + connack_code = ConnackCode(connack_int) + if ignore_codes is None or connack_code not in ignore_codes: + raise ConnectionError(connack_code) diff --git a/src/pysparkplug/_message.py b/src/pysparkplug/_message.py new file mode 100644 index 0000000..ebd2841 --- /dev/null +++ b/src/pysparkplug/_message.py @@ -0,0 +1,60 @@ +"""Module defining the Message class""" + +import dataclasses +from typing import Optional + +from paho.mqtt import client as paho_mqtt + +from pysparkplug._enums import QoS +from pysparkplug._payload import Birth, Payload +from pysparkplug._topic import Topic +from pysparkplug._types import Self + +__all__ = ["Message"] + + +@dataclasses.dataclass(frozen=True) +class Message: + """Class representing a Sparkplug B message + + Args: + topic: + the Sparkplug B topic associated with this message + payload: + the Sparkplug B payload associated with this message + qos: + the MQTT quality of service associated with this message + retain: + if set to True, the message will be set as the + "last known good"/retained message for the topic + """ + + topic: Topic + payload: Payload + qos: QoS + retain: bool + + @classmethod + def from_mqtt_message( + cls, mqtt_message: paho_mqtt.MQTTMessage, *, birth: Optional[Birth] = None + ) -> Self: + """Constructs a Message object from a Paho MQTTMessage object + + Args: + mqtt_message: + the Paho MQTTMessage object to construct from + birth: + the Birth object associated with this message, + for decoding aliases and dropped dtypes + """ + topic = Topic.from_str(mqtt_message.topic) + # We have to ignore some mypy here since we know that mqtt gives us a + # fully defined topic, i.e. no wildcards. + return cls( + topic=topic, + payload=topic.message_type.payload.decode( # type: ignore[union-attr] + mqtt_message.payload, birth=birth + ), + qos=QoS(mqtt_message.qos), + retain=mqtt_message.retain, + ) diff --git a/src/pysparkplug/_metric.py b/src/pysparkplug/_metric.py new file mode 100644 index 0000000..2820bf4 --- /dev/null +++ b/src/pysparkplug/_metric.py @@ -0,0 +1,94 @@ +"""Module defining the Metric dataclass""" + +import dataclasses +from typing import Optional + +from pysparkplug._datatype import DataType +from pysparkplug._protobuf import Metric as PB_Metric +from pysparkplug._types import MetricValue, Self + +__all__ = ["Metric"] + + +@dataclasses.dataclass(frozen=True) +class Metric: + """Class representing a Sparkplug B metric + + Args: + timestamp: + timestamp associated with data acquisition time + name: + name associated with this metric + datatype: + datatype associated with this metric + value: + the value of the metric + alias: + an integer used to map to the metric's name + is_historical: + if this is historical data and should not update real time tag + is_transient: + tells consuming clients such as MQTT Engine to not store this as a tag + is_null: + if this is null - explicitly say so rather than using -1, false, etc + """ + + timestamp: Optional[int] + name: Optional[str] + datatype: DataType + value: Optional[MetricValue] = None + alias: Optional[int] = None + is_historical: bool = False + is_transient: bool = False + is_null: bool = False + + def to_pb(self, include_dtype: bool) -> PB_Metric: + """Returns a Protobuf metric + + Args: + include_dtype: + whether or not to include dtypes in the Protobuf metric + """ + metric = PB_Metric() + if self.timestamp is not None: + metric.timestamp = self.timestamp + if self.name is not None: + metric.name = self.name + if include_dtype: + metric.datatype = self.datatype + if self.alias is not None: + metric.alias = self.alias + if self.is_historical: + metric.is_historical = self.is_historical + if self.is_transient: + metric.is_transient = self.is_transient + if self.is_null or self.value is None: + metric.is_null = True + else: + setattr(metric, self.datatype.field, self.datatype.encode(self.value)) + return metric + + @classmethod + def from_pb(cls, metric: PB_Metric) -> Self: + """Constructs a Metric object from a Protobuf metric + + Args: + metric: the Protobuf metric to construct from + + Returns: + a Metric object + """ + datatype = DataType(metric.datatype) + value_field = metric.WhichOneof("value") + return cls( + timestamp=metric.timestamp if metric.HasField("timestamp") else None, + name=metric.name if metric.HasField("name") else None, + datatype=datatype, + value=datatype.decode(getattr(metric, value_field)) + if value_field is not None + else None, + alias=metric.alias if metric.HasField("alias") else None, + is_historical=metric.is_historical, + is_transient=metric.is_transient, + is_null=metric.is_null, + ) diff --git a/src/pysparkplug/_payload.py b/src/pysparkplug/_payload.py new file mode 100644 index 0000000..27c63c3 --- /dev/null +++ b/src/pysparkplug/_payload.py @@ -0,0 +1,451 @@ +"""Module defining the Payload class and its subclasses?""" + +from __future__ import annotations + +import dataclasses +import json +from collections.abc import Iterable +from typing import Dict, Optional, cast + +from pysparkplug import _protobuf as protobuf +from pysparkplug._datatype import DataType +from pysparkplug._metric import Metric +from pysparkplug._types import Protocol, Self + +__all__ = [ + "NBirth", + "DBirth", + "NData", + "DData", + "NCmd", + "DCmd", + "NDeath", + "DDeath", + "State", +] + + +class Payload(Protocol): + """Protocol defining the methods a payload should have""" + + @classmethod + def decode(cls, raw: bytes, *, birth: Optional[Birth] = None) -> Self: + """Construct a Payload object from bytes + + Args: + raw: + bytes to decode into a Payload object + birth: + the Birth object associated with this message, + for decoding aliases and dropped dtypes + + Returns: + Payload object + """ + + def encode(self, *, include_dtypes: bool = False) -> bytes: + """Encode Payload object into bytes + + Args: + include_dtypes: + whether or not to include dtypes + + Returns: + encoded payload in bytes + """ + + +class _PBPayload: + @classmethod + def decode(cls, raw: bytes, *, birth: Optional[Birth] = None) -> Self: + """Construct a Payload object from bytes + + Args: + raw: + bytes to decode into a Payload object + birth: + the Birth object associated with this message, + for decoding aliases and dropped dtypes + + Returns: + Payload object + """ + payload = protobuf.Payload.FromString(raw) + if birth is not None: + for metric in payload.metrics: + if not metric.name: + metric.name = birth.get_name(metric.alias) + if metric.datatype == DataType.UNKNOWN: + metric.datatype = birth.get_dtype(metric.name) + if not payload.HasField("timestamp"): + raise ValueError("Sparkplug payload missing required timestamp field") + kwargs = { + "timestamp": payload.timestamp, + "metrics": tuple(Metric.from_pb(metric) for metric in payload.metrics), + } + if payload.HasField("seq"): + kwargs["seq"] = payload.seq + return cls(**kwargs) + + def encode(self, *, include_dtypes: bool = False) -> bytes: + """Encode Payload object into bytes + + Args: + include_dtypes: + whether or not to include dtypes + + Returns: + encoded payload in bytes + """ + payload = protobuf.Payload() + payload.timestamp = self.timestamp # type: ignore[attr-defined] # pylint: disable=no-member + if hasattr(self, "seq"): + payload.seq = self.seq + payload.metrics.extend( + metric.to_pb(include_dtype=include_dtypes) + for metric in self.metrics # type: ignore[attr-defined] # pylint: disable=no-member + ) + return cast(bytes, payload.SerializeToString()) + + +@dataclasses.dataclass(frozen=True) +class Birth(_PBPayload): + """Class representing a Birth payload + + Args: + timestamp: + timestamp at message sending time + seq: + sequence number + metrics: + metrics associated with this payload + """ + + timestamp: int + seq: int + metrics: Iterable[Metric] + _names_mapping: Dict[int, str] = dataclasses.field( + init=False, default_factory=dict, repr=False + ) + _dtypes_mapping: Dict[str, DataType] = dataclasses.field( + init=False, default_factory=dict, repr=False + ) + + def __post_init__(self) -> None: + for metric in self.metrics: + if metric.name is None: + raise ValueError( + f"Metric {metric} must have a defined name when provided to a Birth payload" + ) + if metric.datatype == DataType.UNKNOWN: + raise ValueError( + f"Metric {metric} must have a defined datatype when provided to a Birth payload" + ) + if metric.alias is not None: + self._names_mapping[metric.alias] = metric.name + self._dtypes_mapping[metric.name] = metric.datatype + + @classmethod + def decode(cls, raw: bytes, *, birth: Optional[Birth] = None) -> Self: + """Construct a Birth object from bytes + + Args: + raw: + bytes to decode into a Birth object + birth: + unused input since Births payloads are self-contained + + Returns: + Birth object + """ + birth = None # don't use previous birth to determine name/datatypes + return super().decode(raw, birth=birth) + + def encode(self, *, include_dtypes: bool = False) -> bytes: + """Encode Birth object into bytes + + Args: + include_dtypes: + whether or not to include dtypes + + Returns: + encoded payload in bytes + """ + include_dtypes = True # always include datatypes + return super().encode(include_dtypes=include_dtypes) + + def get_name(self, alias: int) -> str: + """Get the name of the metric with the requested alias + + Args: + alias: + the alias of the metric we want the name of + + Returns: + the name of the metric + """ + return self._names_mapping[alias] + + def get_dtype(self, name: str) -> DataType: + """Get the dtype of the metric with the requested name + + Args: + name: + the name of the metric we want the dtype of + + Returns: + the dtype of the metric + """ + return self._dtypes_mapping[name] + + +class NBirth(Birth): + """Class representing an NBirth payload + + Args: + timestamp: + timestamp at message sending time + seq: + sequence number + metrics: + metrics associated with this payload + """ + + +class DBirth(Birth): + """Class representing a DBirth payload + + Args: + timestamp: + timestamp at message sending time + seq: + sequence number + metrics: + metrics associated with this payload + """ + + +@dataclasses.dataclass(frozen=True) +class _Data(_PBPayload): + timestamp: int + seq: int + metrics: Iterable[Metric] + + +class NData(_Data): + """Class representing an NData payload + + Args: + timestamp: + timestamp at message sending time + seq: + sequence number + metrics: + metrics associated with this payload + """ + + +class DData(_Data): + """Class representing a DData payload + + Args: + timestamp: + timestamp at message sending time + seq: + sequence number + metrics: + metrics associated with this payload + """ + + +@dataclasses.dataclass(frozen=True) +class _Cmd(_PBPayload): + timestamp: int + metrics: Iterable[Metric] + + +class NCmd(_Cmd): + """Class representing an NCmd payload + + Args: + timestamp: + timestamp at message sending time + metrics: + metrics associated with this payload + """ + + +class DCmd(_Cmd): + """Class representing a DCmd payload + + Args: + timestamp: + timestamp at message sending time + metrics: + metrics associated with this payload + """ + + +@dataclasses.dataclass(frozen=True) +class NDeath: + """Class representing an NDeath payload + + Args: + timestamp: + timestamp at message sending time + bd_seq_metric: + birth death sequence number metric + """ + + timestamp: Optional[int] + bd_seq_metric: Metric + + @classmethod + def decode( + cls, + raw: bytes, + *, + birth: Optional[Birth] = None, # pylint: disable=unused-argument + ) -> Self: + """Construct an NDeath object from bytes + + Args: + raw: + bytes to decode into a NDeath object + birth: + unused input since NDeaths don't have any metrics with aliases or dropped dtypes + + Returns: + NDeath object + """ + payload = protobuf.Payload.FromString(raw) + return cls( + timestamp=payload.timestamp if not payload.HasField("timestamp") else None, + bd_seq_metric=Metric.from_pb(payload.metrics[0]), + ) + + def encode(self, *, include_dtypes: bool = False) -> bytes: + """Encode NDeath object into bytes + + Args: + include_dtypes: + whether or not to include dtypes + + Returns: + encoded payload in bytes + """ + include_dtypes = True # always include datatypes + payload = protobuf.Payload() + if self.timestamp is not None: + payload.timestamp = self.timestamp + payload.metrics.append(self.bd_seq_metric.to_pb(include_dtype=include_dtypes)) + return cast(bytes, payload.SerializeToString()) + + +@dataclasses.dataclass(frozen=True) +class DDeath: + """Class representing a DDeath payload + + Args: + timestamp: + timestamp at message sending time + seq: + sequence number + """ + + timestamp: int + seq: int + + @classmethod + def decode( + cls, + raw: bytes, + *, + birth: Optional[Birth] = None, # pylint: disable=unused-argument + ) -> Self: + """Construct a DDeath object from bytes + + Args: + raw: + bytes to decode into a DDeath object + birth: + unused input since DDeaths don't have any metrics + + Returns: + DDeath object + """ + payload = protobuf.Payload.FromString(raw) + return cls( + timestamp=payload.timestamp, + seq=payload.seq, + ) + + def encode( + self, *, include_dtypes: bool = False # pylint: disable=unused-argument + ) -> bytes: + """Encode DDeath object into bytes + + Args: + include_dtypes: + unused input since DDeaths have no metrics + + Returns: + encoded payload in bytes + """ + payload = protobuf.Payload() + payload.timestamp = self.timestamp + payload.seq = self.seq + return cast(bytes, payload.SerializeToString()) + + +@dataclasses.dataclass(frozen=True) +class State: + """Class representing a State payload + + Args: + timestamp: + timestamp at message sending time + online: + whether or not the primary host application is online + """ + + timestamp: int + online: bool + + @classmethod + def decode( + cls, + raw: bytes, + *, + birth: Optional[Birth] = None, # pylint: disable=unused-argument + ) -> Self: + """Construct a State object from bytes + + Args: + raw: + bytes to decode into a Payload object + birth: + unused input since States don't have any metrics + + Returns: + State object + """ + state = json.loads(raw) + return cls( + timestamp=state["timestamp"], + online=state["online"], + ) + + def encode( + self, *, include_dtypes: bool = False # pylint: disable=unused-argument + ) -> bytes: + """Encode State object into bytes + + Args: + include_dtypes: + unused input since States have no metrics + + Returns: + encoded payload in bytes + """ + return json.dumps({"timestamp": self.timestamp, "online": self.online}).encode() diff --git a/src/pysparkplug/_protobuf/__init__.py b/src/pysparkplug/_protobuf/__init__.py new file mode 100644 index 0000000..2ae16d7 --- /dev/null +++ b/src/pysparkplug/_protobuf/__init__.py @@ -0,0 +1,20 @@ +"""Module pulling out the relevant types from the protobuf autogenerated code""" + +from pysparkplug._protobuf import sparkplug_b_pb2 + +Payload = sparkplug_b_pb2.Payload # type: ignore[attr-defined] # pylint: disable=no-member +DataSet = Payload.DataSet +DataSetValue = DataSet.DataSetValue +DataSetValueExtension = DataSetValue.DataSetValueExtension +Row = DataSet.Row +MetaData = Payload.MetaData +Metric = Payload.Metric +MetricValueExtension = Metric.MetricValueExtension +PropertySet = Payload.PropertySet +PropertySetList = Payload.PropertySetList +PropertyValue = Payload.PropertyValue +PropertyValueExtension = PropertyValue.PropertyValueExtension +Template = Payload.Template +Parameter = Template.Parameter +ParameterValueExtension = Parameter.ParameterValueExtension +DataType = sparkplug_b_pb2.DataType # type: ignore[attr-defined] # pylint: disable=no-member diff --git a/src/pysparkplug/_protobuf/sparkplug_b_pb2.py b/src/pysparkplug/_protobuf/sparkplug_b_pb2.py new file mode 100644 index 0000000..14d4794 --- /dev/null +++ b/src/pysparkplug/_protobuf/sparkplug_b_pb2.py @@ -0,0 +1,56 @@ +# -*- coding: utf-8 -*- +# Generated by the protocol buffer compiler. DO NOT EDIT! +# source: sparkplug_b.proto +"""Generated protocol buffer code.""" +from google.protobuf.internal import builder as _builder +from google.protobuf import descriptor as _descriptor +from google.protobuf import descriptor_pool as _descriptor_pool +from google.protobuf import symbol_database as _symbol_database +# @@protoc_insertion_point(imports) + +_sym_db = _symbol_database.Default() + + + + +DESCRIPTOR = _descriptor_pool.Default().AddSerializedFile(b'\n\x11sparkplug_b.proto\x12\x19org.eclipse.tahu.protobuf\"\xee\x15\n\x07Payload\x12\x11\n\ttimestamp\x18\x01 \x01(\x04\x12:\n\x07metrics\x18\x02 \x03(\x0b\x32).org.eclipse.tahu.protobuf.Payload.Metric\x12\x0b\n\x03seq\x18\x03 \x01(\x04\x12\x0c\n\x04uuid\x18\x04 \x01(\t\x12\x0c\n\x04\x62ody\x18\x05 \x01(\x0c\x1a\xa6\x04\n\x08Template\x12\x0f\n\x07version\x18\x01 \x01(\t\x12:\n\x07metrics\x18\x02 \x03(\x0b\x32).org.eclipse.tahu.protobuf.Payload.Metric\x12I\n\nparameters\x18\x03 \x03(\x0b\x32\x35.org.eclipse.tahu.protobuf.Payload.Template.Parameter\x12\x14\n\x0ctemplate_ref\x18\x04 \x01(\t\x12\x15\n\ris_definition\x18\x05 \x01(\x08\x1a\xca\x02\n\tParameter\x12\x0c\n\x04name\x18\x01 \x01(\t\x12\x0c\n\x04type\x18\x02 \x01(\r\x12\x13\n\tint_value\x18\x03 \x01(\rH\x00\x12\x14\n\nlong_value\x18\x04 \x01(\x04H\x00\x12\x15\n\x0b\x66loat_value\x18\x05 \x01(\x02H\x00\x12\x16\n\x0c\x64ouble_value\x18\x06 \x01(\x01H\x00\x12\x17\n\rboolean_value\x18\x07 \x01(\x08H\x00\x12\x16\n\x0cstring_value\x18\x08 \x01(\tH\x00\x12h\n\x0f\x65xtension_value\x18\t \x01(\x0b\x32M.org.eclipse.tahu.protobuf.Payload.Template.Parameter.ParameterValueExtensionH\x00\x1a#\n\x17ParameterValueExtension*\x08\x08\x01\x10\x80\x80\x80\x80\x02\x42\x07\n\x05value*\x08\x08\x06\x10\x80\x80\x80\x80\x02\x1a\x97\x04\n\x07\x44\x61taSet\x12\x16\n\x0enum_of_columns\x18\x01 \x01(\x04\x12\x0f\n\x07\x63olumns\x18\x02 \x03(\t\x12\r\n\x05types\x18\x03 \x03(\r\x12<\n\x04rows\x18\x04 \x03(\x0b\x32..org.eclipse.tahu.protobuf.Payload.DataSet.Row\x1a\xaf\x02\n\x0c\x44\x61taSetValue\x12\x13\n\tint_value\x18\x01 \x01(\rH\x00\x12\x14\n\nlong_value\x18\x02 \x01(\x04H\x00\x12\x15\n\x0b\x66loat_value\x18\x03 \x01(\x02H\x00\x12\x16\n\x0c\x64ouble_value\x18\x04 \x01(\x01H\x00\x12\x17\n\rboolean_value\x18\x05 \x01(\x08H\x00\x12\x16\n\x0cstring_value\x18\x06 \x01(\tH\x00\x12h\n\x0f\x65xtension_value\x18\x07 \x01(\x0b\x32M.org.eclipse.tahu.protobuf.Payload.DataSet.DataSetValue.DataSetValueExtensionH\x00\x1a!\n\x15\x44\x61taSetValueExtension*\x08\x08\x01\x10\x80\x80\x80\x80\x02\x42\x07\n\x05value\x1aZ\n\x03Row\x12I\n\x08\x65lements\x18\x01 \x03(\x0b\x32\x37.org.eclipse.tahu.protobuf.Payload.DataSet.DataSetValue*\x08\x08\x02\x10\x80\x80\x80\x80\x02*\x08\x08\x05\x10\x80\x80\x80\x80\x02\x1a\xe9\x03\n\rPropertyValue\x12\x0c\n\x04type\x18\x01 \x01(\r\x12\x0f\n\x07is_null\x18\x02 \x01(\x08\x12\x13\n\tint_value\x18\x03 \x01(\rH\x00\x12\x14\n\nlong_value\x18\x04 \x01(\x04H\x00\x12\x15\n\x0b\x66loat_value\x18\x05 \x01(\x02H\x00\x12\x16\n\x0c\x64ouble_value\x18\x06 \x01(\x01H\x00\x12\x17\n\rboolean_value\x18\x07 \x01(\x08H\x00\x12\x16\n\x0cstring_value\x18\x08 \x01(\tH\x00\x12K\n\x11propertyset_value\x18\t \x01(\x0b\x32..org.eclipse.tahu.protobuf.Payload.PropertySetH\x00\x12P\n\x12propertysets_value\x18\n \x01(\x0b\x32\x32.org.eclipse.tahu.protobuf.Payload.PropertySetListH\x00\x12\x62\n\x0f\x65xtension_value\x18\x0b \x01(\x0b\x32G.org.eclipse.tahu.protobuf.Payload.PropertyValue.PropertyValueExtensionH\x00\x1a\"\n\x16PropertyValueExtension*\x08\x08\x01\x10\x80\x80\x80\x80\x02\x42\x07\n\x05value\x1ag\n\x0bPropertySet\x12\x0c\n\x04keys\x18\x01 \x03(\t\x12@\n\x06values\x18\x02 \x03(\x0b\x32\x30.org.eclipse.tahu.protobuf.Payload.PropertyValue*\x08\x08\x03\x10\x80\x80\x80\x80\x02\x1a`\n\x0fPropertySetList\x12\x43\n\x0bpropertyset\x18\x01 \x03(\x0b\x32..org.eclipse.tahu.protobuf.Payload.PropertySet*\x08\x08\x02\x10\x80\x80\x80\x80\x02\x1a\xa4\x01\n\x08MetaData\x12\x15\n\ris_multi_part\x18\x01 \x01(\x08\x12\x14\n\x0c\x63ontent_type\x18\x02 \x01(\t\x12\x0c\n\x04size\x18\x03 \x01(\x04\x12\x0b\n\x03seq\x18\x04 \x01(\x04\x12\x11\n\tfile_name\x18\x05 \x01(\t\x12\x11\n\tfile_type\x18\x06 \x01(\t\x12\x0b\n\x03md5\x18\x07 \x01(\t\x12\x13\n\x0b\x64\x65scription\x18\x08 \x01(\t*\x08\x08\t\x10\x80\x80\x80\x80\x02\x1a\xbf\x05\n\x06Metric\x12\x0c\n\x04name\x18\x01 \x01(\t\x12\r\n\x05\x61lias\x18\x02 \x01(\x04\x12\x11\n\ttimestamp\x18\x03 \x01(\x04\x12\x10\n\x08\x64\x61tatype\x18\x04 \x01(\r\x12\x15\n\ris_historical\x18\x05 \x01(\x08\x12\x14\n\x0cis_transient\x18\x06 \x01(\x08\x12\x0f\n\x07is_null\x18\x07 \x01(\x08\x12=\n\x08metadata\x18\x08 \x01(\x0b\x32+.org.eclipse.tahu.protobuf.Payload.MetaData\x12\x42\n\nproperties\x18\t \x01(\x0b\x32..org.eclipse.tahu.protobuf.Payload.PropertySet\x12\x13\n\tint_value\x18\n \x01(\rH\x00\x12\x14\n\nlong_value\x18\x0b \x01(\x04H\x00\x12\x15\n\x0b\x66loat_value\x18\x0c \x01(\x02H\x00\x12\x16\n\x0c\x64ouble_value\x18\r \x01(\x01H\x00\x12\x17\n\rboolean_value\x18\x0e \x01(\x08H\x00\x12\x16\n\x0cstring_value\x18\x0f \x01(\tH\x00\x12\x15\n\x0b\x62ytes_value\x18\x10 \x01(\x0cH\x00\x12\x43\n\rdataset_value\x18\x11 \x01(\x0b\x32*.org.eclipse.tahu.protobuf.Payload.DataSetH\x00\x12\x45\n\x0etemplate_value\x18\x12 \x01(\x0b\x32+.org.eclipse.tahu.protobuf.Payload.TemplateH\x00\x12Y\n\x0f\x65xtension_value\x18\x13 \x01(\x0b\x32>.org.eclipse.tahu.protobuf.Payload.Metric.MetricValueExtensionH\x00\x1a \n\x14MetricValueExtension*\x08\x08\x01\x10\x80\x80\x80\x80\x02\x42\x07\n\x05value*\x08\x08\x06\x10\x80\x80\x80\x80\x02*\xf2\x03\n\x08\x44\x61taType\x12\x0b\n\x07Unknown\x10\x00\x12\x08\n\x04Int8\x10\x01\x12\t\n\x05Int16\x10\x02\x12\t\n\x05Int32\x10\x03\x12\t\n\x05Int64\x10\x04\x12\t\n\x05UInt8\x10\x05\x12\n\n\x06UInt16\x10\x06\x12\n\n\x06UInt32\x10\x07\x12\n\n\x06UInt64\x10\x08\x12\t\n\x05\x46loat\x10\t\x12\n\n\x06\x44ouble\x10\n\x12\x0b\n\x07\x42oolean\x10\x0b\x12\n\n\x06String\x10\x0c\x12\x0c\n\x08\x44\x61teTime\x10\r\x12\x08\n\x04Text\x10\x0e\x12\x08\n\x04UUID\x10\x0f\x12\x0b\n\x07\x44\x61taSet\x10\x10\x12\t\n\x05\x42ytes\x10\x11\x12\x08\n\x04\x46ile\x10\x12\x12\x0c\n\x08Template\x10\x13\x12\x0f\n\x0bPropertySet\x10\x14\x12\x13\n\x0fPropertySetList\x10\x15\x12\r\n\tInt8Array\x10\x16\x12\x0e\n\nInt16Array\x10\x17\x12\x0e\n\nInt32Array\x10\x18\x12\x0e\n\nInt64Array\x10\x19\x12\x0e\n\nUInt8Array\x10\x1a\x12\x0f\n\x0bUInt16Array\x10\x1b\x12\x0f\n\x0bUInt32Array\x10\x1c\x12\x0f\n\x0bUInt64Array\x10\x1d\x12\x0e\n\nFloatArray\x10\x1e\x12\x0f\n\x0b\x44oubleArray\x10\x1f\x12\x10\n\x0c\x42ooleanArray\x10 \x12\x0f\n\x0bStringArray\x10!\x12\x11\n\rDateTimeArray\x10\"B,\n\x19org.eclipse.tahu.protobufB\x0fSparkplugBProto') + +_builder.BuildMessageAndEnumDescriptors(DESCRIPTOR, globals()) +_builder.BuildTopDescriptorsAndMessages(DESCRIPTOR, 'sparkplug_b_pb2', globals()) +if _descriptor._USE_C_DESCRIPTORS == False: + + DESCRIPTOR._options = None + DESCRIPTOR._serialized_options = b'\n\031org.eclipse.tahu.protobufB\017SparkplugBProto' + _DATATYPE._serialized_start=2850 + _DATATYPE._serialized_end=3348 + _PAYLOAD._serialized_start=49 + _PAYLOAD._serialized_end=2847 + _PAYLOAD_TEMPLATE._serialized_start=181 + _PAYLOAD_TEMPLATE._serialized_end=731 + _PAYLOAD_TEMPLATE_PARAMETER._serialized_start=391 + _PAYLOAD_TEMPLATE_PARAMETER._serialized_end=721 + _PAYLOAD_TEMPLATE_PARAMETER_PARAMETERVALUEEXTENSION._serialized_start=677 + _PAYLOAD_TEMPLATE_PARAMETER_PARAMETERVALUEEXTENSION._serialized_end=712 + _PAYLOAD_DATASET._serialized_start=734 + _PAYLOAD_DATASET._serialized_end=1269 + _PAYLOAD_DATASET_DATASETVALUE._serialized_start=864 + _PAYLOAD_DATASET_DATASETVALUE._serialized_end=1167 + _PAYLOAD_DATASET_DATASETVALUE_DATASETVALUEEXTENSION._serialized_start=1125 + _PAYLOAD_DATASET_DATASETVALUE_DATASETVALUEEXTENSION._serialized_end=1158 + _PAYLOAD_DATASET_ROW._serialized_start=1169 + _PAYLOAD_DATASET_ROW._serialized_end=1259 + _PAYLOAD_PROPERTYVALUE._serialized_start=1272 + _PAYLOAD_PROPERTYVALUE._serialized_end=1761 + _PAYLOAD_PROPERTYVALUE_PROPERTYVALUEEXTENSION._serialized_start=1718 + _PAYLOAD_PROPERTYVALUE_PROPERTYVALUEEXTENSION._serialized_end=1752 + _PAYLOAD_PROPERTYSET._serialized_start=1763 + _PAYLOAD_PROPERTYSET._serialized_end=1866 + _PAYLOAD_PROPERTYSETLIST._serialized_start=1868 + _PAYLOAD_PROPERTYSETLIST._serialized_end=1964 + _PAYLOAD_METADATA._serialized_start=1967 + _PAYLOAD_METADATA._serialized_end=2131 + _PAYLOAD_METRIC._serialized_start=2134 + _PAYLOAD_METRIC._serialized_end=2837 + _PAYLOAD_METRIC_METRICVALUEEXTENSION._serialized_start=2796 + _PAYLOAD_METRIC_METRICVALUEEXTENSION._serialized_end=2828 +# @@protoc_insertion_point(module_scope) diff --git a/src/pysparkplug/_time.py b/src/pysparkplug/_time.py new file mode 100644 index 0000000..fbb82d3 --- /dev/null +++ b/src/pysparkplug/_time.py @@ -0,0 +1,10 @@ +"""Module of time utilties""" + +import time + +__all__ = ["get_current_timestamp"] + + +def get_current_timestamp() -> int: + """Returns current time in a Sparkplug B compliant format""" + return int(time.time() * 1e3) diff --git a/src/pysparkplug/_topic.py b/src/pysparkplug/_topic.py new file mode 100644 index 0000000..88274a5 --- /dev/null +++ b/src/pysparkplug/_topic.py @@ -0,0 +1,140 @@ +"""Module defining the Topic class""" + +import dataclasses +import re +from typing import Optional, Union, cast + +from pysparkplug._enums import MessageType +from pysparkplug._types import Literal, Self, TypeAlias + +__all__ = ["Topic"] +Wildcard: TypeAlias = Literal["#", "*"] + + +@dataclasses.dataclass(frozen=True) +class Topic: + """Class representing a Sparkplug B topic + + Args: + group_id: + the Group ID element of the topic namespace provides for a logical + grouping of Sparkplug Edge Nodes into the MQTT Server and back out + to the consuming Sparkplug Host Applications + message_type: + the message_type element of the topic namespace provides an + indication as to how to handle the MQTT payload of the message + edge_node_id: + the edge_node_id element of the Sparkplug topic namespace uniquely + identifies the Sparkplug Edge Node within the infrastructure + device_id: + the device_id element of the Sparkplug topic namespace identifies + a device attached (physically or logically) to the Sparkplug Edge + Node + sparkplug_host_id: + the unique identifier of the Sparkplug Host Application + """ + + namespace = "spBv1.0" + group_id: Optional[str] = None + message_type: Optional[Union[MessageType, Wildcard]] = None + edge_node_id: Optional[str] = None + device_id: Optional[str] = None + sparkplug_host_id: Optional[str] = None + + _validator = re.compile("[/+#]") + + def __post_init__(self) -> None: + if ( + self.message_type is not None + and self.message_type not in "#*" + and self._validator.search(self.message_type) is not None + ): + raise ValueError( + f"message_type {self.message_type} cannot contain /, +, or # characters" + ) + if ( + self.group_id is not None + and self.group_id not in "#*" + and self._validator.search(self.group_id) is not None + ): + raise ValueError( + f"group_id {self.group_id} cannot contain /, +, or # characters" + ) + if ( + self.edge_node_id is not None + and self.edge_node_id not in "#*" + and self._validator.search(self.edge_node_id) is not None + ): + raise ValueError( + f"edge_node_id {self.edge_node_id} cannot contain /, +, or # characters" + ) + if ( + self.device_id is not None + and self.device_id not in "#*" + and self._validator.search(self.device_id) is not None + ): + raise ValueError( + f"device_id {self.device_id} cannot contain /, +, or # characters" + ) + if ( + self.sparkplug_host_id is not None + and self.sparkplug_host_id not in "#*" + and self._validator.search(self.sparkplug_host_id) is not None + ): + raise ValueError( + f"sparkplug_host_id {self.sparkplug_host_id} cannot contain /, +, or # characters" + ) + + @classmethod + def from_str(cls, topic: str) -> Self: + """Construct a Topic object from a topic string + + Args: + topic: the Sparkplug B topic in raw string form + + Returns: + a Topic object + """ + parts = topic.split("/") + namespace = parts[0] + if namespace != cls.namespace: + raise ValueError(f"Topic with invalid namespace {namespace}") + if parts[1] == MessageType.STATE: + return cls(MessageType.STATE, sparkplug_host_id=parts[2]) + + group_id = None + message_type = None + edge_node_id = None + device_id = None + + try: + group_id = parts[1] + message_type = cast( + Union[MessageType, Wildcard], + parts[2] if parts[2] in "#*" else MessageType(parts[2]), + ) + edge_node_id = parts[3] + device_id = parts[4] + except IndexError: + pass + return cls( + group_id=group_id, + message_type=message_type, + edge_node_id=edge_node_id, + device_id=device_id, + ) + + def to_str(self) -> str: + """Encode a Topic object as a string""" + if self.message_type == MessageType.STATE: + return f"{self.namespace}/{self.message_type}/{self.sparkplug_host_id}" + if self.device_id is not None: + return f"{self.namespace}/{self.group_id}/{self.message_type}/{self.edge_node_id}/{self.device_id}" + if self.edge_node_id is not None: + return f"{self.namespace}/{self.group_id}/{self.message_type}/{self.edge_node_id}" + if self.message_type is not None: + return f"{self.namespace}/{self.group_id}/{self.message_type}" + return f"{self.namespace}/{self.group_id}" + + def __str__(self) -> str: + return self.to_str() diff --git a/src/pysparkplug/_types.py b/src/pysparkplug/_types.py new file mode 100644 index 0000000..8991ac4 --- /dev/null +++ b/src/pysparkplug/_types.py @@ -0,0 +1,14 @@ +"""Module defining common types""" + +import datetime +import sys +from typing import Union + +if sys.version_info < (3, 11): + from typing_extensions import Literal, Protocol, Self, TypeAlias +else: + from typing import Literal, Protocol, Self, TypeAlias + +__all__ = ["Literal", "MetricValue", "Protocol", "Self", "TypeAlias"] + +MetricValue: TypeAlias = Union[int, float, bool, str, bytes, datetime.datetime] diff --git a/src/pysparkplug/py.typed b/src/pysparkplug/py.typed new file mode 100644 index 0000000..e69de29 diff --git a/test/test.sh b/test/test.sh new file mode 100755 index 0000000..2d1ce9a --- /dev/null +++ b/test/test.sh @@ -0,0 +1,12 @@ +#! /usr/bin/env bash +set -o errexit -o nounset -o pipefail +IFS=$'\n\t' + +REPO_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"/.. +cd "$REPO_DIR" + +echo "Running tests" + +docker compose run --rm cicd nox "$@" + +echo "$0 completed successfully!" diff --git a/test/unit_tests/__init__.py b/test/unit_tests/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/test/unit_tests/test_version.py b/test/unit_tests/test_version.py new file mode 100644 index 0000000..e8e560f --- /dev/null +++ b/test/unit_tests/test_version.py @@ -0,0 +1,24 @@ +"""Test suite for version information""" + +import importlib.metadata +import unittest + +import packaging.version + +import pysparkplug as psp + + +class TestVersion(unittest.TestCase): + """Test the package version is correct""" + + def test_version_metadata(self) -> None: + """Confirm the pysparkplug package has a valid version in its metadata""" + packaging.version.Version(importlib.metadata.version("pysparkplug")) + + def test_version(self) -> None: + """Test the pysparkplug package has a valid version""" + packaging.version.Version(psp.__version__) + + def test_version_match(self) -> None: + """Test the pysparkplug package version matches its metadata""" + self.assertEqual(psp.__version__, importlib.metadata.version("pysparkplug"))