From 07e88ab38fbe1784b7fda3c7198213178bec5e3d Mon Sep 17 00:00:00 2001 From: Rachel Yang Date: Wed, 9 Oct 2024 14:12:14 -0400 Subject: [PATCH 01/11] (chore): updating and cleaning parametric.md --- docs/scenarios/parametric.md | 163 +++++++++++++++++------------------ 1 file changed, 81 insertions(+), 82 deletions(-) diff --git a/docs/scenarios/parametric.md b/docs/scenarios/parametric.md index 70794d5de7..f851926bfb 100644 --- a/docs/scenarios/parametric.md +++ b/docs/scenarios/parametric.md @@ -68,15 +68,10 @@ The following dependencies are required to run the tests locally: - Docker - Python 3.12 -then, run the following command, which will create a Python virtual environment and install the Python dependencies from the root directory: - -```sh -./build.sh -i runner -``` - - ### Running the tests +Build will happen at the beginning of the run statements. + Run all the tests for a particular tracer library: ```sh @@ -97,9 +92,36 @@ TEST_LIBRARY=dotnet ./run.sh PARAMETRIC -k test_metrics_ Tests can be aborted using CTRL-C but note that containers maybe still be running and will have to be shut down. -#### Go +### Using Pytest + +The tests are executed using pytest. Below are some common command-line options you can use to control and customize your test runs. +- `-k EXPRESSION`: Run tests that match the given expression (substring or pattern). Useful for running specific tests or groups of tests. + +```sh +TEST_LIBRARY=dotnet ./run.sh PARAMETRIC -k test_metrics_msgpack_serialization_TS001 +``` + +- `-v`: Increase verbosity. Shows each test name and its result (pass/fail) as they are run. + +```sh +TEST_LIBRARY=dotnet ./run.sh PARAMETRIC -v +``` + +- `-vv`: Even more verbose output. Provides detailed information including setup and teardown for each test. + +```sh +TEST_LIBRARY=dotnet ./run.sh PARAMETRIC -vv -k test_metrics_ +``` + +- `-s`: Disable output capture. Allows you to see print statements and logs directly in the console. + +```sh +TEST_LIBRARY=dotnet ./run.sh PARAMETRIC -s +``` + +### Running the tests for a custom tracer -For running the Go tests, see the README in apps/golang. +#### Go To test unmerged PRs locally, run the following in the utils/build/docker/golang/parametric directory: @@ -110,14 +132,14 @@ go mod tidy #### dotnet -Add a file datadog-dotnet-apm-.tar.gz in binaries/. must be a valid version number. +- Add a file `datadog-dotnet-apm-.tar.gz` in `binaries/`. `` must be a valid version number. + - One way to get that file is from an Azure pipeline (either a recent one from master if the changes you want to test were merged recently, or the one from your PR if it's open) #### Java -##### Run Parametric tests with a custom Java Tracer version +Follow these steps to run Parametric tests with a custom Java Tracer version: -1. Clone the repo and checkout to the branch you'd like to test -Clone the repo: +1. Clone the repo and checkout to the branch you'd like to test: ```bash git clone git@github.com:DataDog/dd-trace-java.git cd dd-trace-java @@ -168,18 +190,16 @@ From the repo root folder: #### Python -To run the Python tests "locally" push your code to a branch and then specify ``PYTHON_DDTRACE_PACKAGE``. - - -```sh -TEST_LIBRARY=python PYTHON_DDTRACE_PACKAGE=git+https://github.com/Datadog/dd-trace-py@2.x ./run.sh PARAMETRIC [-k ...] +To run the Python tests against a custom tracer: +```bash +echo “ddtrace @ git+https://github.com/DataDog/dd-trace-py.git@” > binaries/python-load-from-pip ``` #### NodeJS There is three ways for running the NodeJS tests with a custom tracer: 1. Create a file `nodejs-load-from-npm` in `binaries/`, the content will be installed by `npm install`. Content example: - * `DataDog/dd-trace-js#master` + - `DataDog/dd-trace-js#master` 2. Clone the dd-trace-js repo inside `binaries` 3. Create a file `nodejs-load-from-local` in `binaries/`, this will disable installing with `npm install dd-trace` and will instead get the content of the file, and use it as a location of the `dd-trace-js` repo and then mount it as a @@ -202,41 +222,22 @@ There is two ways for running the C++ library tests with a custom tracer: * `https://github.com/DataDog/dd-trace-cpp@` 2. Clone the dd-trace-cpp repo inside `binaries` -The parametric shared tests can be run against the C++ library, -[dd-trace-cpp][1], this way: -```console -$ TEST_LIBRARY=cpp ./run.sh PARAMETRIC -``` - -Use the `-k` command line argument, which is forwarded to [pytest][2], to -specify a substring within a particular test file, class, or method. Then only -matching tests will run, e.g. -```console -$ TEST_LIBRARY=cpp ./run.sh PARAMETRIC -k test_headers -``` - -It's convenient to have a pretty printer for the tests' XML output. I use -[xunit-viewer][3]. -```console -$ npm install junit-viewer -g -``` - -My development iterations then involve running the following at the top of the -repository: -```console -$ TEST_LIBRARY=cpp ./run.sh PARAMETRIC -k test_headers; xunit-viewer -r logs_parametric/reportJunit.xml +#### When you are done testing against a custom tracer: +```bash +rm -rf binaries/python-load-from-pip ``` -This will create a file `index.html` at the top of the repository, which I then -inspect with a web browser. +### Understanding the test outcomes +Please refer to this chart: -The C++ build can be made to point to a different GitHub branch by modifying the -`FetchContent_Declare` command's `GIT_TAG` argument in [CMakeLists.txt][4]. - -In order to coerce Docker to rebuild the C++ gRPC server image, one of the build -inputs must change, and so whenever I push changes to the target branch, I also -modify a scratch comment in `CMakeLists.txt` to trigger a rebuild on the next -test run. +| Declaration | Test is executed | Test actual outcome | System test output | Comment +| - | - | - | - | - +| \ | Yes | ✅ Pass | 🟢 Success | All good :sunglasses: +| Missing feature or bug | Yes | ❌ Fail | 🟢 Success | Expected failure +| Missing feature or bug | Yes | ✅ Pass | 🟠 Success | XPASS: The feature has been implemented, bug has been fixed -> easy win +| Flaky | No | N.A. | N.A. | A flaky test doesn't provide any usefull information, and thus, is not executed. +| Irrelevant | No | N.A. | N.A | There is no purpose of running such a test +| \ | Yes | ❌ Fail | 🔴 Fail | Only use case where system test fails : the test should have been ok, and is not ### Debugging @@ -296,37 +297,6 @@ docker image rm -test-library The Python implementation of the interface `app/python`, when run, provides a specification of the API when run. See the steps below in the HTTP section to run the Python server and view the specification. -## Updating protos - -In order to update the `parametric/protos`, these steps must be followed. - -1. Create a virtual environment and activate it: -```bash -python3.12 -m venv .venv && source .venv/bin/activate -``` - -2. Install the required dependencies: -```bash -pip install -r requirements.txt -``` - -3. Install `grpcio-tools` (make sure grpcaio is the same version): -```bash -pip install grpcio-tools==1.60.1 -``` - -4. Change directory to `utils/parametric`: -```console -cd utils/parametric -``` - -5. Run the script to generate the proto files: -```bash -./generate_protos.sh -``` - -Then you should have updated proto files. This script will generate weird files, you can ignore/delete these. - ## Implementation ### Shared Interface @@ -362,7 +332,36 @@ service APMClient { rpc StopTracer(StopTracerArgs) returns (StopTracerReturn) {} } ``` +#### Updating protos for GRPC + +In order to update the `parametric/protos`, these steps must be followed. + +1. Create a virtual environment and activate it: +```bash +python3.12 -m venv .venv && source .venv/bin/activate +``` +2. Install the required dependencies: +```bash +pip install -r requirements.txt +``` + +3. Install `grpcio-tools` (make sure grpcaio is the same version): +```bash +pip install grpcio-tools==1.60.1 +``` + +4. Change directory to `utils/parametric`: +```console +cd utils/parametric +``` + +5. Run the script to generate the proto files: +```bash +./generate_protos.sh +``` + +Then you should have updated proto files. This script will generate weird files, you can ignore/delete these. ### Architecture From b37061d10ad1ad2b223a6af8e0fa02289d634bc2 Mon Sep 17 00:00:00 2001 From: Rachel Yang Date: Wed, 9 Oct 2024 14:15:48 -0400 Subject: [PATCH 02/11] typo --- docs/scenarios/parametric.md | 2 -- 1 file changed, 2 deletions(-) diff --git a/docs/scenarios/parametric.md b/docs/scenarios/parametric.md index f851926bfb..a5715ba637 100644 --- a/docs/scenarios/parametric.md +++ b/docs/scenarios/parametric.md @@ -297,8 +297,6 @@ docker image rm -test-library The Python implementation of the interface `app/python`, when run, provides a specification of the API when run. See the steps below in the HTTP section to run the Python server and view the specification. -## Implementation - ### Shared Interface #### HTTP From 6b5374ea7dc42a2e80645a6011400205e3148537 Mon Sep 17 00:00:00 2001 From: Rachel Yang Date: Wed, 9 Oct 2024 14:19:58 -0400 Subject: [PATCH 03/11] adding logs location --- docs/scenarios/parametric.md | 1 + 1 file changed, 1 insertion(+) diff --git a/docs/scenarios/parametric.md b/docs/scenarios/parametric.md index a5715ba637..5b92eb4e75 100644 --- a/docs/scenarios/parametric.md +++ b/docs/scenarios/parametric.md @@ -247,6 +247,7 @@ These can be used to debug the test case. The output also contains the commands used to build and run the containers which can be run manually to debug the issue further. +The logs are contained in this folder: `./logs_parametric` ## Troubleshooting From 62db4a97b4c4d01e81d31d25f8a4a35a8d502b07 Mon Sep 17 00:00:00 2001 From: Rachel Yang Date: Wed, 9 Oct 2024 14:23:54 -0400 Subject: [PATCH 04/11] adding docker system prune situation --- docs/scenarios/parametric.md | 9 +++++++++ 1 file changed, 9 insertions(+) diff --git a/docs/scenarios/parametric.md b/docs/scenarios/parametric.md index 5b92eb4e75..23709a6af1 100644 --- a/docs/scenarios/parametric.md +++ b/docs/scenarios/parametric.md @@ -290,6 +290,15 @@ library. Deleting the image will force a rebuild which will resolve the issue. docker image rm -test-library ``` +### Docker Cleanup +If you encounter an excessive number of errors during your workflow, one potential solution is to perform a cleanup of Docker resources. This can help resolve issues related to corrupted containers, dangling images, or unused volumes that might be causing conflicts. + +```sh +docker system prune +``` + +**⚠️ Warning:** +Executing `docker system prune` will remove all stopped containers, unused networks, dangling images, and build caches. This action is **irreversible** and may result in the loss of important data. Ensure that you **do not** need any of these resources before proceeding. ## Developing the tests From d2e87a4f598c3c0d50491f64aeb5fb1485e85665 Mon Sep 17 00:00:00 2001 From: Rachel Yang Date: Wed, 9 Oct 2024 14:25:42 -0400 Subject: [PATCH 05/11] more cleaning --- docs/scenarios/parametric.md | 18 ++++++++---------- 1 file changed, 8 insertions(+), 10 deletions(-) diff --git a/docs/scenarios/parametric.md b/docs/scenarios/parametric.md index 23709a6af1..9832f867f1 100644 --- a/docs/scenarios/parametric.md +++ b/docs/scenarios/parametric.md @@ -256,6 +256,14 @@ The logs are contained in this folder: `./logs_parametric` - Exiting the tests abruptly maybe leave some docker containers running. Use `docker ps` to find and `docker kill` any containers that may still be running. +### Tests failing locally but not in CI + +A cause for this can be that the Docker image containing the APM library is cached locally with an older version of the +library. Deleting the image will force a rebuild which will resolve the issue. + +```sh +docker image rm -test-library +``` ### Port conflict on 50052 @@ -280,16 +288,6 @@ are being produced then likely build kit has to be disabled. To do that open the Docker UI > Docker Engine. Change `buildkit: true` to `buildkit: false` and restart Docker. - -### Tests failing locally but not in CI - -A cause for this can be that the Docker image containing the APM library is cached locally with an older version of the -library. Deleting the image will force a rebuild which will resolve the issue. - -```sh -docker image rm -test-library -``` - ### Docker Cleanup If you encounter an excessive number of errors during your workflow, one potential solution is to perform a cleanup of Docker resources. This can help resolve issues related to corrupted containers, dangling images, or unused volumes that might be causing conflicts. From d05370bb5de499fa16f7a326b88f27f9f5fd8474 Mon Sep 17 00:00:00 2001 From: Rachel Yang Date: Tue, 15 Oct 2024 14:15:56 -0400 Subject: [PATCH 06/11] linting + architecture --- docs/scenarios/parametric.md | 46 +++++++++++++----------------------- 1 file changed, 16 insertions(+), 30 deletions(-) diff --git a/docs/scenarios/parametric.md b/docs/scenarios/parametric.md index 9832f867f1..cc21236c62 100644 --- a/docs/scenarios/parametric.md +++ b/docs/scenarios/parametric.md @@ -70,7 +70,7 @@ The following dependencies are required to run the tests locally: ### Running the tests -Build will happen at the beginning of the run statements. +Build will happen at the beginning of the run statements. Run all the tests for a particular tracer library: @@ -190,7 +190,7 @@ From the repo root folder: #### Python -To run the Python tests against a custom tracer: +To run the Python tests against a custom tracer: ```bash echo “ddtrace @ git+https://github.com/DataDog/dd-trace-py.git@” > binaries/python-load-from-pip ``` @@ -309,36 +309,22 @@ See the steps below in the HTTP section to run the Python server and view the sp #### HTTP -An HTTP interface can be used instead of the GRPC. To view the interface run +We have transitioned to using an HTTP interface, replacing the legacy GRPC interface. To view the HTTP interface, follow these steps: -``` +1. ``` ./utils/scripts/parametric/run_reference_http.sh ``` -and navigate to http://localhost:8000/docs. The OpenAPI schema can be downloaded at -http://localhost:8000/openapi.json. The schema can be imported -into [Postman](https://learning.postman.com/docs/integrations/available-integrations/working-with-openAPI/) or -other tooling to assist in development. +2. Navigate to http://localhost:8000/docs in your web browser to access the documentation. +3. You can download the OpenAPI schema from http://localhost:8000/openapi.json. This schema can be imported into tools like [Postman](https://learning.postman.com/docs/integrations/available-integrations/working-with-openAPI/) or other API clients to facilitate development and testing. #### Legacy GRPC +**Important:** The legacy GRPC interface will be **deprecated** and no longer in use. All client interactions and tests will be migrated to use the HTTP interface. -In order to achieve shared tests, we introduce a shared GRPC interface to the clients. Thus, each client need only implement the GRPC interface server and then these shared tests can be run against the library. The GRPC interface implements common APIs across the clients which provide the building blocks for test cases. - -```proto -service APMClient { - rpc StartSpan(StartSpanArgs) returns (StartSpanReturn) {} - rpc FinishSpan(FinishSpanArgs) returns (FinishSpanReturn) {} - rpc SpanSetMeta(SpanSetMetaArgs) returns (SpanSetMetaReturn) {} - rpc SpanSetMetric(SpanSetMetricArgs) returns (SpanSetMetricReturn) {} - rpc SpanSetError(SpanSetErrorArgs) returns (SpanSetErrorReturn) {} - rpc InjectHeaders(InjectHeadersArgs) returns (InjectHeadersReturn) {} - rpc FlushSpans(FlushSpansArgs) returns (FlushSpansReturn) {} - rpc FlushTraceStats(FlushTraceStatsArgs) returns (FlushTraceStatsReturn) {} - rpc StopTracer(StopTracerArgs) returns (StopTracerReturn) {} -} -``` -#### Updating protos for GRPC +Previously, we used a shared GRPC interface to enable shared testing across different clients. Each client would implement the GRPC interface server, allowing shared tests to be run against the client libraries. The GRPC service definition included methods like StartSpan, FinishSpan, SpanSetMeta, and others, which facilitated span and trace operations. + +#### Updating protos for GRPC (will be deprecated) In order to update the `parametric/protos`, these steps must be followed. @@ -370,13 +356,13 @@ cd utils/parametric Then you should have updated proto files. This script will generate weird files, you can ignore/delete these. ### Architecture +Below is an overview of how the testing architecture is structured: +- Shared Tests in Python: We write shared test cases using Python's pytest framework. These tests are designed to be generic and interact with clients through the HTTP interface. +- HTTP Servers in Docker: For each language client, we build and run an HTTP server within a Docker container. These servers expose the required endpoints defined in the OpenAPI schema and handle the client-specific logic. +- [Test Agent](https://github.com/DataDog/dd-apm-test-agent/) in Docker: We start a test agent in a separate Docker container. This agent collects data (such as spans and traces) submitted by the HTTP servers. It serves as a centralized point for aggregating and accessing test data. +- Test Execution: The Python test cases use an HTTP client to communicate with the servers. The servers generate data based on the interactions, which is then sent to the test agent. The tests can query the test agent to retrieve data and perform assertions to verify correct behavior. -- Shared tests are written in Python (pytest). -- GRPC/HTTP servers for each language are built and run in docker containers. -- [test agent](https://github.com/DataDog/dd-apm-test-agent/) is started in a container to collect the data from the GRPC servers. - -Test cases are written in Python and target the shared GRPC interface. The tests use a GRPC client to query the servers and the servers generate the data which is submitted to the test agent. Test cases can then query the data from the test agent to perform assertions. - +This architecture allows us to ensure that all clients conform to the same interface and behavior, making it easier to maintain consistency across different languages and implementations. image From 13000f14f0914a7b11d255fd823748886da9b3a9 Mon Sep 17 00:00:00 2001 From: Rachel Yang Date: Tue, 15 Oct 2024 16:29:02 -0400 Subject: [PATCH 07/11] Update docs/scenarios/parametric.md Co-authored-by: Zachary Groves <32471391+ZStriker19@users.noreply.github.com> --- docs/scenarios/parametric.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/scenarios/parametric.md b/docs/scenarios/parametric.md index cc21236c62..66ddfadc4d 100644 --- a/docs/scenarios/parametric.md +++ b/docs/scenarios/parametric.md @@ -309,7 +309,7 @@ See the steps below in the HTTP section to run the Python server and view the sp #### HTTP -We have transitioned to using an HTTP interface, replacing the legacy GRPC interface. To view the HTTP interface, follow these steps: +We have transitioned to using an HTTP interface, replacing the legacy GRPC interface. To view the available HTTP endpoints , follow these steps: 1. ``` ./utils/scripts/parametric/run_reference_http.sh From b840096154f6668a760126fbd77476674972fef1 Mon Sep 17 00:00:00 2001 From: Rachel Yang Date: Tue, 15 Oct 2024 16:29:15 -0400 Subject: [PATCH 08/11] Update docs/scenarios/parametric.md Co-authored-by: Zachary Groves <32471391+ZStriker19@users.noreply.github.com> --- docs/scenarios/parametric.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/scenarios/parametric.md b/docs/scenarios/parametric.md index 66ddfadc4d..5be90c66f1 100644 --- a/docs/scenarios/parametric.md +++ b/docs/scenarios/parametric.md @@ -197,7 +197,7 @@ echo “ddtrace @ git+https://github.com/DataDog/dd-trace-py.git@ Date: Tue, 15 Oct 2024 16:30:09 -0400 Subject: [PATCH 09/11] Update docs/scenarios/parametric.md Co-authored-by: Zachary Groves <32471391+ZStriker19@users.noreply.github.com> --- docs/scenarios/parametric.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/scenarios/parametric.md b/docs/scenarios/parametric.md index 5be90c66f1..b23eb046a7 100644 --- a/docs/scenarios/parametric.md +++ b/docs/scenarios/parametric.md @@ -355,7 +355,7 @@ cd utils/parametric Then you should have updated proto files. This script will generate weird files, you can ignore/delete these. -### Architecture +### Architecture/How System-tests work Below is an overview of how the testing architecture is structured: - Shared Tests in Python: We write shared test cases using Python's pytest framework. These tests are designed to be generic and interact with clients through the HTTP interface. - HTTP Servers in Docker: For each language client, we build and run an HTTP server within a Docker container. These servers expose the required endpoints defined in the OpenAPI schema and handle the client-specific logic. From 718b4d9046c82b4d3e92c42f6a94b3097f38a743 Mon Sep 17 00:00:00 2001 From: Rachel Yang Date: Wed, 16 Oct 2024 11:50:15 -0400 Subject: [PATCH 10/11] fixes --- docs/execute/binaries.md | 75 ++++++++++++++++++++-- docs/scenarios/parametric.md | 120 +++-------------------------------- 2 files changed, 79 insertions(+), 116 deletions(-) diff --git a/docs/execute/binaries.md b/docs/execute/binaries.md index a2a5ced6bc..86bc858cd2 100644 --- a/docs/execute/binaries.md +++ b/docs/execute/binaries.md @@ -9,7 +9,13 @@ But, obviously, testing validated versions of components is not really interesti ## C++ library -* Tracer: TODO +* Tracer: +There are two ways for running the C++ library tests with a custom tracer: +1. Create a file `cpp-load-from-git` in `binaries/`. Content examples: + * `https://github.com/DataDog/dd-trace-cpp@main` + * `https://github.com/DataDog/dd-trace-cpp@` +2. Clone the dd-trace-cpp repo inside `binaries` + * Profiling: add a ddprof release tar to the binaries folder. Call the `install_ddprof`. ## .Net library @@ -22,13 +28,42 @@ But, obviously, testing validated versions of components is not really interesti ## Golang library -1. Add a file `golang-load-from-go-get`, the content will be installed by `go get`. Content example: - * `gopkg.in/DataDog/dd-trace-go.v1@master` +1. To test unmerged PRs locally, run the following in the utils/build/docker/golang/parametric directory: + +```sh +go get -u gopkg.in/DataDog/dd-trace-go.v1@ +go mod tidy +``` + 2. Clone the dd-trace-go repo inside `binaries` ## Java library -1. Add a valid `dd-java-agent-.jar` file in `binaries`. `` must be a valid version number. +Follow these steps to run Parametric tests with a custom Java Tracer version: + +1. Clone the repo and checkout to the branch you'd like to test: +```bash +git clone git@github.com:DataDog/dd-trace-java.git +cd dd-trace-java +``` +By default you will be on the `master` branch, but if you'd like to run system-tests on the changes you made to your local branch, `git checkout` to that branch before proceeding. + +2. Build Java Tracer artifacts +``` +./gradlew :dd-java-agent:shadowJar :dd-trace-api:jar +``` + +3. Copy both artifacts into the `system-tests/binaries/` folder: + * The Java tracer agent artifact `dd-java-agent-*.jar` from `dd-java-agent/build/libs/` + * Its public API `dd-trace-api-*.jar` from `dd-trace-api/build/libs/` into + +Note, you should have only TWO jar files in `system-tests/binaries`. Do NOT copy sources or javadoc jars. + +4. Run Parametric tests from the `system-tests/parametric` folder: + +```bash +TEST_LIBRARY=java ./run.sh test_span_sampling.py::test_single_rule_match_span_sampling_sss001 +``` ## NodeJS library @@ -42,7 +77,24 @@ But, obviously, testing validated versions of components is not really interesti ## PHP library -1. Add a valid `.apk` file in `binaries`. +- Place `datadog-setup.php` and `dd-library-php-[X.Y.Z+commitsha]-aarch64-linux-gnu.tar.gz` (or the `x86_64` if you're not on ARM) in `/binaries` folder + - You can download those from the `build_packages/package extension` job artifacts, from a CI run of your branch. +- Copy it in the binaries folder + +##### Then run the tests + +From the repo root folder: + +- `./build.sh -i runner` +- `TEST_LIBRARY=php ./run.sh PARAMETRIC` or `TEST_LIBRARY=php ./run.sh PARAMETRIC -k ` + +> :warning: **If you are seeing DNS resolution issues when running the tests locally**, add the following config to the Docker daemon: + +```json + "dns-opts": [ + "single-request" + ], +``` ## Python library @@ -51,6 +103,11 @@ But, obviously, testing validated versions of components is not really interesti 2. Add a `.tar.gz` or a `.whl` file in `binaries`, pip will install it 3. Clone the dd-trace-py repo inside `binaries` +You can also run: +```bash +echo “ddtrace @ git+https://github.com/DataDog/dd-trace-py.git@” > binaries/python-load-from-pip +``` + ## Ruby library * Create an file `ruby-load-from-bundle-add` in `binaries/`, the content will be installed by `bundle add`. Content example: @@ -60,6 +117,14 @@ But, obviously, testing validated versions of components is not really interesti ## WAF rule set * copy a file `waf_rule_set` in `binaries/` + +#### After Testing with a Custom Tracer: +Modifying the binaries will alter the tracer version used across your local computer. Once you're done testing with the custom tracer, ensure you **remove** it by running: + +```bash +rm -rf binaries/python-load-from-pip +``` + ---- Hint for components who allows to have the repo in `binaries`, use the command `mount --bind src dst` to mount your local repo => any build of system tests will uses it. diff --git a/docs/scenarios/parametric.md b/docs/scenarios/parametric.md index b23eb046a7..7a924da412 100644 --- a/docs/scenarios/parametric.md +++ b/docs/scenarios/parametric.md @@ -101,6 +101,11 @@ The tests are executed using pytest. Below are some common command-line options TEST_LIBRARY=dotnet ./run.sh PARAMETRIC -k test_metrics_msgpack_serialization_TS001 ``` +- `-L`: To specifiy a language using an argument rather than env +```sh +./run.sh PARAMETRIC -L dotnet -k test_metrics_msgpack_serialization_TS001 +``` + - `-v`: Increase verbosity. Shows each test name and its result (pass/fail) as they are run. ```sh @@ -120,124 +125,17 @@ TEST_LIBRARY=dotnet ./run.sh PARAMETRIC -s ``` ### Running the tests for a custom tracer +To run tests against custom tracers, refer to the [Binaries Documentation](system-tests/docs/execute/binaries.md) -#### Go - -To test unmerged PRs locally, run the following in the utils/build/docker/golang/parametric directory: - -```sh -go get -u gopkg.in/DataDog/dd-trace-go.v1@ -go mod tidy -``` - -#### dotnet - -- Add a file `datadog-dotnet-apm-.tar.gz` in `binaries/`. `` must be a valid version number. - - One way to get that file is from an Azure pipeline (either a recent one from master if the changes you want to test were merged recently, or the one from your PR if it's open) - -#### Java - -Follow these steps to run Parametric tests with a custom Java Tracer version: - -1. Clone the repo and checkout to the branch you'd like to test: -```bash -git clone git@github.com:DataDog/dd-trace-java.git -cd dd-trace-java -``` -By default you will be on the `master` branch, but if you'd like to run system-tests on the changes you made to your local branch, `git checkout` to that branch before proceeding. - -2. Build Java Tracer artifacts -``` -./gradlew :dd-java-agent:shadowJar :dd-trace-api:jar -``` - -3. Copy both artifacts into the `system-tests/binaries/` folder: - * The Java tracer agent artifact `dd-java-agent-*.jar` from `dd-java-agent/build/libs/` - * Its public API `dd-trace-api-*.jar` from `dd-trace-api/build/libs/` into - -Note, you should have only TWO jar files in `system-tests/binaries`. Do NOT copy sources or javadoc jars. - -4. Run Parametric tests from the `system-tests/parametric` folder: - -```bash -TEST_LIBRARY=java ./run.sh test_span_sampling.py::test_single_rule_match_span_sampling_sss001 -``` - - -#### PHP - -##### To run with a custom build - -- Place `datadog-setup.php` and `dd-library-php-[X.Y.Z+commitsha]-aarch64-linux-gnu.tar.gz` (or the `x86_64` if you're not on ARM) in `/binaries` folder - - You can download those from the `build_packages/package extension` job artifacts, from a CI run of your branch. -- Copy it in the binaries folder - -##### Then run the tests - -From the repo root folder: - -- `./build.sh -i runner` -- `TEST_LIBRARY=php ./run.sh PARAMETRIC` or `TEST_LIBRARY=php ./run.sh PARAMETRIC -k ` - -> :warning: **If you are seeing DNS resolution issues when running the tests locally**, add the following config to the Docker daemon: - -```json - "dns-opts": [ - "single-request" - ], -``` - - -#### Python - -To run the Python tests against a custom tracer: -```bash -echo “ddtrace @ git+https://github.com/DataDog/dd-trace-py.git@” > binaries/python-load-from-pip -``` - -#### NodeJS - -There are three ways for running the NodeJS tests with a custom tracer: -1. Create a file `nodejs-load-from-npm` in `binaries/`, the content will be installed by `npm install`. Content example: - - `DataDog/dd-trace-js#master` -2. Clone the dd-trace-js repo inside `binaries` -3. Create a file `nodejs-load-from-local` in `binaries/`, this will disable installing with `npm install dd-trace` and - will instead get the content of the file, and use it as a location of the `dd-trace-js` repo and then mount it as a - volume and `npm link` to it. For instance, if this repo is at the location, you can set the content of this file to - `../dd-trace-js`. This also removes the need to rebuild the weblog image since the code is mounted at runtime. - -#### Ruby - -There is two ways for running the Ruby tests with a custom tracer: - -1. Create an file ruby-load-from-bundle-add in binaries/, the content will be installed by bundle add. Content example: -gem 'datadog', git: "https://github.com/Datadog/dd-trace-rb", branch: "master", require: 'datadog/auto_instrument' -2. Clone the dd-trace-rb repo inside binaries - -#### C++ - -There is two ways for running the C++ library tests with a custom tracer: -1. Create a file `cpp-load-from-git` in `binaries/`. Content examples: - * `https://github.com/DataDog/dd-trace-cpp@main` - * `https://github.com/DataDog/dd-trace-cpp@` -2. Clone the dd-trace-cpp repo inside `binaries` +#### After Testing with a Custom Tracer: +Modifying the binaries will alter the tracer version used across your local computer. Once you're done testing with the custom tracer, ensure you **remove** it by running: -#### When you are done testing against a custom tracer: ```bash rm -rf binaries/python-load-from-pip ``` ### Understanding the test outcomes -Please refer to this chart: - -| Declaration | Test is executed | Test actual outcome | System test output | Comment -| - | - | - | - | - -| \ | Yes | ✅ Pass | 🟢 Success | All good :sunglasses: -| Missing feature or bug | Yes | ❌ Fail | 🟢 Success | Expected failure -| Missing feature or bug | Yes | ✅ Pass | 🟠 Success | XPASS: The feature has been implemented, bug has been fixed -> easy win -| Flaky | No | N.A. | N.A. | A flaky test doesn't provide any usefull information, and thus, is not executed. -| Irrelevant | No | N.A. | N.A | There is no purpose of running such a test -| \ | Yes | ❌ Fail | 🔴 Fail | Only use case where system test fails : the test should have been ok, and is not +Please refer to this [chart](docs/execute/test-outcomes.md) ### Debugging From 1c491d6657b58a8f26b2b19fa4702b68f94549df Mon Sep 17 00:00:00 2001 From: Rachel Yang Date: Wed, 16 Oct 2024 15:28:12 -0400 Subject: [PATCH 11/11] architecture --- docs/scenarios/parametric.md | 17 +++++++++++------ 1 file changed, 11 insertions(+), 6 deletions(-) diff --git a/docs/scenarios/parametric.md b/docs/scenarios/parametric.md index 7a924da412..2a4ff76dc0 100644 --- a/docs/scenarios/parametric.md +++ b/docs/scenarios/parametric.md @@ -218,9 +218,9 @@ We have transitioned to using an HTTP interface, replacing the legacy GRPC inter 3. You can download the OpenAPI schema from http://localhost:8000/openapi.json. This schema can be imported into tools like [Postman](https://learning.postman.com/docs/integrations/available-integrations/working-with-openAPI/) or other API clients to facilitate development and testing. #### Legacy GRPC -**Important:** The legacy GRPC interface will be **deprecated** and no longer in use. All client interactions and tests will be migrated to use the HTTP interface. +**Important:** The legacy GRPC interface will be **deprecated** and no longer in use. All tests will be migrated to use the HTTP interface. -Previously, we used a shared GRPC interface to enable shared testing across different clients. Each client would implement the GRPC interface server, allowing shared tests to be run against the client libraries. The GRPC service definition included methods like StartSpan, FinishSpan, SpanSetMeta, and others, which facilitated span and trace operations. +Previously, we used a shared GRPC interface to enable shared testing across different tracers. Each tracer would implement the GRPC interface server, allowing shared tests to be run against the libraries. The GRPC service definition included methods like StartSpan, FinishSpan, SpanSetMeta, and others, which facilitated span and trace operations. #### Updating protos for GRPC (will be deprecated) @@ -253,14 +253,19 @@ cd utils/parametric Then you should have updated proto files. This script will generate weird files, you can ignore/delete these. -### Architecture/How System-tests work +### Architecture: How System-tests work Below is an overview of how the testing architecture is structured: -- Shared Tests in Python: We write shared test cases using Python's pytest framework. These tests are designed to be generic and interact with clients through the HTTP interface. -- HTTP Servers in Docker: For each language client, we build and run an HTTP server within a Docker container. These servers expose the required endpoints defined in the OpenAPI schema and handle the client-specific logic. +- Shared Tests in Python: We write shared test cases using Python's pytest framework. These tests are designed to be generic and interact with the tracers through an HTTP interface. +- HTTP Servers in Docker: For each language tracer, we build and run an HTTP server within a Docker container. These servers expose the required endpoints defined in the OpenAPI schema and handle the tracer-specific logic. - [Test Agent](https://github.com/DataDog/dd-apm-test-agent/) in Docker: We start a test agent in a separate Docker container. This agent collects data (such as spans and traces) submitted by the HTTP servers. It serves as a centralized point for aggregating and accessing test data. - Test Execution: The Python test cases use an HTTP client to communicate with the servers. The servers generate data based on the interactions, which is then sent to the test agent. The tests can query the test agent to retrieve data and perform assertions to verify correct behavior. -This architecture allows us to ensure that all clients conform to the same interface and behavior, making it easier to maintain consistency across different languages and implementations. +An example of how to get a span from the test agent: +```python +span = find_only_span(test_agent.wait_for_num_traces(1)) +``` + +This architecture allows us to ensure that all tracers conform to the same interface and behavior, making it easier to maintain consistency across different languages and implementations. image