Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Test if Python 3.12 support for release version 0.5.0 #48

Closed
wants to merge 11 commits into from
Closed
5 changes: 3 additions & 2 deletions .github/workflows/build-test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -5,9 +5,9 @@ name: Build and Test

on:
push:
branches: [ "main" ]
branches: [ "main", "release/*" ]
pull_request:
branches: [ "main" ]
branches: [ "main", "release/*" ]

permissions:
contents: read
Expand All @@ -24,6 +24,7 @@ jobs:
- "3.9"
- "3.10"
- "3.11"
- "3.12"

steps:
- uses: actions/checkout@v3
Expand Down
38 changes: 38 additions & 0 deletions .github/workflows/check-codegen.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,38 @@
# This workflow will delete and regenerate the opentelemetry marshaling code using scripts/proto_codegen.sh.
# If generating the code produces any changes from what is currently checked in, the workflow will fail and prompt the user to regenerate the code.
# For more information see: https://help.github.com/actions/language-and-framework-guides/using-python-with-github-actions

name: Check Codegen

on:
push:
branches: [ "main" ]
paths:
- "scripts/**"
- "src/snowflake/telemetry/_internal/opentelemetry/proto/**"
- "src/snowflake/telemetry/serialize/**"
- ".github/workflows/check-codegen.yml"
pull_request:
branches: [ "main" ]
paths:
- "scripts/**"
- "src/snowflake/telemetry/_internal/opentelemetry/proto/**"
- "src/snowflake/telemetry/serialize/**"
- ".github/workflows/check-codegen.yml"

jobs:
check-codegen:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Set up Python
uses: actions/setup-python@v3
with:
python-version: "3.11"
- name: Run codegen script
run: |
rm -rf src/snowflake/telemetry/_internal/opentelemetry/proto/
./scripts/proto_codegen.sh
- name: Check for changes
run: |
git diff --exit-code || { echo "Code generation produced changes! Regenerate the code using ./scripts/proto_codegen.sh"; exit 1; }
32 changes: 32 additions & 0 deletions .github/workflows/check-vendor.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,32 @@
# This workflow will delete and regenerate the opentelemetry-exporter-otlp-proto-common code using scripts/vendor_otlp_proto_common.sh.
# If generating the code produces any changes from what is currently checked in, the workflow will fail and prompt the user to regenerate the code.
# For more information see: https://help.github.com/actions/language-and-framework-guides/using-python-with-github-actions

name: Check OTLP Proto Common Vendored Code

on:
push:
branches: [ "main" ]
paths:
- "scripts/vendor_otlp_proto_common.sh"
- "src/snowflake/telemetry/_internal/opentelemetry/exporter/**"
- ".github/workflows/check-vendor.yml"
pull_request:
branches: [ "main" ]
paths:
- "scripts/vendor_otlp_proto_common.sh"
- "src/snowflake/telemetry/_internal/opentelemetry/exporter/**"
- ".github/workflows/check-vendor.yml"

jobs:
check-codegen:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Run vendor script
run: |
rm -rf src/snowflake/telemetry/_internal/opentelemetry/exporter/
./scripts/vendor_otlp_proto_common.sh
- name: Check for changes
run: |
git diff --exit-code || { echo "Code generation produced changes! Regenerate the code using ./scripts/vendor_otlp_proto_common.sh"; exit 1; }
5 changes: 5 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,10 @@
# Release History

## Unreleased

* Upgrade OpenTelemetry Python dependencies to version 1.26.0
* Vendored in adapter code from package opentelemetry-exporter-otlp-proto-common and replaced protobuf dependency with custom vanilla python serialization

## 0.5.0 (2024-07-23)

* Set empty resource for Python OpenTelemetry config.
Expand Down
420 changes: 420 additions & 0 deletions NOTICE.txt

Large diffs are not rendered by default.

8 changes: 8 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,6 +22,8 @@ pip install --upgrade pip
pip install .
```

## Development

To develop this package, run

```bash
Expand All @@ -33,3 +35,9 @@ source .venv/bin/activate
pip install --upgrade pip
pip install . ./tests/snowflake-telemetry-test-utils
```

### Code generation

To regenerate the code under `src/snowflake/_internal/opentelemetry/proto/`, execute the script `./scripts/proto_codegen.sh`. The script expects the `src/snowflake/_internal/opentelemetry/proto/` directory to exist, and will delete all .py files in it before regerating the code.

The commit/branch/tag of [opentelemetry-proto](https://github.com/open-telemetry/opentelemetry-proto) that the code is generated from is pinned to PROTO_REPO_BRANCH_OR_COMMIT, which can be configured in the script. It is currently pinned to the same tag as [opentelemetry-python](https://github.com/open-telemetry/opentelemetry-python/blob/main/scripts/proto_codegen.sh#L15).
7 changes: 3 additions & 4 deletions anaconda/meta.yaml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
package:
name: snowflake_telemetry_python
version: "0.5.0"
version: "0.6.0.dev"

source:
path: {{ environ.get('SNOWFLAKE_TELEMETRY_DIR') }}
Expand All @@ -11,9 +11,8 @@ requirements:
- setuptools >=40.0.0
run:
- python
- opentelemetry-api ==1.23.0
- opentelemetry-exporter-otlp-proto-common ==1.23.0
- opentelemetry-sdk ==1.23.0
- opentelemetry-api ==1.26.0
- opentelemetry-sdk ==1.26.0

about:
home: https://www.snowflake.com/
Expand Down
86 changes: 86 additions & 0 deletions benchmark/benchmark_serialize.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,86 @@
import google_benchmark as benchmark

from util import get_logs_data, get_metrics_data, get_traces_data, get_logs_data_4MB

from snowflake.telemetry._internal.opentelemetry.exporter.otlp.proto.common._log_encoder import encode_logs
from snowflake.telemetry._internal.opentelemetry.exporter.otlp.proto.common.metrics_encoder import encode_metrics
from snowflake.telemetry._internal.opentelemetry.exporter.otlp.proto.common.trace_encoder import encode_spans

from opentelemetry.exporter.otlp.proto.common._log_encoder import encode_logs as pb2_encode_logs
from opentelemetry.exporter.otlp.proto.common.metrics_encoder import encode_metrics as pb2_encode_metrics
from opentelemetry.exporter.otlp.proto.common.trace_encoder import encode_spans as pb2_encode_spans

"""
------------------------------------------------------------------------------
Benchmark Time CPU Iterations
------------------------------------------------------------------------------
test_bm_serialize_logs_data_4MB 730591536 ns 730562298 ns 1
test_bm_pb2_serialize_logs_data_4MB 702522039 ns 702490893 ns 1
test_bm_serialize_logs_data 100882 ns 100878 ns 6930
test_bm_pb2_serialize_logs_data 97112 ns 97109 ns 7195
test_bm_serialize_metrics_data 114938 ns 114934 ns 6096
test_bm_pb2_serialize_metrics_data 161849 ns 161845 ns 4324
test_bm_serialize_traces_data 123977 ns 123973 ns 5633
test_bm_pb2_serialize_traces_data 131016 ns 131011 ns 5314
"""

def sanity_check():
logs_data = get_logs_data()
metrics_data = get_metrics_data()
traces_data = get_traces_data()

assert encode_logs(logs_data).SerializeToString() == pb2_encode_logs(logs_data).SerializeToString()
assert encode_metrics(metrics_data).SerializeToString() == pb2_encode_metrics(metrics_data).SerializeToString()
assert encode_spans(traces_data).SerializeToString() == pb2_encode_spans(traces_data).SerializeToString()

@benchmark.register
def test_bm_serialize_logs_data_4MB(state):
logs_data = get_logs_data_4MB()
while state:
encode_logs(logs_data).SerializeToString()

@benchmark.register
def test_bm_pb2_serialize_logs_data_4MB(state):
logs_data = get_logs_data_4MB()
while state:
pb2_encode_logs(logs_data).SerializeToString()

@benchmark.register
def test_bm_serialize_logs_data(state):
logs_data = get_logs_data()
while state:
encode_logs(logs_data).SerializeToString()

@benchmark.register
def test_bm_pb2_serialize_logs_data(state):
logs_data = get_logs_data()
while state:
pb2_encode_logs(logs_data).SerializeToString()

@benchmark.register
def test_bm_serialize_metrics_data(state):
metrics_data = get_metrics_data()
while state:
encode_metrics(metrics_data).SerializeToString()

@benchmark.register
def test_bm_pb2_serialize_metrics_data(state):
metrics_data = get_metrics_data()
while state:
pb2_encode_metrics(metrics_data).SerializeToString()

@benchmark.register
def test_bm_serialize_traces_data(state):
traces_data = get_traces_data()
while state:
encode_spans(traces_data).SerializeToString()

@benchmark.register
def test_bm_pb2_serialize_traces_data(state):
traces_data = get_traces_data()
while state:
pb2_encode_spans(traces_data).SerializeToString()

if __name__ == "__main__":
sanity_check()
benchmark.main()
Loading
Loading