Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(trace): implement otlp http backend #7

Merged
merged 4 commits into from
Dec 11, 2024
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 2 additions & 1 deletion .dockerignore
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,8 @@ temp
Dockerfile
coverage
.husky
.env.testing
.env.testing.local
.env.testing.docker
.env
.travis.yml
scripts
Expand Down
7 changes: 4 additions & 3 deletions .env.testing → .env.testing.docker
Original file line number Diff line number Diff line change
@@ -1,17 +1,18 @@
PORT=3000
PORT=4318
AUTH_KEY=valid-api-key
FASTIFY_BODY_LIMIT=10485760

REDIS_URL=redis://redis:6379/0
MONGODB_URL=mongodb://mongo:mongo@mongo:27017
DATA_EXPIRATION_IN_DAYS=7

POST_TEST=4002

MLFLOW_API_URL=http://mlflow:8080/
MLFLOW_AUTHORIZATION=BASE_AUTH
MLFLOW_USERNAME=admin
MLFLOW_PASSWORD=password
MLFLOW_DEFAULT_EXPERIMENT_ID=0
MLFLOW_TRACE_DELETE_IN_BATCHES_CRON_PATTERN="0 */1 * * * *"
MLFLOW_TRACE_DELETE_IN_BATCHES_BATCH_SIZE=100

BEE_FRAMEWORK_INSTRUMENTATION_METRICS_ENABLED=false
BEE_FRAMEWORK_INSTRUMENTATION_ENABLED=true
4 changes: 4 additions & 0 deletions .env.testing.local
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
BEE_FRAMEWORK_INSTRUMENTATION_METRICS_ENABLED=false
BEE_FRAMEWORK_INSTRUMENTATION_ENABLED=true
PORT=4318
AUTH_KEY=valid-api-key
38 changes: 38 additions & 0 deletions .github/workflows/main.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,38 @@
name: Lint, Build, Test

on:
push:
branches: ['main']
pull_request:
branches: ['main']

concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true

jobs:
main:
timeout-minutes: 20
name: Lint & Build & Test
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nitpick: I don't see any test :D

runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Enable Corepack
run: corepack enable
- name: Use Node.js
uses: actions/setup-node@v4
with:
node-version-file: '.nvmrc'
cache: 'yarn'
- name: Install dependencies
run: yarn install
- name: Code Lint
run: yarn lint
- name: Code Format
run: yarn format
- name: Commits Lint
run: yarn commitlint --verbose --from "${{ github.event.pull_request.base.sha || github.event.commits[0].id }}" --to "${{ github.event.pull_request.head.sha || github.event.head_commit.id }}"
- name: Build
run: yarn build
3 changes: 2 additions & 1 deletion CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -44,7 +44,8 @@ yarn install
5. **Setup environmental variables:** You should set the following variables in your `.env` file in the repository’s root.

- copy [.env.example](./.env.example) and fill the right values.
- The [.env.testing](./.env.testing) is used for integration API testing in docker
- The [.env.testing.local](./.env.testing.local) is used directly in the vitest runtime for all tests running locally.
- The [.env.testing.docker](./.env.testing.docker) is used for integration API testing in docker.

6. **Run migrations** You should run database migrations to prepare the necessary data for a valid application run.

Expand Down
16 changes: 11 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,11 +22,12 @@
- [Running the server via Docker](#running-the-server-via-docker)
- [Running the Server via Node.js](#running-the-server-via-nodejs)
- [Development Mode](#development-mode)
4. [🧪 Run tests](#-run-tests)
5. [📣 Publishing](#-publishing)
6. [Code of conduct](#code-of-conduct)
7. [Legal notice](#legal-notice)
8. [📖 Docs](#-docs)
4. [🪑 Setup with OpenTelemetry](#-setup-with-opentelemetry)
5. [🧪 Run tests](#-run-tests)
6. [📣 Publishing](#-publishing)
7. [Code of conduct](#code-of-conduct)
8. [Legal notice](#legal-notice)
9. [📖 Docs](#-docs)

## 👩‍💻 Get started with Observe

Expand Down Expand Up @@ -131,6 +132,10 @@ For development mode with hot-reloading, use:
yarn dev
```

### 🪑 Setup with OpenTelemetry

See the [Using Bee Observe OTLP Backend with OpenTelemetry](./docs/using-with-opentelemetry.md) section for instructions on how to use the Observe backend with the OpenTelemetry stack.

### 🧪 Run tests

> This project uses integration API tests that require a running observe instance to be executed.
Expand Down Expand Up @@ -160,3 +165,4 @@ Read all related document pages carefully to understand the Observer API archite
- [Overview](./docs/overview.md)
- [API](./docs/api.md)
- [Data persistence](./docs/data-persistence.md)
- [Using Bee Observe OTLP Backend with OpenTelemetry](./docs/using-with-opentelemetry.md)
8 changes: 4 additions & 4 deletions compose.yml
Original file line number Diff line number Diff line change
Expand Up @@ -7,11 +7,11 @@ services:
GIT_TAG: ${TAG:-testing}
BUILD_DATETIME: ${BUILD_DATETIME:-}
ports:
- '${OBSERVE_API_EXPOSED_PORT:-4002}:3000'
- '${OBSERVE_API_EXPOSED_PORT:-4318}:4318'
env_file:
- .env.testing
- .env.testing.docker
healthcheck:
test: wget --no-verbose --tries=1 --spider http://0.0.0.0:3000/health || exit 1
test: wget --no-verbose --tries=1 --spider http://0.0.0.0:4318/health || exit 1
interval: 10s
timeout: 5s
retries: 5
Expand All @@ -24,5 +24,5 @@ services:
[
'sh',
'-c',
"while ! curl --silent --fail http://observe_api:3000/health; do echo 'Waiting for API...'; sleep 5; done"
"while ! curl --silent --fail http://observe_api:4318/health; do echo 'Waiting for API...'; sleep 5; done"
]
106 changes: 106 additions & 0 deletions docs/using-with-opentelemetry.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,106 @@
# Using Bee Observe OTLP Backend with OpenTelemetry
GALLLASMILAN marked this conversation as resolved.
Show resolved Hide resolved

This documentation explains how to configure the OpenTelemetry Node.js SDK and the OpenTelemetry Collector to send telemetry data to your custom OTLP backend.

## Limitations

> we support only OpenTelemetry Exporter with HTTP/protobuf

For javascript you can use the `@opentelemetry/exporter-trace-otlp-proto` package.

> We support only the `v1/traces` endpoint

## 1. Using the Custom Backend with the OpenTelemetry Node.js SDK

The OpenTelemetry Node.js SDK can send traces and metrics directly to your custom OTLP backend.

Install the necessary OpenTelemetry dependencies:

```
npm install @opentelemetry/sdk-node @opentelemetry/exporter-trace-otlp-proto @opentelemetry/semantic-conventions
```

Configure the SDK to export telemetry data:

```
import '@opentelemetry/instrumentation/hook.mjs';
import { NodeSDK, resources } from '@opentelemetry/sdk-node';
import { ATTR_SERVICE_NAME, ATTR_SERVICE_VERSION } from '@opentelemetry/semantic-conventions';
import { OTLPTraceExporter } from '@opentelemetry/exporter-trace-otlp-proto';

export const sdk = new NodeSDK({
resource: new resources.Resource({
[ATTR_SERVICE_NAME]: 'your-sevice-name',
[ATTR_SERVICE_VERSION]: '0.0.1' // Load the version of you service
}),
traceExporter: new OTLPTraceExporter({
url: 'http://<custom-backend-host>:4318/v1/traces', // Custom backend endpoint for traces
headers: {
'x-bee-authorization': '<auth-key>' // Custom auth key
},
timeoutMillis: 120_000
})
});

// Start the SDK
sdk.start().then(() => {
console.log('OpenTelemetry SDK started');
}).catch((error) => {
console.error('Error starting OpenTelemetry SDK', error);
});
```

> see the full example [here](https://opentelemetry.io/docs/languages/js/exporters/#otlp)

### Key Points

- Replace <custom-backend-host> with your custom OTLP backend's hostname or IP address.
- Replace <auth-key> with the value defined by `AUTH_KEY` environment variable.
- The exporters use the OTLP protocol over HTTP/Proto.
- Ensure your application can connect to the backend on port 4318.

## 2. Using the Custom Backend with the OpenTelemetry Collector

The OpenTelemetry Collector acts as an intermediary, allowing more flexibility in processing and routing telemetry data to your custom backend. See [docs](https://opentelemetry.io/docs/collector/)

### Collector Configuration

Create a configuration file (otel-collector-config.yaml) with the following content:

```
receivers:
otlp:
protocols:
http: # Listening for OTLP data over HTTP
endpoint: "0.0.0.0:4318"

exporters:
otlphttp:
endpoint: "http://<custom-backend-host>:4318" # Custom backend endpoint
compression: none
headers:
content-type: "application/x-protobuf"
x-bee-authorization: '<auth-key>' # Custom auth key

processors:
batch:
timeout: 5s
send_batch_size: 512

service:
pipelines:
traces:
receivers: [otlp]
processors: [batch]
exporters: [otlphttp]
```

### Steps to Configure

- Replace <custom-backend-host> with the hostname or IP of your custom backend.
- Replace <auth-key> with the value defined by `AUTH_KEY` environment variable.
- Ensure the otlphttp exporter points to http://<custom-backend-host>:4318.

### Start the Collector

Run the collector with this configuration file to route telemetry data to the custom backend.
17 changes: 14 additions & 3 deletions package.json
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,8 @@
"description": "Observability API server for bee-agent-framework",
"type": "module",
"scripts": {
"build": "rm -rf dist && tsc",
"build": "rm -rf dist && tsc && cp -R src/protos dist/protos",
"proto:generate": "proto-loader-gen-types --defaults --oneofs --longs=Number --enums=String --grpcLib=@grpc/grpc-js --outDir=./src/types/generated ./src/protos/*.proto && node scripts/add_js_extensions.js",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is hacky, we already have a proper setup for generating proto types in the framework using tsup, please copy it from there

https://github.com/i-am-bee/bee-agent-framework/blob/main/scripts/ibm_vllm_generate_protos

"start:infra": "docker compose -f compose-before.yml up -d mongo redis mlflow",
"start:dev": "tsx watch ./src/index.ts | pino-pretty --singleLine",
"stop:infra": "docker compose -f compose-before.yml down",
Expand All @@ -30,34 +31,42 @@
"@commitlint/config-conventional": "^19.5.0",
"@commitlint/types": "^19.5.0",
"@fastify/type-provider-json-schema-to-ts": "^3.0.0",
"@opentelemetry/exporter-trace-otlp-proto": "^0.55.0",
"@opentelemetry/instrumentation": "^0.55.0",
"@opentelemetry/sdk-node": "^0.55.0",
"@opentelemetry/semantic-conventions": "^1.28.0",
"@release-it/conventional-changelog": "^8.0.1",
"@types/dotenv-safe": "^8.1.6",
"@types/node": "^20.16.5",
"@typescript-eslint/eslint-plugin": "^7.13.0",
"@typescript-eslint/parser": "^7.13.0",
"@vitest/coverage-v8": "^1.6.0",
"bee-agent-framework": "^0.0.45",
"eslint": "^8.56.0",
"eslint-config-prettier": "^9.1.0",
"eslint-plugin-import": "^2.29.1",
"eslint-plugin-prettier": "^5.1.3",
"globals": "^15.5.0",
"husky": "^9.0.11",
"lint-staged": "^15.2.7",
"ollama": "^0.5.10",
"pino-pretty": "^11.1.0",
"prettier": "^3.3.2",
"release-it": "^17.6.0",
"ts-node": "^10.9.2",
"tsx": "^4.11.0",
"typescript": "^5.4.5",
"typescript-eslint": "^7.13.0",
"vitest": "^1.6.0"
"vitest": "^2.1.3"
},
"dependencies": {
"@fastify/auth": "^4.6.1",
"@fastify/request-context": "^5.1.0",
"@fastify/swagger": "^8.14.0",
"@fastify/swagger-ui": "^3.0.0",
"@godaddy/terminus": "^4.12.1",
"@grpc/grpc-js": "^1.12.2",
"@grpc/proto-loader": "^0.7.13",
"@mikro-orm/cli": "6.2.9",
"@mikro-orm/core": "6.2.9",
"@mikro-orm/migrations-mongodb": "6.2.9",
Expand All @@ -71,7 +80,9 @@
"http-status-codes": "^2.3.0",
"ioredis": "^5.4.1",
"json-schema-to-ts": "^3.1.0",
"pino": "^9.2.0"
"pino": "^9.2.0",
"protobufjs": "^7.4.0",
"typescript-json-schema": "^0.65.1"
},
"resolutions": {
"@types/node": "20.16.5"
Expand Down
43 changes: 43 additions & 0 deletions scripts/add_js_extensions.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@
/**
* Copyright 2024 IBM Corp.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/

import fs from 'fs';
import path from 'path';
import { fileURLToPath } from 'url';

const __filename = fileURLToPath(import.meta.url);
const __dirname = path.dirname(__filename);

const directory = path.resolve(__dirname, '../src/types/generated');

async function addJsExtensions(dir) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

not necessary with a proper tsup setup (see framework)

const files = await fs.promises.readdir(dir);
for (const file of files) {
const filePath = path.join(dir, file);
const stat = await fs.promises.lstat(filePath);
if (stat.isDirectory()) {
await addJsExtensions(filePath);
} else if (file.endsWith('.ts')) {
let content = await fs.promises.readFile(filePath, 'utf8');
content = content.replace(/(from\s+['"](\.?\.\/[^'"]+))(?<!\.js)(?=['"])/g, '$1.js');
await fs.promises.writeFile(filePath, content, 'utf8');
}
}
}

addJsExtensions(directory)
.then(() => console.log('Added .js extensions to imports in generated TypeScript files.'))
.catch(console.error);
4 changes: 2 additions & 2 deletions scripts/copyright.sh
Original file line number Diff line number Diff line change
Expand Up @@ -24,9 +24,9 @@ if ! command -v nwa &> /dev/null && command -v go &> /dev/null; then
fi

if command -v nwa &> /dev/null; then
nwa add -l apache -c "IBM Corp." scripts executor test src examples
nwa add -l apache -c "IBM Corp." scripts/**/*.{js,sh} src/**/*.{ts,js}
elif command -v docker &> /dev/null; then
docker run -it -v "${PWD}:/src" ghcr.io/b1nary-gr0up/nwa:main add -l apache -c "IBM Corp." scripts executor test src examples
docker run -it -v "${PWD}:/src" ghcr.io/b1nary-gr0up/nwa:main add -l apache -c "IBM Corp." scripts src
else
echo "Error: 'nwa' is not available. Either install it manually or install go/docker."
exit 1
Expand Down
2 changes: 1 addition & 1 deletion scripts/publish.sh
Original file line number Diff line number Diff line change
@@ -1,4 +1,3 @@
#!/bin/bash
# Copyright 2024 IBM Corp.
#
# Licensed under the Apache License, Version 2.0 (the "License");
Expand All @@ -12,6 +11,7 @@
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

set -e

REPOSITORY="docker.io"
Expand Down
Loading
Loading