Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(profiles): Add a dataset to store profile chunks #5895

Merged
merged 20 commits into from
May 15, 2024

Conversation

phacops
Copy link
Contributor

@phacops phacops commented May 12, 2024

This will add a new dataset for profile chunks in order to support our continuous profiling feature (receiving profile chunk by chunk and not one profile per transaction).

The SDK will start a profiler session, identified by profiler_id, will profile and send a chunk with containing this profile ID. It will also tag spans collected with this profiler ID.

Later on, to fetch a profile for a span, we'll receive a profile ID and start and stop timestamps for the span and, using this dataset, we'll query the chunk IDs necessary to assemble the profile for that span, with a query looking like this:

SELECT
    DISTINCT chunk_id
FROM profile_chunks
WHERE
    project_id = <project_id>
    AND profiler_id = <profiler_id>
    AND start_timestamp <= <span.start_timestamp>
    AND end_timestamp >= <span.end_timestamp>
ORDER BY start_timestamp;

Since all our queries will contain a date range on both start_timestamp and end_timestamp, I think it's useful to have them in the sort key. chunk_id appears there in order to make it work with a ReplacingMergeTree since this will guaranteed uniqueness of rows, even though having 2 chunks for the same profiler session and timestamps would be a bug.

We're using the DateTime64 type to be able to store sub-millisecond precision and now have 2 different fields. I add support for that field in #5896.

This PR (https://github.com/getsentry/ops/pull/10526) is related as it adds the necessary config for StorageSetKey.PROFILE_CHUNKS to the profiling cluster in every environment.

Copy link

github-actions bot commented May 12, 2024

This PR has a migration; here is the generated SQL

-- start migrations

-- forward migration profile_chunks : 0001_create_profile_chunks_table
Local op: CREATE TABLE IF NOT EXISTS profile_chunks_local (project_id UInt64, profiler_id UUID, chunk_id UUID, start_timestamp DateTime64(6) CODEC (DoubleDelta), end_timestamp DateTime64(6) CODEC (DoubleDelta), retention_days UInt16, partition UInt16, offset UInt64) ENGINE ReplicatedReplacingMergeTree('/clickhouse/tables/profile_chunks/{shard}/default/profile_chunks_local', '{replica}') ORDER BY (project_id, profiler_id, start_timestamp, cityHash64(chunk_id)) PARTITION BY (retention_days, toStartOfDay(start_timestamp)) SAMPLE BY cityHash64(chunk_id) TTL toDateTime(end_timestamp) + toIntervalDay(retention_days) SETTINGS index_granularity=8192;
Distributed op: CREATE TABLE IF NOT EXISTS profile_chunks_dist (project_id UInt64, profiler_id UUID, chunk_id UUID, start_timestamp DateTime64(6) CODEC (DoubleDelta), end_timestamp DateTime64(6) CODEC (DoubleDelta), retention_days UInt16, partition UInt16, offset UInt64) ENGINE Distributed(`cluster_one_sh`, default, profile_chunks_local, cityHash64(profiler_id));
-- end forward migration profile_chunks : 0001_create_profile_chunks_table




-- backward migration profile_chunks : 0001_create_profile_chunks_table
Distributed op: DROP TABLE IF EXISTS profile_chunks_dist;
Local op: DROP TABLE IF EXISTS profile_chunks_local;
-- end backward migration profile_chunks : 0001_create_profile_chunks_table

@phacops phacops force-pushed the pierre/continuous-profiling-dataset-migration branch from 34e88bb to 57c232b Compare May 12, 2024 14:55
@phacops phacops changed the base branch from master to pierre/add-datetime64-column-type May 12, 2024 14:55
@phacops phacops force-pushed the pierre/continuous-profiling-dataset-migration branch from 57c232b to 81a4671 Compare May 12, 2024 14:57
@phacops phacops force-pushed the pierre/continuous-profiling-dataset-migration branch from 0dc7a81 to 45b9c76 Compare May 12, 2024 19:28
@phacops phacops force-pushed the pierre/continuous-profiling-dataset-migration branch from 45b9c76 to 8d2643b Compare May 12, 2024 19:30
@phacops phacops force-pushed the pierre/continuous-profiling-dataset-migration branch from 8d2643b to b9b425c Compare May 12, 2024 20:37
Copy link
Member

@Zylphrex Zylphrex left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks good to me 👍

Copy link
Member

@volokluev volokluev left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

A few questions

Base automatically changed from pierre/add-datetime64-column-type to master May 13, 2024 19:32
Copy link

codecov bot commented May 14, 2024

Test Failures Detected: Due to failing tests, we cannot provide coverage reports at this time.

❌ Failed Test Results:

Completed 286 tests with 1 failed, 284 passed and 1 skipped.

View the full list of failed tests
Test Description Failure message
Testsuite:
pytest

Test name:
tests.test_metrics_summaries_api.TestMetricsSummariesApi::test_basic_query

Envs:
- default
Traceback (most recent call last):
File ".../snuba/tests/test_metrics_summaries_api.py", line 125, in test_basic_query
assert len(data["data"]) == 1, data
AssertionError: {'data': [], 'experiments': {}, 'meta': [{'name': 'project_id', 'type': 'UInt64'}, {'name': 'metric_mri', 'type': 'Str...rray(String)'}], 'profile': {'blocks': 0, 'bytes': 0, 'elapsed': 0.0032491683959960938, 'progress_bytes': 0, ...}, ...}
assert 0 == 1
+ where 0 = len([])

@phacops phacops requested a review from viglia May 14, 2024 14:57
Copy link
Member

@enochtangg enochtangg left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Left a comment

MigrationGroup.PROFILE_CHUNKS: _MigrationGroup(
loader=ProfileChunksLoader(),
storage_sets_keys={StorageSetKey.PROFILE_CHUNKS},
readiness_state=ReadinessState.PARTIAL,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Does the ClickHouse cluster already exist in SaaS (US + DE) and S4S? If the table is suppose to live the profiles cluster, has the storage_set been added there? If this isn't done yet, we should set the readiness state to ReadinessState.LIMITED for now so that GoCD doesn't try to run these migrations on a cluster that doesn't exist.

Copy link
Contributor Author

@phacops phacops May 14, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The ClickHouse cluster we'd use exists in all environments (us, de, s4s, all STs).

I also have https://github.com/getsentry/ops/pull/10526 to configure it for all environments.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

+1

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

which cluster are you using @phacops ?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The cluster dedicated to profiling.

@phacops phacops requested a review from enochtangg May 14, 2024 17:25
@phacops phacops merged commit 6dfa3a7 into master May 15, 2024
30 checks passed
@phacops phacops deleted the pierre/continuous-profiling-dataset-migration branch May 15, 2024 20:44
@getsentry-bot
Copy link
Contributor

PR reverted: 64c044e

getsentry-bot added a commit that referenced this pull request May 15, 2024
This reverts commit 6dfa3a7.

Co-authored-by: volokluev <3169433+volokluev@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants