diff --git a/src/docs/product/session-replay/performance-overhead.mdx b/src/docs/product/session-replay/performance-overhead.mdx index bbe8811df5f38..efffcd87db841 100644 --- a/src/docs/product/session-replay/performance-overhead.mdx +++ b/src/docs/product/session-replay/performance-overhead.mdx @@ -10,7 +10,7 @@ Session Replay works by observing and recording changes to your web application' **For most web applications, the performance overhead of our client SDK is imperceptible to end-users.** -Performance overhead depends on the complexity of your application. Applications with a large DOM and numerous DOM node mutations will have a higher overhead than a more simpler, mostly static site. The only way to get accurate overhead metrics is to measure it yourself. We have written a blog post, [Measuring Session Replay Overhead](https://sentry.engineering/blog/measuring-session-replay-overhead), that outlines how you can get started measuring overhead of Replay on your applications without deploying to production. We measured the overhead of the Replay SDK on Sentry's web UI using the methodology from the blog. Here are the results (median values are shown): +Performance overhead depends on the complexity of your application. Applications with a large DOM and numerous DOM node mutations will have a higher overhead than a more simpler and mostly static site. The only way to get accurate overhead metrics is to measure it yourself. We have written a blog post, [Measuring Session Replay Overhead](https://sentry.engineering/blog/measuring-session-replay-overhead), that outlines how you can get started measuring overhead of Replay on your applications without deploying to production. We measured the overhead of the Replay SDK on Sentry's web UI using the methodology from the blog. Here are the results (median values are shown): | metric | Without Sentry | Sentry SDK only | Sentry + Replay SDK | | -------------------------------- | -------------- | --------------- | ------------------- | @@ -23,13 +23,14 @@ Performance overhead depends on the complexity of your application. Applications | Network Upload | 21 B | 3.79 KB | 392.98 KB | | Network Download | 7.11 MB | 6.93 MB | 6.87 MB | -
- \*: The standard deviation for LCP was 386, 511, 354 ms respectively, meaning + *: The standard deviation for LCP was 386, 511, 354 ms respectively, meaning that the LCP values are quite spread out and explains why the only-Sentry LCP value is higher than Sentry with Replay. +
+ The benchmarks were run on an Apple M1 MacBook Pro against a remote preview server against a remote API backend with 100 iterations. The scenario can be summarized as loading Sentry, navigating to Discover, adding 4 columns, waiting for results, adding another column, and finally waiting for results a second time. The benchmark tests a rather strenuous recording scenario as the Discover data table is one of our most complex in regards to DOM nodes and mutations. A simpler scenario run consisting of navigation to four different "Settings" pages produced an increase of ~100 ms of total JS blocking time.