Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Enhance Server-Timing implementation to support new Chrome DevTools Performance Panel integration #1578

Draft
wants to merge 4 commits into
base: trunk
Choose a base branch
from

Conversation

felixarntz
Copy link
Member

Summary

Fixes #

Relevant technical choices

  • Allows accessing the $start_value property (formerly called $before_value) of Server-Timing metrics via new methods.
  • Modifies that $start_value property to now store values in milliseconds instead of seconds. This is fine because the value could never be set from outside the class, so we have full control over it.
  • Introduces new metric methods measure_start() and measure_end(), replacing the now deprecated measure_before() and measure_after() (mostly in accordance with the better naming encouraged by the new Chrome integration.
  • Updates relevant default metrics to use the new methods.
  • Automatically outputs non-standard metrics response-start and response-end, which are used to provide timing context to the browser so that the browser can display Server-Timing metrics correctly in relation to the other client-side metrics. This is relevant as the server-side clock may be off in comparison to the browser clock.

Testing

  • Use Chrome 129+ Beta.
  • Go to DevTools > Settings > Experiments > Performance panel and enable server timings in the timeline.
  • Open the DevTools Performance panel and record a session including the request where the Server-Timing header with the metrics is added.

…nclude non-standard response-start and response-end metrics automatically, if possible.
@felixarntz felixarntz added [Type] Enhancement A suggestion for improvement of an existing feature [Plugin] Performance Lab Issue relates to work in the Performance Lab Plugin only no milestone PRs that do not have a defined milestone for release labels Oct 4, 2024
@swissspidy
Copy link
Member

@felixarntz we already have #1487 for this

@felixarntz
Copy link
Member Author

felixarntz commented Oct 4, 2024

@swissspidy Oops, I must have missed that.

Looks like we did a few things differently though, so worth comparing the two approaches. We can either align in this PR or the other one, that doesn't really matter.

Biggest differences I see:

Let me know your thoughts.

@swissspidy
Copy link
Member

My rationale for the latter was that they don't have the wp- prefix and that they only have the start parameter. Since start is a non-standard parameter, I think we shouldn't allow to have only start for regular metrics.

That's something that can be addressed with a little validation. And even without validation, nothing breaks and a developer can fix this easily once they realize their Server-Timing header is not as expected.

Continuing to enforce the wp- prefix for all registered metrics also means nobody can (intentionally or accidentally) override response-start and response-end.

I'm actually not a huge fan of this enforcement. I think there shouldn't be any prefix by default, so that core can use wp-, and plugins can use their own prefix like woo- if they want to.

Related to that, #1487 accounts for the way these two special metrics work in the handling for registered metrics, while this PR only adds support for optionally including the start parameter for each metric, but doesn't change anything else.

Both PRs have special handling for response-start and response-end in one form.

If we drop the prefix requirement, then the special handling in my POC goes away entirely.

#1487 always registers response-start and response-end, while this PR only does it when output buffering is enabled.
I went with the latter because without using output buffering, response-end is not the actual end of the response. I'm thinking we shouldn't send this in that scenario, as it would be inaccurate.

That's an easy fix. My PR was a quick POC, I simply didn't test with output buffering disabled.

#1487 adds support for the desc parameter, which this PR doesn't.
I'm happy to include this, but didn't include it here because it's an unrelated change, as desc was already a standard parameter and isn't needed to support the new DevTools feature.

It's not required, but it is used. Actually, after my recommendation, the team added support for desc. The description will be shown in the bottom drawer when the metric in question is selected in the timeline.

So the change is very much related.

Of course it can also be added in a separate PR and merged independently of the rest, as it is a useful change by itself.

This PR introduces measure_start() and measure_end() to better align with the DevTools naming and deprecates the existing "before" and "after" methods

This one I see as an unrelated change. This can be done in a separate PR.

This PR does more validation in the metric class

Nothing that can't be solved in the other PR. Though personally I'm wary of adding too much validation and _doing_it_wrong noise.

@felixarntz
Copy link
Member Author

felixarntz commented Oct 11, 2024

We probably need to discuss the path forward further. Some of your replies look reasonable to me, while I disagree with others.

A few notes:

  • It's a fair point that we should reconsider enforcement of the wp- prefix. I'm not sure we should just allow passing through any name, probably we should still encourage that names have a unique prefix. For example, the prefix could be a separate parameter, so that if you don't pass any, you get wp-, but you could pass another prefix, at which point it would be clear that you explicitly considered this. Just thinking out loud.
    • Some context: The main reason I initially implemented the wp- prefix enforcement is to avoid conflicts with other Server-Timing metrics that could be added independently of WordPress. I don't mean e.g. by other plugins, but by other server-side infrastructure, part of the server stack itself. My rationale was that of course plugins should also use their own prefix, but that it would then still be e.g. wp-my-plugin-my-metric, to keep all WordPress created Server-Timing metrics separate from Server-Timing metrics that come from outside WordPress. Still worth considering for this conversation.
  • I disagree that we should allow metrics to be used that only have a start value, because this is against the specification of Server-Timing. The only exceptions to this are the non-standard response-start and response-end metrics, but as they are clearly non-standard they should not influence what is possible for regular metrics. That's why I think hard-coding these two metrics outside the registration system is the right path.
  • I think the other points are more minor technicalities that we can figure out when aligning in the code. But those two points above we may want to discuss before proceeding.

@swissspidy
Copy link
Member

I disagree that we should allow metrics to be used that only have a start value,

That's why I said it's something that can be addressed with a little validation. My PR was a quick POC, I didn't spend much time on edge cases like this.

because this is against the specification of Server-Timing.

Technically this is not part of the Server-Timing specification. It's part of a non-standard extension that isn't in any specification. The spec only knows dur, not start nor end.

@felixarntz
Copy link
Member Author

I disagree that we should allow metrics to be used that only have a start value,

That's why I said it's something that can be addressed with a little validation. My PR was a quick POC, I didn't spend much time on edge cases like this.

👍

because this is against the specification of Server-Timing.

Technically this is not part of the Server-Timing specification. It's part of a non-standard extension that isn't in any specification. The spec only knows dur, not start nor end.

Exactly. I think we're saying the same thing.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
no milestone PRs that do not have a defined milestone for release [Plugin] Performance Lab Issue relates to work in the Performance Lab Plugin only [Type] Enhancement A suggestion for improvement of an existing feature
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants