Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

low-latency-playback-over-gaps__t2.html fails in all 8 TVs in London plugfest #152

Open
jpiesing opened this issue Mar 15, 2024 · 27 comments
Assignees
Labels
Blocker Issue must be resolved for V1 launch London 2024 Plugfest Issues from London plugfest in Feb/Mar 2024 Release V2 Deferred to Release V2

Comments

@jpiesing
Copy link

jpiesing commented Mar 15, 2024

In the London plugfest, low-latency-playback-over-gaps__t2.html failed the following 2 observation in 5 or 6 of 8 TV sets and failed to execute to completion in two more.

  • OF] Every video frame S[k,s] shall be rendered and the video frames shall be rendered in increasing presentation time order.
  • [OF] Video: The presented sample shall match the one reported by the currentTime value within the tolerance of +/-(2/framerate + 20ms)
@jpiesing jpiesing added Release V2 Deferred to Release V2 London 2024 Plugfest Issues from London plugfest in Feb/Mar 2024 labels Mar 15, 2024
@jpiesing jpiesing changed the title low-latency-playback-over-gaps__t2.html fails in 6 of 8 TVs in London plugfest and failed in 2 more low-latency-playback-over-gaps__t2.html fails in all 8 TVs in London plugfest Mar 15, 2024
@jpiesing
Copy link
Author

Not resuming playback after the gap should be reported as failure rather than TIMEOUT.

@FritzHeiden
Copy link
Collaborator

Not resuming playback after the gap should be reported as failure rather than TIMEOUT.

How should we detect this error? As this is unwanted behavior rather than an actual error that is thrown by the application, we would have to manually throw the error so it appears correctly in the results. Here is what comes to my mind: When play position is close enough to the beginning of the gap and the playback remains in waiting state for a certain time, throw an error.

What do you think?

@jpiesing
Copy link
Author

The failure of currentTime is fixed by cta-wave/device-playback-task-force#123.

The failure of 'Every video frame' has the following error on TV 2.

FAIL: First frame found is 7, expected to start from 1. First frame number tolerance is 0. Mid frame number tolerance is 10. Frames out of order 368, 251. Last frame detected before gap 120 is within the tolerance of 'stall_tolerance_margin'=7.5 frames of expected frame 125. Total of missing frames is 6.

@yanj-github What is "Frames out of order"? How should that be interpreted?

@yanj-github
Copy link
Contributor

@jpiesing It means frame 251 is detected after frame 368, I would go back to recording and navigate to frame 368 and goes frame by frame. If you have debug logs or the recording that can share with me I can have a look.

@jpiesing
Copy link
Author

@louaybassbouss @FritzHeiden Please see my email about sharing recordings from London with Resillion. Both TVs 2 and 4 have this issue.

@louaybassbouss
Copy link
Collaborator

@louaybassbouss @FritzHeiden Please see my email about sharing recordings from London with Resillion. Both TVs 2 and 4 have this issue.

Upload in progress. Will send details to @yanj-github.

@jpiesing jpiesing added the Blocker Issue must be resolved for V1 launch label Apr 18, 2024
@jpiesing
Copy link
Author

I've labelled this as "Blocker" on the basis that we want to include this test in the release.
If this turns out to be more complex than expected then we can drop the test from the release and remove "Blocker".

@yanj-github
Copy link
Contributor

cta-wave/device-playback-task-force#125
I need help on this please.

@jpiesing
Copy link
Author

cta-wave/device-playback-task-force#126 also applies here

@yanj-github
Copy link
Contributor

yanj-github commented Apr 30, 2024

cta-wave/device-playback-task-force#125

The failure of 'Every video frame' has the following error on TV 2.

FAIL: First frame found is 7, expected to start from 1. First frame number tolerance is 0. Mid frame number tolerance is 10. Frames out of order 368, 251. Last frame detected before gap 120 is within the tolerance of 'stall_tolerance_margin'=7.5 frames of expected frame 125. Total of missing frames is 6.

@yanj-github What is "Frames out of order"? How should that be interpreted?

Frame out of order caused by duplication:
What I observed from recording is that the playback started from beginning and it stopped at frame 120, then it jumped to frame 251 and played till 368 then jumped back again to frame 251 then played till the end. The duration from frame 251 till 368 is repeated.

Will this be test runner or device issue? The test contains duplicated duration. Frame 251 and frame 368, from the example.

@FritzHeiden
Copy link
Collaborator

Frame out of order caused by duplication: What I observed from recording is that the playback started from beginning and it stopped at frame 120, then it jumped to frame 251 and played till 368 then jumped back again to frame 251 then played till the end. The duration from frame 251 till 368 is repeated.

Will this be test runner or device issue? The test contains duplicated duration. Frame 251 and frame 368, from the example.

This is also what I noticed when looking at the recordings. As this test runs properly in the browser and on other TVs, I don't believe its a test runner issue. Here is my guess what is happening: When reaching the beginning of the gap, the test should wait for a specific time until skipping to the end of the gap. Some devices skip over this gap automatically and continue playing the video. As soon as the waiting time is over the test skips to the end of the gap, which in this case results in a seek backwards.

@jpiesing
Copy link
Author

jpiesing commented May 6, 2024

Frame out of order caused by duplication: What I observed from recording is that the playback started from beginning and it stopped at frame 120, then it jumped to frame 251 and played till 368 then jumped back again to frame 251 then played till the end. The duration from frame 251 till 368 is repeated.
Will this be test runner or device issue? The test contains duplicated duration. Frame 251 and frame 368, from the example.

This is also what I noticed when looking at the recordings. As this test runs properly in the browser and on other TVs, I don't believe its a test runner issue. Here is my guess what is happening: When reaching the beginning of the gap, the test should wait for a specific time until skipping to the end of the gap. Some devices skip over this gap automatically and continue playing the video. As soon as the waiting time is over the test skips to the end of the gap, which in this case results in a seek backwards.

Is it reasonable to make the test JS code detect devices that skip over the gap automatically and report a FAIL at that point?

@FritzHeiden
Copy link
Collaborator

Is it reasonable to make the test JS code detect devices that skip over the gap automatically and report a FAIL at that point?

Implemented with #176

@jpiesing
Copy link
Author

jpiesing commented May 7, 2024

Both TVs 2 and 4 have the 'frames out of order' issue which we believe is an incorrect implementation.
TV 7 had a TIMEOUT.
How do we move this test forwards?

@yanj-github
Copy link
Contributor

@jpiesing if fixes done on test runner I think we need a new recording and run with OF. Maybe we cannot release this due to unable to re-take recording?

@FritzHeiden
Copy link
Collaborator

We may run this test on our selection of TVs in the lab and see if the frames out of order error still occurs, or at least is replaced by the new error from the test runner

@jpiesing
Copy link
Author

jpiesing commented May 8, 2024

We may run this test on our selection of TVs in the lab and see if the frames out of order error still occurs, or at least is replaced by the new error from the test runner

If the frame out of order does not occur then we can consider this resolved.

@jpiesing
Copy link
Author

2024-05-14: Wait to see the results from the Fraunhofer zoo check and close as discussed in the two previous comments.

@louaybassbouss
Copy link
Collaborator

This are the results on 3 different TV sets after adding a new assertion in the test to make sure that the video remains in the waiting state. @yanj-github I will share the 3 recordings with you.

image

image

image

@yanj-github
Copy link
Contributor

@louaybassbouss thanks. I will check the recording once I have got them.
From the result, it seems to me that all the test ended at frame 251 and the playback only last around for 10 minutes. I don't this this is what we expected. I'd expecting the playback starts from 1 to around 125 and then 251 till 750.

@louaybassbouss
Copy link
Collaborator

@yanj-github I just sent the recordings

@yanj-github
Copy link
Contributor

@louaybassbouss from looking at the recording the issue is caused by early reporting status=finished.
OF uses status=finished to determine the end of test and make observation. I have noticed the status changed to finished at frame 251. Can test runner stay playing until it is actually finishes the test please?

@louaybassbouss
Copy link
Collaborator

Thanks @yanj-github we will check. @FritzHeiden Please check the status reporting of finished state

@yanj-github
Copy link
Contributor

@jpiesing and @louaybassbouss The spec stating two different modes live and VOD. I can see that we only have live mode testing but the VOD mode test is missing.

@jpiesing
Copy link
Author

I think there are two cases which are not properly reflected in the DPCTF spec.

  • Simulated VOD where the app carries on appending data after the gap without waiting, i.e. when the player gets to the gap, there is already data after the gap and
  • Simulated live where the app delays appending data after the gap until the duration of the gap is over, i.e. when the player gets to the gap, there is no data after the gap

I'm not sure if what we currently have is one of these or something in-between.

@FritzHeiden
Copy link
Collaborator

@louaybassbouss from looking at the recording the issue is caused by early reporting status=finished. OF uses status=finished to determine the end of test and make observation. I have noticed the status changed to finished at frame 251. Can test runner stay playing until it is actually finishes the test please?

This is fixed with #178

@jpiesing
Copy link
Author

2028-05-28: This remains open as the DPCTF spec needs updating.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Blocker Issue must be resolved for V1 launch London 2024 Plugfest Issues from London plugfest in Feb/Mar 2024 Release V2 Deferred to Release V2
Projects
None yet
Development

No branches or pull requests

4 participants