如何确定html视频元素上的预期帧率

如何确定html视频元素上的预期帧率

本文介绍了如何确定html视频元素上的预期帧率的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

限时删除!!

有没有方法可以确定在html视频元素中播放的内容的预期帧速率?



视频元素是否知道预期的FPS或帧计数,



这是我的失败尝试:










$ b b

  • 在视频元素本身上查找FPS或FrameCount属性 - Nope!


  • 关于FPS或FrameCount的跨视频格式头信息 - 没有一致!


  • 查找在更改帧时触发的事件 - p>



我的下一个尝试更复杂:通过捕获帧到canvas元素并计算帧,

解决方案



div>

知道视频的帧速率将不会像你想象的那样有用。

浏览器使用一些技巧,在电影的帧速率和刷新率,因此,如果您查看 currentTime 属性,您会看到实际帧持续时间(== currentTime - 上一个 currentTime )不是一个常量,它在帧之间变化。



在此示例视频中:模式为:

4-1-5-1:

4帧在21.3 + 1帧在32 + 5帧在21.3 + 1 frame at 32.



因此,如果你想要在画布上显示最新的帧,同时避免覆盖,解决方案可能是:

- 在每个rAF上,查看视频的当前时间:

•相同? - >不执行任何操作。

•更改了? - >更新帧。



无论你想做什么,比较两个currentTime ===两个数字可能比比较两个imageDatas更快;-)



编辑:查看规格以找到我的话的证据,我发现了这个注意的细微差别:

 视频流中的哪个帧对应于特定的播放位置由视频流的格式定义。 

()



所以严格地说,我们只能说(当前时间是一样的)暗示(框架是相同的)。

我只能打赌

在上面的示例中,Chrome尝试匹配60Hz计算机上的24Hz的电影,尝试获取45 Hz(= 60/2 + 60/4),最接近48 = 2 * 24。
对于21个创建的帧,我不知道它是内插还是只是复制帧。它肯定会根据浏览器/设备(特别是Gpu)而改变。我敢打赌,任何最近桌面或强大的智能手机插入。



无论如何,鉴于使用imageData检查的成本很高,你最好画两次检查。 >

Rq1:我想知道使用Xor模式+测试在什么程度上每次0个32位可以提高比较时间。 (getImageData 慢。)



Rq2:我确定有一个hacky的方法来使用播放速率来同步视频显示,并知道哪个帧是真(==不是内插)帧。 (所以两个通过这里1)同步2)后退和检索帧)



Rq3:如果你的目的是获得每个视频帧 / em>视频的框架,浏览器不是要走的路。如上所述,(桌面)浏览器进行内插以尽可能接近地匹配显示帧速率。这些帧不是在原始流中。甚至有一些高端的3D(2D +时间)插值设备,其中甚至不想显示初始帧(!)。
另一方面,如果你可以使用(内插的)输出流,轮询rAF将提供你看到的每帧(你不能错过一个帧(除了明显你的应用程序正忙于执行其他操作。)



Rq4:插值(==无重复帧)在最近/正常的GPU驱动的桌面上可能是99.99%。



Rq5:一定要加热你的函数(在开始时调用它们100次),并且不创建垃圾以避免jit / gc暂停。


Is there a way to determine the intended frame rate of content playing in the html video element?

Does the video element even know the intended FPS or frame count, or does it simply "guess" (maybe 24fps) and play at the guessed speed?

Here are my unsuccessful attempts:

  • Look for a FPS or FrameCount property on the video element itself--Nope!

  • Look for cross-video-format header info about FPS or FrameCount--Nothing consistent!

  • Look for an event that is triggered upon frame changing--Nope!

My next attempt is more complicated: Sample the video by capturing frames to a canvas element and count frames by determining when the pixels change.

Does anyone have a simpler answer before I do the complicated attempt?

解决方案

Knowing the frame-rate of the video wouldn't be as useful as you might think.
Browsers uses of some tricks to make a match between the frame-rate of the movie and the refresh-rate of the screen, so if you look at currentTime property, you'll see that the actual frame duration ( == currentTime - previous currentTime) is not a constant, it varies from frame to frame.

On this sample video : http://jsfiddle.net/3qs46n4z/3/ the pattern is :
4-1-5-1 :
4 frames at 21.3 + 1 frame at 32 + 5 frames at 21.3 + 1 frame at 32.

So if you want to always display the latest frame on a canvas while avoiding overdraw, the solution might be to :
- On each rAF, look at the current time of the video :
• Same ? -> do nothing.
• Changed ? -> update frame.

And whatever you want to do, comparing two currentTime === two numbers might be faster than comparing two imageDatas ;-)

Edit : looking into the specifications to find evidence of my saying, i found a nuance with this Note :

 Which frame in a video stream corresponds to a particular playback position is defined by the video stream's format.

(Note of 4.8.6 at http://www.w3.org/TR/2011/WD-html5-20110113/video.html )

So strictly saying we can only say that (the current time is the same) implies (the frames are identical).
I can only bet that the reciprocal is true => different time means different frame.
In the example above, Chrome is trying to match the 24Hz of the movie on my 60Hz computer by trying to get 45 Hz ( = 60 / 2 + 60 / 4), the nearest from 48 = 2*24.For the 21 created frames i don't know if it interpolates or merely duplicates the frames. It surely changes depending on browser/device (Gpu especially). I bet any recent desktop or powerful smartphone does interpolate.

Anyway given the high cost of checking with the imageData, you'd much better draw twice than check.

Rq1 : I wonder to which extent using Xor mode + testing against 0 32 bits at a time could boost the compare time. (getImageData is slow.)

Rq2 : I'm sure there's a hacky way to use the playback rate to 'sync' the video and the display, and to know which frame is a genuine ( == not interpolated ) frame. ( so two pass here 1) sync 2) rewind and retrieve frames).

Rq3 : If your purpose is to get each and every video frame and only the video's frame, a browser is not the way to go. As explained above, the (desktop) browsers do interpolate to match as closely as possible the display frame rate. Those frames were not in the original stream. There are even some high-end '3D' (2D+time) interpolation devices where the initial frames are not even meant to be displayed (! ).On the other hand, if you are okay with the (interpolated) output stream, polling on rAF will provide every frames that you see (you can't miss a frame (except obviously your app is busy doing something else) .

Rq4 : interpolation ( == no duplicate frames ) is 99.99% likely on recent/decent GPU powered desktop.

Rq5 : Be sure to warm your functions (call them 100 times on start) and to create no garbage to avoid jit / gc pauses.

这篇关于如何确定html视频元素上的预期帧率的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

1403页,肝出来的..

09-06 14:03