Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion. Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules. |
4th May 2013, 15:08 | #1 | Link | |
Practising Schemer
Join Date: Feb 2008
Location: Newcastle, Australia
Posts: 791
|
ORBX.js HD Codec
Saw this Brendan Eich article https://brendaneich.com/2013/05/today-i-saw-the-future/
It claims that this js library has: Quote:
|
|
5th May 2013, 00:54 | #2 | Link |
Registered User
Join Date: Mar 2004
Posts: 1,142
|
nobody knows until they release the encoder for us to test out and compare with x264. I highly doubt it will be 25% better than x264, possibly a h.264 codec by a commercial company, who knows. Also we have no info as to whether it violates the patents of the MPEG-LA yet either. It is an interesting development though which could potentially be important.
|
5th May 2013, 04:56 | #3 | Link |
Practising Schemer
Join Date: Feb 2008
Location: Newcastle, Australia
Posts: 791
|
Brendan Eich doesn't put too much FUD out there, gets things wrong like the rest of us, obviously though he is looking at this from a Mozilla & free web perspective to make the browser the whole platform.
|
5th May 2013, 07:29 | #4 | Link |
Registered Developer
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,366
|
Personally, if someone claims to not only have developed a platform to run a video decoder inside the browser, but also a codec which beats H.264 hands down, all they can do to convince me is giving me stuff to play with.
So, before we have independent testing of these claims, i'll consider it hyped. You know how such comparisons to H.264 always use the H.264 reference encoder as comparison, so theirs looks better? Real world comparisons to real world encoders, and we'll talk!
__________________
LAV Filters - open source ffmpeg based media splitter and decoders |
5th May 2013, 15:38 | #5 | Link |
Registered User
Join Date: Apr 2002
Posts: 756
|
On the issues of patents, I think, correct me if i am wrong, this Javascript Codec avoid all the patents by not providing binaries, much like how x264 are providing code, but not compiled binaries, since sometime ago US rules code distribution belongs to free speech.
And it would be great if that is the future, but the problem is still battery, until some special hardware that can make JavaScript running at speed capable of decoding this while retainning ultra low battery life. It still wouldn't work out. Note: Even with asm.js being nearly 1.2x of Native Code speed, it would still be far away from a power usage dedicated h.264 encoder. I truly hope someday all the patents for H.265 would only collect the payments through hardware, and allow the software side to be completely free. |
6th May 2013, 17:16 | #6 | Link |
Moderator
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,852
|
Ars Technica had a good, reasonably skeptical article: http://arstechnica.com/information-technology/2013/05/are-video-codecs-written-in-javascript-really-the-future/
This seems of unlikely value to me. On most browsers it won't even support P or B frames right now according to the article. And I'm not really use if something that requires WebGL to be interesting really counts as "JavaScript." I'd think that most devices with good enough hardware to do JS+WebGL with have a hardware H.264 decoder ASIC, and probably have hardware encode as well. Unless they are doing something CRAZY, there's no way they're getting 25% more efficient at similar wattage! I expect that "25%" example is probably a case where they didn't configure the H.264 encode correctly. I can't tell you how many booths at NAB I tweaked their x264 command lines for "apples-to-apples" comparisons. |
8th May 2013, 23:20 | #8 | Link | |
Angel of Night
Join Date: Nov 2004
Location: Tangled in the silks
Posts: 9,559
|
There was a great post in Ars' comments (aside from endless OT bickering about the strengths and weaknesses of JS in general):
Quote:
|
|
9th May 2013, 10:05 | #9 | Link |
Registered User
Join Date: Aug 2009
Posts: 202
|
Given Adobe products long history of security issues I can see a very valid role for PDF.js when reading random files off the internet. Mozilla have also stated that ambitious projects like that help them push the whole browser platform to the next level.
For codecs the big thing for Mozilla is punting the patent issue but I think he may have been quite literal in his "I have seen the future" i.e. not "we should all use this right now" but "we'll be using something like this in the future". It's easy to see that over time codecs (image/audio/video) have trended from single-purpose hardware to software. Following on from that, a continually updated software codec for use with disposable content (e.g. video chat) makes more sense in that world than something nailed down in a specification 10 years ago. (I kind of thought that this might be where Google was aiming with the VP* codecs given their culture of beta releases and incremental improvements via online updates). Doing it "in javascript" rather than as part of the browser puts that vision further back again, but even before this announcement there's been a progression of work from all sorts of people to open up modern hardware to javascript. How many years do you have to go back to find "you'll never get spotify/photoshop/unreal engine etc. working in a browser". Like most crazy things it'll start in niches, this orbx thing appears to be a family of codecs, one for screensharing, another for 3D video games. Perhaps this specialisation is giving them an edge in those domains, but I think it's definately where things are headed generally, even if it takes a while to get there. |
9th May 2013, 15:41 | #10 | Link |
Registered User
Join Date: Jan 2007
Posts: 729
|
Except that with the performance lossess from javascript, your format will likely have to be compromised enough that you will wish you could use a standard codec designed 10 years ago. (Hey - H.264 is almost as old and it rules?)
Bonus points: the standard codec will have decent encoders with good quality/bit results, while your javascript monster might not, especially if it is an inhouse design with proprietary design and development model (VP8, this javascript vapourware). |
12th August 2015, 10:47 | #11 | Link |
/人 ◕ ‿‿ ◕ 人\
Join Date: May 2011
Location: Russia
Posts: 643
|
Today I've remembered about this vapourware. It seem that they've actually made their comparison public in october: http://aws.otoy.com/docs/ORBX2_Whitepaper.pdf
It says that they have been comparing against x264 --preset slow --bframes 0 --tune ssim / --tune psnr (which is quite sane). I've been trying to reproduce it and, well, obviously SSIM values are wildly different (--crf 18 gives SSIM-Y 0.9903910 = 20.173 db @ 6289.24 kb/s), so either there're different ways to calculate SSIM in db, or dunno. For PSNR they've used "PSNR-HVS-Y" and I have no idea how to do it. UPD: oh wait, they do have normal PSNR-Y graph. Real x264 results are: 43.207 @ 2016 kb/s 45.057 @ 2968 kb/s 46.303 @ 3990 kb/s 47.102 @ 5036 kb/s 49.347 @ 10068 kb/s It might be not that accurate (who knows how they've preformed RGB->YV12 conversion), but still it's much higher than numbers they got and beats their numbers (which could be fake as well). Last edited by vivan; 12th August 2015 at 14:10. |
Tags |
codec, h264, orbx.js |
Thread Tools | Search this Thread |
Display Modes | |
|
|