Google and Ittiam systems have made a step forward in the world of advanced technology to make life easier, increasing viewers and mass consumption.With the introduction of open and royalty free video coding format (VP9) developed by Google, development of VP9 started in the second half of 2011 under the name of Next Gen Open video (NGOV) and VP-Next.
A select media roundtable for discussion about VP9, Google’s royalty-free, open-source codec was held recently at the Taj Santacruz, Mumbai in October, 2017. Animation Xpress was present there to get a hang of the whole thing.
The meeting started with an introduction by the chairman and CEO of Ittiam Systems Srini Rajan. The meaning of the acronym I.T.T.I.A.M is ‘I think therefore I am’. Google’s Chrome Media head of strategy and partnerships Matt Frost and senior software engineer Adrian Grange were present in the meeting as main speakers. It was a formal meeting with the press to discuss about VP9. The topic for discussion was “Future of Video with VP9” emphasising on “The next wave of cutting edge technologies redefining the video content ecosystem.”
For the knowledge of our readers, Ittiam Systems is a global technology company focused in catering to the special demands of the online video segment. The company started in 2001 and had recently completed their four years anniversary with a meeting with Frost to work on VP9 and its benefits.
Let us know how was the discussion in the media roundtable meet.
How does VP9 work?
Matt Frost: The project was launched by Google in 2010 with the start of VP8 and it was called the WebM project. VP9 allows playback efficiency and has been around since 2013 and 2014. It is used for video viewing on mobile phones, tablets and PC. With the use of VP9 on YouTube, when a video is viewed there is less buffering and users also experience more high resolution content. VP9 viewers are upgraded to higher resolution like 720p with the use of H.264. The focus has been towards smart phones, HD televisions as they can support this 4K technology and HD quality can be viewed.
The transition from VP8 to VP9
Matt Frost: The initial launch was about four years ago but most of the viewers are still on H.264 and are currently being integrated into gaming as well. The codec is a very simple technology and is basically taking digital video and representing it with a code that makes it efficient. The challenge that both Google and Ittiam are facing is finding out ways to add this technology to games, cameras and to make sure that the devices targeted can support this format and hence it has been taking a long time to perfect this launch.
Currently VP8 is the most used format for video streaming and video conferencing. VP9 is currently supported on Google, Firefox and is a natural successor to VP8. Due to the launch of VP8 and VP9, more HD videos have emerged compared to the previously used formats like MPEG and Flash. It is supported on many browsers for free.
Application of VP9 in the AVGC industry
Matt Frost: VP9 is used in applications like Twitch, YouTube and gaming as the original formats like MPEG doesn’t compress the content for gaming and animation but displays the content as very shaky. Google also launched a media format called WebP which makes faster compression of still images . It has also focused on better VR experiences and to make sure that the user receives better audio with new extensions that they are working on. For further improvements in the gaming industry, it has an open source technology called Draco. We want to make sure that all next gen AR, VR , gaming and animation experiences are better. We have set up new formats for that and will continue to work on better compression technologies to improve all video viewing.
Is there a loss in the quality of video during compression?
Adrian Grange : Sometimes, during the compression of a video, some of the quality will be lost and the key is to lose the information that people do not notice. To better achieve their objective of improving compression, Google has set up different tools for the codecs to make sure that they work properly. You can migrate from VP9 to AV1.
Matt Frost: The right tools are developed for softwares to be used in either real time or offline situations and difficult decisions needs to be made if required. VP9 started off in PCs then in android devices where it can be used. But later on, H.264 was used instead. A difference was made between video for the living room which is all hardware related to a high definition television but it is not the same as video quality on an application over a PC or a smartphone and can have a better playback up to a certain point. A software decoder requires much more power than a hardware decoder because it has greater efficiency in its format. You are not relying on the modem, you are spending more on the software decoder and it is just as sufficient as using a hardware decoder. 70 to 80 percent of data on the web is video related. Based on the test results, it has been found that a lot of people spend time on Google chrome viewing videos either on YouTube or by using Twitch. At the moment we are collaborating with Netflix as well.
VP9 has been added across all resolutions, Frost also discussed what videos looked like when higher resolutions were not available for everyone, and hence they are not pursuing high end, but making sure everyone can be able to access resolutions at 720p.
With the introduction of VP9 can 144p be eliminated?
Adrian Grange: Unfortunately there is still a spike in users using that resolution. VP6 was in Flash and VP7 was used in Skype. There is a progression and it takes time for the evolution of the formats to take place.
Matt Frost : A lot of the people who work to design the formats have worked with them since 1998 and are very familiar codecs and the objective is to make life easier.
What are the future applications of VP9?
Matt Frost: It depends on what will happen with VR and AR. General consensus in the market shows that AR has a better chance in the market as compared to VR. The constraining factor happens the moment you put headsets on, where it may be good for gaming, it may not be suitable for other avenues. But with AR, you can imagine all sorts of applications overlaying information on the real world that you are viewing and it doesn’t matter if its on computer screens or glasses. This seems to be an area where people are gearing up on. With AV1 they are able to develop codecs that will help them move in that direction and since they are encouraging spherical video which will help users move around in a virtual environment. People will be able to see high quality video and one of the problems they have to solve is how to make the natural or synthetic content look as if its natural.
Adrian Grange : If you look at a scene now, it is usually around the edges that you see a difference in lighting and that is where the Draco project, the video projects with VP9 or AV1. It appears to be the exciting area going forward. It has paved the way for entering a new age with the broadcast industries that has helped them move forward since the early 80s. Bearing this in mind, they have taken a step beyond this to the web based applications. The main focus is on making these codecs for better live experiences because for streaming content or broadcast content it never mattered how quickly the content was captured. With VR tech experiences, we need to make sure to have high frame rate and low latency. If not done people will find themselves getting nauseous or there will be bad user experience. These are one of the challenges they have and there are available tools. With each new tool, a new decision is taken by your encoder to decided whether the tool should be used or not.
The post Google and Ittiam Systems meet to discuss the future of video with VP9 appeared first on AnimationXpress.