Google has started making some new developments in its real-time caption feature and Live Captions from Pixel phones to a Chrome user. Live Captions make use of machine learning to instantly create videos and audios captions. This makes the web more accessible for the deaf.
When this feature is enabled, Live Captions appear in a box at the browser’s bottom when listening or watching a piece of content. Words display after some delay and those who stutter might spot these mistakes.
However, this feature is very impressive just as it was on Pixel phones two years ago. These captions even make it easy to “read” podcasts it videos without disturbing people around you.
You can also read: Japan’s Prohibition on Same-sex Marriage Rules as Unconstitutional
The Live Captions Chrome offers worked on podcast players, YouTube videos, and Twitch streams. Users have observed that Chrome’s Live Captions only work in English. The most recent version of Chrome enables Live Captions.
To do this, go to settings and click on the “Advanced” section, after that click on “Accessibility.” (In case, you can’t find this feature on your settings, update your browser manually and restart.)
Some speech recognition files will be downloaded automatically. After doing this, you will realize that captions would be displayed the next time you play audio with your browser.
Live Captions were first allowed in Android Q beta, they were only exclusive to some Samsung and Pixel phones until everything changed recently. Live Captions are now enabled on Chrome and it will be made available to wider audiences.