Broadcom and a company named CAMB.AI Teaming up to bring on-device audio translation to a chipset This will allow devices that use the SoC to complete translation, dubbing and audio description tasks without having to dive into the cloud. In other words, it can greatly improve accessibility for consumers.
Companies promise ultra-low latency and enhanced privacy, as all processing is kept local to the user’s device. Wireless bandwidth should also be drastically reduced.
Broadcom just announced an AI chipset
As part of the audio description, a clip contains a demo video of the tool being used from the film Ratatouille. AI can be heard describing scenes in “AI chipset” different languages, with a written translation appearing on-screen. This looks incredibly useful, especially for those with vision problems.
There is one major caveat. This is a tightly controlled clip with lots of editing. We have no idea how this technology will work in a real world scenario. Also, we don’t know how accurate the information will be. It has a voice model that is already being used by organizations such as NASCAR, Comcast and Eurovision
The companies boast that it will enable “on-device translation in over 150 languages”. We don’t know when these chips will start “AI chipset” showing up in TVs and other gadgets The technology is currently in the testing phase, so it’s going to be a while. Broadcom recently teamed up with OpenAI to help the latter company develop its own chips.
