“The Babel fish is small, yellow, leech-like, and probably the oddest thing in the universe. It feeds on brain wave energy, absorbing all unconscious frequencies and then excreting telepathically a matrix formed from the conscious frequencies and nerve signals picked up from the speech centres of the brain, the practical upshot of which is that if you stick one in your ear, you can instantly understand anything said to you in any form of language: the speech you hear decodes the brain wave matrix.”

Related to The Hitchhiker’s Guide to the Galaxy, Google released a line of new products on Wednesday, including its first pair of premium wireless headphones, which can support live translation between more than 40 languages.

When the Google Pixel Buds are paired with a new handset, the Google Pixel 2, the earbuds can tap into Google Assistant, Google’s artificially intelligent voice-activated product.

The search company, which is now making a slew of its own hardware products, announced the Google Pixel Buds at a San Francisco event on October 4. This functionality is only available when the headphones are paired with a Google Pixel 2 phone, further demonstrating how Google is looking to directly compete with Apple in the mobile field. The earphones are able to utilize Google’s AI-powered, voice-activated assistant, Google Assistant.

Google has been ramping up its translation services for years. Late last year it released a new version of its simultaneous translation service powered completely by artificial intelligence. Quartz tested the service after it launched, and concluded it had some work to do on its Chinese.

According to Google CEO Sundar Pichai’s statement to investors, “We have improved our translation ability more in one single year than all our improvements over the last 10 years combined.”

The translation itself is currently processed on Google’s AI-focused data-centers, because it takes a lot of processing power. Audio must be converted to text, translated into another language, and then turned back into speech and spoken to the listener.

The last part of that process is traditionally done by putting together pre-recorded words or word fragments. However, DeepMind, Alphabet’s AI research lab, wrote in a blog post today that the AI research it used to generate human-sounding voices—a system called WaveNet—is now in Google Assistant. That means the voice speaking the translations will be generated in real time and thus more realistic, according to DeepMind.

Mobile technologies are increasingly developing to integrate into new areas of our lives. Devices are even able to turn your phone into a mobile lab. Pairing machine learning and mobile devices will only continue to expand the possibilities of our mobile devices.

The earbuds will cost $159 when they become available in November. Preorders have just begun on the Google store website.

“The Babel fish is small, yellow, leech-like, and probably the oddest thing in the universe. It feeds on brain wave energy, absorbing all unconscious frequencies and then excreting telepathically a matrix formed from the conscious frequencies and nerve signals picked up from the speech centres of the brain, the practical upshot of which is that if you stick one in your ear, you can instantly understand anything said to you in any form of language: the speech you hear decodes the brain wave matrix.”

Related to The Hitchhiker’s Guide to the Galaxy, Google released a line of new products on Wednesday, including its first pair of premium wireless headphones, which can support live translation between more than 40 languages.

When the Google Pixel Buds are paired with a new handset, the Google Pixel 2, the earbuds can tap into Google Assistant, Google’s artificially intelligent voice-activated product.

The search company, which is now making a slew of its own hardware products, announced the Google Pixel Buds at a San Francisco event on October 4. This functionality is only available when the headphones are paired with a Google Pixel 2 phone, further demonstrating how Google is looking to directly compete with Apple in the mobile field. The earphones are able to utilize Google’s AI-powered, voice-activated assistant, Google Assistant.

Google has been ramping up its translation services for years. Late last year it released a new version of its simultaneous translation service powered completely by artificial intelligence. Quartz tested the service after it launched, and concluded it had some work to do on its Chinese.

According to Google CEO Sundar Pichai’s statement to investors, “We have improved our translation ability more in one single year than all our improvements over the last 10 years combined.”

The translation itself is currently processed on Google’s AI-focused data-centers, because it takes a lot of processing power. Audio must be converted to text, translated into another language, and then turned back into speech and spoken to the listener.

The last part of that process is traditionally done by putting together pre-recorded words or word fragments. However, DeepMind, Alphabet’s AI research lab, wrote in a blog post today that the AI research it used to generate human-sounding voices—a system called WaveNet—is now in Google Assistant. That means the voice speaking the translations will be generated in real time and thus more realistic, according to DeepMind.

Mobile technologies are increasingly developing to integrate into new areas of our lives. Devices are even able to turn your phone into a mobile lab. Pairing machine learning and mobile devices will only continue to expand the possibilities of our mobile devices.

The earbuds will cost $159 when they become available in November. Preorders have just begun on the Google store website.