“Meta's collecting our clicks and likes is like them knowing what you eat for breakfast, then mind-reading tech would be them knowing why you chose cereal over eggs?”
Meta has just pulled the curtains back on a revolutionary artificial intelligence (AI) system.
In mere milliseconds, this AI reads your brain's activity and turns it into vivid pictures.
It is a bit like having a high-speed artist sketching the daydreams and thoughts swirling inside your noggin.
A research paper spills the beans.
The study reveals:
"Overall, these results provide an important step towards the decoding-in real time-of the visual processes continuously unfolding within the human brain."
It relies on a technique called magnetoencephalography, or MEG for short.
Imagine MEG as a high-tech stethoscope that eavesdrops on your brain's chit-chats, capturing the magnetic fields your neurons produce as they gossip away.
First up, the Image Encoder. Think of it as a translator that helps the AI understand pictures. It takes an image and chops it into bite-sized pieces the AI can munch on.
Then comes the Brain Encoder. This chap is the middleman who syncs these translated image bits with your brain's MEG signals. Picture it as a bridge between your brain's buzzing and the image's digital code.
Last but not least, the Image Decoder. This component takes the mashed-up info and recreates an image that mimics your original thought. It is like a chef taking all the ingredients and serving you a dish that tastes like your grandma's secret recipe.
Image: Meta—’
But hold on, Meta's not alone in pushing the envelope in mind-reading tech.
Other researchers are also jamming to this tune.
For instance, an AI was able to belt out Pink Floyd's "Another Brick in the Wall" using only data pulled from human brain activity.
This technology is not just a shiny new toy.
It is making strides in healthcare too.
One study highlighted how AI has enabled a quadriplegic man to regain sensation and movement, thanks to microchips implanted in his brain.
These advances show that we are not just drawing pretty pictures; we are potentially rewriting the script for medical rehabilitation.
Unbabel, a digital translation juggernaut, has spent four years developing a "brain-to-communications interface," aiming to facilitate deep connections between businesses and their multilingual customer base.
At the core of this venture is the Language Operations platform, a harmonious blend of AI and human finesse.
The platform evolves through high-quality interactions, aiming to seamlessly integrate AI tools with human transactions over time.
Indeed, the jump from data collection to thought collection raises ethical questions thicker than a blockchain.
If misused, this technology could serve as a tool for cybercriminals to exploit sensitive mental data, influence high-stakes situations, or violate personal privacy.
Maybe a law for personal privacy might surface if more advancements progresses.
As we marvel at the technology's capabilities, experts urge caution, emphasising the necessity to safeguard mental privacy.
The balance between innovation and ethical responsibility remains a precarious tightrope we are still learning to walk.