Scientists in the United States have managed to invent an algorithm in able to read human thoughts, decipher the riddles of his brain and know what he is thinking without the need to confess or speak, which constitutes a new development in the field of treating many diseases and exploring many things related to humans.
According to a report published by the British newspaper “Daily Mail”, seen by “Al Arabiya.net”, the new technology achieved by American scientists depends on artificial intelligence technology, and constitutes a non-surgical breakthrough.
The device and algorithm were developed at the University of Texas, USA, with the aim of helping those who cannot speak or write to communicate for the first time, as this method decodes the language in real time.
The method works by feeding functional magnetic resonance imaging (fMRI) to the algorithm, which reconstructs arbitrary stimuli a person feels or thinks in natural language.
Participants in the studio they listened to the stories being told as the scientists scanned areas of the brain associated with natural language and entered the scan in an AI-powered decoder that returned a summary of what the individual was listening to.
So far, this process has only been accomplished by implanting electrodes in the brain.
The new model produces an idea, or summary, of a patient’s thoughts by analyzing the scans, but it cannot literally decode what they are thinking, the researchers say.
Our brain breaks down complex thoughts in smaller pieces that correspond to a different aspect of the whole thought Ideas can be as simple as a single word, like (dog), or as complex as (I have to walk the dog).
The brain also has its own alphabet of 42 different elements that relate to a specific concept such as size, color or position, and all of these combine to form our complex thoughts.
Each “letter” is handled by a different part of the brain, so by combining all the different parts, it is possible to read a person’s mind.
The team it did so by recording fMRI data from three parts of the brain linked to natural language while a small group of people listened to 16 hours of audio broadcasts.
The three brain regions analyzed were the frontal lobe network, the classical language network, and the parietal, temporal and occipital association network, the researchers said.
The algorithm was then subjected to scans, which compared the sound patterns with patterns of recorded brain activity. The system proved to be in able to scan and transform it in a story based on content, which the team he found matches the idea of the stories being told.
Although the algorithm is not in able to analyze every “word” in an individual’s thoughts, it is in able to decode the story that each person has heard.
Read More About: Business News