Monday, March 9, 2015

Brain-To-Brain Interfaces And The Science Of Telepathy by Kristyn Bates

Recent advances in brain-computer interfaces are turning the science fantasy of transmitting thoughts directly from one brain to another into reality.
Have you ever wondered what it would be like to walk a mile (or 1.6 kilometres) in somebody else's shoes? Or have you ever tried to send a telepathic message to a partner in transit to "pick up milk on your way home"?
Studies published in the last two years have reported direct transmission of brain activity between two animals, between two humans and even between a human and a rat. These "brain-to-brain interfaces" (BBIs) allow for direct transmission of brain activity in real time by coupling the brains of two individuals.
So what is the science behind this?

Reading the Brainwaves

Brain-to-brain interface is made possible because of the way brain cells communicate with each other. Cell-to-cell communication occurs via a process known as synaptic transmission, where chemical signals are passed between cells resulting in electrical spikes in the receiving cell.
Synaptic transmission forms the basis of all brain activity, including motor control, memory, perception and emotion. Because cells are connected in a network, brain activity produces a synchronised pulse of electrical activity, which is called a "brain wave".
Brain waves change according to the cognitive processes that the brain is currently working through and are characterised by the time-frequency pattern of the up and down states (oscillations).
For example, there are brainwaves that are characteristic of the different phases of sleep, and patterns characteristic of various states of awareness and consciousness.
An example of brainwaves that appear during one of the stages of sleep.

Brainwaves are detected using a technique known as electroencephalography (EEG), where a swimming-cap like device is worn over the scalp and electrical activity detected via electrodes. The pattern of activity is then recorded and interpreted using computer software.
This kind of brain-machine interface forms the basis of neural prosthesis technology and is used to restore brain function. This may sound far-fetched, but neural prostheses are actually commonplace, just think of the Cochlear implant!

Technical Telepathy

The electrical nature of the brain allows not only for sending of signals, but also for the receiving of electrical pulses. These can be delivered in a non-invasive way using a technique called transcranial magnetic stimulation (TMS).
A TMS device creates a magnetic field over the scalp, which then causes an electrical current in the brain. When a TMS coil is placed over the motor cortex, the motor pathways can be activated, resulting in movement of a limb, hand or foot, or even a finger or toe.
Scientists are now working on ways to sort through all the noise in brainwaves to uncover specific signals that can then be used to create an artificial communication channel between animals.
The first demonstration of this was in a 2013 study where a pair of rats were connected through a BBI to perform a behavioural task. The connection was reinforced by giving both rats a reward when the receiver rat performed the task correctly.
Hot on the heels of this study was a demonstration that a human could control the tail movements of a rat via BBI.
We now know that BBIs can work between humans too. By combining EEG and TMS, scientists have transmitted the thought of moving a hand from one person to a separate individual, who actually moved their hand. The BBI works best when both participants are conscious cooperators in the experiment. In this case, the subjects were engaged in a computer game.

Thinking at You

The latest advance in human BBIs represents another leap forward. This is where transmission of conscious thought was achieved between two human beings in August last year.
Using a combination of technologies – including EEG, the Internet and TMS – the team of researchers was able to transmit a thought all the way from India to France.
Words were first coded into binary notation (i.e. 1 = "hola"; 0 = "ciao"). Then the resulting EEG signal from the person thinking the 1 or the 0 was transmitted to a robot-driven TMS device positioned over the visual cortex of the receiver's brain.
In this case, the TMS pulses resulted in the perception of flashes of light for the receiver, who was then able to decode this information into the original words (hola or ciao).
Now that these BBI technologies are becoming a reality, they have a huge potential to impact the way we interact with other humans. And maybe even the way we communicate with animals through direct transmission of thought.
Such technologies have obvious ethical and legal implications, however. So it is important to note that the success of BBIs depends upon the conscious coupling of the subjects.
In this respect, there is a terrific potential for BBIs to one day be integrated into psychotherapies, including cognitive behavioural therapy, learning of motor skills, or even more fantastical situations akin to remote control of robots on distant planets or Vulcan-like mind melds a la Star Trek.
Soon, it might well be possible to really experience walking a mile (or a kilometre) in another person's shoes.
This article originally appeared at The Conversation and is reprinted here under a creative commons license.

Thursday, November 6, 2014

Quantum of Solace: Computer Table and Walls

Hi Damian -- 

this is Ben. Thank you very much for your compliments on our work in the film -- it's much appreciated!

With regards to your question, there actually was a lot of R&D done on the user interface we designed for the smart wall & table, from the shape & placement of nodes to their color & purpose. The nodes themselves serve both as 'information centers' as well as a visual / tactile utility for navigating about. Much like the visual thesaurus project (which I'm sure you're aware of), each node is connected to other nodes of relevant information via tendrils, the proximity and size of which indicate the most relevant information. 

So, for example -- if you recall the scene where M is on the phone with Bond and they're searching for Greene on the big smart wall in her office -- the Greene node is the most prominent, as it's the feature of that session. From that, the computer connects additional nodes to that central one as it finds out more information about Greene, essentially creating a nonlinear cluster of information that's both intuitive for the user and unique to that session.

The color of each node indicates its function (i.e. red for real-time action, green for location, blue for people, etc.), and they're meant to be bright & saturated -- at times to the point of annoyance -- for quick recognition. The idea there is that MI6 agents need to be able to find and react to information quickly, and the human brain responds to color much faster than any other stimuli. 

The foundation of the operating system is based on the principles of radial thinking & mind mapping, which both theorize that the best way to absorb and process information is by organizing it in such a way that it's aligned with how the brain itself process information, i.e. by forming relationships between previously-unrelated material to create a 'web of understanding'. 

As I mentioned, these nodes can then also become utilities, where the user can directly manipulate it to access more data relating to that node (as in using it to scale up a scan of a dollar bill, for example).

Hope that helps!

Wednesday, October 22, 2014

Paralyzed Man Walks Again After Brain Cells Are Injected into His Spine

Back in 2010, Darek Fidyka became paralyzed from the waist down after suffering stab wounds to his back. Now, after 19 months of treatment in which cells from his brain were transplanted into his spinal column, he can walk with a frame. Researchers are calling it a "historic breakthrough."
The new technique, the details of which now appear in the latest edition of Cell Transplantation, involve olfactory ensheathing cells (OECs), which come from a part of our brains that deals with the sense of smell. By transplanting them into Fidyka's spinal column, the neurologists were able to construct a "nerve bridge" between two stumps of the damaged spinal column.
"We believe... this procedure is the breakthrough which, as it is further developed, will result in a historic change in the currently hopeless outlook for people disabled by spinal cord injury," noted the study's lead author Geoffrey Raisman in a Reuters article. He's currently a professor at University College London's (UCL) institute of neurology.
Fidyka, who's 38 years old, has recovered some voluntary movement and some sensation in his legs. He's continuing to improve more than predicted, and he's now able to drive and live more independently.
More here:

Tuesday, September 9, 2014

Boy Gets Prosthetic Iron Man Hand

Three-year-old Rayven "Bubba" Kahae was born with ABS, amniotic band syndrome, which means one of Bubba's hands isn't fully formed. But now he has a prosthetic Iron Man Hand.

Tuesday, November 19, 2013

The World’s First Mind-controlled Exoskeleton by George Dvorsky

Wow, is this a taste of the future, or what? Check out MindWalker — an exoskeleton that will soon enable paralysed and locked-in people to walk using only their mind. Ah, who are we kidding — we're ALL going to eventually want this for ourselves!

The groundbreaking device, which is currently under review by the European Commission, consists of three main elements: The exoskeleton itself, a virtual-reality user interface, and the mind-reading component. It was developed by a consortium of several major universities and companies.

Users control the MindWalker using an EEG cap that measures electrical activity at various points across the scalp. There are a number of different ways to control the exoskeleton in this way, but the best model involves wearing a pair of glasses with flickering diodes attached to each lens.

Wednesday, November 6, 2013

Robots Can Learn To Hold Knives — and Not Stab Humans

aurtherdent2000 writes  

"We humans enjoy not having knives inside of us. Robots don't know this (Three Laws be damned). Therefore, it's important for humans to explain this information to robots using careful training. Researchers at Cornell University are developing a co-active learning method, where humans can correct a robot's motions, showing it how to properly use objects such as knives. They use it for a robot performing grocery checkout tasks."