Thursday, November 17, 2022

“Decades to centuries is still my guess.”

 

In 2019, Edward Chang, a neurosurgeon at the University of California, San Francisco, opened the skull of a 36-year-old man, nicknamed “Pancho,” and placed a thin sheet of electrodes on the surface of his brain.1 The electrodes gather electrical signals from the motor neurons that control the movement of the mouth, larynx, and other body parts to produce speech. A small port, implanted on top of Pancho’s head, relayed the brain signals to a computer. This “brain-computer interface,” or BCI, solved an intractable medical problem.

In 2003, Pancho, a field worker in California’s vineyards, was involved in a car crash. Days after undergoing surgery, he suffered a brainstem stroke, reported the New York Times Magazine.2 The stroke robbed Poncho of the power of speech. He could communicate only by laboriously spelling out words one letter at a time with a pointing device. After training with the computer outfitted with deep-learning algorithms that interpreted his brain activity, Pancho could think the words that he wanted to say, and they would appear on the computer screen. Scientists called the results “groundbreaking”; Pancho called them “life-changing.”

The path from helping stroke victims to giving people superpowers is neither direct nor inevitable.

The clinical success of BCIs (there are other stories to go along with Pancho’s) appear to vindicate the futurists who claim that BCIs may soon enhance the brains of healthy people. Most famously, Ray Kurzweil, author of The Singularity Is Near, has asserted that exponentially rapid developments in neuroscience, bioscience, nanotechnology, and computation will coalesce and allow us to transcend the limitations of our bodies and brains. A major part of this huge shift will be the rise of artificial intelligences that are far more capable than human brains. It is an inevitability of human evolution, Kurzweil thinks, that the two kinds of intelligence will merge to form powerful hybrid brains, which will define the future of humanity. This, he predicted, would happen by 2045.

While futuristic scenarios like Kurzweil’s are exciting to ponder, they are brought back down to Earth by the technological capabilities of brain-computer hybrids as they exist today. BCIs are impressive, but the path from helping stroke victims to giving people superpowers is neither direct nor inevitable.

One of the first great steps in BCIs came in 1998, when neuroscientist Phil Kennedy inserted a single electrode into the brain of Johnny Ray, a paralyzed stroke survivor, and produced the first example of human mind control of an external device via an implant. This enabled the “locked-in” Ray to communicate by mentally moving a cursor to select letters on a computer screen and earned Kennedy international acclaim.

Implanted BCIs can also work oppositely, directing external electrical signals to trigger specific neurons. In 2021, a team at the University of Pittsburgh put electrodes into the motor cortex of a paralyzed man to allow him to control a robotic arm, and into his somatosensory cortex, where incoming sensory impulses activate neurons.3 As he grasped an object with the arm, he felt that he was contacting and holding the object through signals sent by sensors in the robotic hand. This substantially improved control of the artificial limb.

In another example, Columbia University biomedical engineer Ken Shepard has used advanced nanotechnology to construct a tiny chip a half-inch square with 65,000 microelectrodes.4 The idea is to place the chip on the surface of the brain’s visual cortex and wirelessly send in data from a camera to restore sight to the blind. If this device passes human trials, it will represent a big advance over an earlier effort with fewer electrodes, which limited the quality of the image a camera could send to the brain.

THE SINGULARITY IS NOT NEAR: We are still a long way, “decades to centuries,” says Princeton University neuroscientist Michael Graziano, from augmenting the whole brain, or achieving that science-fiction dream of uploading its contents to a computer. Image by lassedesignen / Shutterstock.

Along with triggering sensory responses, electrical or other input to the brain can alter its functions in a process known as neuromodulation. In deep brain stimulation (DBS), a small “brain pacemaker” is embedded under the skin in the upper chest and sends electrical impulses to electrodes placed in specific brain regions. DBS was approved by the FDA to treat Parkinson’s disease and manage epileptic seizures, and has been used to treat other conditions such as chronic pain.

Some neuromodulation methods work without invasive surgery. In transcranial direct current stimulation (tDCS), electrodes placed on the scalp and connected to a battery produce a weak electric current that influences brain activity. Any electronics hobbyist can build this simple device, and commercial models can be found for as little as $125. tDCS is not FDA-approved and there are concerns about its unregulated use, but tests show promise to relieve certain conditions and improve brain function. In 2020 and 2022 the FDA approved full clinical trials to test the efficacy of tDCS to treat depression.

These examples show how the capability to record and influence brain activity can benefit body and mind for those who have lost function in either. The new pathways to the brain also suggest ways to enhance the bodies of healthy people; for instance, through a neurally controlled exoskeleton that provides greater-than-human power or speed. But can these technologies augment human cognition? Can human and machine intelligences merge into a greater whole?

In 2011, Paul Allen, cofounder of Microsoft and founder of an institute to study the brain, and AI expert Mark Greaves, declared the singularity was not near, and called Kurzweil’s prediction of a major realignment in 2045 “far-fetched,” notably because it is unlikely we will understand the human brain so soon. In 2022, we remain at the beginning of knowing the brain.

We have, though, made incredible progress knowing the brain—progress that highlights how much is left to do. Kurzweil projected that swarms of nanobots would explore the human brain in unprecedented detail. We’re nowhere near that technology. Rather, the National Institute of Health’s Brain Initiative has mounted $500 million to bring together hundreds of scientists to map and catalog the brain’s 86 billion neurons with existing methods such as staining them to reveal their shapes. Instead of having software models of human intelligence as Kurzweil predicted, a €1 billion European project to simulate the brain on a supercomputer has after 10 years only simulated the mouse brain, a thousand times smaller.

The state of BCIs today presents another stumbling block in the road to singularity. Surgically implanted electrodes and non-invasive methods like tCDS have serious drawbacks. Inserting wires and silicon chips requires skilled brain surgery and risks infection or collateral damage. Implants can deteriorate within the brain’s wet environment, and the recipient is awkwardly tied to a computer by the connecting wires. Electrodes are implanted only in clinically monitored patients like Pancho. They are not implanted for human experimentation, nor is their use in healthy people likely to earn regulatory approval anytime soon.

The technological imperative should not be our sole guide to how humanity can help itself evolve.

Technology companies have announced the invention of improved and less invasive surgically implanted BCIs. Neuralink, founded by Elon Musk, redolent of his grand ambitions, promises that its BCIs will help clinicians treat people with paralysis and “could expand our abilities, our community, and our world.” After several years of development, though, Neuralink has yet to begin human trials. Synchron, another start-up, dedicated to the treatment of people with neurological diseases, has passed human trials abroad and has just started an FDA-approved trial of its method, which puts electrodes inside the brain’s natural blood vessels without major surgery. Both efforts would use Bluetooth technology to eliminate wires from the brain and increase portability.

The other option is to augment brains with non-invasive methods. Electroencephalography and tDCS can record and stimulate brains with electrodes placed on the scalp, and other contactless means use magnetic fields, light, or ultrasound. They too, however, present problems. Compared to electrode implants, some non-invasive approaches display lower spatial resolution and noisier data. And although they offer fewer risks than brain surgery, their side effects, such as long-term unanticipated changes in brains, need further scrutiny.

A 2019 summary review by Davide Valeriani at Harvard Medical School, and Caterina Cinel and Riccardo Poli at the Brain Computer Interfaces and Neural Engineering Laboratory at the University of Essex in England, looks at the ongoing research into BCIs designed not only for people with severe disabilities but for human cognitive augmentation in general.5 The authors show that researchers and clinicians today can choose from among 10 different methods to record or affect brain activity and enhance it.

One such brain function is perception. Non-invasive BCIs have improved performance in discriminating among different shapes, tracking multiple objects, and in a more complex task, viewing a video clip and determining if a possible threat is present. Decision-making, another important brain function, draws on several mental abilities and has been extensively studied. But using non-invasive BCIs to improve decision-making has been unimpressive; the data they yield is too noisy unless it is averaged over measurements or from several users.

Augmentation of memory and learning is important as the population ages, with accompanying memory loss. Studies show that sessions of non-invasive stimulation can temporarily improve spatial memory and the working memory that briefly holds information. One set of experiments gives clues to a memory prosthesis, although it would require invasive surgery. Researchers at Wake Forest Baptist Medical Center, the University of Southern California, and elsewhere, showed that electrical stimulation of electrodes placed in the part of the brain called the hippocampus enhanced memory in animals, and human subjects, who showed an average improvement of 36 percent in short- and long-term memory.6 This work met ethical standards because the subjects were epileptics who already had implants that controlled their seizures.

Valeriani, Cinel, and Poli predict that by 2040, most forms of non-invasive brain augmentation will be in field-testing for general use, or perhaps even as wearable neurotechnology.

These achievements by 2040 would represent astonishing technological progress but are less grandiose than the vision of human brains merging with AIs by 2045 to reach unprecedented capability. Instead of an imagined total meshing of brain and machine, current methods affect only portions of the brain and enhance only aspects of cognitive ability such as perception, not the entire brain. We are still a long way from augmenting the whole brain, or even achieving that science-fiction dream of uploading its contents to a computer.

In 2019, Princeton University neuroscientist Michael Graziano explained why.7 He believes mind uploading will happen, but only after we simulate the 86 billion neurons in our brains and reproduce how they are connected through 100 trillion synapses, the “connectome” that shapes whole-brain functions. “The most wildly optimistic predictions place mind uploading within a few decades, but I would not be surprised if it took centuries,” Graziano wrote. Since neuroscience is rapidly developing, I recently asked Graziano if, three years later, he had seen any progress that would alter his original assessment. His response: “Decades to centuries is still my guess.”

Neurotechnology is evolving, but not explosively enough to bring humanity to a new stage by 2045. Future projections of technology often depend on two assumptions: the technological imperative—new technology will always come, and once available, people will develop and exploit it to the fullest; and exponential growth, exemplified by Moore’s Law, which states that the number of transistors on a computer chip doubles roughly every two years.

Neither assumption is inviolate nor appropriate for neurotechnology. Exponential growth can reach a plateau: We may already be at a limit of chip technology where Moore’s Law no longer applies. And the technological imperative should not be our sole guide to how humanity can help itself evolve beyond its biological heritage. Unlike chip technology, neurotechnology inherently affects people, from the ill, injured, and disabled, to citizens who may or may not want their brains to be accessed. Here the technological imperative needs to be tempered by an ethical imperative, worked out by society, which would, and should, slow the evolution of brain and machine until we know it benefits humanity.

Sidney Perkowitz is the Candler Professor of Physics Emeritus at Emory University. His latest books are Physics: a Very Short Introduction (2019, audiobook forthcoming 2022) and Science Sketches: the Universe from Different Angles (2022).

Sunday, November 13, 2022

CYMATIC 'SOUND' IN TREATMENT OF CANCER AND D.N.A.

 

New
CC