October 12, 2024

Pierreloti Chelsea

Latest technological developments

Brain technology companies in Silicon Valley seek to enhance human minds

Brain technology companies in Silicon Valley seek to enhance human minds

At the age of 21, Bryan Johnson arrived at a decision: He wanted to make a lot of money. And he did. In 2007, at age 30, he founded Braintree, a web payment system for e-commerce companies. In 2012, Braintree acquired Venmo, and in 2013, Johnson sold his company to eBay for $800 million.

Before he sold Braintree, Johnson was already thinking about his next step. If he could throw millions at one thing, what would it be? He imagined humans evolving into an “entirely novel form” by enhancing cognition.

This is how Johnson entered the world of neurotechnology, a field that uses technology to better understand, and even repair or enhance, the brain. More specifically, he’s interested in brain-computer interface (BCI) technology that links communication between the brain and a device.

In 2016, Johnson founded his own neurotechnology company, Kernel. While his ultimate dream of cognitive evolution sounds pie in the sky, Kernel’s current goal is more contained: to explore and measure the brain by making the tools to do so more accessible and capable.

After a few years of development and a pivot or two, the company developed helmets that gather information about the brain’s blood flow and electrical impulses. In July, Kernel started shipping out its $50,000 Flow helmet to partners, including research organizations, a pharmaceutical company developing psychedelic mental health treatments and an esports training platform.

Kernel is part of a decadeslong history of people trying to get closer to the human brain through technology. Dozens of people around the world already live with implanted technology in their brains — allowing people with epilepsy to predict seizures, or people with paralysis of facial muscles to communicate through a computer.

Meanwhile, a growing batch of wellness-oriented consumer products make dubious claims that they can read brain waves or provide direct stimulation to the brain through the scalp. These claims present a problem for consumers: Is it all just hype or are there impending repercussions warranting fear?

If Silicon Valley hype men like Elon Musk are to be believed, technology being developed today will soon be able to cure diseases and allow us to control our intelligence, emotions and more. But even if that’s a pretty big exaggeration (and most experts believe it’s a very big exaggeration), it’s true that neurotechnology is in a particularly interesting place in 2021.

As more investment than ever goes toward BCI technology and it becomes available to more people, we may soon be confronting questions of how much we want to understand about our own brains — and the lengths we’d be willing to go to change our own minds.


Your reaction to the idea of measuring and eventually enhancing the brain might depend on how familiar you are with neurotechnology. Those who study it will be the first to tell you that mainstream tinkering with the brain is likely a long way off.

“I always say we’re a little bit like the Wright brothers at Kitty Hawk, saying, ‘Look! It can fly … for 200 yards,” Florian Solzbacher, co-founder of BCI company Blackrock Neurotech and a professor at the University of Utah, says. “We’re not in the golden age of jetliners or going into space.”

Much of the neurotechnology that exists today is not as newfangled as you may think. You’ve probably heard of one of the earliest BCIs authorized for use in the United States in 1984: the cochlear implant, a device that picks up external sound and stimulates a nerve in the brain to restore some sounds to those with hearing loss. It falls under the category of neuroprosthesis, or instruments that can stand in for or enhance what parts of the nervous system do — including sensory systems like vision and hearing, or motor systems like speaking and moving.

In the 1990s, Solzbacher’s colleague at the University of Utah, Richard A. Normann, developed a small device called the Utah Electrode Array that could be implanted in the brain to receive signals from sensory and motor neurons. The Utah Array is now one of the most common implanted neural interface devices. It’s often used in the types of BCIs you’ve likely seen in the news, like prosthetic hands that can be controlled mentally.

It’s estimated there are about 30 people in the world with a brain implant, most of which were developed at Blackrock Neurotech. In fact, the first patient to ever use a BCI implant to restore functionality after paralysis, Matthew Nagle, used a Utah Array by Blackrock. Nagle, whose spinal cord was severed in a stabbing in Massachusetts in 2001, was able to essentially control basic computer functions like moving a cursor, pressing buttons, typing and drawing using his thoughts. He used the BCI from when it was implanted in 2004 until his death in 2007.

Now, some prosthetic arms can even relay sensory information to the user so they can feel again, and BCIs exist to treat a range of illnesses. The number and frequency of technological advances and first-in-human demonstrations (as opposed to testing in animals) has accelerated in the past five or so years. “It’s quicker and quicker almost every month,” Solzbacher says.

As neurotechnology improves, BCI devices (especially the noninvasive kind that don’t require surgery, like Kernel’s helmet) will become less expensive and more compact. Whether they’ll make the leap from specific research and medical uses to more widely available consumer products is an ethical discussion and will depend on what societal consensus is, Solzbacher says, adding that “Blackrock has created an ethics supervisory board to make sure that we are consciously addressing these questions early.”

Anders Sandberg, a senior research fellow at the University of Oxford’s Future of Humanity Institute, likens this moment less to the tech-frenzied dystopia of the “Black Mirror” TV show and more to the advent of desktop computers that people could purchase, instead of the early mainframe computers that filled rooms.

“Maybe the 2020s are going to be like the 1980s,” he says. “There was this enormous explosion of computers, programming, hacking and computer games.”

As Sandberg and other ethicists see it, neurotechnology is likely to be like any other technology in one major way: The people who create it probably have no idea how people will actually use it. Much like the internet, “it’s curious; once you reached that point of affordable computation, they did the cat pictures. Nobody was envisioning that cat pictures had anything to do with it.”


Bryan Johnson grew up in a Latter-day Saint family in Springville, Utah, playing in open fields and reservoirs or weaving in and out of traffic on a bike, racing friends to school.

“I guess that’s maybe only comfortably done in a small community,” he says. “We just had a blast.”

His first time leaving Utah was a two-year mission trip to Ecuador; he remembers coming back with a desire to be useful and improve other people’s lives. He pursued entrepreneurship because he “wasn’t really skilled at any one thing,” but knew how to be resourceful and tenacious.

By the time Johnson sold his company, he’d had three kids, been married for 13 years and struggled with chronic depression for 10 years. “I was questioning everything,” he says. “Literally everything I’d ever learned was now fair game to be analyzed with a different set of eyes.”

Johnson’s lifestyle is now optimized to Silicon Valley standards; as Ashlee Vance wrote in a Bloomberg profile of Johnson in June, he eats one 2,250 calorie meal at 4 a.m. and regularly undergoes tests in case recalibration is needed. He blogs often about treating his thoughts like an erratic co-worker whose impulses cannot be trusted.

“I fired my conscious mind from deciding what foods to eat. I asked my liver and my heart and my DNA methylation, ‘What does everybody need?’ And then they generate the shopping list,” he tells me of his diet. “Now I’ve never experienced more pleasure eating, but a lot of people make the wrong conclusion that I’m somehow depriving myself or that I’ve lost all joy.”

The assumption that his mind is probably irrational or wrong about everything, and he just has to figure out how to correct it, is central to Johnson’s life. He has a process of running his own thoughts back through so that he can be hyperaware of his own biases. But the idea seems to have much deeper roots than a run-of-the-mill occupation with optimization and rationality. During his worst years of chronic depression, “my brain was telling me things every day, all day, that weren’t correct,” he says. “It was filling my head full of things like, ‘Life is not worth living,’ and ‘Everything is awful’ and ‘You’ll never feel joy again.’”

It makes sense, then, that the driving obsession at Kernel is getting to know the mind in concrete terms. “We get tricked into thinking that our self-awareness is a complete sentence for what is going on in our brain or our minds,” Johnson says. But he compares that to a doctor asking a patient how their heart feels. “You’d say, ‘Thanks, doc, for asking, but please could you do an EKG, a blood panel?’ … You’d want measurement.” He imagines finding out, in concrete terms, how social media or sugar affect the brain. We have baseline measurements for blood alcohol levels that help us know if someone is unsafe to drive, for example; Johnson reasons, why wouldn’t we want similar measurements for the brain?

The Flow helmet works by observing blood flow in the brain. The brain responds to stimuli and changes like physical activity or temperature by adjusting blood flow, so observing it over time can show which regions the brain sends blood to when it experiences certain changes. This information reveals just a portion of how the brain works — but the usual method of observing blood flow in the brain takes much longer and requires the subject to lie perfectly still in an fMRI machine rather than wear a helmet.

Kernel’s other product, Flux, is meant to complement Flow’s data by sensing and tracking the activity of brain cells that communicate with other cells through electric currents. Flux’s magnetometers sense magnetic fields produced by those currents; this, Johnson says, is like seeing activity from the brain’s outer area, or the cortex, in 1080p resolution. (Again, this technology already exists, but in the form of bulky machines.) For now, that product will remain in-house; chosen partners will participate in studies to gather data through Flux, which requires wearers to sit in a room with a large helmet on their head.

The value of Flow lies in how many more people’s brain activity could be tracked while they do a wider range of activities that require movement, or for an extended period of time. Johnson himself has participated in a study where he slept in the Flow helmet; he says he learned that deep sleep positively affected his self-discipline. He believes that it would take a reasonably low number of voluntary participants to start getting interesting insights and information to establish baselines. For now, the only people using Flow will be organizations and companies doing brain-related work.

“I built the technology because I believe that it needs to exist, not because there’s an active market out there,” he says.


The growing number of people getting in on neurotechnology presents conflicting problems: Companies — attempting to appeal to consumers — are exaggerating the capabilities of existing technology, while consumers are left to wonder if scary brain-enhancing and brain-zapping implements are nigh.

“It is way too easy to transition from sort of overhyping of technology to fear on the other hand,” says Blackrock’s Solzbacher.

Implanted BCIs introduce a whole set of questions around patient experience: Researchers have found that patients can start to see their devices as part of them or feel that it changes their personality. Things get even trickier when it comes to noninvasive BCIs — after all, most people who are willing to undergo surgery for an implant really need it to make their lives easier, so they weigh their options differently and more carefully.

As neurotechnology becomes more of a consumer product, it will likely be harder for customers to know what they can actually expect from a BCI. And loopholes are being created: Claiming that a device is “nonmedical” has been used to avoid closer scrutiny. Veljko Dubljević, an assistant professor of philosophy, science technology and society at North Carolina State University, calls the world of noninvasive devices “the Wild West.”

Further out, it’s worth considering what it would mean to be able to augment our intelligence. If we could offload certain thinking activities, like doing math, to a device that interacts directly with our brain, we’d need to think hard about why we value education. “I’m pretty sure that currently, as things stand, we still want people to learn,” Dubljević says dryly.

The technology has yet to catch up with worries about how BCIs could promote inequality and change our identity, but ethicists like Dubljević say guardrails like stronger regulations and privacy policies should be put up now. “The idea is that we should have the rules of the game clear,” he says. “Make it so that this technology is for the public good, and not just for one set of actors to make a lot of money quickly.”

Johnson says that Kernel has drawn up acceptable use guidelines and privacy standards for researchers using the Flow helmets, and will continue building on that as the products develop. But he doesn’t believe that Kernel’s products in their current form raise many of these speculative red flags because they don’t fit into the same categories as many of the BCIs we imagine.

On the other side of the spectrum from medical BCIs, he says, “people are familiar with ‘The Matrix’ scene of downloading Kung Fu, or ‘Black Mirror,’ imagining these dystopic outcomes. There aren’t really a lot of other imagination sets in between.”

Because Kernel’s products are currently only used for research and measurement, Johnson leans on an overall agnostic stance when it comes to speculation.

“The inventor of a technology is always wrong on how the technology will be used,” he says. On this, ethicists agree.


While it may be an easy answer for people like Johnson, not everyone will want to know everything about their brain. (Have you really never felt that ignorance was bliss? I asked him. “No, I’ve had universally positive experiences with measurement,” he said.) And plenty of people will balk at the idea of becoming part machine.

But to some degree, Sandberg points out, we’re cyborgs already — we all use technology and rely on it almost as if it were part of our own bodies, whether it’s eyeglasses or our phones. “Even though most of the time technology doesn’t go inside our bodies, it affects us.”

At the end of the day, the ironic thing about neurotechnology is that it will inevitably come to be influenced by the quirks of humanity. We’ll want to manage neurotechnology so that people don’t mess around with it in ethically questionable ways. “But as if that ever stopped people,” Sandberg says. “I think we are going to do a lot of very stupid things. We’re going to hype a lot of technologies to death. But I do think we can learn quite a lot from it.”

Johnson’s view, as he sees the first version of his concept make its way into the world, is perhaps necessarily vague — and hopeful. He’d like people to push back against the most common reaction to neurotechnology: a fear of the unknown. He sees a need for what he calls future literacy, the ability to deal with emerging technologies and developments without having all the answers.

“It requires us to be able to deal with uncertainty in novel ways,” he says, “and to start finding ourselves being motivated by that which we can’t describe.”

One of the greatest uncertainties may be just how many people will be motivated by the same desire to change our relationship with our minds.

This story appears in the September issue of Deseret Magazine. Learn more about how to subscribe.