Ted Simons: Good evening, and welcome to "Horizon." I'm Ted Simons. Researchers at the Arizona state University in conjunction with Phoenix children's hospital, are working on an interNais will allow for the use much brain signals to control electronics device and artificial limbs. The study is funded by a grant from the Arizona biomedical research commission. I'll talk to the lead researcher, but first, Mike Sauceda tells us more in this edition of Arizona technology and innovation, "Horizon's" multimedia effort that focus oats people, ideas, businesses, and technologies that are shaping Arizona's future.
Narrator: This ball is being controlled directly by the brain of a monkey with a brain implant. It's part of research being conducted by Arizona state University school of biological health systems engineering and Phoenix Children's Hospital. Research on measuring brain waves is taking part at ASU's biodesign institute. ASU student, Remy Wahnoun, is working on the research. He explains how researchers can figure out how to read brain signals.
Remy Wahnoun: Basically in the path that controls the limbs in the brain, you have NEURONS, and those have a direction of movement. When you go to one direction, they will be really excited. When you go to the other one, they will be less excited. So what you do is if you recall from -- you can actually find out where the person is trying to go, and using this we can actually do a movement in treating --
Narrator: Wahnoun says there are two types of brain implants, one that goes into the brain, 1 or 2 millimeters, or another that lies on top of the brain.
Remy Wahnoun: To first one plants in the brain, so we're trying to avoid this because the brain will react to it. And the second sway to use implantation, to show -- they're on the brain, so they don't penetrate the brain. This is pretty safe. They can have it up to a few weeks without any problem. So given this, you could have -- you could use the implants to control your limbs, you could actually use that to remote control -- someone was completely paralyzed will be able to do all of this. And hopefully eventually get full feedback from it. Control your arms and get back feeling from it.
Narrator: The implants are being placed on epileptic children having brain surgery because of their disease. He says the kids learn fast, how to control devices directly with their brains.
Remy Wahnoun: We are -- already have three patients and they got pretty good control of the interface, so we're walking on -- working on making things better, how the brain is working, so this is the thing with the interface, the primary goal is to actually help people who lost mobility, and the secondary goal is to actually understand better how the brain is working.
Ted Simons: Here now to talk about efforts to create a brain computer interface is Stephen Helms-Tillery, an assistant professor at ASU's school of biological and health systems engineering. Good to have you here. Thanks for joining us.
Stephen Helms-Tillery: Thank you, my pleasure.
Ted Simons: Let's define terms here. Brain computer interface. What are we talking about here?
Stephen Helms-Tillery: We're talking about a way to interface the brain, obviously, to an electronic system like a computer. So we have an interface to our body, our brain controls our limbs, through an interface, which interfaces our spinal cords, which projects to our muscles. If you can't control your limb, maybe you need something else to control, like a computer game, or robotic arm. To do that you need to interface to a computer which interfaces with the technology.
Ted Simons: Talk about how long this was developed. I imagine the first step would be to control, you have the interface to the electronic device, the next step would be the electronic device being things that could be used.
Stephen Helms-Tillery: Right. So it's actually a very long development. This comes from work in basic science of how the brain controls movement. It's expanded in 50 years. In that time we've learned what Remy discussed about how newer neurons relate to movements and specifically, and so we learn in the 1980s or so if we had enough cells at one time, enough signals from the brain at once that we could use that to predict or read out what the brain is trying to do with the movement. So once we knew that, then we could take that readout and use it to control any other kind of system. So in the laboratory, in the early part of 2000, we use the signals to control first video game, so -- the virtual reality, and then to control a robotic arm to pick up pieces of food, to feed yourself with, for example. So there's a whole chain of processing and several computers involved and processing those signals, but once we have fast enough computers, we know enough about the brain that we can actually take those signals and do something with them.
Ted Simons: Talk about what's going on in conjunction with Phoenix children's hospital, and using kids so epileptic children that will have to have -- these are kids that will have to have some sort of brain surgery anyway, and they -- the experiments are precursors? How does that work?
Stephen Helms-Tillery: This is a very unique opportunity. These are children who have intractable epilepsy, which is normally treated with medications. They give the kids medications, and at some point it doesn't stop the seizures enough. What is done now is a neurosurgical procedure, often epilepsy caused by sort of an irritation in a very localized part of the brain, so the procedure would be to go in and separate that part of the brain from the rest of the brain, or actually even remove it. But before you do that, you can imagine if you're going to have a piece of your brain separated from the rest, you want to make sure that it's the right piece of brain that gets separated, and that you don't separate other things, like the part of your brain that controls language, or the part that controls movement. So you want to limit accessory damage. So what they do is they place these grids, either on the surface or sometimes deep inside the brain as well, of what's called ECOG, or electrocorticogram – thye are little discs, they're on the surface of the brain and they record the signals. They watch the children when the children have seizures, they can localize where in that grid the seizures originated from, and sort of helps them to guide for the surgery, the treatment surgery. But during that time, the kids sit in a clinical room for a week, 10 days, and they're stuck in this room, because they have to be monitored continuously, they're being videotaped, because seizures can have all kinds of characteristics and they want to catch the seizures, so we have an opportunity to go into that room to say, look, we have a very interesting research problem, which if we have an interface to the brain, what are the characteristics of the signal processing we can do to take signals out of the brain and use that to control a prosthetic system? We talk to the parents and the kids about it, and give them the opportunity to play video games just by thinking about it. So the kids learn, for example, to play a soccer game where they're controlling the motion after soccer game to make it go left and right, but instead of using joystick, they just use signals inside their brain.
Ted Simons: And it works doesn’t it? The kids must be fascinated by this.
Stephen Helms-Tillery: The kids seem to love it so far. We've had three patients, and it's a mix because the kids are not in the best of physical conditions, so -- but they've been very engaged and really enjoyed having this sort of new challenge to occupy their time.
Ted Simons: Can these sensors now, what do they control? Can they control touch, can they control gradients, can they control temperature? Is the sky the limit as far as what these things can do?
Stephen Helms-Tillery: The sky is the limit. So right now what we're doing is we’re taking signals out, using that through something else, but if you want to use your hand to pick up a glass of water, you need to feel the glass. You need to know what you need to feel, how your hand touches it, how hard am I touching it, what’s the temperature, all those things. So the next step in development of these prosthetic systems is going to be to put information back into the brain. So we're also using this environment now to study that. And we're developing tasks where in addition to having the kids control video games, we can maybe stimulate their fingers and look at what the signals look like in the brain.
Ted Simons: This is big news, or at least big possibilities now for victims of stroke, ALS, these things, you can also use this in terms of communication?
Stephen Helms-Tillery: Right. I think in the first patients where this has been applied, these technologies, there have been patients with locked-in syndrome, so these are really sad cases, so this is a person who has, for example, a brain stem stroke, there's a really fascinating French movie called "the Butterfly and the Diving Bell." It's about this. So what happens is people become profoundly paralyzed. The only thing they can control is for eye movements. But otherwise, cognitively they're normal. They have normal as inspirations, and dreams, and thoughts, but they're locked in. In this body they can't control. Those are the first people where implants are actually put into the brain so they can control things. And one of the first things that people in that condition want to be able to do is communicate. So just give them access to like a virtual keyboard on a screen, so they can move the cursor and peck keys on the screen, and write emails. That's sort of the first line of technology development here.
Ted Simons: OK. So we've gotten this far, you've gotten this far, what's next? How much do we still not know about brain signaling, interface and these sorts of things?
Stephen Helms-Tillery: There's a lot that we don't know about how adaptive they are, so how many different kinds of things can you control? If we have an interface to your brain, can you control -- you can control something like an arm, could you control an airplane? Could you control a wheelchair? So what are the limits of what you can learn to do? How many things could you learn to do from a single -- could you control a whole bunch of things at once? And then the big open question, the thing that's really the hard challenge now is, how can we put information back in? What can we use that for? And so we're beginning to develop technologies where we're interfacing multiple brains to computers at once. And to see if one brain can maybe coach another brain in using these sort of implants.
Ted Simons: Wow. The collaboration between ASU and Phoenix Children's Hospital, talk to us about that.
Stephen Helms-Tillery: This is really a super pleasure. I've been working on this sort of working the lab for 10 years at ASU, and when Dr. Addleson came, one of the things he wanted to do very quickly was get involved in research, and especially this kind of research, brain interface. So it's been gratifying to take the technologies we developed in the lab, move them directly into the clinical environment and begin to sort of do that development. And so Dr. Addleson has been amazing in sort of putting these pieces together and allowing these reactions. It's difficult with clinician typically because clinicians are very busy. They have a lot of work that they have to do. And so he's done a very nice job of structuring things so we actually have time to do these interactions and make these experiments happen.
Ted Simons: It's fascinating stuff. And we're so glad to have you on to talk about it. We'll try to get you back on when the next phase comes through. Great work, thanks for joining us, we appreciate it.
Stephen Helms-Tillery: Thank you very much.