← All talks

A Dive into the Future of Brain Computer Interfaces

BSides Perth · 202333:04743 viewsPublished 2023-08Watch on YouTube ↗
Speakers
Tags
About this talk
Kai Frost explores brain-computer interfaces (BCIs) as an emerging authentication mechanism. The talk covers current BCI technology—from eye-tracking to EEG headsets to implanted electrodes—and demonstrates a practical proof-of-concept using machine learning to recognize specific thought patterns for Linux system authentication. Frost discusses security implications, the challenge of compelled thoughts, and emerging implantable BCI platforms like Neuralink and Synchron.
Show transcript [en]

[Music] all right quiet please and take your seats we're back on track with the schedule now it's uh fully recovered so over to Kai thank you hello everybody uh thank you for coming to my talk uh my name is Kai Frost I'll be doing a bit of a dive into the future and current technology around brain computer interfaces and a potential for authentication now I see you all looking at me and I'm I can I can tell what you're all thinking you are wondering where I got this wonderful pen and I will get to that but first I want to talk about this wacky device on my head so there is a bit of a heads up running

pardon the plan here this is a trigger warning this talk will contain discussions about the surgical implantation of electrodes into living human brains um I have not included any medical imagery in this talk at all but if this is the kind of thing that splicks you out of it this may not be the talk for you so I just wanted to put that up front in case anyone finds discussion of surgical techniques disquieting so uh who am I uh I was forced to put the slide in um my work history is actually terrifyingly close to Drew so we're bookending today um I've worked 20 years uh in uh ISP industry I worked across server Administration Network and routing

programming I need to switch over to cyber security about five years ago and for the last two years I've had the very great pleasure of working for crowdstrike in a variety of roles and I now currently work as one of their principal security analysts I've spoken at besides previously I talked about Circle navigating full disk encryption on Linux hosts and extracting AES keys from memory I also spoke about phone hacking networks for Fun and Profit mostly profit um this is kind of all led to me uh being a bit of a massive nerd with an unhealthy Fascination and authentication and futurism so we'll go see what that takes us what I'm going to talk about

today so today I'm going to talk about what are brain computer interfaces uh what was the inspiration for this talk what I set out to achieve how I achieved it uh we're going to do a bit of demo and if anyone's been to any of my previous besides talks they know that this is a cursed option for me so I have pre-recorded My Demo this time so assuming my laptop doesn't catch fire in the middle of my demo which is a real possibility we should see what happens now then talk about the security considerations of using your brain as an authentication method for getting into systems and then I'll talk a bit about the future of bcis and what is on the

horizon um this follows on actually remarkably well from that previous talk I'm going to talk a little about machine learning models and their proper application as opposed to the horrible things they're used for today and hopefully I'll have some time for some questions CI I bring computer interface at its simplest a BCI is a method of taking signals from your mind and translating them into a method that the computer can understand and react to technically your hands and a keyboard are a BCI they take the signals from your brain translated to your fingers in a keyboard into signals that a computer can understand we obviously don't think of that as a PCI that we've had those for years we're

very used to them uh traditionally now there are kind of three areas that are thought of as modern day bcis the first one is Electro oculography or oeog eye tracking so that's a series of devices that are set up either as a headset or around your eyes and they track where your eye is looking these are prominent in modern VR headsets that can be used for uh people who are disabled or quadriplegic paraplegic in order to allow them to select letters and keys on a computer screen there's these little devices here which is an electroencephalograph EEG which is a brainwave reading so what this is doing it's got a series of electrodes and sensors all through it this one has

14 and it's reading the electrical impulses of my brain through my skull now skulls are pretty thick some of us more than others and that signal is pretty low Fidelity it's also very generalized because you know nerve cells are tiny these are big fat electrodes so they're kind of getting the general firing of what's happening in different parts of my brain we've got the motor cortex we've got the the prefrontal cortex who are actually living our conscious lives our thoughts and so we're getting General signals out of the different areas of the brain the High Fidelity stuff is taking an EEG and actually putting it into a person inside your head and this is used a lot

in medical the medical realm these days they put them in for direct reading of brain signals from the brain into a computer and those are very high fidelity because you can actually get a sensor very close to a particular nerve cluster and then you can use that to get very high fidelity signals out of that nerve cluster the creepy thing about you start putting into the brain is that it should be can become a two-way interface so you can actually stimulate parts of the brain using that same electrode now that's very useful it also has more nefarious uh possibilities which we'll talk about later in the talk and I'm sure are pretty obvious from the outside

so what are bcis used for today so probably the most generic and the the example you'll be most exposed to are Cochlear implants so that's why they think a single electrode into the auditory processing part of the human brain some of the more High Fidelity Cochlear implants have multiple wires and they use that to stimulate the hearing part of the brain and allow people who have lost their hearing to regain the sense of hearing it's a really good use for that deep brain stimulation is another area that they're used increasingly they found that by sinking electrodes into certain parts of the brain and stimulating them with electrical pulses you can do things like remove the

Tremors from people with Parkinson's so the Strummer is turn on the BCI and their hand goes goes stable they can also use them to decrease the severity and occurrence of epileptic fits in some people um one of the more advanced areas we're using in these days is quadriplegic computer control again I like to sunk into the brain reading signals from the motor cortex of the brain this could be used to control a cursor on a screen this can be used to in the more advanced implementations like we're seeing here to control an electronic arm with much training and so we can take people who have locked in syndrome people who are quadriplegic paraplegics and give them

back the use of a limb um some areas they're talking about bypassing the nervous system and actually directly stimulating the muscles to give them back control of their native body if you're a BCI implanted in the brain obviously VRI tracking which we've already talked about some of the more nefarious uses for eye tracking are attention in advertising on computers so your webcam can track where your eyes looking and they can test the effectiveness in a browser of their advertising what do you look at what draws your eye what's fascinating to you and then they can improve those algorithms based on your attention span to make you really want your waifu and with the headsets like the one I'm

wearing some of the more interesting applications are there's a twitch streamer who recently used one of these exact headsets to be the final boss in uh Dark Souls so she trained the model over time to understand forward backwards Left Right attack and roll roll being the most important in Dark Souls as we all know and then defeated a final boss with hands off keyboard using only the power of a thought to control the the the character in game that sounds impressive but considering the year before someone beat a Dark Souls boss using a pair of Bongos it's probably less impressive it's still impressive nonetheless so what was the inspiration for this project um I've been interested in bringing

computer interfaces for years I met a futurist and a technologist transhumanist um and Epoch I'm on their mail list and they sent me there check out our new cool headset and I was like I really want one of those and I've got no use case for it whatsoever and my wife who's a massive hoovie and said you know you could do that and I said what and she played me a clip on YouTube from a Doctor Who episode um any humans the audience may be familiar with this episode I'm going to play a little clip from it which hopefully works with the audio um in this scene a couple of the doctor whose companions are running away from a

monster as is their their uvoir and almost every Doctor Who episode and they're trying to get into the control room of the TARDIS which is protected with a password as all good secure areas in a device spaceship should be and the password is transmitted them and then they use it in the following manner

where is this place just wish it home to go they just sent me the password

the meaning of what is it eleven tonight

successfully authenticated and the Tardis using just a memory of her concept of crimson her 11th birthday her marriage and the smell of petrichor and I thought you know what all right can I can't do that so I set out to actually do it for real um and I wanted it to be not me thinking the word 11th birthday but my actual memory of my 11th birthday these have enough sensors that pick up on different parts of your brain such as your sensory organs your your skin uh that you can if you can vividly imagine the scene well enough and capture those signals and train an ammo model to tease those out the whole experience of that 11th

birthday can be your password the neat thing about that is that every human brain develops differently they wire up differently they're all unique as unique is your fingerprint as your neural networks grow so I can tell you what my password is it's a vivid memory of my 11th birthday doesn't help you in the slightest if you put this headset on and thought about my 11th birthday wouldn't unlock the system because you don't have that experience of my 11th birthday and cannot have the experience I can describe it to you in infinite detail a little Bobby was crying because he lost musical chairs for the 11th time the smell of the cake the the taste of

the lemonade you have to have that from my brain in order for this to work which you cannot replicate so my primary goal was to set out and see if I could do it with what we have now hanging around and actually develop that kind of an outcome so the headset that I actually ended up getting was a 14 channel EEG headset with excellent Fidelity it's got really good readings this is from Epoch there are a few options at the time they do a couple other headsets they do the emotive Insight which is much much cheaper you don't want to know how much this thing costs um the motive headset is a four channel one I didn't feel that enough Fidelity

to really give me a good signal for a secure password so I didn't go with that one then they do the emotive Flex which is much more of a research grade headset that's a pull-on skull cap and you can put the sensors wherever you want move them around depending on what you researched there's very close to the price for this one but I thought about getting one of those but I've planning to wear it in the talk and I didn't really want to look like Tully from Ghostbusters so I decided to go with a much smelter cooler version of this and if you go to a motive you'll notice that all of their marketing for these headsets have really

attractive people wearing them and posing looking thoughtful so it's clearly they don't know the target market so the data comes out in a format like this it's 14 channels of analog signalings uh coming from the various electrodes depending on what I think about what I'm experiencing what I do and then those can come into a model which I can then train so let's see if we can get a signal out of my brain this is as close I'm gonna get to a live demo in this so I've got the headset on and in theory connected so there we go that's my live brain right now if you look just down there at the purple you can see the Panic I'm

experiencing by talking in front of a room with this many people but I can actually do things like pinch my cheek that hurt there there's my pain I can stroke the back of my arm and you can see that causes signal again in the purple line which is right where my motor cortex is I stop and I kind of relax a bit signal goes flat just by talking you can see that I'm thinking my very words are actually being translated into into things so I think about a bitter lemon you get an Essence Spike there so that's coming from my taste buds part of my brain so I can use this data to generate a model of a particular thought now

skip back to the thing honestly that is the correct one nope oh no dropped out of presentation mode sorry just one second hope this takes me back rollers yes excellent so in order to actually get any sense out of that noise we have to train the ml model so if I think about my 11th birthday sitting at home on my desk in my nice quiet office you get a certain lot of signals if I then do it while smelling some incense I get a different set of signals but my thoughts still in there somewhere in the noise if I hop on one foot while doing it I get a different set of signals if I drink a

lemonade if I'm presenting in front of 500 people so what you have to do is you have to train the model you say I'm thinking the thought and you hit record and it records up that stream and then you do it over and over again I had to train it about 15 times to get it to work well with this project and what it does it looks for the commonalities amongst all the spikes and and dips in the in the signal it looks for the things that are mostly the same between the two of them puts into a it's quite a fuzzy logic so it allows for deviations between if I don't quite remember the sound of Bobby's crying as

loud as it was then it allows for certain deviations in those areas um so yeah so it took about 14 15 for me to get that and find the statistical similarities and so what I did then is uh once you've got the model trained you can think the thought and I'll go you've thought the thought which isn't very helpful um you want to get that data out of the emotive software it's a black box that brain waves go into and a the thought happened comes out of they have a bunch of ways to interact with the with the machine learning model they have two main uh interfaces one is osc which for some reason they chose the Linux open

Sound control format I get it because it's a high fidelity high speed um interface for transmitting data you can basically take that and feed that over any kind of secure format you want over SSH to another box to uh to get the output for that they also offer this cortex API which they charge for it's a web API no https I decided not to go with that so what I've used is the osc format uh which I can then pipe to any device I want over some kind of secure tunnel so let's look at what the Oco stream looks like so I need to go to here so this is my this is a VM running on my windows box this

can subscribe to the osc stream which is being put out by uh the the host machine learning model and so neutral is it's not detecting the signal of my memory I don't know if this is going to work because I have not trained the machine learning model in front of a bunch of people so we'll give it a go no I don't think it was going to work again this is too outside of the Machine model if I trained it about three or four times standing in front of oup oh it came through that was me thinking I thought so it does come through eventually um this is the problem now try stop thinking about the thought

I'll talk about that in a second as well uh why does it do this to me every time cool so using that OCS stream I now have a way of getting my thought out of the machine learning model and I want to get it into my system for authentication Unix Linux authentication Works via a subsystem that has a thing called the Pam is not the kind of Pam we were talking about yesterday so it's for plugable authentication modules they made it very generic as the Linux people like to do and so you can create your own authentic authenticators which you can then tell say sudo or your desktop when you log in to go to those

authenticators and work through them and you have things that are required or sufficient and you can say things like I want a smart key plugged in I want the password to be right against Etc password and once all those criteria are satisfied Pam hands back to the requesting uh service authentication successful so I used a pan module called python Pam which is a nice little handy module that allows you to write a Pam module in Python nice and easy and then you can basically do anything programmatically that you want to do such as listening to an osc stream and then based on finding that line in the OCS stream you can then hand back an authentication success to

the requesting device so this is a bit of code that uh there's the demonstrator code that came with it I created a pan module that subscribes the USC screen and returns authentication okay when uh when When the Thought goes past in the Stream uh then uh modified the Pam configuration for both sudu minus I and for gdm which is the desktop login manager for Linux to be sufficient to put a pen python Pam success that's what I just called my python script so When the Thought goes past that's sufficient to authenticate it doesn't matter if it matches any other criteria just the minute that thought happens it authenticates so this is what this is where the demos

are recorded I'm afraid uh because I didn't want to risk standing in front of you guys and going so this is recorded so what we have here is uh this is I've tied into pseudo um over here we've got the OCS stream so we can watch it I'm going to try and run so you do and it's going to sit there and wait for me to think the thought so let's try and authenticate so you do a boring amount of time passes well I think about foot I think and now I'm authenticated as root on the Linux box [Applause] and the show this ties into uh authentication for the full desktop as well now the pair module I modified the

gdm login manager to authenticate against that pair module as well what I'm going to do here is I'm going to click on a login normally you get a password prompt here but I've disabled the password prompt pair module and replaced it with my python module uh what you'll see is I'll put my hand on my thing and then I move my hand to show and I start thinking the thought just to kind of give a bit of a signal that it was working so I try and log in it sits there now this laptop I'm running this one is a bit slow so even when it starts to authenticate it takes a while to drop

the desktop now I start picking the thought it's now authenticating and if you give it a second because this machine is the slowest picture I'm logged in just like that [Applause] so this is a security conference let's talk about the security configuration considerations so the potential attack vectors for this are you're only as secure as your OCS stream obviously if anybody can get into that OCS stream they can just inject yeah sure you thought that thought into the OCS stream and then you're authenticated um so that's why I'd recommend doing it over some kind of secure medium uh SSH TLS encapsulated and https or something we'll this is this problem that's well solved we know how to create a secure

tunnel if it's on the same machine it doesn't matter um because it's just talking locally but if you want to authenticate remote devices via thoughts like you basically have an AED for thoughts which has all the machine learning models on it uh that the signals are sent to and then they're processed and sent back as authentication requests you just want to secure those tunnels you could in theory this is a USB receiver you couldn't record and replay the raw data coming from my headset interesting so you know I think the thought you just record that raw data and then feed it back into the machine learning model at a later date um this headset does not authenticate

the uh the headset to the USB key that would have to be resolved uh going forward for this to be considered a truly secure method for authentication um is Pam on the tax service for this I mean in order to modify the pan configuration files they need to be root on the system you're already root who cares um the problem I discovered early on is the don't think of a pink elephant problem and this is something I discussed with one of the other attendees earlier is one of the interesting things legally about this as an authentication method is in many countries you can be compelled to hand over biometric data you can tell to give

your fingerprints you can be compelled to log in Legally compelled to log into your foreign view if it's got a thumbprint reader but in some places you cannot be compelled to hand over something that's in your head so your password and that requires a court order a warrant or things like that now what happens if I can tell you my password it doesn't help you uh you know give us the password to your laptop my 11th birthday good luck um I can sit down and think guys 11th birthday all they like and it's not going to help them at all so can I be compelled because in my head to hand over my thought can I be compelled can they sit me down

put a headset on my head and force me to think the thought within against me can I be court ordered to authenticate against my device by thinking the thought how can I tell if I'm doing it wrong or right I'm sorry officer I'm panicking right now I can't think of it properly that's why it's not working and I'm really you know singing some but he once told me you're all welcome um so this leads to the Pink Elephant problem which is I put the headset on like and I was setting up those videos I was like okay don't think the thoughts I've got time to get them around logged in all right it's cool squad log out

okay I'll get the recording set up don't think I'm logged in so it's actually remarkably difficult not to pick the thought when you don't want to think it um initially when I recorded the videos I just sung nursery rhymes in my head until I was ready to authenticate and then the thought but uh it actually is funnily enough sold by that Doctor Who episode which is what you need to do is you can train the model it doesn't have to be one thought it can be a sequence of thoughts I can think about my 11th birthday a red flag the smell of petrichor and you're very unlikely to randomly think about those three things in a row and even when someone you're

trying not to think about them it's actually much harder to accidentally think about three things in a row so I'd solve that problem by by making it a sequence of thoughts um something which you can't actually tell in the video is it's not just the memory of my 11th birthday because I found that was too easy I'm actually clicking my tongue as well in my mouth when I think the book so I'm actually adding a physical element to that password as well so I can say it's the memory of my 11th birthday but you also have to click tongue um the the pros of this is you can replicate the neuronal growth pattern of my brain

so it's incredibly hard to crack um thoughts can be strung together paired with physical Sensations as I just described a brute more surface of that is humongous it's 14 channels but that's not like 14 ABCs it's 14 analog signals and depending on the Fidelity the headset that could be anything between zero and a thousand so you've got 14 times a thousand possible signal levels it becomes more possible combinations than there are atoms in the universe to to brute force that thought and hopefully match the machine learning model that I've created and and as I spoke before the can you be compelled to think of thought uh issue which is is is a I think a big Pro

um it really is one of those things that I don't think's been explored in law yet and is coming and we need to think about it because when these become more common and when we start talking about implanted brain computer faces which will make our lives easier in a multitude of ways I think that authentication via a thought a memory will actually be the next logical step um if you can secure the the transmission path which we know is a solvable problem it's a very secure um methodology for authentication it's just better than anything else it's something you have and it's a biometric in one it's not a token you don't need anything no one else can duplicate it

it's it's a it's an excellent authentication methodology so the future how am I doing fine update uh so the future so uh where we go from here is obviously implantable bcis um there is a few options now for that kind of thing um obviously if you're uh registered in one of the medical uh if you're physically disabled and your message in one of these uh studies either BCI and they're working on training hands and machine learning models for that but commercially we obviously have a neuralink and this totally not scary machine that Rams electrodes into your brain uh that they've created it has it's basically a giant sewing machine that can Ram a thousand needles into

your brain in under a minute through a small hole that drilling on the top of your skull they've done this in pigs already and they can make the pig like move its back leg this is the problem this is where we start to get terrified because once they're inside your head they're two-way signals do you like your waifu or Does Elon Musk just want you to like your waifu do you really want a Coke right now are you really thirsty or is he just stimulating The Thirst portion of your brain so that's that that's that is a consideration we need to take but implantable BCI here this is a their current chip it's got uh I think uh 14

channels 15 channels but they're going up to a thousand is what they want so this little disks that's just under your skull and on top of your brain and then the they're all fed into the different parts of your neural cortex it's much like this is but just much higher Fidelity so you're getting thousands of signals you're getting very high definition uh readings of the the surrounding areas and it's two-way which is a bit concerning um like I love what he's doing with SpaceX and stuff like that but I'm not sure I want an Elon chip in my head is all I'm saying and I'm a futurist and I'm keen on having a chip in my head I

really want one but I'm not sure about that fortunately there are other opportunities also we don't know very well what the damage caused of the brain by having it ran through the meat of the brain is like it's going to destroy cells on the way um it's the in the pigs they don't suffer any brain damage some more it's fine um and they can remove them and put them in but repeated damage and scarring is a real concern there fortunately there's an Australian company that's doing a much better job and is a hedge of neuralink this is a company called synchron and they're using a they've got a brilliant ideas instead of ramming wires into the brain what already permeates

all of the brain your blood system goes to all parts of your brain a problem we've already solved is a thing called stents if anybody's had anyone who's had a stroke or a heart attack you know what a stent is it's a little uh mesh thing like this that they insert into a vein either in the armpit or in the inner thigh and they go to the vein that's closed up and they push the state and it just opens up the vein a little bit very safe they've been doing them since the 70s um one you can remove them you can put them back in they don't cause any damage and so instead of just putting in a

stent put an electro to each of the junction points from the stent feed them in through the jugular vein and just feed them up into the brain via the blood system that's already there no damage no punching holes in the brain they can be removed if you get a software upgrade um synchron so it's done like that they just feed them into rainbows as a highway already there um and those are also two-way you can stimulate the the brain around them and they can be really tiny the picture looks quite large but uh it is actually much smaller than it looks um and so you just feed those all through the brain and then have two-way

communication from that and that they they have moved into human trials I think they've got one human trial now um and they're based in Melbourne so go Australia um not punching holes in people's brains um so yeah that's kind of what the future we have right now it'll only get better smaller and faster so be prepared are there any questions I think I saw your head first sorry um yes I mentioned I mean yes is there any possibility you could wherever you experiment with anything possibly thought like say we all ask for his I'm not being compelled to give this password and the fact that you are being they'll change yeah I don't think I

don't know that you could do true beautiful so I mean like you could obviously try bashing I think it's the fall and erasing the memory of your childhood um but no I don't I don't know that you could do that because it's just a further thought really they're saying like you know think I'm not being compelled to think this thought whether or not it's true or not the interface can't tell I can only tell what you're thinking it so yeah sorry you had a question here have you tried this game before to log in on multiple different days yes as an extra physical stimulus yeah put that inside if you're a bit colder in that room or if it's something right in

your eyes yeah that changes it does and that's why you train the model over and over and over again like I cheaped out I did it take 60 seconds to reach training sessions I did it 15 times I got bored um but ideally you want to try and train the model in as many different situations you can that training period is slow at this point in time when you have the implanted ones the training period can be accelerated as much faster because it's High Fidelity um but if I was being a good boy I'd be I've trained a couple of times here and then I have a new experience and again it just adjusts that model so it has a

wider range of tolerances for like you know you're on fire and you're thinking about your 11th birthday and you really need to unlock your fire extinguisher right now um you want that to be well trained so that it can understand you in a variety of different situations it starts to understand what's relevant signal and what isn't so you don't have to train in every situation just the more you train it the more it knows how to discard the irrelevant uh yes you mentioned it's a machine learning models really machine learning means like all about how big teams might make it like enough it's not uh you did they recommend uh 30 plus for a goodish

training period um if you want to go real good I mean I think the the lady who did the um the game she trained it 60 times per movement uh to get really high fidelity um the more is better basically yeah yeah yeah for for adoption at the moment like I said because it's uh slow at the moment yeah sorry uh yes you had a question there um

no no no not not be these things uh I'd say I don't think so I mean like basically most of the research on autonomics is preventing the damage in the first place yeah um I don't think you can rebuild neural Pathways we don't have technology rebuild neural Pathways this point time that are lost yeah sorry I have one more question one more uh yes yeah let's see my connection yeah when you're doing authentication that's right um so if it's possible that you could uh begin thinking about dedicated at the time before you do I reduce that tonight yeah yeah you could you could have a a authentication in the pipeline I guess you could but I

mean like as I said because I've trained it quite lightly it takes a second to tease it out again with more training the faster it it picks it up so that was that was a problem I think that's it so yeah thank you thank you very much everybody