When I first heard about eye-gaze technology (nearly three years ago, at the first IRSF conference I went to), I was (a) blown away, and (b) convinced that this was the most promising communication solution for Amy.
The brilliant thing about eye-gaze is that it makes no physical demands on her, and it makes use of her best skill - all she has to do is focus on a computer screen. She doesn't have to press a switch, or do anything at all with her uncooperative hands. She looks at the bit of the screen she wants, and holds her gaze for a second or two. And by doing that, she can make things happen, entertain herself, interact with people around her and be a little bit independent.
Amy now has a Tobii device at school, on loan from CENMAC til the end of July. It's a new thing for everyone who works with her: no child at her school has ever used eye-gaze technology, and hopes are high. But it's been quite a process getting to this point. We had to introduce the idea to school and Amy's speech and language therapist, then get an AAC (Augmentative and Alternative Communication) referral, talk up Amy's cognitive abilities, share research evidence, help her do her best with switches, yes/no cards and PODD, and then wait for it to be Amy's turn for an extended trial with one of the few devices available in our part of London.
We don't yet have the full range of software. But even with the software she's got, and even with all that's going on at the moment - the seizures and general concerns - she is focused and interested, and seems pretty gratified that this is all for her.
So how does it work? I wish everyone could see it for themselves. I'll try to film Amy using it sometime, but for now, will do my best to describe. The screen looks just like a regular computer monitor, and she has to sit just the right distance from it, with her eyes positioned right in the middle. (There's a little screen that pops up to help us keep track of where her eyes are.) The fact that she can't get up and go anywhere is a positive advantage!
The aim, both short and long-term, is teaching rather than 'therapy'. At this stage - given both her age and her new-ness to all this - teaching happens through games and fun. So although the aim is to move to functional communication, we can't expect her to jump straight into that. Instead she has to get used to making things happen on the screen with her eyes. And that happened pretty much instantly - looking at bubbles in a corner of the screen and using her eyes to fill the whole screen with bubbles (look at the bubbles and they multiply, look away and they disappear). Or playing a game of 'splat', where the screen fills with faces and she can 'splat' them one by one by looking at them - it's the same kind of fun her sister might have playing a game on my phone.
Then moving on to things like playing with animals in a farmyard, dressing up a doll, making cars race. And using each of these things as the basis for a conversation and more input on her part. I've watched her looking all over the screen, pausing over particular pictures or symbols, looking away and going back, then deciding - holding her gaze for long enough for the computer to 'do' something. And when she chooses something - say, a pink hat - the computer 'talks': 'pink hat'. (In American English, which makes us laugh.)
This is all pretty basic, but she's moving on quickly. And the reason we're so optimistic about it - apart from the fact that she clearly likes it, tries hard and gets a kick from it - is that she shows every sign of knowing just what she needs to do. She quickly progressed to being able to cope with 32 different symbol-boxes on the screen, and being able to find the thing she wants. And the best thing: right away she was responding to verbal requests. She's doing what I always hoped: showing what she knows. Someone asks, what colour car would you like, Amy? And she chooses. Then, what do you want the car to do? And she makes it crash, or toots the horn (and giggles). It's fun, it's interactive and best of all she's in control.