Autism Speaks Science Board member John Elder Robison is the author of Look Me in the Eye: My Life with Asperger’s and Be Different: Adventured of a Free-Range Aspergian. You find out more about his IMFAR experience, here, here, and here.
To find out more about ‘Innovative Technology for Autism’ visit here.
Welcome to this installment of ‘Topic of the Week.’ These topics stem from submissions from our community. If there is anything in particular that you would like to see featured, please contact us!
Have you, or someone you know on the spectrum, used assistive technology to help communicate? Are there any applications you favor? What are some pros and cons of using assistive technology for those on the spectrum?
In this coming week’s ‘Community Connections,’ Family Services will devote a newsletter to technology and autism. Signup and receive here!
A recent study reports that a quick brain scan could be used to screen for autism. The study, from senior author Declan Murphy, Ph.D., of Kings’ College London, has garnered considerable attention from the media for its potential to change the way we identify autism spectrum disorders (ASD). There is, however, another interesting aspect to this story. The investigators borrowed methods from a field of computer science and engineering called machine learning. These tools are most effective in finding patterns in sets of data that are large and heterogeneous for use in classification. Using a set of five measurements that are based on structural features of the human brain, the authors found that different patterns emerged for adults with autism when compared with typically-developing adults and also adults with ADHD. Importantly, no single brain region or feature alone was able to discriminate between the groups. When considered together, however, these features were selective approximately 90% of the time.
Machine learning techniques are also being used to classify symptoms in the hope of identifying meaningful subtypes of autism that can lead to tailored effective treatments. Curtis Jensen, a computer science engineer in San Diego has applied these techniques to the ARI database of symptoms from over 40,000 parent surveys. to identify symptom clusters that suggest possible relationships between symptoms that may be useful for identifying subtypes of autism. According to Jensen, the clusters “make sense”. For example, those subjects that score high in the fear or anxiety clusters tend to have lower intellectual disability. Similarly, although challenges with language communication are a defining feature of ASD, the obsessive-compulsive cluster seems to experience the least language difficulty.
Machine learning methods are not alone among the computer science tools used to benefit autism. For many years, the Interactive Technology for Autism (ITA) initiative from Autism Speaks, brought together researchers with expertise in computer science and engineering to seek solutions to problems faced in autism. Now, through a $10 million initiative from the National Science Foundation, researchers will combine computer vision, speech analysis and wireless physiological measurements to assist with early diagnosis and behavioral shaping. Collaborators at Georgia Tech, Carnegie Mellon University, University of Illinois at Urbana-Champaign, the University of Southern California and the Massachusetts Institute of Technology (MIT), will be aiming these powerful tools at social engagement and other behaviors. By analyzing video collected in clinic visits, at schools and also at home, the group hopes to develop tools for screening autism and evaluating the effects of therapy.
Several of the principal investigators involved in the recently awarded NSF grant are long standing members of the ITA steering committee. According to ITA co-chair and Associate Director of the NSF grant, “Organizations like Autism Speaks play a vital role in funding pilot investigations needed to demonstrate scientific feasibility of innovative approaches that lead to larger-scale, federally-sponsored research programs”. Stay tuned as we learn more from the new field of Computational Behavioral Science.
This guest post is by Susan Schober. Susan is a 4th year Ph.D. Electrical Engineering-Electrophysics student at the University of Southern California (USC) Viterbi School of Engineering and a mother to a young daughter with autism.
Eva and I
I was searching for answers to my questions. Will she ever speak? Will she have a normal life? What can I do to help? What caused this thing called autism? What about her future? I read tons of books and searched the internet for some kind of direction. I felt totally lost. Helpless. Confused. Sad. I was even embarrassed to tell people. In fact, only people I absolutely trusted knew my secret: my three-and-a-half-year-old daughter, Eva, was diagnosed with non-verbal autism.
After Eva’s first birthday, which was filled with presents, laughter, and friends, she came down with a fever that lasted for two weeks. Her words and eye contact left at this time, never to return. Her big beautiful brown eyes developed a glassed-over look. Where was the little girl with the rosy cheeks that smiled and giggled constantly? All that remained was an unresponsive child that stared at our ceiling fans or at the leaves blowing in the trees. She acquired weird habits like her love of collecting anything plastic, especially gift and credit cards. More recently, she became obsessed with computers and anything electronic.
Her current fascination is fine with me though, as I myself am a Ph.D. Student in Electrical Engineering (EE) at the University of Southern California (USC). At USC, I am completing my doctorate in Ultra-Low Power Radio Frequency/Analog Integrated Circuit Design.
One of the first challenges occurred when Eva was one and a half years old. She was referred by the Regional Center of Orange County to OCKids for a diagnosis. It was pure luck that Eva was to see Dr. Pauline Filipek, who is a specialist in autism spectrum disorders (ASD). Dr. Filipek’s nurse, Teri Book, who would eventually become a great friend, was in charge of scheduling the barrage of tests – which including blood work, EEGs, EKGs, hearing, vision, ultrasound for gastrointestinal issues, and genetics – that followed to get a more accurate picture of what was going on. The official diagnosis came in a 40-page report a few months later. I read it over and over with tears in my eyes.
Eva’s Early Start program started soon after. Her therapies included physical, speech/language, Occupational Therapy (OT), and Applied Behavioral Analysis (ABA). My mom would always joke that Eva had a full-time job as her work schedule would last 25-30 hours a week, on average. It was hard seeing her frustrated, but we stuck with the program. She slowly learned basic sign language and worked with the PECS (Picture Exchange Communication System) to organize her daily activities.
On one of her follow-up appointments with Dr. Filipek, the doctor tried to get Eva to look in her eyes. This was no easy task. However, Filipek would not give up and finally Eva gave in. Eva looked in Dr. Filipek’s eyes for a brief second, and cracked a big smile—the first smile in a year. I almost fell out of my chair. Dr. Filipek whipped around and looked me square in the eyes and said, “There IS a little girl in there wanting to get out. It is OUR job to help her.” That was all the fuel I needed to start my quest to find a way to help Eva overcome autism.
It was by chance that I met Professor Olga Solomon and found that USC had a wide variety of research interests in helping those with ASD. That chance came in September 2009 in the form of an email forwarded to the Electrical Engineering Department at USC’s Viterbi School of Engineering where I study. That email was titled: “SEMINAR: Enhancing and Accelerating the Pace of Autism Research and Treatment: The Promise of Developing Innovative Technology by Matthew Goodwin.” When I received that email, I did a double take. It was addressed to my USC account and it said the word “autism.” I thought by accident I had gotten one of my many autism related newsletters or therapist’s emails in the wrong account for some reason. But when I read it for the third time, I realized that yes, there was a scientist coming to USC to speak about integrating engineering techniques into research on autism. I thought it so strange and beautiful. I had to go.
At the end of this eye-opening seminar, Dr. Solomon announced that she would teach a class in the Spring 2010 semester titled “Innovative Technology for Autism Spectrum Disorders” funded by Autism Speaks. The course would unite the fields of engineering, occupational science, neuroscience, psychology, anthropology to give a full view of the technological advances in the world of ASD. Every week, the students would read articles about ASD science and technology, blog about the readings, and invite the authors to present their research in the class. The course was too good to be true. I believe I was the first person to sign up.
The students came from a mix of backgrounds, including engineering, computer science, and occupational therapists. I struggled with being open about the fact that I was a mom of a daughter with autism. When it was my turn, I blurted it out. This was the first time I had ever told people I did not know about Eva’s autism and it was therapeutic. This small action opened the door for me to use my engineering background coupled with the knowledge that comes with being a parent of a child with ASD. I was so happy; I was not embarrassed anymore. I was here because of my unique experience and my desire to help and to find answers and solutions.
The first few weeks were dedicated to making sure the students had a strong foothold in what ASD was and what current methods exist to aid those with autism. The first speaker was Portia Iverson and we read about her experiences raising her son with autism through an excerpt from her book “Strange Son.” I was so touched by the passage that I wrote in my blog that I was going to buy the book and finish reading it.. The class day came and I received the most touching gift: Dr. Solomon obtained a copy of the book and had Portia sign it for me personally. I read the book in two days.
Each week following the first, the class had wonderful speakers; these included my favorites: Shri Narayanan – a well known Electrical Engineer who deals with speech and signal processing techniques, Skip Rizzo – a Virtual Reality (VR) guru, and Gillian Hayes, who works in pervasive computing for ASD. After each talk, I made every effort to speak with the lecturers in order to ask questions and broaden my knowledge. Most importantly, I wanted to say “thank you” and shake their hands. I had such an overwhelming feeling that in order to solve the puzzle of autism, every approach, story, and effort was an important piece to be considered in the autism equation.
At the end of the semester we worked in teams with mixed backgrounds to develop an innovative idea to apply to the field of autism. My group’s project was to develop an interactive VR and pervasive computing program to help diagnose children with autism living in rural areas where there are not enough resources or doctors on-site to make a diagnosis. We collectively wrote a grant proposal which, if accepted and funded, could be applied to disaster areas like that of Hurricane Katrina or Haiti. Using technology such as video and wireless sensors to gather data (including heart rate, sound, and body movement), the VR system could be set up in a remote area and used by a doctor or trained therapist at another location to make an initial assessment for a child suspected of having autism. This, in turn, would allow that child to receive an accurate diagnosis, including a recommendation for therapy or medical attention as needed. Not all families are as lucky as I was to live in an area with access to top doctors, therapists, and research facilities dedicated to autism. Hopefully, with a portable system like the one proposed, costs, such as travel expenses and doctor fees, can be greatly reduced and children suspected of having ASD can receive effective treatment quickly.
Now that the class is over, I can look back and confidently say I am so grateful for the experience and connections I have made though the semester. The autism technology course has opened a whole new world for me. I signed up for the class because it intrigued me for the obvious reasons. I wanted to know more about autism and what was out there that could possibly help heal my daughter. What Dr. Solomon’s course gave me was a basic, yet solid understanding of autism and a way in which I could personally contribute my engineering skills and unique background to forming innovative technologies to improve the lives of individuals with ASD. Looking forward, I would love to continue to further my research in ASD technologies using both my insight as an engineer and a mom of a child with autism.
This guest post is by Geri Dawson, Chief Science Officer, Autism Speaks.
The other day, I was talking to a mother who was telling me about her teenage son. The story was a familiar one to any mother who has raised a boy through the adolescent years. He wants more control over his own life, has a short fuse, gets easily frustrated, and feels no one really understands him. His mom tries to communicate with him, but much of the time, she’s at a loss figuring out what he really wants. It’s been like this for a while now.
Most fourteen year old boys are struggling with these same issues. But, there is one way in which her son is different: he has autism. His spoken language is limited to one or two word phrases. His mom described how he would repeatedly say “Rajaneba”, hoping that his mother would understand what he wanted. This went on for two years. She had no clue. Finally, in desparation and frustration, he would start to hit himself.
Then, his mom heard about something that any 14 year old boy would want—and iPhone with a special app. This particular application is really special, though—it allows users to tap words, phrases, or sentences on the screen to create messages that are read aloud by the software. Samuel Sennott a graduate student in special education at Penn State, collaborated with a developer from Amsterdam to create Proloquo2Go, an iPhone application that can be used by people with language impairments to communicate.
It didn’t take long for her son to tap out “Mr. Roger’s Neighborhood,” referring to a DVD he had watched over two years ago and that he wanted to watch again. Raja – neba! She happily found the DVD and handed it to him.
Over the next several weeks, the world changed for this mom and her son. Her son had a lot to say as it turns out. She found out that he has a strong interest in math and international events. He’s doing a lot better in his classes at school. He thinks it’s really cool to carry around an iPhone. And, like most 14 year old boys, he’s spending a lot of time texting.
TechDemo 2010: Innovative Technologies for Understanding and Supporting Persons with Autism Spectrum Disorders
The second day of IMFAR brought the second autism Technology Demo sponsored by Autism Speaks’ Innovative Technologies for Autism (ITA) initiative. One of the newest features of the conference, this unique event consisted of live demonstrations of 30 technologies being developed around the world to benefit a number of critical areas affecting individuals with ASD, their families, and the professionals who strive to better support them. This year, while throngs of scientists listened to oral presentations and discussed research posters, TechDemo 2010 provided community members with an opportunity to interact with some of the most recent advancements in the areas of robotics, virtual reality, assistive communication devices, video and audio capture technology, on-body sensors and much more. Taking advantage of the fun, several local families came to explore the technologies with their children and provide critical feedback to the researchers.
The primary mission of Autism Speaks’ ITA initiative is to stimulate creative design that can provide more immediate and tangible solutions to the challenges faced by individuals living with autism today, and the session illustrated the many ways that technology can enhance and accelerate the pace of autism research and treatment.
Several presentations used interactive puzzles and computer avatars to teach children conversational skills or how to read expressions. For instance, one popular project involved the Disney animated character Crush from “Finding Nemo” in an interactive animated show at Disney’s Epcot called “Turtle Talk.” In this show, an actor operates the Crush character remotely using hidden cameras to view his audience. This allows the actor to modulate his interactions in response to specific situational information. The professional Disney actors are also trained to modulate their voice and elicit social interactions such as imitations through engaging games, vocal play and banter. Researchers discovered that young children with autism and minimal verbal skill responded to the turtle with both spontaneous and delayed imitation of vocalizations and gestures. Some children even initiated spontaneous questions, something their parents reported they had never done. These early pilot results are promise as an alternative and engaging therapy to increase arousal level for greater sustained attention and learning.
Emma Brightman attended the Tech Demo with her father Jay and her babysitter Nicole Jacobs. “The interactive computer games were definitely her favorite,” they said afterwards, as a smiling Emma emerged from the room.
A collection of projects also explored the opportunities for using novel sensing devices on the body and in the environment to help parents and therapists understand life from the perspective of an individual with an ASD. For instance, one demonstration showed wearable wireless physiological sensors that record internal arousal states in naturalistic settings, such as at home and school. The technology is being developed to document and understand stress and arousal in persons with autism during engagement with a variety of social, communicative, and learning activities. These physiological measures may guide an occupational therapist as to whether therapies they are using are effective, and what time of day offers optimal arousal for the purpose of treatment. The child may also be able to use the technology to analyze their own internal states. Moreover, in some cases these same technologies can be used as biofeedback to engage individuals more effectively in therapeutic games and everyday activities.
Mobile applications were also very popular this year. Several at the Demo were shown to improve social skills and awareness in high functioning students with ASD. One product called Symtrend has been developed for use with the iPod/iPad and allows users to log their ‘stress’ responses at various points of the day and can be customized for a variety of different uses and cueing purposes. YouthCare have now adapted it for a summer camp in which students are taught mindfulness and Cognitive Behavior Training to raise awareness of anxiety and stress, and learn strategies to effectively deal with overwhelming feelings. The advantage of such a program is that students can compare their own impression regarding their affective state with that provided by their therapist, as all responses can be tracked and synched via a data graphing and output program that is web-based. Using this method, researchers report a change in physiological response to stress and anxiety such as reduced heart rate as well as an improved ability to talk about and cope with these feelings.
A second program developed for the iPod acts as a mobile social compass. Based on the social skills interventions that include pictorial representations of social rules, this mobile compass with GPS aids the student by giving pictorial cues to distinguish strangers from friends and prompt them regarding how to interact with such people differently in terms of proximal space and conversation skills. For example, the device can store personal interest information about friends and cue the user with a topic sentence for initiating for a sustainable conversation. In addition, if the user terminates the conversation and starts to walk away, the device can cue for appropriate ‘closure’ of the conversation. This device was found to successfully augment an existing social curriculum and encouraged students to use the system more regularly.
“As a parent you want your child to be able to communicate and socialize,” said Gail Walsh, who felt her son Dillon could benefit greatly from these technologies. “We’re working on functional skills at home, and seeing all this research really gives me hope.”
The families in attendance were also eager to share their passion for technology. Doug Fischer attended with his son Ben and Ben’s teacher Dave Mendell, who in honor of the special event had together made a poster describing the many ways they use technology in their own classroom. Upon entering the demo, the young scientist-to-be promptly set the poster up amidst all the other presentations and soon found himself explaining to the scientists what he thinks needs to be done!
For more information on the Tech Demo and to find the full set of research abstracts, see http://www.autism-insar.org/index.php?option=com_content&task=view&id=187&Itemid=164.
The conference continues through Saturday. To read complete coverage from IMFAR, please visit http://www.autismspeaks.org/science/science_news/imfar_2010.php