The Autism Speaks’ Innovative Technology for Autism (ITA) Initiative has awarded more than $400,000 in new research grants to develop innovative assistive, educational, therapeutic, and diagnostic technologies for persons with autism.
2011 saw a new approach for Autism Speaks’ Innovative Technology for Autism(ITA) Initiative with the running of a student design competition called Autism Connects. The design brief was pretty straight forward: to create technology design ideas for individuals with autism to better connect with the world around them, and to allow individuals who do not have autism to better understand and connect with those who do.
Vodpod videos no longer available.
Young and old, technology is never far from us. It enables our communication and helps us grow and maintain social relationships. For years Autism Speaks has promoted the research of technologies to support children and adults with ASD, whether that is through the awarding of grants or by supporting research-networking events.
2011 saw a new approach for Autism Speaks’ Innovative Technology for Autism (ITA) initiative with the running of a student design competition called Autism Connects. The design brief was pretty straight forward: to create technology design ideas for individuals with autism to better connect with the world around them, and to allow individuals who do not have autism to better understand and connect with those who do.
Autism Connects was a partnership between Autism Speaks, Core 77 and jovoto. In total there were 126 design ideas submitted from over 30 countries. The popularity of the competition really shows the passion and interest there is for autism around the world and how we can engage young professionals to use their burgeoning skills to make a difference in the lives of people with ASD and their families.
The submitted ideas were judged by a panel of international experts on ASD, including Temple Grandin and John Robison. The jury rated the best design and first prize to Gobug, by Greg Katz and Tom Rim from the University of Illinois College of Fine and Applied Arts Industrial Design.
“Gobug is designed to move around on a ground surface at the control of the users. Up to two or three children can play with the toy simultaneously. Each user takes ownership of one controller. These controllers work in conjunction; each user points his/her remote in a direction, and the Gobug moves in the combined direction of the active controllers” said Greg Katz and Tom Rim.
When asked how the team came upon this original idea, Greg Katz said, “We took this on from a user-centered design perspective. The focus was 100% on the person we were designing for. We designed through an iterative process, constantly sketching ideas and fine tuning them in to workable concepts. The outcome was Gobug.”
In second place was WEsync, which was designed by Noel Cunningham from Maryland Institute College of Art (MICA) in Baltimore, Md.
“weSYNC is an application for the iPad, iPhone, and Web, that creates a specialized profile for the autistic individual by gathering knowledge from each caregiver and establishing a centralized location where it can be accessed and edited by everyone. Establishing a dialogue among doctors, therapists, teachers and parents allows them to share information and reinforce oneanother’s efforts.”
In third place was another idea from Maryland Institute College of Art (MICA) in Baltimore designed by Cameron Zotter whose idea is called Visual Watch. The watch is both a time management and picture exchange communication system (PECS) tool designed specifically for people with ASD.
The three prize winners were invited to this week’s International Meeting for Autism Research to present their designs in person at the technology demonstration on Friday. Autism Speaks’ Chief Science Officer Geraldine Dawson, Ph.D., will be announcing the winners and celebrating their innovations on Friday at the event.
The breadth and wealth of these students’ ideas reflects the technology and autism field in general. All of the designs that were submitted had considered and detailed ways of using technology to aid the lives of people with ASD or those who love and support them. The potential of these ideas to make a difference for families is vast. Our next challenge is how we get these concepts and ideas out into the real world and we’d be interested to hear your ideas on how to achieve that.
Through this competition Autism Speaks has encouraged a new community of young people to think about ASD. Our hope is that Greg, Tim, Noel, and Cameron will take this experience into their working lives and have autism close to their thoughts when they are planning their future projects.
Lastly, none of this could have been possible without our fantastic ITA committee members, who are chaired by Drs. Katherina Boser and Matthew Goodwin. Also, enormous thanks to our judges and the community experts who guided the students’ design ideas to help make them as good as they turned out to be.
You can find out more about the three Jury Prize winners and the six Community Prize winners here.
A recent study reports that a quick brain scan could be used to screen for autism. The study, from senior author Declan Murphy, Ph.D., of Kings’ College London, has garnered considerable attention from the media for its potential to change the way we identify autism spectrum disorders (ASD). There is, however, another interesting aspect to this story. The investigators borrowed methods from a field of computer science and engineering called machine learning. These tools are most effective in finding patterns in sets of data that are large and heterogeneous for use in classification. Using a set of five measurements that are based on structural features of the human brain, the authors found that different patterns emerged for adults with autism when compared with typically-developing adults and also adults with ADHD. Importantly, no single brain region or feature alone was able to discriminate between the groups. When considered together, however, these features were selective approximately 90% of the time.
Machine learning techniques are also being used to classify symptoms in the hope of identifying meaningful subtypes of autism that can lead to tailored effective treatments. Curtis Jensen, a computer science engineer in San Diego has applied these techniques to the ARI database of symptoms from over 40,000 parent surveys. to identify symptom clusters that suggest possible relationships between symptoms that may be useful for identifying subtypes of autism. According to Jensen, the clusters “make sense”. For example, those subjects that score high in the fear or anxiety clusters tend to have lower intellectual disability. Similarly, although challenges with language communication are a defining feature of ASD, the obsessive-compulsive cluster seems to experience the least language difficulty.
Machine learning methods are not alone among the computer science tools used to benefit autism. For many years, the Interactive Technology for Autism (ITA) initiative from Autism Speaks, brought together researchers with expertise in computer science and engineering to seek solutions to problems faced in autism. Now, through a $10 million initiative from the National Science Foundation, researchers will combine computer vision, speech analysis and wireless physiological measurements to assist with early diagnosis and behavioral shaping. Collaborators at Georgia Tech, Carnegie Mellon University, University of Illinois at Urbana-Champaign, the University of Southern California and the Massachusetts Institute of Technology (MIT), will be aiming these powerful tools at social engagement and other behaviors. By analyzing video collected in clinic visits, at schools and also at home, the group hopes to develop tools for screening autism and evaluating the effects of therapy.
Several of the principal investigators involved in the recently awarded NSF grant are long standing members of the ITA steering committee. According to ITA co-chair and Associate Director of the NSF grant, “Organizations like Autism Speaks play a vital role in funding pilot investigations needed to demonstrate scientific feasibility of innovative approaches that lead to larger-scale, federally-sponsored research programs”. Stay tuned as we learn more from the new field of Computational Behavioral Science.
This post is by Autism Speaks’ staffer, Leanne Chukoskie, Ph.D., Asst. Director for Science Communication and Special Projects.
Imagine having your child’s favorite therapist available to you at any time you need. With her uncanny way of drawing out your child, together they practice speech, social interactions and some physical gross and fine motor control drills to improve his coordination skills. Now imagine that all this doesn’t cost millions and you don’t even need to build on to your house for the therapist’s new abode. All she needs is a dose of 110V every few days to keep going.
Such fantastic scenarios aren’t necessary the realm of science fiction any longer. Social robots are making appearances in classrooms and even serving as teacher’s aides. An article in this week’s New York Times makes the case for the unique advantages that these machines bring to the learning environment. For example, RUBI, a robot developed at UCSD, has been deployed in preschools where it has been teaching children Finnish words. RUBI modifies her interactions on the basis of the children’s approach and retreat behaviors as well as a real-time analysis of children’s facial expressions. Is this child looking frustrated or engaged and content? Using this interactive technology, RUBI and robots like her can individualize interactions to optimize the engagement with each child.
It is precisely this ability to optimize that makes robotics so useful for working with individuals with autism. Katharina Boser, Ph.D. the co-chair of Autism Speaks’ Interactive Technology for Autism Initiative (ITA) underscores the utility of programmed real-time interactions for autism. People are unpredictable. For an individual with autism, this aspect of behavior can make interactions with people less desirable than interactions with a robot that matches expectations frequently. With the phenomenal advances in computer technology, a robotic social partner can work on lags in, for example, gestural communication by keeping to a very predictable set of gestures, while adding a bit more variety with speech and intonation if the learner is ready. Other ITA research underscores the point that the social partner does not need to be human-like. Researchers noticed that some young children with autism really engaged with Crush, the sea turtle made famous in Finding Nemo. and have built a therapeutic strategy around this character.
In addition to the robots, other sorts of technology are offering therapy support for autism. By integrating eye-tracking into a game interface, Felicia Hurewitz, Ph.D. (Drexel University) has developed an environment for children that requires gaze interaction with the characters to reveal clues to advance in the game. Nilanjan Sarkar, Ph.D. (Vanderbilt University) is using feedback from an individual’s autonomic responses (such as heart rate, sweating) to find the right level of difficulty for a game—not too frustrating and not too easy. The computer interface and simulated human interactions are put to work in service of practicing social skills in a safe environment from SIMmersion, a technology company in Maryland. The ability to analyze, review and evaluate interactions is a great strength of this technology.
These great new tools will undoubtedly change the landscape of autism therapy, but they will never replace a human therapist. Our models of the behaviors being simulated in silico are human. We also need humans to serve as final arbiters of the therapy—did it really work? Are the data being collected valid in other settings? After all, the point of this training is to interact with other people. As of this writing, we still need humans for that!
This guest post is by Susan Schober. Susan is a 4th year Ph.D. Electrical Engineering-Electrophysics student at the University of Southern California (USC) Viterbi School of Engineering and a mother to a young daughter with autism.
Eva and I
I was searching for answers to my questions. Will she ever speak? Will she have a normal life? What can I do to help? What caused this thing called autism? What about her future? I read tons of books and searched the internet for some kind of direction. I felt totally lost. Helpless. Confused. Sad. I was even embarrassed to tell people. In fact, only people I absolutely trusted knew my secret: my three-and-a-half-year-old daughter, Eva, was diagnosed with non-verbal autism.
After Eva’s first birthday, which was filled with presents, laughter, and friends, she came down with a fever that lasted for two weeks. Her words and eye contact left at this time, never to return. Her big beautiful brown eyes developed a glassed-over look. Where was the little girl with the rosy cheeks that smiled and giggled constantly? All that remained was an unresponsive child that stared at our ceiling fans or at the leaves blowing in the trees. She acquired weird habits like her love of collecting anything plastic, especially gift and credit cards. More recently, she became obsessed with computers and anything electronic.
Her current fascination is fine with me though, as I myself am a Ph.D. Student in Electrical Engineering (EE) at the University of Southern California (USC). At USC, I am completing my doctorate in Ultra-Low Power Radio Frequency/Analog Integrated Circuit Design.
One of the first challenges occurred when Eva was one and a half years old. She was referred by the Regional Center of Orange County to OCKids for a diagnosis. It was pure luck that Eva was to see Dr. Pauline Filipek, who is a specialist in autism spectrum disorders (ASD). Dr. Filipek’s nurse, Teri Book, who would eventually become a great friend, was in charge of scheduling the barrage of tests – which including blood work, EEGs, EKGs, hearing, vision, ultrasound for gastrointestinal issues, and genetics – that followed to get a more accurate picture of what was going on. The official diagnosis came in a 40-page report a few months later. I read it over and over with tears in my eyes.
Eva’s Early Start program started soon after. Her therapies included physical, speech/language, Occupational Therapy (OT), and Applied Behavioral Analysis (ABA). My mom would always joke that Eva had a full-time job as her work schedule would last 25-30 hours a week, on average. It was hard seeing her frustrated, but we stuck with the program. She slowly learned basic sign language and worked with the PECS (Picture Exchange Communication System) to organize her daily activities.
On one of her follow-up appointments with Dr. Filipek, the doctor tried to get Eva to look in her eyes. This was no easy task. However, Filipek would not give up and finally Eva gave in. Eva looked in Dr. Filipek’s eyes for a brief second, and cracked a big smile—the first smile in a year. I almost fell out of my chair. Dr. Filipek whipped around and looked me square in the eyes and said, “There IS a little girl in there wanting to get out. It is OUR job to help her.” That was all the fuel I needed to start my quest to find a way to help Eva overcome autism.
It was by chance that I met Professor Olga Solomon and found that USC had a wide variety of research interests in helping those with ASD. That chance came in September 2009 in the form of an email forwarded to the Electrical Engineering Department at USC’s Viterbi School of Engineering where I study. That email was titled: “SEMINAR: Enhancing and Accelerating the Pace of Autism Research and Treatment: The Promise of Developing Innovative Technology by Matthew Goodwin.” When I received that email, I did a double take. It was addressed to my USC account and it said the word “autism.” I thought by accident I had gotten one of my many autism related newsletters or therapist’s emails in the wrong account for some reason. But when I read it for the third time, I realized that yes, there was a scientist coming to USC to speak about integrating engineering techniques into research on autism. I thought it so strange and beautiful. I had to go.
At the end of this eye-opening seminar, Dr. Solomon announced that she would teach a class in the Spring 2010 semester titled “Innovative Technology for Autism Spectrum Disorders” funded by Autism Speaks. The course would unite the fields of engineering, occupational science, neuroscience, psychology, anthropology to give a full view of the technological advances in the world of ASD. Every week, the students would read articles about ASD science and technology, blog about the readings, and invite the authors to present their research in the class. The course was too good to be true. I believe I was the first person to sign up.
The students came from a mix of backgrounds, including engineering, computer science, and occupational therapists. I struggled with being open about the fact that I was a mom of a daughter with autism. When it was my turn, I blurted it out. This was the first time I had ever told people I did not know about Eva’s autism and it was therapeutic. This small action opened the door for me to use my engineering background coupled with the knowledge that comes with being a parent of a child with ASD. I was so happy; I was not embarrassed anymore. I was here because of my unique experience and my desire to help and to find answers and solutions.
The first few weeks were dedicated to making sure the students had a strong foothold in what ASD was and what current methods exist to aid those with autism. The first speaker was Portia Iverson and we read about her experiences raising her son with autism through an excerpt from her book “Strange Son.” I was so touched by the passage that I wrote in my blog that I was going to buy the book and finish reading it.. The class day came and I received the most touching gift: Dr. Solomon obtained a copy of the book and had Portia sign it for me personally. I read the book in two days.
Each week following the first, the class had wonderful speakers; these included my favorites: Shri Narayanan – a well known Electrical Engineer who deals with speech and signal processing techniques, Skip Rizzo – a Virtual Reality (VR) guru, and Gillian Hayes, who works in pervasive computing for ASD. After each talk, I made every effort to speak with the lecturers in order to ask questions and broaden my knowledge. Most importantly, I wanted to say “thank you” and shake their hands. I had such an overwhelming feeling that in order to solve the puzzle of autism, every approach, story, and effort was an important piece to be considered in the autism equation.
At the end of the semester we worked in teams with mixed backgrounds to develop an innovative idea to apply to the field of autism. My group’s project was to develop an interactive VR and pervasive computing program to help diagnose children with autism living in rural areas where there are not enough resources or doctors on-site to make a diagnosis. We collectively wrote a grant proposal which, if accepted and funded, could be applied to disaster areas like that of Hurricane Katrina or Haiti. Using technology such as video and wireless sensors to gather data (including heart rate, sound, and body movement), the VR system could be set up in a remote area and used by a doctor or trained therapist at another location to make an initial assessment for a child suspected of having autism. This, in turn, would allow that child to receive an accurate diagnosis, including a recommendation for therapy or medical attention as needed. Not all families are as lucky as I was to live in an area with access to top doctors, therapists, and research facilities dedicated to autism. Hopefully, with a portable system like the one proposed, costs, such as travel expenses and doctor fees, can be greatly reduced and children suspected of having ASD can receive effective treatment quickly.
Now that the class is over, I can look back and confidently say I am so grateful for the experience and connections I have made though the semester. The autism technology course has opened a whole new world for me. I signed up for the class because it intrigued me for the obvious reasons. I wanted to know more about autism and what was out there that could possibly help heal my daughter. What Dr. Solomon’s course gave me was a basic, yet solid understanding of autism and a way in which I could personally contribute my engineering skills and unique background to forming innovative technologies to improve the lives of individuals with ASD. Looking forward, I would love to continue to further my research in ASD technologies using both my insight as an engineer and a mom of a child with autism.