A History of AAC Devices – Part 1: The Pioneers
The use of alternative methods of communication can be traced back to ancient times when individuals who were deaf or could not speak developed a manual language. Manual languages were also used by First Nations to communicate with members of other groups (Glennen).
Mountain Chief demonstrates the Plains Indian Sign and Gesture language.
By the early 1950s, manual sign language had taken its place as a legitimate form of communication for people with special needs. In the years that followed, the clinical and educational basis for AAC was established but little actual research was done.
In 1952, Goldstein and Cameron first published information about the use of communication boards.
1960s – Emergence of AAC
The 1960s were a decade of social and political change. One result was the development of American Sign Language (ASL) which helped raise public awareness of people’s communication needs.
Most of the early technologies emerged in Europe, often as special systems designed to control a typewriter or other common items in the home such as bells, lights, radios, telephones and televisions.
The POSM (Patient Operated Selector Mechanism), a sip-and-puff typewriter controller prototyped by Reg Maling in 1960 was an early electric communication device. Clearly portability was an issue!
In the early 1960s Orest Z. Roy at the National Research Council of Canada designed and built a communication device called the Comhandi – an electronic letter board that allowed people to select letters and build up words. It was one of the first devices of its kind in the world.
In 1967, the Patient Initiated Light Operated Telecontrol (PILOT) allowed people to control typewriters by simply pointing a beam of light.
Morse code was also used for typewriter control through voiced or sip-and-puff Morse code in the VOTEM system (1969), as well as by oral muscular control (1972).
To learn more about these pioneering communications systems see the 1974 book, Aids for the Severely Handicapped, edited by Keith Copeland.
1970s – Transistors & Symbols
In the 1970s early mechanical systems began to be replaced by transistorized devices.
Some of these systems used Electromyography (The electrical activity generated by skeletal muscles) to control typewriters and other devices.
Richard Foulds of Tufts University, led the development of the Tufts Interactive Communicator (TIC) a scanning communication aid, and later the ANTIC, the first communication aid to change its scanning order in anticipation of the next most probable letter to be typed.
In 1973, The Talking Broach, a pocket display operated via a keyboard became the first wearable communication device and along with Lightwriter created by Toby Churchill, the first portable communication aids. Talking Brooch was also the first to allow face-to-face communication with near eye contact.
A series of US government acts including the Rehabilitation Act of 1973 which prohibited discrimination against Individuals with disabilities and the Education for All Handicapped Children Act of 1975 which mandated free special education services for all school aged children helped spur the further research and development of AAC devices. Initially therapists tended to avoid the use of AAC devices until they were sure that the person could not develop speech naturally. Later research showed that early interventions that included use of an AAC device actually encouraged them to use more speech to supplement the devices.
The SCRP “100 + 100” display developed by the Ontario Crippled Children’s Centre, was designed specifically to be used with Bliss Symbols.
In the early 1970s, picture symbols for individuals who could not use the alphabet were introduced. The first graphic symbol system was called Blissymbols, created by Charles Bliss. The original system included 1,400 black and white symbols with written symbols. This system is still used and updated with new symbols yearly. The Blissymbolics Communication Institute was formed in Toronto in the early 1970’s and professionals traveled there to be trained to use the symbol system.
Another early symbol-based system developed by the Fairchild Space and Electronics company
The next big step in AAC devices for people with autism and ELD was speech synthesis. Initial experiments involved stationary computers and the first use of synthetic voice by someone who could not speak was probably done out of the Artificial Language Laboratory at Michigan State University. It was used in the most normal of ways – to order a pizza.
In the mid-70s the Trace Center created a “portable” voice synthesizer based on the commercial DecTalk synthesizer by Digital Equipment Corporation (DEC). It weighed about 20 pounds and was mounted on the back of a wheelchair.
The first commercial mass-marketed communication aid with speech synthesis was probably the Handivoice from Federal Screw Works.
1980’s Period of growth and development
By the 1980’s AAC became an area of professional specialization and the American-Speech-Language-Hearing Association recognized it as an area of practice. In 1978, Purdue University became one of the first universities to offer a course on AAC.
Around the world, countries began implementing programs to fund the provision and development of AAC devices. Many of the companies that would become dominant players in the AAC market were also being formed.
Although Ed Prentke and Bill Romich had created their first AAC device – based on a discarded Teletype machine – in 1969, it wasn’t until the 1980s that the Prentke-Romich Corporation became a leading player in AAC devices with the release of the Minispeak System in 1982 and the Touch Talker and LightTalker in 1984. As technology has evolved, Prentke-Romich has continually refined their dedicated devices ever since and produced their first tablet app in 2012.
The original Light Talker from the 1980s, along with a more modern version.
The era of dedicated devices also prominently featured the companies that would become today’s Dynavox Mayer-Johnson. In the early 80s, DynaVox’s predecessor Sentient Systems Technology developed the EyeTyper which allowed people to type using eye movements. In the early 1990s, now know as DynaVox they produced first AAC products to feature touch screens with dynamic displays. Word and grammar prediction was introduced to allowed people to compose messages more quickly.
In 1996, DynaVox introduced the first AAC devices with universal remote controls. In 2001, DynaVox was also instrumental in the lobbying effort that resulted in Medicare coverage for AAC devices. As computers became more powerful so did AAC devices and Dynavox continued to add features to their devices.
In 2004, DynaVox acquired Mayer-Johnson, makers of Boardmaker and Boardmaker software.
These and other mainstream dedicated device manufactures continued to develop new products and new features throughout the early 2000s. In 2010, the first iPad was introduced – and that changed everything. Suddenly AAC devices would become more affordable and more accessible for millions of children with autism and Expressive Language Delay worldwide.