Hci Assistive Technology Devices for Specially-Abled

HCI Assistive Technology devices for Specially-abled Abstract—The Human-computer interaction (HCI) is the study of how people design, implement, and use interactive computer systems and how computers affect individuals, organizations, and society. This encompasses not only ease of use but also new interaction techniques for supporting user tasks, providing better access to information, and creating more powerful forms of communication.

It involves input and output devices and the assistive devices interaction techniques that use them; how information is presented and requested; how the computer’s actions are controlled and monitored; all forms of help, documentation, and training; the tools used to design, build, test, and evaluate user interfaces; and the processes that developers follow when creating Interfaces. HCI in the large is an interdisciplinary area.

We Will Write a Custom Essay Specifically
For You For Only $13.90/page!

order now

It is emerging as a specialty concern within several disciplines, each with different emphases: computer science (application design and engineering of human interfaces), psychology (the application of theories of cognitive processes and the empirical analysis of user behavior), sociology and anthropology (interactions between technology, work, and organization), and industrial design (interactive products). Keyword- HCI, anthropology, assistive technology devices and interactive products. I. INTRODUCTION The concept of human–computer interaction (HCI) was ? st presented by a group of professionals at the Association for Computing Machinery’s Special Interest Group on Computer–Human Interaction Conference in 1992. HCI (human-computer interaction) is the study of how people interact with computers and to what extent computers are or are not developed for successful interaction with human beings. Computer vision has been an active area of research for more than three decades. A quick review of the field reveals that image processing and pattern recognition has been tremendously successful in terms of delivering operational systems.

This paper explains project applying Human computer interface (HCI) knowledge and techniques, such as accessibility and usability, to help people with cerebral palsy or other severe disabilities carry out specific tasks with computer. The goals of HCI are to produce usable and safe systems, as well as functional systems. In order to produce computer systems with good usability, developers must attempt to understand the factors that determine how people use technology develop tools and techniques to enable building suitable systems achieve efficient, effective, and safe interaction put people first.

The organization of a computer vision system is highly application dependent. These systems open new paradigms in HCI and allow us to innovate ways of interaction that can benefit people with severe disabilities like cerebral palsy. According with the United Cerebral Palsy (UCP), Cerebral palsy describes a group of disorders of the development of movement and posture causing activity limitation that are attributed to non-progressive disturbances that occurred in the developing fetal or infant brain.

The motor disorders of cerebral palsy are often accompanied by disturbances of sensation, cognition, communication, perception and/or behavior, and/or by a seizure disorder. II. HUMAN COMPUTER INTERACTION It is an undisputable fact that the computer, which was once only operated by experts during the years it was first introduced and was produced in small numbers, is now an indispensable part of human existence. Therefore it gains more and more importance that the computers and the software running on them i. e. he interfaces can be effectively and efficiently used by everyone and should be further improved to meet that need. HCI (human-computer interaction) is the study of how people interact with computers. Human–computer interaction (HCI) is the study, planning and design of the interaction between people (users) and computers. The Association for Computing Machinery defines human-computer interaction as “a discipline concerned with the design, evaluation and implementation of interactive computing systems for human use and with the study of major phenomena surrounding them in Fig 1.

An important facet of HCI is the securing of user satisfaction Due to the multidisciplinary nature of HCI, people with different backgrounds contribute to its success. HCI is also sometimes referred to as man–machine interaction (MMI) or computer–human interaction (CHI). Usability is one of the key concepts in HCI. It is concerned with making systems easy to learn and use. A usable system is: easy to learn, easy to remember how to use, effective to use, efficient to use, safe to use, enjoyable to use Accessibility means that people with disabilities can perceive, understand, navigate, and interact with an interactive system.

Accessibility also addresses the needs of a wider range of users, including people with changing abilities due to aging. Accessibility problems can also be caused by specific environment or social conditions. Both above-mentioned attributes are largely treated and tested in this paper. Fig 1. HCI inter-disciplinary interactions III. DESIGN FOR ALL AND ASSISTIVE TECHNOLOGY Design for all means that the products and environments should be designed so that they are usable by all people, to the greatest extent possible, without the need for adaptation or specialized Severe Movements Restrictions and CV-Based HCI design.

This will only come about as a result of designing mainstream products and services to be accessible by as broad a range of users as possible (van Dusseldorp, Paul, & Ballon, 1998). An architectural analogy would be the ramp or the elevator (as opposed to the stairs), which makes movement within the environment possible for most people. Design for all involves the principles of accessibility and usability Since accessibility means removing barriers, an accessible system allows a blind person to “view” a web site or a paralyzed person to move the screen cursor without a mouse.

Since usability means minimizing the overload imposed by the use of computers in terms of motor and cognitive load, a usable system makes manipulation of the system easier. Therefore, design for all also involves assistive technologies to overcome any impairments of a user. In the context of this paper, assistive technology refers to the (special) computer hardware and/or software used to increase, maintain, or improve the functional capabilities of individuals with disabilities (Blaise, 2003). Examples of computer assistive technology devices are Braille readers, screen magnification software, eye-tracking devices, and so forth.

In spatial accessibly, a wheelchair is an assistive technology device. A. User Interface Accessibility From the user’s viewpoint, interface means “the whole system. ” The barriers that disabled and elderly people find when accessing interactive systems are mainly related to the user interface. These barriers reflect the physical difficulties users might have to manage the devices and the cognitive barriers users may have in understanding the operating procedures for interacting with the interface.

The studies made with users demonstrate the necessity of adaptable interfaces that allow the control of devices and services through integrated interoperable systems in intelligent surroundings 1)Physical accessibility Standard interfaces are based on the most common interaction devices the keyboard and mouse for data input, and the screen (and sometimes speakers) for data output. The use of these devices requires user. Input via these devices demands precision and motor coordination; visual-motor coordination is also needed to use the pointing device.

Output requires visual and, sometimes, auditory abilities. People present disabilities in diverse ways. A significant percentage of the general population does not possess the necessary minimum physical ability to use standard input/output devices. This occurs for various reasons, for example, aging, physical or cognitive disability, or the inability to execute multiple tasks simultaneously (browsing the address book of a mobile phone while driving, for example). 2)cognitive accessibility 3) The cognitive abilities and disabilities of users are diverse (Canas & Waern, 2001).

Besides aging and cognitive disabilities, the use of a foreign language or the reduction in attention when doing simultaneous tasks may influence the cognitive ability of the user. Therefore it is necessary to take into account this diversity when designing interaction methods. Despite the fact that the cognitive disabilities affect a large number of people, many of whom are not considered disabled, cognitive accessibility studies are less developed than those for physical accessibility. IV. ASSISTIVE TECHNOLOGY DEVICES

Assistive Technology for People with Disabilities Assistive Technology (AT) can be defined as a device or service that can benefit people with disabilities. Any piece of equipment, product system, or any device that can be used to improve, increase, and or maintain a disabled person’s functional capabilities is defined as an assistive technology device. According to the AT Act of 2004, an assistive technology service is defined as any service that helps a disabled individual acquire, select, and or use an assistive technological device.

These technological tools benefit people with intellectual disabilities by enabling an individual to have a level of accomplishment and or normal fluency that could not be achieved without these devices and or services. Reducing the labor required or increase endurance when attempting to complete routine tasks. Enhances some individual’s opportunities for learning or employments tasks. Supporting normal social interactions with their environment be it work and or social events. A. saifu Designer Jonathan Lucas has designed a unique concept device that will enable visually impaired users feel everything on their computer screen.

The device, Saifu, uses a unique material called Magneclay that turn the computer’s display into a Braille display or even a morphing relief of pictures. . The Saifu (named after a blind African ant) is a unique device that can be used as a nine-key Braille keyboard with Braille readout, or it can be used like a book, displaying words in Braille. The unique Magneclay material along with the image conversion technology makes this possible. What is more, the Saifu can also read aloud the words on screen if the user wishes so as shown in Fig 2.

The machine is also equipped with voice recognition software that translates words spoken by the users into typed words. This unique device featuring the amazing magneclay material will definitely benefit visually impaired people everywhere. Fig 2. saifu B. visual assistance card Kyle Lechtenberg, an Industrial Design grad from Auburn University, had whipped up this innovative product for the visually-impaired shopper a debit/credit card for the blind, with private information regarding PIN numbers and all that kept as private.

Fig. 3 Visual assistance card The Visual Assistance Card has Braille relief imprinted on it, making sure that the user keeps his personal information private, increasing their independence when shopping with no need for assistance from the counter cashier. The Visual Assistance Card is light weight and can be easily stored out of the way until future use as shown in Fig 3 C. Eye mouse Eye mouse’ to help disabled people get online a new technology that could help severally disabled people get online at a very low cost.

The software and webcam system was developed by the two 18-year-olds so that one of their friend’s who suffers from Spinal Muscular Atrophy (SMA) could use the computer. Eye movements are translated by the Eye Mouse with a standard webcam into on-screen action . It means that people who suffer from SMA like Nicolas Rossi can control the computer with their eyes shown in Fig 4. Fig 4 Eye mouse D. Head Pointer The head pointer is a pointing device controlled with the user’s head and is useful for people with good cephalic control shown in Fig 5. There are different kinds of head pointers. Head sticks are head-worn pointers These can be used in a numberof ways: for signaling pictures, words, communication board icons; as a keyboardaid; or as a pencil holder, for turning pages or drawing, for example. The only requirement is the user ability to move their head with certain precision. The user of these devices heavily depends on third party assistance in order to place the device on user’s head. • Webcam-based devices. These devices usually track facial features to gather the user’s motion and do not need any headset or dots mounted on the user’s head.

As a result, third party assistance is minimal. Moreover, Webcam-based head pointersare usually software-only solutions based on standard hardware, thus they are cheaper than electronic ones. Fig 5. Head pointer V. VISION BASED INTERACTION SYSTEM Tracking is a well-established research field which may be addressed from various view points. Tracking may be seen as a separate process, as a means to prepare data for pose estimation, or as a means to prepare data for recognition as in Fig 6. If considered a separate process the subject is typically.

Some alternatives to the standard mouse have been available for a number of years. For example trackballs, that do not necessarily have to be used by the hands. These devices provide a method of moving the mouse pointer around the computer screen with head movements they still require some means of providing the ‘button click’. Most systems can be used in combination with a dedicated switch, perhaps used by the elbow or foot. Others have software that assumes that a length of time on a particular icon should be taken as a ’double click’.

Many of these head activated devices can be combined with specialist on-screen keyboard software and predictive software providing a total hands tracked as a single object (without any limbs) and no high-level knowledge is used. A general structure for systems analyzing human body motion is shown in Fig. 6 Fig 6. Motion analyzer structure The Technology -Developments in hands-free access to computers could perhaps be split into two categories,1. Mouse and keyboard alternatives 2. Speech recognition 1. Mouse alternatives Smart-Nav hands free mouse .

An infrared transmitter and receiver are placed on the top of the computer monitor. A silver dot is attached to the forehead or in the centre of a pair of spectacles. The Smart-Nav provides control of the pointer through movements of the head 2. HeadMouse The HeadMouse sensor replaces the standard desktop computer mouse for people who cannot use their hands. The HeadMouse is a device that translates the movements of a user’s head into directly proportional movements of the computer mouse pointer. It is a wireless optical sensor that tracks a tiny, disposable target that is placed on the user’s forehead or glasses . Facial Mouse The Facial Mouse11 is a mouse emulator system based on the facial movement of the user. A Webcam is placed in front of the user, focusing on the user’s face shown in Fig 7. A motion extraction algorithm, which is user independent, is used to extract the facial motion from the video. This motion is used to move the mouse pointer that is controlled in a fashion relatively similar tostandard mouse devices. This system can be used with great accuracy even when the user hasexiguous cephalic motion control. The click can be generated through several mechanisms: • Built-in mechanisms. Dwell click. This click is automatically generated after stopping the pointer for a certain amount of time. o Sound click. The click is generated when the user emits a sound whose input level is greater than a configured threshold. Fig 7. Facial mouse 4)Speech Recognition Speech recognition is one of the more desired assistive technology systems. People believe if someone can speak, than speech recognition is a logical and easy method of accessing the computer. People with a physical disability want to use speech recognition for: •Dictation. Translation of the spoken word into written text. Computer Control. Operation of the computer, and other software applications simply by speaking commands. VI. TESTING Testing should be carried out by testers who are trained / competent in the use of these technologies so that they are using them in a similar way to people who uses those technologies all of the time. The level of testing and the technologies used (i. e. assistive technology, browser or operating system) is documented and available in the product accessibility statement. Where testers feel that problems are due to issues with the browser (e. g. on-conformance with UAAG), then work-arounds should be considered and information about these should be documented in the accessibility statement. Testing with every combination of browser/OS, and assistive technology defined in the product’s accessibility policy and the keyboard alone. This helps to ensure that the web product is accessible to people who have multiple impairments, or have a particular browsing preference. Expert reviews are particularly useful in early design stages where prototypes may present difficulties for disabled users and users of assistive technologies (e. . where a product is not fully developed and not supporting the features that a user may need to test accurately). An organization chooses to use user testing, the test group should represent a range of users with disabilities and older people in accordance with the product’s accessibility test plan. VII. CONCLUSION Computer vision-based interaction is an emerging technology that is becoming more useful, effective and affordable. However, it raises new questions from the HCI viewpoint, for example, which environments are most suitable for interaction by users with disabilities.

Here this paper emphasis on the accessibility and usability aspects of such interaction devices to meet the special needs of people with disabilities, and specifically people with CP computer vision interaction systems are very useful in some cases, these systems are the only ways by which some people can interact with a computer. Computer vision-based interaction systems also give advantages—such as flexibility and lower cost—over other traditional assistive technologies. The latest systems, however, use more advanced methods based on comprehensive probabilistic models and advanced training.

Nevertheless, some assumptions are still required and we are far from a general solution to the human motion capture problem. Some of the general key issues needing to be addressed are initialization, recover from failure, and robustness. Cardinal improvements that this paper highlights can be enunciated with advancement in(HCI) to meet the needs of the disabled to expertise themselves with computer as mode of communication through assistive devices. Furthermore accessibility, availability of these assistive devices should influence the disabled users.

Hence this paper brings out that developer of the particular device should familiarize priory to meet the challenge of fulfilling what common man desires to attain by using such devices. REFERENCE [1]Abascal, J. (2003). Accesibilidad a Interfaces Moviles para Computacion Ubicua Relativa al Contexto. [Accessibility to mobile interfaces for context aware ubiquitous computing. ]. In I. Ramos, A. Fernandez, &M. D. Lozano (Eds. ), Tendencias actuales en la Interaccion Persona-Ordenador: Accesibilidad,adaptabilidas y nuevos paradigmas (pp. 7-75). Albacete, Spain: Castilla-La Mancha University [2]Bax, M. , Goldstein, M. , Rosenbaum, P. , Leviton, A. , & Paneth, N. (2005). Proposed definition and classification of cerebral palsy. Journal of Developmental Medicine and Child Neurology, 47(8), 574. [3]Canas, J. J. , & Waern, Y. (2001). Ergonomia cognitiva: Aspectos psicologicos de la interaccion de las personas con la tecnologia de la informacion [Psychological aspects of the interaction of the people with the information technology].

Madrid, Spain: Editorial Medica Panamericana. [4]European Assistive Technology Information Network [EASTIN]. (n. d. ). Retrieved on February 2, 2006 from http://www. eastin. info [5]Gips, J. , Betke, M. , & Fleming, P. (2000). The Camera Mouse: Preliminary investigation of automated visual tracking for computer access. In Proceedings of the Rehabilitation Engineering and Assistive Technology Society of North America 2000 (pp. 98–100). Arlington, VA: RESNA.

Leave a Reply

Your email address will not be published. Required fields are marked *