MENU

Robotic arm for quadriplegics

To develop a controller for a robotic arm that can be manipulated by people with motor impairments of varying levels. The controller will be able to receive simple verbal or non-verbal instructions from the patient and perform a predefined set of tasks with the arm.

The main goal of this project is the development of a controller for a mechanical arm that uses a human-machine interface (HMI) based on latest speech, video, sensory and BCI technologies. The controller will enable a modular and multimodal interface for a natural control of robotic arms, and -in a later stage- other mechanical devices such as wheelchairs, medical beds, autonomous side-tables, etc. Our solution will allow the selection of a cost effective interface between the user and the assisting robot based on the specific user's limitations (e.g. a voice and touch input combination for those able to speak vs. a brain computer interface for fully paralyzed patients). The special control system can allow quadriplegics to control the movements of a given mechanical arm through gaze, touch, blow and/or voice. The controller will have a vision system that enables it to identify a certain number of objects such as drinking glasses, doors and handles, and other common objects defined by the end user. The controller will then have an intrinsic mechanism to manipulate -through the mechanical arm, vision system and other sensors- this predefined set of objects, containers and devices based on the patient's instructions. In the scope of the current project, both participating partners will contribute their previously developed and tested controlling technologies. The Lithuanian partner has developed a technology that is able to control a wheel chair through gaze, while the Israeli partner has developed a controller that manages an industrial robotic arm through hand gestures. Below are some of the technological characteristics of the final controller to be developed: Safety: The controller will keep constant track of the location of the patient and of objects in the environment, and will avoid any instruction that can result in a potential collision with the patient and/or any obstacles around it. Controller-to-Controller communication: Controllers of mechanical arms assigned to different patients will be interconnected and will share performance information between them. Adaptability: The controller will keep track of the tasks being requested by the patient and will use this information to adapt itself to the particular patient's needs. It will also try to predict some of the actions to be requested based on the user's past behavior. Self and collaborative improvement: Specifically designed operational data will be constantly logged in real-time and shared between controllers performing similar tasks. This data will later on be used by each controller to optimize its performance in the next execution of a given task. Operational data is shared both locally and remotely. Visual identification of objects: A set of functions and libraries will be developed to identify a specific set of objects to be manipulated in the initial scope of this project. Additional objects will be added in the future by inserting new software modules into the system. Platform oriented: Embedded support for high-level programming from the most popular programming languages and environments (C/C++/JAVA…) will be enabled. This will allow third parties to develop applications for this platform in a familiar environment in the future.
Acronym: 
QUADRIBOT
Project ID: 
8 766
Start date: 
01-04-2013
Project Duration: 
24months
Project costs: 
550 000.00€
Technological Area: 
Apparatus Engineering
Market Area: 
Handicap aids

Raising the productivity and competitiveness of European businesses through technology. Boosting national economies on the international market, and strengthening the basis for sustainable prosperity and employment.