This paper delivers a method to guide and control the wheelchair for disabled people based on movement of eye. This concept can be used for people with loco-motor disabilities. The proposed system involves three stages: image detection, image processing and sending of control signals wheelchair. The eye movement is detected using a head mounted camera. The images of the eye will be sent to the laptop where the images will be processed using Python software. The corresponding output signals are then sent to the motor driving circuit which control the motors.
A wheelchair is a chair with wheels, invented in the early 5th century. The device comes in variations where it is propelled by motors or by the seated occupant turning the rear wheels by hand. Often there are handles behind the seat for someone else to do the pushing. Wheelchairs are used by people for whom walking is difficult or impossible due to illness, injury, or disability. People who have difficulty sitting and walking often need to use a wheel bench. A basic manual wheelchair incorporates a seat, foot rests and four wheels: two, caster wheels at the front and two large wheels at the back. Other varieties of wheelchair are often variations on this basic design, but can be highly customised for the user’s needs. Such customisations may encompass the seat dimensions, height, seat angle (also called seat dump or squeeze), footrests, leg rests, front caster outriggers, adjustable backrests and controls. An electric-powered wheelchair is a wheelchair that is moved via the means of an electric motor and navigational controls, usually a small joystick mounted on the armrest, rather than manual power. For users who cannot manage a manual joystick, head switches, chin-operated joysticks, sip-and-puff or other specialist controls may allow independent operation of the wheelchair.
The purpose of this project is to develop a wheelchair that will be controlled by the eyes of the person seated in the wheelchair. This will allow people without full use of their limbs the freedom to move about and provide a level of autonomy. The project will consist of three main parts. The eye tracking module consists of a camera that captures the image of the eyeball. The setup is designed so as to cause minimum stress to the user. A webcam is fixed on to a spectacle like set up to capture the image. The camera is placed so as to capture the movement of one eye allowing clear vision to the other eye. The camera will take an image of the eyes that will be sent to the laptop where the images are being processed. Once the image has been processed it moves onto the second part, the microcontroller.