Вы находитесь на странице: 1из 3

3/13/2017 GestureControlledRobotElectronicsForYou

Gesture-Controlled Robot
This project uses Kinect technology to capture, process and interpret human gestures for controlling the motion of a robot. -- By Samarth Shah, Devanshee Shah and
YagnikSuchak
February 15, 2015

Life can be wonderful if everything around us can be controlled by simple gestures. Gesture recognition technology helps us to interact
with machines naturally without any additional device. Gestures are interpreted via mathematical algorithms and corresponding actions
initiated. Although this technology is still in its infancy, applications are beginning to appear. Kinect is one such application.

Fig.1:Kinectsensor(Courtesy:Wikipedia)

Fig.2:Blockdiagramofgesturecontrolledrobot

Though initially invented for gaming, Kinect is being used for dierent purposes. Kinect is a motion-sensing and speech recognition
device developed by Microsoft for Xbox 360 video game console. The main idea was to be able to use a gaming console without any kind
of controller.

This project uses Kinect technology to capture, process and interpret human gestures for controlling the motion of a robot.

Circuit and working


Fig. 2 shows the block diagram of gesture-controlled robot. It comprises a Kinect sensor interfaced with a computer using USB port and a
simple robot connected to the computer through USB-to-serial converter.

Kinect sensor. Kinect sensor is packed with an array of sensors and specialised devices to pre-process the information received. The
Kinect and the computerrunning Windows or Linuxcommunicate through a single USB cable.

Main features of Kinect sensor include:

Gesture recognition. It can recognise gestures like hand movements, based on inputs from an RGB camera and depth sensor.

http://electronicsforu.com/electronicsprojects/hardwarediy/gesturecontrolledrobot 1/3
3/13/2017 GestureControlledRobotElectronicsForYou

Speech recognition. It can recognise spoken words and convert them into text, although accuracy strictly depends on the dictionary used.
Input is from a microphone array.

Fig.3:Circuitoftherobot

The main components are the RGB camera, depth sensor and microphone array. The depth sensor combines an IR laser projector with a
monochrome CMOS sensor to get 3D video data. Besides these, there is a motor to tilt the sensor array up and down for the best view of
the scene, and an accelerometer to sense position.

READ Flashing Headlight for Mobikes

http://electronicsforu.com/electronicsprojects/hardwarediy/gesturecontrolledrobot 2/3
3/13/2017 GestureControlledRobotElectronicsForYou

Robot. Fig. 3 shows the circuit of the robot. The robot is built around ATmega16 MCU (IC2), driver IC
MAX232 (IC1), regulator IC 7805 (IC4), motor driver IC L293D (IC3) and a few discrete components.

COM port is connected to the computer using the USB-to-serial converter. Controlling commands to the
robot are sent via serial port and the levels converted into 5V TTL/CMOS type by IC1. These TTL/CMOS
signals are directly fed to the MCU (IC2) for controlling motors M1 and M2 to move the robot in all
directions. Port pins PB4 through PB7 of IC2 are connected to input pins IN1 through IN4 of IC3,
respectively, to give driving inputs. EN1 and EN2 are connected to VCC to keep IC3 always enabled. LED1
and LED2 are connected to ports PB1 and PB2 of IC2 for testing purpose.

1 2 3

http://electronicsforu.com/electronicsprojects/hardwarediy/gesturecontrolledrobot 3/3

Вам также может понравиться