Вы находитесь на странице: 1из 20

Real time Navigation for the

Visually-Impaired Students Using


Android App Module.

Project Report Submitted by:


Jennifer Kezia V
Naveen Kumar B
Samuel Stanly
Anu Kumari
Sibi R

Department of Electrical and Electronics Engineering


Table of Contents

Executive Summary: ..........................................................................3

Objectives: ............................................................................................5

Technology Platform and Architecture ..........................................6

MIT APPINVENTOR VERSION 2 .............................................6

Project Description .............................................................................9

Homescreen: .....................................................................................9

Student Registration Database: ...................................................11

CREATING A PATH ...................................................................13

Fusion Tables: .............................................................................13

Compass and GPS:.........................................................................15

TAKE ME TO CLASS: ...............................................................16

Text To Speech............................................................................16

Speech Recognizer .....................................................................18

Relevance of the project in E-Governance .................................20

Future Outcomes:.............................................................................20

2|Page
Executive Summary:
In the present era, owning a smartphone is an indispensable necessity of every
person in the society. It has reached a state in which it is helping the Government and
Nation to shape the way in which a society progresses. Be it an app as simple as
Whatsapp used for communication to even Uber app for booking taxis; they have
become a core part of our day-to-day life.

Smartphone applications have changed the traditional way of using


cellphones. They are not only used for calling and messaging but also for various
diverse applications.

Every school has traffic jams between classes. The sound of Rushing students
fills the air the rushing, bumping and pushing are hurdles every student must jump
over to get to class on time. These challenges are placed twice as higher for the
visually-impaired. Suddenly the world can become an unpredictable obstacle course
(a noisy one at that) to people with this handicap and they must rely on other
techniques to accomplish lifes daily activities.

Our App is a user-friendly one designed to assist the indoor navigation process
in a very unique way (with outdoors in the foreseeable future). This Ambitious app
guides and dramatically empowers the campus experience for these twice-exceptional
students giving campus orientation a whole new meaning.

A Mobility Specialist can spend a few hours training a visually-impaired


student spatial awareness, location and protective techniques to navigate school
hallways.

This overcomes the many hours that are spent on teaching the visually-
impaired student where the new walls are going to lead.

Through our App venture we aim to provide the visually-impaired student a


technological way to access campus wings, classrooms, bathrooms, gym and other
parts of campus just by speaking into their phone. A Mobility Specialist can set up

3|Page
various paths from different locations on campus for a student to later use to move
about the school campus. The student talks into the device and requests help
navigating to and from particular locations. OUR APP CUTS THE TRAINING TIME
IN HALF.

Modern technological advances can allow our app to open invisible doors that
were once thought permanently shut. It will allow blind students to reach their highest
level of independence yet. Through this app students with this handicap can keep
beating the odds and stay connected just like every other student on a school campus.
LET US TAKE YOU BY THE ARM DIGITALLY AND WELCOME YOU IN TO OUR
APP.

4|Page
Objectives:

Through our innovative and state of the art Android App module we hope
to accomplish the following task for the benefit of the Visually Impaired, so as to
rid them of their cumbersome burden and make their life hassle-free.

Our goal is to aid those who are visually impaired, have a hard time seeing,
and work as mobility specialists through our App module.
Create a Database for uploading the student details and also downloading
existing student database.
Assemble an Algorithm for recording the program path for various sources and
destinations for the Visually-Challenged persons.
Integrate the compass with our APP Module for recording the various
geometric position coordinates (Latitude and Longitude).
Devise a GPS (Global Positioning system) and Step counter for navigating the
student from one destination to source without any hassles.

App
module

STEP
Compass GPS
counter

Navigation
5|Page
Technology Platform and Architecture

MIT APPINVENTOR VERSION 2

App Inventor for Android is an open-source web application originally


provided by Google, and now maintained by the Massachusetts Institute of
Technology (MIT).

It allows new comers to computer programming to create software


applications for the Android operating system (OS). It uses a graphical interface, very
similar to Scratch and the StarLogo TNG user interface, which allows users to drag-
and-drop visual objects to create an application that can run on Android devices. In
creating App Inventor, Google drew upon significant prior research in educational
computing, as well as work done within Google on online development
environments.

App Inventor and the projects on which it is based are informed


by constructionist learning theories, which emphasizes that programming can be a
vehicle for engaging powerful ideas through active learning. As such, it is part of an
ongoing movement in computers and education that began with the work of Seymour
Papert and the MIT Logo Group in the 1960s and has also manifested itself
with Mitchel Resnick's work on Lego Mindstorms andStarLogo

App Inventor includes:

A designer, in which a program's components are specified. This includes


visible components, such as buttons and images, which are placed on a
simulated screen, and non-visible components, such as sensors and web
connections.

A blocks editor, in which the program's logic is created. A compiler based on


the Kawa language framework and Kawa's dialect of the Scheme programming

6|Page
language, developed by Per Bothner and distributed as part of
the GNU operating system by the Free Software Foundation.

An app for real-time debugging on a connected Android device.

The application was made available through request on July 12, 2010, and released
publicly on December 15, 2010. The App Inventor team was led by Abelson and Mark
Friedman. In the second half of 2011, Google released the source code, terminated its
server, and provided funding for the creation of The MIT Center for Mobile Learning,
led by App Inventor creator Hal Abelson and fellow MIT professors Eric Klopfer and
Mitchel Resnick. The MIT version was launched in March 2012.

The blocks editor in the original version ran in a separate Java process, using the
Open Blocks Java library for creating visual blocks programming languages and
programming

Figure. Android Development Suite

7|Page
Open Blocks is distributed by the Massachusetts Institute of Technology's
Scheller Teacher Education Program (STEP) and is derived from master's thesis
research by Ricarose Roque. Professor Eric Klopfer and Daniel Wendel of the Scheller
Program supported the distribution of Open Blocks under an MIT License. . The MIT
AI2 Companion app enables real-time debugging on connected devices via Wi-Fi, not
just USB.As of May 2014, there were 87 thousand weekly active users of the service
and 1.9 million registered users in 195 countries for a total of 4.7 million apps built.

Fig: BLOCK Editor

Java Web Start programs are applications that launch from your Web
browser, but run as separate programs. The Blocks Editor is a part of App Inventor
that runs separately from your browser.

8|Page
Project Description

Our Project aims to incorporate the following aspects to cover the complete
navigation of the Visually-Challenged persons. The features are mentioned as
follows:

STUDENT SETTINGS to customize the application for a particular student.

CREATE A PATH which is a module for the mobility specialists. They can
program the path for the visually impaired student will take.

TAKE ME TO CLASS is for the end user (STUDENT). The student can tell
the app through voice where he/she is located and where he/she would like
to go. When they want to navigate the path recorded by the mobility specialist
will pop up and guide the Student from the SOURCE to the DESTINATION.

Homescreen:
The homescreen of the app presents the user with the option to enter any of the
aforementioned dialog boxes.

The Instructions is used for providing the entry-level user with the various
insights offered in the app module.

It features an elegant and sleek way of presenting the necessary information to


the user so that he can navigate through the app without much aggravation.

The entire objective of the app is to make the user feel at ease with app and
therefore the entire visual aesthetic of the app is made to look as simple as possible
because ultimately the end user is a student. So our App must not look sophisticated
and must be at the same level which can synchronize with the mental level of the
students so as to achieve maximum compliance.

9|Page
Fig. Homescreen of the APP Module.

Fig. Homescreen showing Student and school name

10 | P a g e
Student Registration Database:

Fig. Student Registration Database

You retrieve data from the database with the TinyDB.GetValue block. When
you call GetValue, you request particular data by providing a tag. For the No
Texting While Driving app, you can request the custom response using the same tag
as we used in the StoreValue, responseMessage. The call to GetValue returns the
data, so you must plug it into a variable.

11 | P a g e
Fig. Block For loading the database when app starts.

Fig. Database Recorded

12 | P a g e
The blocks put the data returned from Get Value into the variable response and
then check if response has a length greater than 0. If it does, then the database did
return a nonempty custom response, and it should be put in the Response Label. If the
length of the value returned is 0, it means no data with a tag of responseMessage
has been stored, so no action is necessary.

CREATING A PATH
Now that we have uploaded the student and school name into the database, it
is the time for the Mobility Specialists to record the path into the Fusion Tables.

Fusion Tables:
Fusion tables may be considered as the online form of database for the app
module. It offers open-source storage at free of cost. This boosts the versatility of the
app module.

Fig. Creating a path which is recorded in the Fusion tables.

13 | P a g e
Now we know that from the above Figure that we can choose the starting and
destination location for recording by the Mobility Specialists.

The direction dialog box will give us a sense of direction for the student to
navigate in the classroom corridors. The Get coordinates will link with the Compass
sensor of the Mobile to get the latitude and longitude of the particular position
where the segment changes the direction.

Fig. Screen showing direction and Coordinates recording.

The segment can be divided into four parts. With each segment covering a
part of the total distance to be covered. By the principle of dividing the total distance
into different parts we are breaking down the complexity of the total navigation

process. This theme of separating aligns with the very idea of the app which is to

be as simple as possible for the end user.

14 | P a g e
Compass and GPS:
The Compass and GPS which is already in-built in most of the smartphones
are used here thereby reducing any incurring costs. Just making the App as budget-
friendly as possible.

Fig. The GPS and Compass Sensor is embedded in the app development
suite.

So, we can see from the above part just how simple fabricating a
technology so complex as GPS and Compass can be done with the application suite.

In this way the path is recorded and stored in the FUSION TABLES which is
an open-source platform.

Now we come to the final part of the app which is TAKE ME TO CLASS which

is used for achieving the actual navigation process and offers some breathtaking
speech to text technology.

15 | P a g e
TAKE ME TO CLASS:
We have come to the final part of the App module and here we offer some insights
into the awe-inspiring Speech to Text technology which we exclusively came up

with for the sole purpose of the app. Let us take you through this spectacular
algorithm.

Text To Speech
The Text-To-Speech component turns any string into audible text. The speech is
based on the Android devices Speech settings. However, you can set the language
and country using the Country and Language blocks.

The .Speakblock takes whatever text is input in the message socket and turns it into
Audible spoken words. You can place either a text block or input from a text box
field into the message socket.

The .BeforeSpeaking event is called after the .Speak block but before the
.AfterSpeaking block. You can use it to execute blocks before the .AfterSpeaking
Event occurs. The .AfterSpeaking event is triggered after the text has been rendered
to speech.

The example in Figure shows a button being used to call the .Speak block to speak
text that has been entered into a text box. The .AfterSpeaking event then calls a
procedure to reset the application.

16 | P a g e
The example in Figure shows the SoundRecorder component being used to record a
Sound with a SoundPlayer component source set to the newly recorded sound. The
picture blocks follow this logic.

When Button1 is pressed, if the varRecording variable is true, the .Stop block is
called and the variable is set to False. If the varRecording is false, the .Start block is
called to start recording and the varRecording is set to true.

17 | P a g e
Speech Recognizer
The Speech Recognizer uses Googles network-dependent voice-to-text system to
transcribe a users vocal input. The component requires network connectivity to
function.

The .GetText call initiates the Android speech component, which prompts the user
for speech input and then sends the sound clip to Googles speech-to-text system.
The resulting text is sent back to the device and the .AfterGettingText event is
triggered.

The .AfterGettingText event is triggered when the Google servers send back the text
from the speech input. The result parameter contains the text for use in your
application.

The .BeforeGettingText is called after the .GetText call is made but before the
.AfterGettingText is triggered by the returning text.

Figure shows a button calling the speech components and a label being
populated with a text result.

18 | P a g e
As a result we have the final screen of the App Module as follows :

Before VOICE-NAVIGATION Process

After VOICE-NAVIGATION Process

19 | P a g e
Relevance of the project in E-Governance
Today there is no individual who does not possess a Smartphone, hence the
base for this application is readily available at hand. This application has been
developed with the main objective to act as an aid which provides guidance in the
area in which the visually impaired suffer a lot that is navigation.

Although a lot of projects have been proposed and implemented by the


government for the betterment of the physically handicapped, the number of

projects that has been developed specifically for the visually impaired are very
few, through this project we aim to help the visually impaired students to lead a
confident and normal life like any other ordinary individual.

Our project is pro-bono and very easy to implement as it does not require

complicated trainings and procedures. This project is easy to understand and utilize,
in other words, it is user-friendly.

Future Outcomes:

It is a common scene to see a visually impaired person asking someone to


help him/her to cross the road or he/she will be using a stick to tap in front
before they walk-this is highly de-motivating and discouraging to them.
The Government can set up these apps in every schools and colleges for the
benefit of the Visually-Impaired which is the need of the hour right now.
By implementing this App Module the government just need to set the
module only once and it can be expected to last for a long period of time.
This would encourage the Visually-Challenged to be more self-dependent
and have a sense of encouragement to pursue their day-to-day life.
Like our past creations and innovations, they may be used to improve the

human condition or they may be misused and abused. This is not a


technical choice we are faced with; its a social one. Lets make the right
choice that brings out the best in the future of technology.

20 | P a g e

Вам также может понравиться