Академический Документы
Профессиональный Документы
Культура Документы
On
(13120046)
(13120049)
(13120062)
(13120067)
(13120074)
SCHOOL OF ENGINEERING
COCHIN UNIVERSITY OF SCIENCE AND TECHNOLOGY
KOCHI-682022, KERALA, INDIA
CERTIFICATE
Certified that the major project report entitled REAL TIME EYE STATE
RECOGNITION SYSTEM WITH ZERO TRAINING OVERHEAD is a bonafide
record
of
work done by
KAVYA
MANOHARAN,
LEO ALEXANDER,
NAICINY C B, RAGINI and RESHMA C towards the partial fulfillment for the award
of the degree of B.Tech in Electronics and Communication of Cochin University of
Science and Technology, Kochi-682022.
Co-ordinators
Dr. Binu Paul
Dr. S. Mridula
ACKNOWLEDGEMENT
We express our sincere thanks to Dr. Binu Paul, Head of the Division of
Electronics Engineering, for her kind cooperation, encouragement and help.
We express our gratitude to our Co-ordinators Dr. Binu Paul and Dr. S Mridula
, Associate Professors, Division of Electronics Engineering for the expert guidance and
advice in presenting the Major Project.
We also take this opportunity to express a deep sense of gratitude to
Mr.
Salih Muhammed in coordinating our activities and also for helping us throughout this
project.
Lastly, we thank almighty, our parents, brothers, sisters and friends for their
constant encouragement without which this project would not be possible.
ABSTRACT
Eye state recognition is one of the main stages of many image processing systems
such as driver drowsiness detection system and closed-eye photo correction. Driver
drowsiness is one of the main causes in the road accidents around the world. In these
circumstances, a fast and accurate driver drowsiness detection system can prevent these
accidents. Here, a fast algorithm for determining the state of an eye, based on the difference
between iris/pupil color and white area of the eye is proposed. In the proposed method,
vertical projection is used to determine the eye state. This method is suitable for hardware
implementation to be used in a fast and online drowsiness detection system. The proposed
method, along with other needed preprocessing stages, is implemented on Field
Programmable Gate Array chips. The results show that the proposed low-complex
algorithm has sufficient speed and accuracy, to be used in real-world conditions.
CONTENTS
1
2
2.1
Introduction
Real Time Eye State Recognition
Flow Chart
15
17
17
MATLAB
21
3.1
Introduction To MATLAB
21
3.1.1
21
3.1.2
21
3.1.3
21
3.1.4
Flow Control
22
3.2
23
3.3
Implementation In MATLAB
23
3.4
MATLAB Program
27
3.5
Simulation results
31
35
Xilinx
43
5.1
Introduction to Xilinx
43
XILINX IMPLEMENTATION
47
6.1
47
6.2
Image Ram
47
6.3
Adder Unit
47
6.4
PV RAM
48
6.5
48
6.6
50
6.7
53
6.8
55
6.9
56
6.10
60
Conclusion
63
References
65
Appendix
67
Index
69
10
LIST OF FIGURES
Figure No.
Description
Page No.
Figure 1.1
16
33
36
Figure 2.1
Figure 2.2
Figure 2.3
Figure 3.1
Figure 3.2
Figure 3.3
Figure 3.4
Figure 3.5
Figure 4.1
Figure 4.2
Figure 4.3
Figure 4.4
17
19
20
30
30
31
32
38
39
41
43
44
45
Figure 6.1
Figure 6.2
Figure 6.3
Figure 6.4
Figure 6.5
Figure 6.6
Figure 6.7
Figure 6.8
Figure 6.9
Figure 6.10
Figure 6.11
Figure 6.12
Figure 6.13
47
47
47
48
48
49
50
51
52
52
53
54
54
11
Figure 6.14
Figure 6.15
Figure 6.16
Figure 6.18
Figure 6.19
Figure 6.20
checking unit
Output of threshold checking unit of
59
Figure 6.21
Figure 6.22
60
61
Figure 6.17
Figure A.1
checking unit.
Overall system for eye state recognition in
xilinx
55
55
56
57
58
59
67
LIST OF TABLES
EC,SOE,CUSAT, 2011 Admission
12
Table No.
Description
Page No.
Table 3.1
Results of simulation in
MATLAB
31
13
14
CHAPTER-1
Introduction
All over the world and every day, drivers fatigue and drowsiness have caused many
car accidents. In fact, drowsiness is the case of about 20% of all car accidents in the world.
With the increasing popularity of automobiles, the traffic accidents have occurred
frequently, some of which are caused by fatigue driving, especially easily happen
standalone. The traffic accidents will be largely decreased if finding a judging rule to
determine whether drivers stay awake or not, and make a warning to the drivers when they
begin to fall asleep, so it is meaningful to research fatigue detection algorithm which is also
a key technology in smart vehicles driving. As a result, an electronic device to control the
drivers awareness is needed. This device should monitor and detect the drivers drowsiness
online and activate an alarm system immediately. In recent years, many researches on these
systems have been done and their results are reported .One of these methods is to monitor
the movement of the vehicle to detect drowsiness of the driver [6]. This method depends
very much to the type of vehicle and the condition of road. Another method is to process
the electrocardiogram (ECG) signals of driver [7]. In this system, some ECG probes are
needed to be connected to the driver, which are disturbing the driver. There are other
methods based on processing of the image of drivers face and eye. Some of methods in
this category are to process the image of driver and to monitor his/her eye blinking.
In this report, a new algorithm to recognize the state of an eye, without constraints
of the previous methods, is proposed. This algorithm has less sensitivity to the light
conditions than other algorithms, with no need to a training phase. In order to verify the
correctness of the proposed algorithm, a computer simulation is developed. The results
show a fast performance and acceptable accuracy for the proposed train less eye state
recognition algorithm.
Here, grey level values of image of an eye is processed. Column wise sum of these
values are calculated and smoothened. Local maximas and minimas are found out and
specified conditions are checked. Status of the eye is found out from these conditions.
EC,SOE,CUSAT, 2011 Admission
15
Figure 1.1 Overall system for hardware implementation of an eye state recognition
16
CHAPTER-2
Real Time Eye State Recognition
2.1 Flow Chart
m
i 1
where i is the row number, j is the column number, and PVlen = n is the size of this
projection vector. For example, the original vertical projection of an image of an eye shown
in Figure 2.2(a) is depicted in Figure 2.2(b). The vertical projection vector needs to be
smoothened. To obtain a smooth vector, we use an averaging filter. The size of this
averaging filter, AFlen, is considered to be the floor of PVlen/7
AFlen = floor (PVlen/7)
As shown in Figure 2.2(a), the image of an open human eye has three different
areas, pupil/iris in the middle and two white parts in the left and right sides. However, in
the image of a closed eye, these areas are not discriminated. The effect of this observation
in the projection vector is the basis of our proposed algorithm to determine the state of the
eye. As shown in Figure 2.2(b), the projection vector of an eye in the open state has three
areas. The projection vector of the pupil and iris area has less gray values than two other
white areas. As a result, the projection vector has a local minimum in its middle, belongs to
pupil area, and two local maximums, belong to two white parts of the eye. The method
which searches for these local minimum and maximums in the projection vector is as
follows. First, we add AFlen/2 zeros to left and right sides of projection vector, to generate
the zero padded projection vector (ZPPV), with a length of ZPlen = PVlen + AFlen. If,
atleast one group in the ZPPV of the image satisfies the both following conditions then the
eye is open, otherwise it is closed.
Condition-1: The ratio of difference between Ysmax and Ymin to Ysmax is greater than a
threshold value, ,
Ys max Y min
Ys max
Condition-2: The minimum that satisfies condition-1 is located almost in the middle of
ZPPV. That is, location of this minimum, Xmin is between 0.4ZPlen to 0.6ZPlen.
0.4ZPlen < X min < 0.6ZPlen
The ratio stated in condition-1 is based on the difference between the color of the
pupil (black) and the white area of an eye. This difference varies when the state of an eye is
changing from an open state to a closed state. That is, when an open eye is going to be
closed it passes different steps. Condition-1 verifies that an eye is open when this ratio is
greater than a threshold value.
EC,SOE,CUSAT, 2011 Admission
18
In the other hand, in the relaxed and open eye, such as driving situation, the pupil
almost in the center of eye, Condition-2 checked this condition to validate the openness of
the eye.
As an example, considering the image of an open eye, as shown in Figure 2.2(a).
Figure 2.2(b) is its projection vector and Figure 2.2(c) shows the smoothed and ZPPV of
this image. Based on our experiments the threshold of Condition-1 is considered to be =
0.05. Figure 2.2(c) is satisfied both conditions of an open eye, therefore it belongs to an
open eye.
Figure 2.2 (a) image of an Open eye (b) original Vertical projection of the open eye (c) Smoothed and ZPPV
of the open eye
19
(c)
Figure 2.3 (a) image of a closed eye (b) original Vertical projection of the closed eye (c) Smoothed and ZPPV
of the closed eye
20
CHAPTER-3
MATLAB
3.1 Introduction To MATLAB
3.1.1 MATLAB Matrix Laboratry
It was later marketed and further developed under MathWorks Inc. (founded in
1984) www.mathworks.com
MATLAB is a software package which can be used to perform analysis and solve
mathematical and engineering problems.
It has excellent programming features and graphics capability easy to learn and
flexible.
stem(y) -Data sequence y is plotted as stems from the x axis terminated with circles
for the data values.
The plot and stem functions can take a large number of arguments.
21
if
The if statement evaluates a logical expression and executes a group of statements
when the expression is true.
If Statement Syntax:
if (Condition_1)
MATLAB Commands;
elseif (Condition_2)
MATLAB Commands;
elseif (Condition_3)
MATLAB Commands;
else
MATLAB Commands;
End
for
The for loop repeats a group of statements a fixed, predetermined number of times.
A matching end delineates the statements
For loop syntax:
for i=Index_Array
MATLAB Commands;
end
while
Repeats a group of statements an indefinite number of times under control of a
logical condition. A matching end delineates the statements.
While Loop Syntax
while (condition)
MATLAB Commands;
22
end
switch case
The switch statement executes groups of statements based on the value of a variable
or expression. The keywords case and otherwise delineate the groups.
23
6.
AF=ones(1,aflen);
for j=1:z-1
if(j<aflen)
o=0;
for f=1:j
o=o+1*zpv(1,f)
end;
AF(1,j)=o/aflen;
elseif (j>=aflen)&(j<=z)
pp=(j-aflen)+1;
y=0;
for f=pp:j
y=y+1*zpv(1,f);
end;
AF(1,j)=y/aflen;
end;
AF
end;
7. Next the maxima and minima are searched for in the smoothened projection vector.
EC,SOE,CUSAT, 2011 Admission
24
8. Each element in the array of smoothened projection vector is compared with the
preceding element.
9. A nine bit shift register is initialized and if a maxima occurs then a 1 is stored in it
else a 0 is stored in it.
10. If the shift register pattern is 000001111 then it is a maximum. If it is 111100000
then it is a minimum. Else it is neither a maximum nor a minimum.
if AF(i)>=AF(i-1)% compare
a=1;
else
a=0;
end
A(9)=A(8); A(8)=A(7); A(7)=A(6);
A(6)=A(5);A(5)=A(4);A(4)=A(3);A(3)=A(2);A(2)=A(1);A(1)=a; % shif register
if A==[0 0 0 0 0 1 1 1 1 ] %% for maxm
disp ('maxm')
typ(w)=1;
val(w)=AF(1,i-4);
ind(w)=i-4;
w=w+1;
elseif A==[1 1 1 1 1 0 0 0 0 ] %% for minm
disp ('minm')
11. On finding a maximum or minimum, its type(max/min), index, value of smoothened
vector are stored in three different arrays( registers). These registers are initialized
before.
12. Now the condition is checked concurrently. If the bit pattern in type array (register) is
101 then we obtain a minimum and two maxima. We find smallest maximum Ysmax
apply the constrainsa. (Ysmax-Ymin) > (.05* Ysmax)
b. 0.4ZPlen < X min < 0.6ZPlen (i.e xmin lies in middle of the image of
eye).
typ(w)=0;
val(w)=AF(1,i-4);
EC,SOE,CUSAT, 2011 Admission
25
ind(w)=i-4;
w=w+1;
end
end
h=1;
i=2;
s=size(typ)
s=max(s);
s=s-2;
for i=1:s
if typ(i)==1 & typ(i+1)==0 & typ(i+2)==1
ysmax(h)=min(val(i),val(i+2));
ymin(h)=val(i+1);
xmin(h)=ind(i+1);
h=h+1;
end
end
flag=0;
h=size(ysmax);
h=max(h);
for n=1:h
if ((ysmax(n)-ymin(n))>(0.05*ymin(n)))&(xmin(n)>(0.4*zplen)) & (xmin(n)<(0.6*zplen))
flag=1;
break;
else
flag=0;
end
end
13. If conditions are satisfied then eye is open else the eye is closed.
if (flag==1)
disp('Eye is open')
elseif (flag==0)
disp('Eye is Closed')
end
14. Hence the algorithm is implemented using MATLAB codes.
EC,SOE,CUSAT, 2011 Admission
26
27
AF(1,j)=o/aflen;
elseif (j>=aflen)&(j<=z)% complete overlap
pp=(j-aflen)+1;
y=0;
for f=pp:j
y=y+1*zpv(1,f);
end;
AF(1,j)=y/aflen;
end;
AF
end;
%plotting smothened graph
subplot(3,1,3);
plot(AF)
title('smoothened')
figure;
imshow(b)
AF(1,j)
A=zeros(1,9);
w=1;
[d, zplen]=size(AF)
%initializations
typ(w)=0;
val(w)=0;
ind(w)=0;
ysmax=0;
ymin=0;
for i=2:zplen-1
if AF(i)>=AF(i-1)% compare the elements of smoothened projection vectors
a=1;
else
a=0;
end
A(9)=A(8); A(8)=A(7); A(7)=A(6);
A(6)=A(5);A(5)=A(4);A(4)=A(3);A(3)=A(2);A(2)=A(1);A(1)=a; % shif register
EC,SOE,CUSAT, 2011 Admission
28
29
end
%displaying of result
if (flag==1)
disp('Eye is open')
elseif (flag==0)
disp('Eye is Closed')
end
30
37
29
66
Number of Incorrect
6
5
11
Accuracy (%)
86.0
85.3
85.7
31
Open Eye
(a)
(b)
EC,SOE,CUSAT, 2011 Admission
32
Closed Eye
(a)
33
(b)
Figure 3.5 (a) Working closed eyes (b) Not working
34
CHAPTER-4
Hardware Implementation Design
In order to implement the proposed algorithm on a hardware platform, we
assume that the image of an eye is stored in a Random Access Memory (RAM). In this
implementation, we use a RAM with 136 82 bytes, called IMAGE_RAM, to store an
image. We also used a True dual port RAM, called PV_RAM, with 136 of 15-bit words, to
store the projection vector. Three other major units in the design are data smoothing unit,
local max/min search unit, and condition checking unit. These units are controlled through
a control circuit. Vertical projection vector is obtained in the first part of this system
containing IMAGE_RAM, PV_RAM, and Adder1. All elements of each column of the
image, IMAGE_RAM, are added together, by ADDER1, and the result is stored into
vertical projection vector, PV_RAM, through port B. The stored Data is read from
PV_RAM through port A. This data corresponds to a column of the eye image. The address
of IMAGE_RAM is generated through IMAGE_RAM_ADRESS register/counter, a ring
counter which counts from 0 to 11151. The address of PV_RAM is also generated by
PV_RAM_ADRESS1, a ring counter which counts from 0 to 135. When the projection
vector is completed, PV_C flag is set; and then smoothing unit starts its process.
(a)
EC,SOE,CUSAT, 2011 Admission
35
(b)
Figure 4.1 (a) PV_RAM and its ports (b) control circuit to obtained
36
In smoothing procedure, we need to divide the summation of projection vector data which
are overlap with smoothing filter to the length of the smoothing filter, AFlen. To simplify
the hardware implementation of this procedure we approximate the AFlen with the nearest
number of 2k, less than AFlen. In the other word, if 2 k <AFlen < 2k+1 then the denominator
consider to be 2k.
Dividing a number by 2k is a k-bit shift to the right. In our simulation, for example, since
AFlen = 19, we considered k = 4 and instead of division in averaging procedure, we shift
the result 4 bits to the right. Smoothing flag, SC_F, is set during smoothing procedure.
(a)
37
(b)
Figure 4.2 (a) Smoothing unit (b) control circuit of smoothing unit
38
(a)
(b)
Figure 4.3 (a) max/min searching unit (b) control circuit
Last unit of this system is the condition checking unit. In this unit, conditions of maximums
and minimums are checked, concurrently. In each step of type checking, three successive
bits of type register is selected. If these three bits have a pattern of 101 then a minimum
EC,SOE,CUSAT, 2011 Admission
39
exist between two maximums. We called this minimum in this group as Ymin. The index of
this minimum in ZPPV is called Xmin. Also in each group one of the maximum is less than
the other which we called it Ysmax.
If Xmin is between (54, 81), then Condition-2 is satisfied.
Condition 2: 0.4ZPlen<Xmin<0.6ZPlen
Condition-1 can be checked as follows:
First, we can rewrite Condition1:
YsmaxY min > Ysmax
We set = 0.05;
Substituting value of in above equation,
YsmaxY min > 0.05 Ysmax
Where in binary mode we have,
YsmaxY min > (2-5 + 2-6) Ysmax
Both sides of the equation are multiplied by 28, so the equation becomes,
28 (YsmaxY min) > (23 + 22) Ysmax
which is simpler and easier for implementation.
(a)
40
(b)
(c)
Figure 4.4 Condition checking unit (a) type and location checking (b) threshold checking (c) control circuit
of condition checking unit.
This procedure is repeated for all frames and PV_RAM is cleared when
the process of one frame is completed. Therefore, data are overwritten on PV_RAM, when
IMAGE_RAM_ADDRESS points to the first column of IMAGE_RAM. Control circuit
controls the flow of data. When the first row of IMAGE_RAM is read then the inputs of the
EC,SOE,CUSAT, 2011 Admission
41
adder are connected to IMAGE_RAM and 0. When the other rows of IMAGE_RAM are
read then the inputs of the adder are connected to IMAGE_RAM and output port A of
PV_RAM.
42
CHAPTER-5
Xilinx
5.1 Introduction to Xilinx
Xilinx System Generator provides a set of Simulink blocks (models) for several
hardware operations that could be implemented on various Xilinx FPGAs. These blocks
can be used to simulate the functionality of the hardware system using Simulink
environment. The nature of most DSP applications requires Floating point format for data
representation. While this is easy to implement on several computer systems running high
level modelling software such as Simulink, it is more challenging in the hardware world
due to the complexity of the implementation of floating point arithmetic. These challenges
increase with portable DSP systems where more restricting constraints are applied to the
system design. For these reasons Xilinx System Generator uses fixed point format to
represent all numerical values in the system. System generator provides some blocks to
transform data provided from the software side of the simulation environment (in our case
it is Simulink) and the hardware side (System Generator blocks). This is an important
concept to understand during the design process using Xilinx System Generator. The
System Generator runs within the Simulink simulation environment which is part of
MATLAB mathematical package. To start Xilinx system generator, we need to open
MATLAB (versions 2011 onwards) and open simulink library. Type simulink at the
MATLAB command prompt or click the Simulink button in the MATLAB toolbar to open
the Simulink Library Browser.
Examine the available blocks in the Simulink Library Browser.The following elements,
among others, should appear:
43
Xilinx Blockset
Xilinx XtremeDSPKit
The Simulink library browser shows a list of all the different Toolboxes installed within
MATLAB.
Xilinx System Generator components will appear under three categories:
1. Xilinx Blockset
2. Xilinx Reference Blockset
3. Xilinx XtremeDSP Kit
The category Xilinx Blockset contains all the basic blocks used in various number of
applications.
Create a new Simulink model by selecting FileNewModel.
44
From the Simulink Library Browser, open the Xilinx Blockset Library to access the
blocks.We
must use Xilinx Gateway In / Gateway Out blocks to define the FPGA
boundary and we must also place a Xilinx System Generator token in the design.
(a)
(b)
(c)
Figure 5.3 (a) system generator token (b) Gateway in block (c) gateway out block
Insert the required blocks from Xilinx blockset in the simulink library to create the
required model. To connect with blocks other than Xilinx blocks, we need to use gateway in
and gateway out. To take a signal from outside the Xilinx blockset we should use a gateway
in. And to take a signal outside Xilinx.
45
46
CHAPTER-6
XILINX IMPLEMENTATION
6.1 Writing an image to a RAM
Image from file block reads an image from a file. We can use the File name
parameter to specify the image file we want to import into the model. Color space
conversion block can be used to convert color information to intensity values. Reshape unit
changes the dimensions of a vector or matrix input signal. Output dimensionality is set as a
1 dimensional array which is input to an unbuffer block that converts a frame to scalar
samples output at a higher rate. This is input to Single port RAM in our model that stores
the image pixel values in 1-D array to be used for processing.
6.2 Image Ram
Its a single port ram that stores image. Size of image ram equals size of image. Port address
is used to select the required pixel value of the image.
6.3 Adder Unit
47
It performs the required addition to generate projection vector. Its a simple adder that sums
the two inputs given to it.
6.4 PV RAM
Its a dual port RAM that stores data width upto 256 bits. The ports are functionally
identical and independent of each other. That is, it enable shared access to the single
memory space. Here, port a is set as read port and port b as write port. Projection vector
calculated is stored in PV RAM.
6.5 PV RAM control circuit
48
To get sum of entire elements in each column, we use a two input multiplexer
(mux1) that selects the stored sum per column from PV RAM output once the image RAM
address generation is complete for one row of pixel values of original image. Port b address
EC,SOE,CUSAT, 2011 Admission
49
of PV RAM is generated using another two input multiplexer (mux) that inputs a delayed
PV RAM address until pvc_f is set and then takes input from s_address2 signal from
smoothening control unit. Write enable signal for port b of PV RAM is generated using a
third multiplexer (mux3) which disables web after pvc_f is set. Port a address of PV RAM
is selected using another multiplexer (mux2) that inputs PV RAM address until pvc_f is set
and then inputs
smoothening unit.
6.6 Smoothening Unit and Control circuitry
When the projection vector is completed, PV_C flag is set, and then smoothing unit
starts its process .For implementation of smoothing unit, we use the following procedure:Since the length of the projection vector is 136, the length of smoothing filter is equal to 19.
The values of all tap weights of this smoothing filter, averaging filter, are considered to be
1. In the smoothing process, there are three distinguish phases. In the first phase,
smoothing filter has some overlap with data projection vector from the left. In the second
phase smoothing filter is completely overlapped with data projection vector. In the third
phase, smoothing filter has some overlap with the projection vector from the right.
Figure below shows these three phases.
50
51
52
53
54
55
56
57
On the other side, the condition-1 is checked. For this, each bit of location_2 is
taken as the Xmin and this value is compared with the threshold values using the
comparators.
Condition 1: 0.4 Z Plen < Xmin < 0.6 Z Plen
The ZPlen values have been calculated from the MATLAB program part that we did
and is found to be 153. The output from all the comparators are given to an AND gate that
checks if all the conditions are satisfied or not. If the conditions are satisfied ,then output of
AND gate will become 1. This output is given to D flipflop. The output Q of this flipflop is
taken as condition 1 and the qbar is fed into the AND operator. Thus this condition
checking will check only once and the output will remains zero other times.
Figure 6.17 Type and location checking block of condition checking unit
58
Ys max Y min
Ys max
59
check the threshold condition. Then the result is ANDed with condition1 output. Then the
output is taken from the D flipflop .
60
61
Figure 6.22 Output of control circuit for the condition checking unit.
62
63
CHAPTER-7
Conclusion
In this project, an algorithm to determine the state of an eye by using its image was
presented. In this algorithm, we used the fact that the pupil and iris are darker than white
part of the eye. We used the vertical projection to distinguish the state of the eye. The
proposed method performed well in different light and eye color conditions. The computer
simulation included 77 eyes in which 66 eyes gave desired output and hence the system
showed 85.7% accuracy. Hardware design and implementation in Xilinx were also
presented.
Applications include real time detection of eye state of a person driving a vehicle,
detection of other similar patterns based on gray scale values, security maintenance
purposes, control of interfaces like computer mouse by monitoring eye blink.
64
65
References
1. Mohammad Dehnavi and Mohammad Eshghi (2012), Design and implementation
of a real time and train less eye state recognition system EURASIP Journal on
Advances in Signal Processing asp.eurasipjournals.com/content/pdf/1687-61802012-30.pdf, pp. 1-12
2. Drowsy
Driving.
http://dimartinilaw.com/motor_vehicle_accidents/
news
site.
http://www.newsinferno.com/accident/drowsydriving-
66
67
APPENDIX
68
69
70
Index
Averaging filter......................................................................................18,24
Caltech frontal database.........................................................................23,31
Color space conversion..........................................................................47
Condition checking unit.........................................................................39
Drowsiness.............................................................................................17
Dual port RAM......................................................................................35
FPGA....................................................................................................43
Gray level values..................................................................................17
Image RAM..........................................................................................35,41
Max/min searching unit........................................................................38
Pixel values...........................................................................................24
Projection vector..................................................................................17,24
PV_C Flag............................................................................................35,49
Register bank........................................................................................38
SC_F flag..37
Single port RAM...................................................................................47
Smoothing unit......................................................................................36,38
Smoothing filter..................................................................24,36
Smoothened projection vector..........................................38
Tap weights...36
Step counter....38,53
Threshold value..18
Type register...25
Vertical projection..17
Xilinx blockset.44
Gateway in............................................................................45
EC,SOE,CUSAT, 2011 Admission
71
Gateway out.........................................................................45
Simulink...............................................................................43
System generator...43
Zero padded projection vector...40
72