Вы находитесь на странице: 1из 8

IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS , FEBRUARY 2015

Proximity Sensing Based on Dynamic Vision


Sensor for Mobile Devices
Jae-Yeon Won, Hyunsurk Ryu, Tobi Delbruck, Fellow, IEEE, Jun Haeng Lee, and Jiang Hu, Senior
Member, IEEE,
many sensors are being used in the mobile phones to provide
more useful features such as motion recognition [1], [18], [20].
Recently, Dynamic Vision Sensor (DVS) which mimics
human optic nerve has been researched as a power efficient
and fast responding sensor by Lichtsteiner et al [5], [14]. It
detects only the changes of brightness, and has the fastest
response speed among image sensors so far. Its fast response
time enables DVS to be used in various applications such as
motion recognition. Meanwhile, a proximity sensor is also
equipped to avoid wrong operations by unintended contacts
with skin while calling in a touch screen-based Smartphone
[12]. Among various proximity sensors, optical proximity
sensor is commonly used. And a research has been proposed to
expand its functionality for additional features such as motion
recognition [3]. It outputs infrared light for certain periods
and measures the absolute amount of the lights coming back
by reflection. It assumes that an object is close if the lights
coming back is over certain amount. However, it loses detailed
information by measuring the aggregated absolute values of
lights and hard to differentiate environmental noises. Also,
it is very sensitive to quality and setup angle of transparent
plastic cover for protection and becomes a main reason that
the fraction defective goes up at the production stage.
The proposed proximity sensing design which utilizes time
domain analysis is robust to the environmental noises, and
can decrease fraction defective in production process. The
proposed design is constructed with a DVS and a light source
physically. It suggests two main algorithms to detect proximity.
One is distance estimation based on the spatial information of
the reflection and the other is temporal pattern recognition
of DVS events with an additional light source. A DVS event
is generated when the DVS detects that brightness of a
pixel is changed. The reflection due to the light source is
placed at different spatial position according to distance to
the reflecting object. It can calculate distance based on the
spatial information of DVS events. Also, temporal pattern
recognition is used to remove the environmental noises due
to other light sources. It analyzes the temporal patterns of the
events during turning the light source on or off. In addition,
when the proposed proximity sensor is equipped and replaces
conventional proximity sensor, various and attractive features
such as motion recognition using DVS can be utilized as well.
This paper explains conventional proximity sensors in section II. The concept of DVS and the proposed design for
proximity sensing which utilizes DVS is explained in section
III. Section IV shows experiment results and performance
analysis. The conclusions are drawn in section V.

AbstractDynamic Vision Sensor (DVS) is a sensor which


detects temporal contrast of brightness and has the fastest
response time compared to conventional frame-based sensors
which detect static brightness per every frame. The fastest
response time allows fast motion recognition which is a very
attractive function as a view of consumers. Especially, its low
power consumption due to the event-based processing is a key
feature for mobile applications. In recent Smartphone based
on touch screen, a proximity sensor is equipped to prevent
malfunction due to undesired contacts with skin while calling. In
addition, the main processor stops operation of the touch screen
and turns display off when any object is close to the proximity
sensor to achieve minimizing power consumption. Considering
the importance of the power consumption and reliable operations,
it is certain that proximity sensing is an essential part in touch
screen-based Smartphone. In this paper, a design of proximity
sensing utilizing DVS is proposed. It can estimate the distance
from DVS to an object by analyzing the spatial information of
the reflection of additional light source. It also uses a pattern
recognition based on time domain analysis of the reflection
during turning on of the light source to avoid wrong proximity
detection by noises such as other light sources and motions. The
contributions of the proposed design are in three parts. First,
it calculates accurate distance in real time only with spatial
information of the reflection. Second, the proposed design can
eliminate environmental noises by using pattern matching based
on time domain analysis while conventional optical proximity
sensors, which are mainly used in Smartphone, are very sensitive
to environmental noises due to that they use the total amount
of brightness for certain period. Third, our design replaces
conventional proximity sensors with holding additional benefits
that it utilizes the advantages of DVS.
Index TermsProximity Sensor, Dynamic Vision Sensor, Pattern recognition, Smartphone, Mobile Devices.

I. I NTRODUCTION

HE mobile phone has evolved from an electrical device


which had only a purpose of communication to an essential device with an important role to manage all of our daily
needs. Especially, market share of Smartphone that are mobile
phones with computing functions has increased over the world
because of its various properties such as communication,
internet media, mass media, video camera and etc. In addition,

c
Copyright 2014
IEEE. Personal use of this material is permitted. However, permission to use this material for any other purposes must be obtained
from the IEEE by sending a request to pubs-permissions@ieee.org
This work was supported in part by Samsung Advanced Institute of
Technology, Samsung Electronics Co. Ltd., Republic of Korea.
J.-Y. Won is with the department of Electrical and Computer Engineering,
Texas A&M University, TX 77843 USA (email: jaeyeon9@neo.tamu.edu).
H. Ryu is with Samsung Advanced Institute of Technology, Yongin 446-712
Korea (email: eric ryu@samsung.com).
T. Delbruck is with Institute of Neuroinformatics, University of Zurich, and
ETH Zurich, Zurich, CH-8057, Switzerland (email: tobi@ini.phys.ethz.ch).
J. H. Lee is with Samsung Advanced Institute of Technology, Yongin 446712 Korea (email: junhaeng2.lee@samsung.com).
J. Hu is with the department of Electrical and Computer Engineering, Texas
A&M University, TX 77843 USA (email: jianghu@ece.tamu.edu).

II. C ONVENTIONAL P ROXIMITY S ENSORS


Various proximity sensors have been researched and their
applications are various as well. Magnetic proximity sensor is
one of non-contact sensors and generates magnetic waves [11],
1

IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS , FEBRUARY 2015

[13]. If an object is close to the magnetic proximity sensor, an


induced current flows in the object due to the magnetic wave.
And the sensor detects impedance changes of the detection coil
by the induced current of the object. This sensor detects proximity of only metals and is affected easily by other magnetic
substances. Inductive proximity sensor and capacitive sensor
are also used to detect surface of metallic objects [8], [17].
They are constructed with various switches which have ferrite
core, oscillator, detector and solid state switch. If an object
is close from the sensor, the switch is shorted. Otherwise, it
is opened. They are mainly used for metal detectors, traffic
lights, car washes, and checking occupancy.
Optical proximity sensor is constructed with an emitter
which is a light source and a receiver. It is mainly used in
the applications of Smartphone [7]. The receiver can perceive
existence of light. The light source emits light and the receiver
such as photo-transistor operates according to the amount of
light reflected back. In case that an object is far from the
proximity sensor, the amount of light back is almost zero
and it decides that the object is not close to the sensor.
Because it uses the absolute amount of aggregated light for
certain period, however, the accuracy is not high and it is hard
to differentiate the effects by environmental noises that are
unintended brightness changes. Ultrasonic proximity sensor
also has an emitter and a transmitter [2], [9]. Instead of
light for optical proximity sensor, it uses ultrasound wave
as a source. It observes travel time of the wave which is
reflected back by objects. It can estimate the distance with
the time measured, but it is suitable for measuring relatively
long distance. In addition, various approaches for specific
applications have been conducted [4], [6], [10], [15], [19].
Therefore, we propose a design of proximity sensing which
is robust to noises and has suitable for mobile applications.

determines the proximity which means an object is close to


DVS only when those two methods determine the proximity.
A. Real time Distance Estimation
The proximity sensing design proposed in this paper is
constructed with a DVS and a light source such as laser or light
emitted diode (LED). Main controller turns the light source on
and off periodically and changes the brightness enough to be
detected by DVS. When an object is at certain distance from
DVS, the light is reflected from the object. DVS detects the
changes of the brightness and outputs the spatial information
of the reflection. When the distance to an object varies, the
spatial position which is detected through DVS also varies.
The proposed method calculates the distance based on the
spatial information of the DVS events. To measure the distance
more accurately, rectilinear light source is required because
the diffusion of light is less. In this section, it assumes that an
ideal rectilinear light source is used as a light source in the
proposed proximity sensing design. In addition, it explains a
method to use more realistic light source in section III-B.
rL

rR
c

object

TMIN
d

DMIN

Reflection

S
DVS

Light Source

(a) Settings of a DVS and a light source

sL

sR

(b) Spatial address

Fig. 1. An illustration on how the design works.

Fig. 1 illustrates how the proposed proximity sensor works.


The distance, S and the angle, 2 between the light source
and DVS can be adjusted for various applications. If the
value of S is small, the variation of the address of the pixels
which are detected by the DVS is not large enough while the
distance of an object from DVS is changed. In other words,
it is preferred to set the distance large enough to derive large
variations of the spatial information. Although larger S allows
more distinct distance estimation, the distance is limited by
the design of applications because applications have their own
size limitation. Also, larger S affects on the blind distance,
DM IN which means cannot be detected by DVS. The angle
2 is related to DM IN and the minimum size of an object that
can be detected by the DVS. The larger 2 derives the larger
variance of the address of the pixels, but cannot detect small
objects at the center of the DVS while decreasing DM IN . If an
object which should be detected are large enough, the variable
2 can be large enough because it does not need to consider
small objects. Therefore, the design parameter S and the angle
2 are set according to the specifications of the applications.
The angle 1 is viewing angle of DVS, and given by the
specification of the DVS itself. Fig. 1 also describes variables
to calculate the distance to an object from DVS, d. In Fig. 1(a)
which is side view of the design, the reflected image is located
with the rate of rL and rR which are physical distance among
the total area that can be detected by DVS. Fig. 1(b) shows
spatial address of the reflection in the DVS output. And, (1) is
established where sL and sR are the distance from the both
sides to the reflection in the DVS output. Equation (2) to (8)

III. P ROPOSED P ROXIMITY S ENSOR D ESIGN


Dynamic Vision Sensor generates events, which we term
DVS events here, when DVS detects the change of brightness
of certain pixels and it is over certain threshold. The DVS
events include the type of event (ON or OFF) and its address
which is spatial information. Regarding the type of event, DVS
can detect whether the brightness is increased or decreased.
We term on-events for the DVS events when the brightness is
increased and off-events for the DVS events when the brightness is decreased. The proposed method uses an additional
light source to generate intended brightness changes. When an
object is at certain distance from DVS, it detects the brightness
changes of the reflection on the object through controlling
the light source. The spatial address information and temporal
patterns of the reflections are used in the proposed method. In
section III-A and III-B, it is explained that DVS can measure
distance of an object through modulation by an additional
synchronized light source and the spatial information of the
reflections. However, only with the spatial information by the
synchronized light, it cannot differentiate whether the events
are caused by the light source or other sources if there are
many light sources around. To minimize the effect of the
other light sources which are regarded as noises, this paper
also proposes a method by analyzing the temporal pattern of
the reflection in section III-C. It utilizes the pattern of the
reflected light during the time when the light source is turn
on or off. Regarding the integration of these two methods, it
2

IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS , FEBRUARY 2015

explain how to calculate the distance d by using sL and sR.


The variable a, b, and c can be calculated as shown in (2), (3)
and (4), respectively.
rL : rR = sL : sR
(1)
S
a=
(2)
tan 2
S
b=da=d
(3)

 tan 2
S
c = b tan 2 = d
tan 2 = d tan 2 S (4)
tan 2

d3
d2
d1
DVS

R2

R2
Lens

(6)

(d tan 1 c) : (d tan 1 + c) = sL : sR

(7)

S(sL + sR)
sL(tan 1 + tan 2 ) sR(tan 1 tan 2 )

(a) Focal point

(b) DVS output

Fig. 2. Minimization of light diffusion using a convex lens.

explains a method to use less rectilinear light sources with


more diffusion of light for realistic equipment.
In the example of LED as a light source which has more
diffusion of light, it can focus lights diffused through convex
lens to a specific point. Fig. 2 shows that a convex lens
concentrates the light diffused from the LED to one spot at
a certain distance, d2 . Although it concentrates the light to
one spot at a certain distance through a lens, it cannot avoid
the diffusion of light at other distances. To decide the certain
distance which has focal point, it is placed at the median
distance of the critical range, to measure the distance in
the critical range accurately and minimize the error due to the
light diffusion. For the distance out of the critical range, it does
not have to acquire the spatial information of the reflection
because it can recognize that an object is at a far distance
only with little events by reflected light.
Fig. 2(a) shows how to decide the focal point to minimize
the diffusion of the light using a convex lens. When the critical
range, which is desired to be observed is the difference of
d1 and d3 , it sets the focal point at the median point of d1 and
d3 . Therefore, the focal point, d2 is at (d1 + d3 )/2. Although
it sets the focal point to middle point, the diffusion of the light
still exists in the critical range. In that case, it calculates one
dedicate spatial information of the reflection diffused through
averaging all spatial information in the diffused area. Also, it
can eliminate other DVS events out of the active range because
it averages DVS events only in the active region. Fig. 2(b)
shows the active region which is the area that reflections of
the light source can be placed. If an object is further than the
center point, its reflection is placed on the left active region
of the DVS output. If it is closer than the center point, its
reflection is placed on the right side of active region. Thus,
the distance can be measured based on the spatial information
of the DVS events even when the reflection of light is diffused.

(8)

Considering that 1 is fixed by the specification of DVS


itself, S and 2 are parameters that can be adjusted appropriately to applications. In addition, variable S has certain
limitation to be adjusted due to the size of the application.
Thus, variable 2 is a main adjustable parameter and determines detectable size, T and detectable distance, D. Within
the given specifications, 2 has a lower and a higher bound.
Within the settings as shown in Fig. 1(a), minimum detectable
size, TM IN where the object is at distance is acquired by
(9). The size of an object which can be detected, T should be
greater than or equal to TM IN .
TM IN = 2( tan 2 S) T

R1

In (8), a distance from DVS to the object, d, can be acquired


because S, 1 and 2 are set by the specification and sR and
sL are the results observed which are known values.
d=

R3

R1

LED

The physical distances, rL and rR, are calculated as shown


in (5) and (6). Equation (7) is also derived by using (1), (5)
and (6).
rL = d tan 1 c
(5)
rR = d tan 1 + c

Active Region

R3

(9)

Also, the minimum distance which can be detected within


the settings of Fig. 1(a), DM IN , is calculated by (10). The
detectable distance, D is greater than or equal to the given
specification, DM IN .
S
DM IN =
D
(10)
tan 1 + tan 2
Assuming that 1 , , T , D and S are fixed by the specifications, the angle between the DVS and the light source, 2 is
bound as shown in (11) and can be adjusted for applications
within the range.



 
S
T
1
1
1
tan
tan 1 2 tan
+S
(11)
D
2

C. Pattern Recognition for Light Source Turn-On/Off Time


This section explains how the temporal pattern of the reflection is used to increase the robustness of proximity detection.
In the proposed technique, the light source is modulated by
switching it on and off. Then, the observation of corresponding
reflection pattern can tell if the measured light is reflected
from the source or from ambient light noises. The brightness
changes of the light are generated for rising time (Tr ) when
the light source is turn on and for falling time (Tf ) when it is
turn off. When the light intensity is changed, DVS generates
on-events which are DVS events have positive polarity for the
brighten pixels. Also, it generates off-events which are the
DVS events have negative polarity for the darken pixels. To
acquire realistic rising time and falling time of the light source,
photo detector is used to measure the transition time of the

B. Applying Various Light Sources


In the proposed design, the more rectilinear propagation and
the less diffusion of the light source provide more accurate
spatial information of the reflection. In other words, it detects
more distinct spatial information when the light source which
has more rectilinear property such as when laser is used.
Considering production cost and productivity, however, it may
give more benefits when the less rectilinear light source such
as light emitted diode (LED) is used. Therefore, this section
3

The number of on-events

300

max
min

250
200
150
100
50

0
0.0

0.5

1.0
1.5
Time (ms)

The number of on-events

IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS , FEBRUARY 2015


100

max
min

80

60
40

20
0

2.0

0.0

(a) By the light source

0.5

1.0
1.5
Time (ms)

2.0

(b) By the reflected light

Fig. 4. The patterns of the DVS on-events.


The number of on-events

light source. Fig. 3 shows a LED control signal and brightness


of the LED for the transition time measured by photo detector
and digital storage oscilloscope. The rising time, Tr of the
LED is set to 1 ms. Therefore, the controller turns the LED
on for 1 ms and turns the LED off for remaining period.
When the light source is turned on and off, the transition
of the DVS events are shown in Fig. 4. The x-axis is time
from turning a light source on and the y-axis is the number
of the DVS events. In Fig. 4, the number of on-events has
been acquired for 100 times of the light on/off operation.
Only on-events are observed because the transition of the onevents by turning the light source on is more obvious than the
off-events while minimizing the complexity of the algorithm.
The max and min represent maximum and minimum values
of the number of events among 100 times at each time. In
this experiment, the transition of the light is measured a 2ms
interval, during which the light is on for 1ms and off for
the other 1ms. To acquire the pattern for 2 ms, the main
controller counts the number of events per 100 us for 2 ms. The
pattern shown in Fig. 4(a) was acquired by the experiment that
DVS observed the light source directly without any reflected
object to minimize reflecting effects. In addition, it forms a
certain pattern of on-events which can be used to differentiate
other effects except by the intended light source control. Also,
certain patterns of reflection by an object which is at certain
distance from DVS with the light source control are shown
in Fig. 4(b). The distance to an object is set to 50 mm,
and the light source and DVS direct to the reflecting object.
The patterns which are shown in Fig. 4(b) are defined as
standard patterns which is used to differentiate whether the
DVS event occurs due to the controlled light source or noises.
The distance setting for standard pattern can be varied by
the specifications and the 50 mm used in the experiment is
reasonable for mobile devices. As shown in Fig. 4(b), the
pattern can differentiate itself from noise ambient light. In
other words, where an object is close to the light source, the
pattern shows a similar pattern to the standard pattern while
it is regarded as by other light sources which are noises if a
pattern observed has a different type to the standard pattern.
Even though the patterns vary slightly according to the kinds
of objects and distance, the patterns are formed quite similarly
to the standard pattern. The variations of the patterns by
objects and distance will be discussed more in section IV.
The reflecting patterns are various according to reflecting
object, background light, the distance to objects and etc. It
is hard to determine that the pattern belongs to the standard
pattern only with the absolute number of DVS events. This
paper proposes an efficient algorithm to determine whether
the temporal pattern matches the standard pattern only with a
common rule which utilizes the transition aspect of the pattern.
Fig. 5 explains how to determine the proximity based on

120

120
4-Averaged on-events

Fig. 3. Brightness transition of the LED used in this paper

100

80
60
40

20

100

80
60
40

20

0
0.0

0.5

1.0
1.5
Time (ms)

2.0

0.0

(a) Original temporal pattern

1.0
1.5
Time (ms)

2.0

(b) 4-Averaged pattern


reset

50
The amount of transition

0.5

40

S0

30
20

S1

S2
+

10

ST

0
-10 0.0

0.5

1.0

1.5

2.0

-20

-30

Time (ms)

(c) Transition of the pattern

SF

S6

+
S5

S3
+

S4

(d) State diagram

Fig. 5. Proposed pattern recognition.

the transition aspect of the temporal pattern, and the distance


to an object is set to 20 mm. By turning-10the light source
-20
on, on-events are generated. Fig. 5(a) shows
that there are
-30
some irregular transitions. The irregular pattern may result in
misinterpretation. To avoid the effect of the irregular transition,
it uses an averaged pattern as shown in Fig. 5(b). Here other
smoothing functions such as Gaussian function can be used,
but the averaging method used in this paper is simple to be
implemented into hardware and effective enough.

PN 1
1
n=0 CSm+n if m M N + 1
N
ASm =
(12)
0
otherwise
Equation (12) describes calculation of the averaged pattern.
The ASm is the mth averaged sample and the CSn is the
nth current sample. N is the number of samples used in
the averaging and M is the number of maximum samples.
Fig. 5(a) shows the original samples, CSn and Fig. 5(b) shows
the averaged samples, ASm . AS0 is regarded as 0. Considering
that larger averaging causes more information loss, M is set
to 20 and N is set to 4 to acquire the smoothing pattern while
keeping the original pattern. Fig. 5(c) shows the transition of
the averaged samples, and is acquired by difference of ASm
and ASm + 1. A few positive transitions are followed by a
few negative transitions, and this property is used as a common
rule to decide whether the DVS events are caused by an object
with the intended light control or noises. In the example of
Fig. 5(c), it shows four consecutive positive transition values
from 0 ms to 0.3 ms and eleven consecutive negative transition
values from 0.4 ms to 1.4 ms. The number of positive
transitions and negative transitions can be different according
to many conditions. However, the property which includes
4

16
14

The amount of transition

The number of on-events

IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS , FEBRUARY 2015

12
10
8

6
4
2
0
0

0.5

1.0
1.5
Time (ms)

2.0

(a) Original temporal pattern

6
4
2
0
-2
-4

0.5

1.0

1.5

2.0

(a) 20 mm

(b) 50 mm

(c) 100 mm

(d) 150 mm

Fig. 7. Estimation of the distance based on the address of the reflection.

Time (ms)

lens and the LED are located at horizontally 25 mm (S) apart


from the DVS at the same vertical position. We also analyzed
smaller setting distance (S), 10 mm, for more various and tiny
applications
in section IV-G. The setup angle, 2 , is set to
-10
30 to-20 generate higher sensitivity and fully support detectable
range-30regarding that the range of 2 is 14 2 36 when
the detectable range is 20 mm to 200 mm by equation (11).
The distance to the focal point, d2 is set to 50 mm.
A. Distance calculation using the spatial information
Figure 7 shows spatial information of reflections, which is
the DVS output obtained by using a software to visualize the
DVS events for certain period. White dots represent lighter
pixels while black dots represent darker pixels in the grey
square which is region that DVS can detect. To evaluate only
the effect of reflections by the synchronized light source,
the experiments are executed without any other light sources
except a room light. As shown in Fig. 7(a), the position of
the reflection is on the right side of the DVS output when
the reflecting object is at 20 mm away from DVS. When the
reflecting object is at 50 mm away from DVS, Fig. 7(b) shows
that the reflected light is at center of the DVS output. In case
that the object is further than 50 mm from DVS, reflected
lights are observed on the left side of the DVS output as
shown in Fig. 7(c) and 7(d). The experiments prove that the
proposed design can calculate the distance only with the spatial
information of the reflection. And, the resolution error is less
than 1%, 0.45mm when an objest is 50 mm apart and setup
angle of light source (2 ) is set to 30 .

(b) Transition pattern

Fig. 6. The patterns of the DVS events by random light.

some consecutive positive and negative transitions is held on


regardless of various conditions. The property-10is experimented
-20
and verified at section IV. For objects at a far
distance, it can
-30
be differentiated by the spatial address of the reflection which
is described in section III-A although it satisfies the common
rule. Fig. 5(d) shows a diagram of the proposed method in case
that it checks more than two consecutive positive transitions
and more than three consecutive negative transitions to decide
whether it matches to the standard pattern. ST means it decides
the object is close while SF indicates the object is not close.
As shown in Fig. 5(c), it shows four positive transitions from
0 ms to 0.3 ms, and the state of Fig. 5(d) which begins at
S0 moves to S3 because there is no negative transition. And,
it has negative transitions from 0.4 ms to 1.4 ms, and the
state moves to next state whenever it has a negative transition
and finally to ST because it has more than three consecutive
negative transitions.
Fig. 6(a) and 6(b) show original DVS on-events and the
transition of the 4-averaged pattern by random light sources,
respectively. It shows irregular DVS events for measuring time
because the random light is not synchronized by the light
source control. In the example of Fig. 6(b), there are two
positive transitions at time 0 ms and 0.1 ms and a negative
transition at time 0.2 ms, the state moves from S0 to S2 due to
the first two positive transitions, and then it falls to SF because
it is followed by a negative transition. Finally, it determines
that there is no object which is close to DVS because it does
not belong to the standard pattern by the synchronized light
source control. Also, a threshold can be used to compare
with the total number of events generated. In other words,
even though it matches the standard pattern by the proposed
algorithm, it determines that there is no object if the total
number of events is less than the activity threshold because
the number of events is not enough to decide its proximity.
The proposed method utilizes spatial and temporal pattern
of the reflection together with light source modulation and a
DVS which detects the changes of brightness to determine
proximity. The advantages and performance of the proposed
method are explained with experiment results in section IV.

B. Robustness to background lights


Fig. 8 shows the effects of background lights. Fig. 8(a) is
the temporal pattern by an object at 20 mm apart when the
background light is turned on. Fig. 8(b) is the temporal pattern
when the background light is turned off. Fig. 8(c) and 8(d)
show the transition of 4-averaged pattern. Comparing Fig. 8(b)
with 8(a), more DVS events are generated in dim environment
because the change of the LED brightness is dominant in
dark environment. However, the temporal patterns are highly
correlated each other and share the same trait that a few
positive transitions are followed by a few negative transitions.
In other words, Fig. 8(c) and Fig. 8(d) show that the both
patterns have four positive transitions and more than three
negative transitions. Therefore, the proposed method can be
used to detect proximity in presence of background lights.
C. The effect of transparent plastic cover
Fig. 9 shows the effect of the transparent plastic cover for
protection. Fig. 9(a) shows the position of the reflection where
an object is at 50 mm apart and a transparent plastic cover is
equipped at 5 mm apart from the DVS. As shown in Fig. 9(a),
the transparent plastic cover does not affect the performance
because it is placed closer than the minimum detectable
distance, DM IN , and the reflection by the cover is not detected

IV. E XPERIMENT R ESULTS


The proposed algorithm has been implemented and evaluated on FPGA tool environment. A dynamic vision sensor is
connected to GPIOs of the FPGA and the output of the light
source is controlled by the FPGA. In the experiments, a 128
by 128 asynchronous dynamic vision sensor is used [14].
The event latency of the sensor is 15 us at the minimum. Its
viewing angle is total 90 which is 45 (1 ) to left side and
45 to right side with a lens which has 35 mm C-mount lens
/ focal length 4.5 mm. A LED is used as a light source with a
5

100
50
0
0

0.5

1.0
1.5
Time (ms)

100
50

40
20
0
-20

-40

0.5

1.0

1.5

0.5

1.0
1.5
Time (ms)

2.0

100
The amount of transition

The amount of transition

60

(c) Transition pattern w BL

min

60

-10

-10

40

-20

-20

20

-30

-30

-20

0.5

1.0

1.5

2.0

min

80
60
40

20
0

(a) Reflection
with cover

1.0
1.5
Time (ms)

2.0

(b) Temporal pattern


with cover

0.5

1.0
1.5
Time (ms)

2.0

150

max
min

100

50

0
0

0.5

1.0
1.5
Time (ms)

2.0

(b) Brightness changes and temporal DVS pattern of a hand


shaking at 50 mm apart with a light source control
Fig. 10. The effect of hand gestures.

The number of on-events

The number of on-events

max

100

min

Time (ms)

(d) Transition pattern w/o BL

120

0.5

max
10

(a) Brightness changes and temporal pattern of a hand


shaking at 50 mm apart without a light source control

Fig. 8. The effects by background light (BL).

20

2.0

max

80

-40

Time (ms)

30

(b) Original pattern w/o BL

min

40

2.0

max

80

max
min

150

(a) Original pattern w BL


100

The number of on-events

150

200

The number of on-events

max
min

120

max
min

100

-10

-10

60

-20

-20

40

-30

-30

80

20
0
0

0.5

1.0
1.5
Time (ms)

2.0

(c) Temporal pattern


without cover

The number of on-events

200

The number of on-events

The number of on-events

IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS , FEBRUARY 2015

80

max
min

60
40
20
0
0

0.5

1.0
1.5
Time (ms)

2.0

(a) Brightness changes and temporal DVS pattern of random lights without a light source control

Fig. 9. The effect of a transparent plastic cover.

The number of on-events

by the DVS. Fig. 9(b) and Fig. 9(c) show the temporal pattern
of DVS events with a cover and without a cover, respectively.
Even though temporal pattern is affected by a cover, it shares
the same trait that a few positive transitions are followed by
a few negative transitions. Therefore, the proposed method
which is temporal pattern recognition with spatial information
is robust to the effect due to the transparent plastic cover.

150

max
min

100

50

0
0

D. The effect of noises


In this section, it is experimented that the proposed method
is robust to environmental noises by moving objects and other
light sources. Fig. 10 and 11 show patterns of the temporal
on-events with hand gestures and unintended light sources,
respectively. Fig. 10(a) shows temporal on-events without the
light source control. It shows that temporal pattern of hand
gesture forms a different aspect to the standard pattern which
is shown in Fig. 4(b), and can be regarded as a noise. The
effects by the hand gesture with intended light control have
been observed in Fig. 10(b). It shows on-events when a hand is
moving at 50 mm apart with light source control. Even though
the light intensity changes by shaking a hand are generated, it
does not affect patterns by the intended light control because
it measures only for 2 ms since it begins light control.
The effects of noise lights are analyzed in Fig. 11. Fig. 11(a)
shows the effect of random lights without a light source control. The temporal pattern is quite distinct from the standard
pattern, and can be recognized as noises easily. Fig. 11(b) show
the effect of random lights with a reflecting object at 20 mm
apart with a light source control. Even though random lights
affect the DVS pattern, the impact for the measuring time
is negligible because the random lights are not synchronized

0.5

1.0
1.5
Time (ms)

2.0

(b) Brightness changes and temporal DVS pattern of an


object at 20 mm apart with random lights and a light source
control
Fig. 11. The effects of random lights.

by the light source control. By increasing the distance to the


object, random lights effects may become dominant relatively
comparing the effect of the light source control. Although the
effects of noises become dominant for objects far away from
DVS, disturbance by the random lights can be recognized by
using the proposed temporal pattern recognition method and
regarded as noises.
E. The effect of Reflecting Objects
Fig. 12 shows that temporal patterns of DVS events by
various objects. There is a large variety of temporal patterns
as the objects may have all kinds of different colors, texture,
surface roughness, etc. Fig. 12(a) and 12(b) show the effect of
color of the reflecting object. White color shows more similar
pattern to the standard pattern comparing black color because
the reflectance of white color is higher than black color. Even
though different color show slightly different patterns, both
of them share the same trait that a few positive transitions
6

min

40
30
20

10
0
0

0.5

1.0
1.5
Time (ms)

60

min

40
30
20

10
0

2.0

50

min

40
30
20

10
0
0

0.5

1.0
1.5
Time (ms)

0.5

1.0
1.5
Time (ms)

2.0

(b) Black paper


max

The number of on-events

The number of on-events

(a) White paper


60

detection. Regarding the scenario I is the most common


situation of calling, it proves that the proposed design is very
effective. The success rate in Scenario II can be increased
up to 100% by applying a loose condition which allows a
single negative transition among consecutive positive transitions for temporal pattern recognition. Scenario III and IV
are to simulate worst situations. In Scenario III, DVS detects
reflection of the light source and effects by a hand shaking.
Its correct operations should detect the proximity. It shows
94.4% success rate when DVSST L is applied. It is also high
enough regarding the movement of face and hand is not
much-10while calling. Scenario IV is close to the worst case.
When-20the reflection of the events is on right side of DVS,
-30
it decides
the proximity which is undesirable if it uses only
spatial information, DVSS . By applying DVSST or DVSST L ,
it can differentiate the effect of the fan by using temporal
pattern matching because the temporal pattern of the fan is not
synchronized by the light control. Therefore, applying spatial
information and temporal pattern of the reflection together, it
can determine the proximity by using light source control.
In -10
addition, we experimented the common optical proximity
sensing
-20 method which is used in Smartphone application [16].
-30
The conventional
optical proximity sensor detects proximity
based on aggregated brightness change by an additional light
source. The brightness change is calculated as a difference of
brightness when the additional light source is turn on and off.
Thus, the conventional method also shows effective success
rate (100.0%) of proximity detection regardless of background
lights. In Scenario IV, the conventional method shows only
42.9% success rate to the environmental noises while the
proposed DVSST L shows 81.5% success rate by temporal
pattern recognition. Thus, the proposed method is more robust
to environmental noises than conventional optical method as
shown in Scenario IV.

max

50

60

max

50

min

40
-10

30

-20
20

-30

10

2.0

0
0

0.5

1.0
1.5
Time (ms)

2.0

(c) Glazed wood


(d) LCD screen
Fig. 12. The temporal patterns of various objects.
TABLE I
S UCCESS RATE OF PROXIMITY DETECTION .
-10

Optical
DVSS
DVSST
DVSST L

Scenario I
100.0%
100.0%
100.0%
100.0%

Scenario II
100.0%
100.0%
87.5%
100.0%

Scenario -20
III
95.5% -30
95.7%
92.1%
94.4%

Scenario IV
42.9%
15.9%
85.8%
81.5%

are followed by a few negative transitions. Other reflecting


objects, a glazed wood and a LCD screen are experimented
in Fig. 12(c) and 12(d), respectively. The glazed wood
shows almost the same as standard pattern, but the LCD
screen distorts the light extensively due to its inner structure.
Regarding that it focuses on human hands or face, the proposed
design for proximity sensing is useful although there are some
distortions at some artificial objects.
F. Performance Analysis
In this section, the performance of the proposed design is
evaluated. The distance to decide the proximity is set to 50
mm. In other words, it determines that an object is close when
it is closer than 50 mm. We use a sequence of more than two
positive transitions followed by three negative transitions as a
pattern to identify the reflections from the modulated light.
Table. I shows the success rate of proximity detection. The
success rate is defined as the number of correct detection
over the number of detection trials. For the detection trial,
we conducted 300 times of proximity detection. We analyzed
three proposed methods through four different scenarios.
DVSS : Only spatial information of the reflection applied.
DVSST : Temporal pattern recognition shown in Fig. 5
with spatial information of the reflection.
DVSST L : Allows a single negative transition among the
consecutive positive transitions in DVSST .
 Scenario I: A hand not moving at 30 mm apart from DVS
with room lights.
 Scenario II: Same settings to Scenario I w/o room lights.
 Scenario III: A hand shaking at 30 mm apart from DVS.
 Scenario IV: An electrical fan at 300 mm apart from DVS
and the reflection by the fan is placed on the right side
of DVS output intentionaly.
Scenario I and II are for normal operations at day and night
time, respectively. Correct detection of Scenario I is that the
object is close. It shows 100% success rate of proximity

G. Analysis of setting distance

100 mm

For more tiny applications, we analyzed the effect of setting


distance, S, between the light source and DVS. Smartphone
of 50 mm wide or larger are very common and 25 mm setting
used in this work is suitable to the dimension. Also, large S
is preferred to generate higher sensitivity (distance/pixel) due
to large variation according to the distance to the object as
described in section III-A. In some tiny applications, however,
S of 25 mm may be too large. Thus, we described the effect
of smallersetting distance. Smaller S can be also used for
proximity sensing, but it comes with lower sensitivity due to
less variation according to the distance to the object.

20 mm

100 mm

max

50

2
20 mm

60

The number of on-events

The number of on-events

IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS , FEBRUARY 2015

(S) 25 mm
DVS

Light Source

(S) 10 mm
DVS

Light Source

(a) Setting distance, S = 25 mm (b) Setting distance, S = 10 mm


Fig. 13. The analysis of setting distance.

Fig. 13 illustrates examples of 25 mm and 10 mm setting


distance while other settings are same in the paper. We
assumes that an object moves between 20 mm to 100 mm
from DVS for analysis and the range of movement is suitable
to analyze because the proximity design focuses close object.
7

IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS , FEBRUARY 2015

In Fig. 13(a), the variation of angle is 52 when an object


moves between 20 mm and 100 mm from DVS and setting
distance is 25 mm. As an example of small distance, Fig. 13(b)
shows that the variation of angle is 21 for same movement
when setting distance is 10 mm. Also, sensitivity of each cases
are 2.6 mm/pixel and 1.1 mm/pixel, respectively. Therefore,
smaller setting distance between DVS and a light source can
be also used even though it reduces sensitivity.

[11] K. Koibuchi, K. Sawa, T. Honma, T. Hayashi, K. Ueda, and H.


Sasaki, Eddy-current type proximity sensor with closed magnetic circuit
geometry, IEEE Tran. Magn., vol. 43, no. 4, pp. 17491752, Apr. 2007.
[12] N. D. Lane, E. Miluzzo, H. Lu, D. Peebles, T. Choudhury, and A. T.
Campbell, A survey of mobile phone sensing, IEEE Commun. Mag.,
vol. 48, no. 9, pp. 140150, Sep. 2010.
[13] J. Li, C. C. Jobes, and J. L. Carr, Comparison of magnetic field
distribution models for a magnetic proximity detection system, IEEE
Trans. Ind. Appl., vol. 49, no. 3, pp. 11711176, May-June 2013.
[14] P. Lichtsteiner, C. Posch, and T. Delbruck, A 128x128 120 db 15us
latency asynchronous temporal contrast vision sensor, IEEE J. SolidState Circuits, vol. 43, no. 2, pp. 566576, Feb. 2008.
[15] F. Marino, P. D. Ruvo, G. D. Ruvo, M. Nitti, and E. Stella, Hiper
3-d: An omnidirectional sensor for high precision environmental 3-d
reconstruction, IEEE Tran. Ind. Electron., vol. 59, no. 1, pp. 579591,
Jan. 2012.
[16] G. Milette and A. Stroud, Professional Android Sensor Programming.
John Wiley & Sons, NJ, 2012.
[17] T. Mizuno, T. Mizuguchi, Y. Isono, T. Fujii, Y. Kishi, K. Nakaya, M.
Kasai, and A. Shimizu, Extending the operating distance of inductive
proximity sensor using magnetoplated wire, IEEE Tran. Magn., vol. 45,
no. 10, pp. 44634466, Oct. 2009.
[18] M. Murakami, J. K. Tan, H. Kim, and S. Ishikawa, Human motion
recognition using directional motion history images, in Proc. of the
Int. Conf. on Control Automation and Systems, 2010, pp. 15121514.
[19] R. Richa, M. Balicki, R. Sznitman, E. Meisner, R. Taylor, and G.
Hager, Vision-based proximity detection in retinal surgery, IEEE
Trans. Biomed. Eng., vol. 59, no. 8, pp. 22912301, Aug. 2012.
[20] W. Wei and A. Yunxiao, Vision-based human motion recognition: A
survey, in Proc. of the Int. Conf. on Intelligent Networks and Intelligent
Systems, 2009, pp. 386389.

V. C ONCLUSION
The proposed proximity sensing design which utilizes a dynamic vision sensor (DVS) can detect the proximity of an object with providing robustness to interference from unintended
other light sources and a transparent plastic cover which is
equipped for protection. The fast response time of DVS allows
the proposed design to measure the distance to the object
from the DVS in real time based on the spatial information of
the DVS events. Also, the proposed method can change the
critical range which is measurable distances by changing the
angle of the light source. The more rectilinear light source is
preferred to measure the distance more accurately. This paper
also mentioned how to use more diffused light source such as
LEDs for the sake of cost-effectiveness. Comparing with the
conventional optical proximity sensors which are vulnerable
to environmental noises, the proposed method is robust to
noises because it uses temporal pattern of the DVS events
with synchronizing the light source control. It is experimented
that the proposed design can achieve up to 100% accuracy at
normal scenarios by applying a common rule with a loose
condition and over 80% accuracy even at the worst scenarios.
As an additional advantage, the proposed design can replace
the conventional proximity sensor and utilize benefits of DVS
itself when it is not used for proximity detection.

Jae-Yeon Won received the B.S. degree in electrical and electronic engineering from Yonsei University, Seoul, Korea, in 2007 and the M.S.
degree in electrical engineering and computer science from Seoul National
University, Seoul, in 2009. He is currently working toward the Ph.D. degree
in electrical and computer engineering Department, Texas A&M University,
College Station, TX since 2010. His research interests include hardware design
and power optimization for Chip Multi Processor (CMP) and neuromorphic
computing.
Hyunsurk (Eric) Ryu gradated from POSTECH (B.S., 1992, M.S., 1994,
Ph.D., 1998). He has been a Member of Technical Staff in Samsung Advanced
Institute of Technology, Samsung Electronics, after receiving his Ph.D.
Recently he has expanded his research interests to bio-mimic computing,
communication and cognitive applications.
Tobi Delbruck (M89-SM06-F13) received his BSc in physics and applied
mathematics from UCSD in 1983 and a PhD from Caltech in 1993. He
worked for 5 years on electronic imaging at Arithmos, Synaptics, National
Semiconductor, and Foveon. He is professor at ETH Zurich in the Inst. of Neuroinformatics. He is a fellow of IEEE. He has been awarded 6 IEEE awards,
including the 2006 ISSCC Jan Van Vessem Outstanding European Paper
Award. He co-organizes the Telluride Neuromorphic Cognition Engineering
summer workshop and the demonstration sessions at ISCAS, and is incoming
chair of the CAS Sensory Systems Technical Committee and associate editor
of the IEEE Transactions of Biomedical Circuits and Systems. His current
interests include bio-inspired and neuromorphic sensory processing.
Jun Haeng Lee received the B.S., M.S., and Ph.D. degrees in electrical
engineering from the Korea Advanced Institute of Science and Technology
(KAIST), Daejeon, Korea, in 1999, 2001, and 2005, respectively. He is
currently with Samsung Advanced Institute of Technology (SAIT) of Samsung
Electronics Co. Ltd., Gyeonggi-do, Republic of Korea. His current research
interests include neuromorphic engineering and biologically plausible model
for artificial intelligence.
Jiang Hu (M01-SM07) received the B. S. degree in optical engineering from
Zhejiang University, China, in 1990, the M. S. degree in physics in 1997, and
the Ph. D. degree in electrical engineering from the University of Minnesota
in 2001. He has been with IBM Microelectronics from January 2001 to June
2002. Currently, he is an associate professor in the Department of Electrical
and Computer Engineering at the Texas A&M University. His research interest
is on Computer-Aided Design for VLSI circuits and systems, especially on
large scale circuit optimization, clock network synthesis, robust design and
on-chip communication. He received a best paper award at the ACM/IEEE
Design Automation Conference in 2001, an IBM Invention Achievement
Award in 2003 and a best paper award at the IEEE/ACM International
Conference on Computer-Aided Design in 2011. He has served as technical
program committee member for DAC, ICCAD, ISPD, ISQED, ICCD, DATE,
ASPDAC, ISLPED and ISCAS, technical program chair and general chair for
the ACM International Symposium on Physical Design, and associated editor
for IEEE Transactions on CAD and ACM Transactions on Design Automation
of Electronic Systems.

R EFERENCES
[1] E. Y. Ahn, J. H. Lee, T. Mullen, and J. Yen, Dynamic vision sensor
camera based bare hand gesture recognition, in Proc. of the IEEE
Symposium on Computational Intelligence for Multimedia, Signal and
Vision Processing, 2011, pp. 5259.
[2] C. Canali, G. D. Cicco, B. Morten, M. Prudenziati, and A. Taroni, A
temperature compensated ultrasonic sensor operating in air for distance
and proximity measurements, IEEE Tran. Ind. Electron., vol. 29, no. 4,
pp. 336341, Nov. 1982.
[3] H. Cheng, A. M. Chen, A. Razdan, and E. Buller, Contactless gesture
recognition system using proximity sensors, in Proc. of the IEEE Int.
Conf. on Consum. Electron., 2011, pp. 149150.
[4] W. Chung, H. Kim, Y. Yoo, C. Moon, and J. Park, The detection and
following of human legs through inductive approaches for a mobile robot
with a single laser range finder, IEEE Tran. Ind. Electron., vol. 59, no. 8,
pp. 31563166, Aug. 2012.
[5] T. Delbruck and P. Lichtsteiner, Photoarray for detecting timedependent image data, Patent US 20 080 135 731 A1, 6 12, 2008.
[6] W. Hortschitz, H. Steiner, M. Sachse, M. Stifter, F. Kohl, J. Schalko, A.
Jachimowicz, F. Keplinger, and T. Sauter, Robust precision position detection with an optical mems hybrid device, IEEE Tran. Ind. Electron.,
vol. 59, no. 12, pp. 48554862, Dec. 2012.
[7] T. Inari and N. Aoki, A proximity sensor using power variation of a
laser diode caused by returned light and its characteristics of stability, in
Proc. of the IEEE Annual Conf. on Ind. Electron., 2005, pp. 21142118.
[8] M. Jagiella, S. Fericean, and A. Dorneich, Progress and recent realizations of miniaturized inductive proximity sensors for automation, IEEE
Sensors J., vol. 6, no. 6, pp. 17341741, Dec. 2006.
[9] S. J. Kim and B. K. Kim, Dynamic ultrasonic hybrid localization
system for indoor mobile robots, IEEE Tran. Ind. Electron., vol. 60,
no. 10, pp. 45624573, Oct. 2013.
[10] Y. K. Kim, Y. Kim, Y. S. Jung, I. G. Jang, K. Kim, S. Kim, and B. M.
Kwak, Developing accurate long-distance 6-dof motion detection with
one-dimensional laser sensors: Three-beam detection system, IEEE
Tran. Ind. Electron., vol. 60, no. 8, pp. 33863395, Aug. 2013.

Вам также может понравиться