Академический Документы
Профессиональный Документы
Культура Документы
I. I NTRODUCTION
c
Copyright
2014
IEEE. Personal use of this material is permitted. However, permission to use this material for any other purposes must be obtained
from the IEEE by sending a request to pubs-permissions@ieee.org
This work was supported in part by Samsung Advanced Institute of
Technology, Samsung Electronics Co. Ltd., Republic of Korea.
J.-Y. Won is with the department of Electrical and Computer Engineering,
Texas A&M University, TX 77843 USA (email: jaeyeon9@neo.tamu.edu).
H. Ryu is with Samsung Advanced Institute of Technology, Yongin 446-712
Korea (email: eric ryu@samsung.com).
T. Delbruck is with Institute of Neuroinformatics, University of Zurich, and
ETH Zurich, Zurich, CH-8057, Switzerland (email: tobi@ini.phys.ethz.ch).
J. H. Lee is with Samsung Advanced Institute of Technology, Yongin 446712 Korea (email: junhaeng2.lee@samsung.com).
J. Hu is with the department of Electrical and Computer Engineering, Texas
A&M University, TX 77843 USA (email: jianghu@ece.tamu.edu).
rR
c
object
TMIN
d
DMIN
Reflection
S
DVS
Light Source
sL
sR
d3
d2
d1
DVS
R2
R2
Lens
(6)
(d tan 1 c) : (d tan 1 + c) = sL : sR
(7)
S(sL + sR)
sL(tan 1 + tan 2 ) sR(tan 1 tan 2 )
(8)
R1
R3
R1
LED
Active Region
R3
(9)
300
max
min
250
200
150
100
50
0
0.0
0.5
1.0
1.5
Time (ms)
max
min
80
60
40
20
0
2.0
0.0
0.5
1.0
1.5
Time (ms)
2.0
120
120
4-Averaged on-events
100
80
60
40
20
100
80
60
40
20
0
0.0
0.5
1.0
1.5
Time (ms)
2.0
0.0
1.0
1.5
Time (ms)
2.0
50
The amount of transition
0.5
40
S0
30
20
S1
S2
+
10
ST
0
-10 0.0
0.5
1.0
1.5
2.0
-20
-30
Time (ms)
SF
S6
+
S5
S3
+
S4
16
14
12
10
8
6
4
2
0
0
0.5
1.0
1.5
Time (ms)
2.0
6
4
2
0
-2
-4
0.5
1.0
1.5
2.0
(a) 20 mm
(b) 50 mm
(c) 100 mm
(d) 150 mm
Time (ms)
100
50
0
0
0.5
1.0
1.5
Time (ms)
100
50
40
20
0
-20
-40
0.5
1.0
1.5
0.5
1.0
1.5
Time (ms)
2.0
100
The amount of transition
60
min
60
-10
-10
40
-20
-20
20
-30
-30
-20
0.5
1.0
1.5
2.0
min
80
60
40
20
0
(a) Reflection
with cover
1.0
1.5
Time (ms)
2.0
0.5
1.0
1.5
Time (ms)
2.0
150
max
min
100
50
0
0
0.5
1.0
1.5
Time (ms)
2.0
max
100
min
Time (ms)
120
0.5
max
10
20
2.0
max
80
-40
Time (ms)
30
min
40
2.0
max
80
max
min
150
150
200
max
min
120
max
min
100
-10
-10
60
-20
-20
40
-30
-30
80
20
0
0
0.5
1.0
1.5
Time (ms)
2.0
200
80
max
min
60
40
20
0
0
0.5
1.0
1.5
Time (ms)
2.0
(a) Brightness changes and temporal DVS pattern of random lights without a light source control
by the DVS. Fig. 9(b) and Fig. 9(c) show the temporal pattern
of DVS events with a cover and without a cover, respectively.
Even though temporal pattern is affected by a cover, it shares
the same trait that a few positive transitions are followed by
a few negative transitions. Therefore, the proposed method
which is temporal pattern recognition with spatial information
is robust to the effect due to the transparent plastic cover.
150
max
min
100
50
0
0
0.5
1.0
1.5
Time (ms)
2.0
min
40
30
20
10
0
0
0.5
1.0
1.5
Time (ms)
60
min
40
30
20
10
0
2.0
50
min
40
30
20
10
0
0
0.5
1.0
1.5
Time (ms)
0.5
1.0
1.5
Time (ms)
2.0
max
50
60
max
50
min
40
-10
30
-20
20
-30
10
2.0
0
0
0.5
1.0
1.5
Time (ms)
2.0
Optical
DVSS
DVSST
DVSST L
Scenario I
100.0%
100.0%
100.0%
100.0%
Scenario II
100.0%
100.0%
87.5%
100.0%
Scenario -20
III
95.5% -30
95.7%
92.1%
94.4%
Scenario IV
42.9%
15.9%
85.8%
81.5%
100 mm
20 mm
100 mm
max
50
2
20 mm
60
(S) 25 mm
DVS
Light Source
(S) 10 mm
DVS
Light Source
V. C ONCLUSION
The proposed proximity sensing design which utilizes a dynamic vision sensor (DVS) can detect the proximity of an object with providing robustness to interference from unintended
other light sources and a transparent plastic cover which is
equipped for protection. The fast response time of DVS allows
the proposed design to measure the distance to the object
from the DVS in real time based on the spatial information of
the DVS events. Also, the proposed method can change the
critical range which is measurable distances by changing the
angle of the light source. The more rectilinear light source is
preferred to measure the distance more accurately. This paper
also mentioned how to use more diffused light source such as
LEDs for the sake of cost-effectiveness. Comparing with the
conventional optical proximity sensors which are vulnerable
to environmental noises, the proposed method is robust to
noises because it uses temporal pattern of the DVS events
with synchronizing the light source control. It is experimented
that the proposed design can achieve up to 100% accuracy at
normal scenarios by applying a common rule with a loose
condition and over 80% accuracy even at the worst scenarios.
As an additional advantage, the proposed design can replace
the conventional proximity sensor and utilize benefits of DVS
itself when it is not used for proximity detection.
Jae-Yeon Won received the B.S. degree in electrical and electronic engineering from Yonsei University, Seoul, Korea, in 2007 and the M.S.
degree in electrical engineering and computer science from Seoul National
University, Seoul, in 2009. He is currently working toward the Ph.D. degree
in electrical and computer engineering Department, Texas A&M University,
College Station, TX since 2010. His research interests include hardware design
and power optimization for Chip Multi Processor (CMP) and neuromorphic
computing.
Hyunsurk (Eric) Ryu gradated from POSTECH (B.S., 1992, M.S., 1994,
Ph.D., 1998). He has been a Member of Technical Staff in Samsung Advanced
Institute of Technology, Samsung Electronics, after receiving his Ph.D.
Recently he has expanded his research interests to bio-mimic computing,
communication and cognitive applications.
Tobi Delbruck (M89-SM06-F13) received his BSc in physics and applied
mathematics from UCSD in 1983 and a PhD from Caltech in 1993. He
worked for 5 years on electronic imaging at Arithmos, Synaptics, National
Semiconductor, and Foveon. He is professor at ETH Zurich in the Inst. of Neuroinformatics. He is a fellow of IEEE. He has been awarded 6 IEEE awards,
including the 2006 ISSCC Jan Van Vessem Outstanding European Paper
Award. He co-organizes the Telluride Neuromorphic Cognition Engineering
summer workshop and the demonstration sessions at ISCAS, and is incoming
chair of the CAS Sensory Systems Technical Committee and associate editor
of the IEEE Transactions of Biomedical Circuits and Systems. His current
interests include bio-inspired and neuromorphic sensory processing.
Jun Haeng Lee received the B.S., M.S., and Ph.D. degrees in electrical
engineering from the Korea Advanced Institute of Science and Technology
(KAIST), Daejeon, Korea, in 1999, 2001, and 2005, respectively. He is
currently with Samsung Advanced Institute of Technology (SAIT) of Samsung
Electronics Co. Ltd., Gyeonggi-do, Republic of Korea. His current research
interests include neuromorphic engineering and biologically plausible model
for artificial intelligence.
Jiang Hu (M01-SM07) received the B. S. degree in optical engineering from
Zhejiang University, China, in 1990, the M. S. degree in physics in 1997, and
the Ph. D. degree in electrical engineering from the University of Minnesota
in 2001. He has been with IBM Microelectronics from January 2001 to June
2002. Currently, he is an associate professor in the Department of Electrical
and Computer Engineering at the Texas A&M University. His research interest
is on Computer-Aided Design for VLSI circuits and systems, especially on
large scale circuit optimization, clock network synthesis, robust design and
on-chip communication. He received a best paper award at the ACM/IEEE
Design Automation Conference in 2001, an IBM Invention Achievement
Award in 2003 and a best paper award at the IEEE/ACM International
Conference on Computer-Aided Design in 2011. He has served as technical
program committee member for DAC, ICCAD, ISPD, ISQED, ICCD, DATE,
ASPDAC, ISLPED and ISCAS, technical program chair and general chair for
the ACM International Symposium on Physical Design, and associated editor
for IEEE Transactions on CAD and ACM Transactions on Design Automation
of Electronic Systems.
R EFERENCES
[1] E. Y. Ahn, J. H. Lee, T. Mullen, and J. Yen, Dynamic vision sensor
camera based bare hand gesture recognition, in Proc. of the IEEE
Symposium on Computational Intelligence for Multimedia, Signal and
Vision Processing, 2011, pp. 5259.
[2] C. Canali, G. D. Cicco, B. Morten, M. Prudenziati, and A. Taroni, A
temperature compensated ultrasonic sensor operating in air for distance
and proximity measurements, IEEE Tran. Ind. Electron., vol. 29, no. 4,
pp. 336341, Nov. 1982.
[3] H. Cheng, A. M. Chen, A. Razdan, and E. Buller, Contactless gesture
recognition system using proximity sensors, in Proc. of the IEEE Int.
Conf. on Consum. Electron., 2011, pp. 149150.
[4] W. Chung, H. Kim, Y. Yoo, C. Moon, and J. Park, The detection and
following of human legs through inductive approaches for a mobile robot
with a single laser range finder, IEEE Tran. Ind. Electron., vol. 59, no. 8,
pp. 31563166, Aug. 2012.
[5] T. Delbruck and P. Lichtsteiner, Photoarray for detecting timedependent image data, Patent US 20 080 135 731 A1, 6 12, 2008.
[6] W. Hortschitz, H. Steiner, M. Sachse, M. Stifter, F. Kohl, J. Schalko, A.
Jachimowicz, F. Keplinger, and T. Sauter, Robust precision position detection with an optical mems hybrid device, IEEE Tran. Ind. Electron.,
vol. 59, no. 12, pp. 48554862, Dec. 2012.
[7] T. Inari and N. Aoki, A proximity sensor using power variation of a
laser diode caused by returned light and its characteristics of stability, in
Proc. of the IEEE Annual Conf. on Ind. Electron., 2005, pp. 21142118.
[8] M. Jagiella, S. Fericean, and A. Dorneich, Progress and recent realizations of miniaturized inductive proximity sensors for automation, IEEE
Sensors J., vol. 6, no. 6, pp. 17341741, Dec. 2006.
[9] S. J. Kim and B. K. Kim, Dynamic ultrasonic hybrid localization
system for indoor mobile robots, IEEE Tran. Ind. Electron., vol. 60,
no. 10, pp. 45624573, Oct. 2013.
[10] Y. K. Kim, Y. Kim, Y. S. Jung, I. G. Jang, K. Kim, S. Kim, and B. M.
Kwak, Developing accurate long-distance 6-dof motion detection with
one-dimensional laser sensors: Three-beam detection system, IEEE
Tran. Ind. Electron., vol. 60, no. 8, pp. 33863395, Aug. 2013.