Вы находитесь на странице: 1из 22

The history of image formation

The idea of a camera is linked to how a


perceives the world with her eyes
y
human p
But in the early days we had only vague or
incorrect ideas about

TSBB09 Image Sensors

What is light
How the eye maps objects in the world to the
image that we perceive

2011-HT2
2011
HT2
Lecture A
I
Image
Formation
F
ti

Prior to the camera: the artist/painter


1

13th century Europe

Ancient Egypt

15th century

The Renaissance
Early investigations in perspective started
alreadyy byy the ancient Greeks ((~500 BC))
and Arab scientists (~1000 AD)
It was not until the 15th century that artists
began to use perspective as a basis for
their paintings
Parallel lines should meet at a single
g p
point
Brunelleschi (~1415)

Christ handing the keys to Saint Peter, by Perugino 1481


5

Camera obscura

Camera obscura

Since ancient times it has been known that


a brightly
g y illuminated scene can be
projected to an image

Full sized dedicated camera


obscura rooms were built in
mansions and castles in the
17th and 18th centuries

In a dark room (Latin: camera obscura)


Through a small hole (aperture)
The image becomes rotated 180o

From Diderots
Encyclopedia 1772

Camera obscura

Camera obscura

Th
The fifirstt photograph
h t
h was taken
t k by
b a smallll
camera obscura in 1826 by Nipce
8h exposure time!

Today
A camera obscura
at Melville Garden
in Massachusetts
around 1880

Large sized: as tourist attractions


S
Small
a ssized:
ed for
o hobby
obby p
photographs
o og ap s
9

10

Laterna Magica

History of photography

Devices that can project an image into a screen


have been described since the 16th century
(in Europe, possibly earlier elsewhere?)
g
((magic
g lamp))
Referred to as laterna magica
We need only

An image painted on a transparent material (glass)


A strong light source
A lens
A suitable screen
From the Ars Lucis et Umbrae, 1671

11

1826: Nipce takes the first proper photograph. 8h exposure time!


1839: Daguerre develops the first practical method for photography
1839: Talbot invents the process for taking negative images that can be copied
1839: Herschel invents glass negatives
1861: Maxwell demonstrates color photographs
1878: Muybridge demonstrates moving images
1887: Celluloid film is introduced
1888: Kodak markets its first easy-to-use camera
1891: Edison patents his kinetoscopic camera
189 L
1895:
Lumire
i B
Bros. iinvent the
h cinmatographe
i
h
1925: Leica introduces the 35mm film format for still images
1936: Kodachrome color film
1948: Land invents the Polaroid camera
1957 Fi
1957:
Firstt di
digitized
iti d iimage
1959: AGFA introduces the first automatic camera
1969: Boyle and Smith invent the first CCD chip for image capture (based on the bubble memory)
1973: Fairchild Semiconductor markets the first CCD chip (100 100 pixels)
1975 B
1975:
Bayer att K
Kodak:
d k fifirstt single
i l chip
hi color
l CCD camera
1981: Sony markets the Mavica, the 1st consumer digital camera. Stores images on a floppy disc
1986: Kodak presents the first megapixel CCD camera
2005: Film based photography company AgfaPhoto files for insolvency
2006 D
2006:
Dalsa
l C
Corporation
ti presents
t a 111 M
Mpixel
i l CCD camera
2009: Kodak announces that it will discontinue production of Kodachrome film
12

Source: en.wikipedia.org

Basic physics

Frequency and wavelength

Electromagnetic radiation consists of


g
waves
electromagnetic

The relation between frequency and


g is
wavelength

With energy
That propagate through space

c=

The waves consist of transversal electrical


and magnetic fields that alternate with a
temporal
p
frequency
q
y ((Hertz)) and spatial
p
wavelength (meter)

c is the speed of light and depends on the


medium c c0
medium,
c0 = speed of light in vacuum 3108 m/s

13

Particles and energy

14

Particles and energy

Light can also be represented as particles,


p
photons

E
Energy depends
d
d on th
the ffrequency
Energy is preserved

c1

The
Th energy off a photon
h t is
i
1

E=h=hc/
h is Plancks constant ( 6.623 10-23 Js)
15

c2 and 2 must
change with the
same factor relative
to c1 and 1

c 2 < c1
1

2 = 1

2 < 1

If the speed
p
of light
g changes
g from one medium to
another,
the frequency is constant to make the energy constant
the
th wavelength
l
th mustt change
h
16

Spectrum

Spectrum

In practice, light normally consists of

Less number
of photons

photons with a range of energies, or


waves with a range of frequencies
This mix of frequencies/wavelengths/energies is
called the spectrum of the light

More number
of photons
E

The spectrum gives the total amount of energy


for each frequency/wavelength/energy
Monochromatic light consists of only one
frequency/wavelength

Same total energies

Can be produced by special light sources, e.g., lasers


Natural light

Monochromatic light

17

Classification of light spectrum

18

Polarization
The
Th electromagnetic
l t
ti fifield
ld h
has a direction
di ti
Perpendicular to the direction of motion

The polarization of the light is defined as


the direction of the electric field
Natural light consists of waves with
polarization in all possible directions:
unpolarized light
Special
S
i l lilight
ht sources or filt
filters can produce
d
polarized light of well-defined polarization
19

20

Polarization

Polarization
Circular/elliptical
Ci l / lli ti l polarization
l i ti

Plane polarization

The electric field vector rotates


Can
C b
be constructed
t t d as th
the sum off two
t
plane
l
polarized
l i d
o
waves with 90 phase shift

The electric field varies only


y in a single
g p
plane

Electric field

21

Coherence

Conversely: plane polarized light can be


decomposed as a sum of two circular polarized
22
waves that rotate in opposite directions

Radiometry

The
Th phase
h
off th
the lilight
ht waves can either
ith b
be
random: incoherent light (natural light)
in a systematic relation: coherent light

Light radiation has energy


Each p
photon has a p
particular energy
gy related
to its frequency (E = h )
The number of photons of a particular
frequency gives the amount of energy for this
frequency
Described by the spectrum
Unit:
U it Joule
J l (or
( Watt
W tt second)
d)
Is usually not measured directly

Coherent light is usually related to


monochromatic light sources
Compare a red LED and a red laser
Both produce light within a narrow range
The
Th LED lilight
h iis iincoherent
h
The laser light is coherent
23

24

Radiometry

Radiometry

The power of the radiation, i.e., the energy


per unit time,, is the radiant flux
p
Since the energy depends on the frequency,
so does the radiant flux
Unit: Watt or Joule per second
Is
I usually
ll nott measured
d di
directly
tl

The radiant flux per unit area is the flux density


Since the flux depends on the frequency, so does the
flux density
Unit: Watt per square meter
As the energy
through a specific
Can be measured directly!
area during a
specific time

Irradiance: flux density incident upon a surface


y emitted from
Excitance or emittance: flux density
a surface
25

Radiometry

26

Basic principle

For point sources, or distant sources of


small extent,, the flux densityy can also be
measured per unit solid angle

Based on preservation of energy


A constant light
g source must p
produce the
same amount of energy through a solid angle
regardless
g
of distance to the source
The radiant intensity is constant
The radiant flux density decreases with the square
of the distance to the source

The radiant intensity


intensit is the radiant flux
fl per
unit solid angle
Unit: Watt per steradian

27

28

The radiometric chain

The radiometric chain


Sensor

Light
source

Sensor

Light
source
Surface 2

Surface

Surface 1
29

The radiometric chain

The radiometric chain


Sensor

Light
source 1

30

Surface 2

Sensor

Light
source 1
Surface 2

Light
source 2

Light
source 2
Medium

Surface 1

Surface 1
31

32

Interaction between light and


matter

Light incident upon a surface

Most types of light-matter interactions can


p
by
y
be represented
n = the materials refractive index
= the materials
material s absorption coefficient

When
Wh light
li ht meets
t a surface
f
Some part of it is transmitted through the new media
Possibly
P
ibl with
ith another
th speed
d and
d di
direction
ti

Some part of it is absorbed by the new media


Usually: the light energy is transformed to heat

Some part of it is reflected

Both parameters depend on


More complex interactions include
polarization effects or non-linear
non linear effects

All these effects are different for different


wavelengths!

33

34

Basic principle

Refraction

Based on preservation of energy:


E0 = E1 + E2 + E3

The light that is transmitted into the new


medium is refracted due to the change
g in
light speed

E0 = incoming energy

Snells law of refraction:


1

E3 = absorbed energy

sin 1
n1
c2
=
=
sin 2
n2
c1
E1 = transmitted energy
E2 = reflected energy

2
35

36

Absorption

Absorption

Absorption implies attenuation of


g
transmitted or reflected light
Materials get their colors as a result of
different amount of absorption for different
wavelengths
Ex: A green object attenuates wavelengths in
the green band less than in other bands.

Th
The absorption
b
ti off light
li ht iin matter
tt depends
d
d on th
the length
l
th
that the light travels through the material

a = ex
a = attenuation of the light (0 a 1)
= the materials absorption
p
coefficient
x = length that the light travels in the material

37

38

Absorption spectrum

Reflection

The spectrum of the reflected/transmitted


light
g is g
given by
y

Highly dependent on the surface type


Light is reflected equally
much in all directions
independent of

s2()
( ) = s1()
( ) a()
( )

s1 = incident spectrum
s2 = reflected/transmitted
fl t d/t
itt d spectrum
t
a = absorption
p
spectrum
p
((0 a()
( ) 1))

Mirror

Lambertian surface

A real surface is often a mix between the two cases


39

40

Emission

Scattering

Independent of its interaction with incident


g ((well,, almost):
)
light

All mediums (other than vacuum) scatter light

Any object, even one that is not considered a


light source
source, emits electromagnetic radiation

We can think of the medium as consisting of


small p
particles and with some p
probability
y they
y
reflect the light

Primarily in the IR-band, based on its


t
temperature
t
More on this in the lecture on IR sensors

Examples: air, water, glass

In any
yp
possible direction
Different probability for different directions
Weak effect and roughly
g yp
proportional
p
to -4
In general, the probability depends also on the
distribution of particle sizes

41

42

Scattering

Scattering
Scattering is not an absorption
g ray
y does not travel
It rather means that the light
along a straight line through the medium
There is a p
probability
y that a certain p
photon exits the
medium in another direction than it entered.

Examples:
p
The sky is blue because of scattering of the sun light
A strong
g laser beam becomes visible in air

Medium

43

44

The plenoptic function

The plenoptic function

At a point x = (x1,x2,x3) in space we can


g energy
gy that
measure how much light
travels in the direction n = (n1,n2,n3),
knk = 1

The plenoptic function is the


p
g radiance intensity
y function
corresponding
p(x,n) (5-dim since x is 3-dim & n is 2-dim)

Can also be a function of


Frequency
Time t
p(x,n,,t) (7-dim)
(7 dim)
(Polarization)

n
x

45

A light camera

46

The pinhole camera

A (light) camera is a device that samples


the plenoptic function in a particular way
Different types of cameras sample in
different ways

The most common camera model is the


pinhole camera
p
Swedish: hlkamera

An ideal model of the camera obscura

Pinhole-camera
Orthographic camera
Push-broom camera
Light-field
Light field camera

47

48

The pinhole camera model


Each point in the
image plane is
illuminated by a
single
i l ray passing
i
through the
aperture

The pinhole camera model

The aperture
through which all
light enters the
camera

M
Mathematically
th
ti ll we need
d only
l kknow th
the
location of the image plane and the
aperture
The rest is physics + practical implementation
In fact, it suffices to know the aperture (why?)

The image plane

This is where we
measure the image

In the literature
literature, the aperture point is also
called

For an ideal
F
id l
pinhole camera
the aperture is a
single point

camera center
camera focal point

The camera
ffrontt
49

The pinhole camera model


The image
g p
plane and the camera
center define a camera-centered
coordinate system (x1,x2,x3):
x1,x
x2 are parallel to the image
plane, x3 is perpendicular to the
plane and defines the viewing
direction of the camera

Q ( 1,y2) is
Q=(y
i
the projection
of P

50

The pinhole camera model


R is the point where the optical axis
intersects the image
g p
plane

Principal or optical axis

P=(x1,x2,x3) is
a point in 3D
space

f = focal distance, the


distance between the
image plane and the
51
camera center

The principal point or the image center

The (x1,x
x2) plane is the principal plane or
focal plane
The green line is the projection line of
point P
All points on the line are projected onto Q
Alternatively:
Alt
ti l th
the projection
j ti liline off Q
52

The pinhole camera model

The pinhole camera model

If we look at the camera coordinate system


along the x2 axis:

Looking along the x1 axis gives a similar


p
for y2
expression
This can be summarized as:

Two similar
triangles give:

y1
x1
=
f
x3

or

y1
x1
= xf
3 x2
y2

y1 = fxx1
3

53

The virtual image plane

54

The virtual image plane

The projected image is rotated 180o relative to


how we see the 3D world

Projection lines works as before: from P


through
g the focal p
point and intersect at Q
This defines the virtual image plane

Reflection in both y1 and y2 coordinates = rotation

Must be de-rotated before we can view it

Cannot
C
t be
b realized
li d iin practice
ti
Produces the same image as the rotated
image from the real image plane

In the film based camera, the image is manually


rotated
In the digital camera this is taken care of by reading
out the pixels in the rotated order

Mathematically this is equivalent to placing the


image plane in front of the focal point

Easier to draw?
55

56

The virtual image plane

Lenses vs
vs. infinitesimal aperture
Th
The pinhole
i h l camera model
d ld
doesnt
t work
k iin
practice since

P=
A point in 3D
space

O=
The camera
focal point

If we make the aperture small, too little light


enters the camera
If we make the aperture larger, the image
becomes blurred

The projection of P onto


the virtual image plane

y1
x1
= xf
3 x2
y2

57

Thin lenses

Th
The object
bj t plane
l
consist
i t off allll points
i t th
thatt appear
sharp when projected through the lens onto the
image plane
The object plane is an ideal model of where the
sharp
sharp points
points are located

image
g
plane
a

58

The object plane

The simplest model of a lens


Focuses all points in an object plane onto
the image plane

object
plane

Solution: we replace the aperture with a


lens or a system of lenses

59

In practice: the object plane may be non-planar: e.g.


described by the surface of a sphere
The shape of the object plane depends on the quality
of the lens (system)
For thin lenses the object plane can often be
approximated as a plane
60

Thin lenses

Diffraction limited systems

The thin lens is characterized by a single


parameter: the focal length
p
g fL

Due to the wave nature of light, even when


various lens effects are eliminated,, it
cannot be focused to an arbitrarily small
point if it has passed an aperture
For coherent light:

1+1 = 1
a
b
fL
To change a (distance to object plane), we
need to change
g b since f is constant
61
a = for b = fL !

Diffraction limited systems


Example: 1D

Huygens's principle: treat the incoming light


as a set of point light sources
Gives diffraction pattern at the image plane
62

Diffraction limited systems


E
Each
h point
i t along
l
th
the aperture,
t
att position
iti x,

acts as a wave source


In the image plane, at position x, each point
source contributes with a wave that has a
phase difference = 2 x sin / relative
the position at the centre of the aperture
is the angle from point x to the aperture,
and assuming that x << x it follows that
sin x / f
We g
get:
2 xx / (( f))

x = vertical position in the aperture

63

64

Diffraction limited systems

Diffraction limited systems

The principle of superposition means that


g wave-function at the image
g
the resulting
plane is a sum/integral of the contributions
from the different light sources:

This phenomena generalizes to 2D:


The resulting
g wave-function is the 2D FT of
the incoming spatial amplitude (function of x)

Example: a circular aperture of diameter D

Amplitude of incoming light

Resulting wave-function

First order Bessel


function

65

The Airy disk

66

The Airy disk


The smallest resolvable distance in the
image
g p
plane,, x,, is given
g
byy

Distance to first zero point in (x)

lens focal length

lens diameter

camera front
focal plane

image plane

The Airy disk,


the image of a
circular pattern
projected
j t d iinto
t
the image plane

light wavelength

67

68

The Airy disk

The point spread function

Conclusions:
C
l i
The image cannot have a better resolution
than x
No need to measure the image with higher
resolution than x !

The
Th Airy
Ai disk
di k iis also
l called
ll d point
i t spread
d ffunction
ti
or blur disk, circle of confusion

IIn generall the


h point
i spread
d ffunction
i can b
be
related to several effects that make the image of
a point appear blurred
Diffraction
Lens imperfections
Imperfections in the position of the image plane

Be aware
a are of cameras with
ith high pi
pixel
el
resolution and high diffraction

Often modeled as constant over the image

Image resolution is not defined by number of


pixels in the camera!

Can be variable for poor optical systems


69

Depth of field

70

Depth of field

We
W have
h
now placed
l
d a llens att th
the aperture
t
Points that are off the object plane become
blurred proportional to the displacement from
the object plane

F
For a camera where
h
a < , an
approximation (assuming d << a) for d is
L)
d 2 x a(af
Df

Due to the point spread function, it makes


p blur in the order of x
sense to accept

This blur will be there anyway due to diffraction

Depth of field d is the displacement along


the optical axis from the object plane that
gives blur x
71

a = distance
di t
ffrom lens
l
tto object
bj t plane
l
fL = lens focal length
D = lens diameter
x = required
q
image
g p
plane resolution
d = depth of field

72

Depth of field

The F-number
F number
fL/D iis th
the F-number
F
b off the
th lens
l
or llens system
t

For a lens where a = , points that are


further awayy than dmin are blurred less
than x where

Example
A typical F number of a camera = 8
Blue light = 420 nm wavelength
Airy disk diameter x = 1.22 F 4 m

fL D
dmin
i = 4 x

For a lens with fL = 15 mm we get


d 0.6 m at a = 1.5 m
dmin 1.8 m at a =

This means that the


depth of field is within
a manageable
bl range

73

74

Lens distortion

Thin lenses and the pinhole camera


b is
i th
the di
distance
t
ffrom th
the llens/aperture
/
t
to
t the
th image
i
plane

A lens or a lens system can never map


straight
g lines in the 3D scene exactly
y to
straight lines in the image plane
Depending on the lens type
type, a square
pattern will typically appear like a barrel or
a pincushion

This is the focal length of the pinhole camera, had there not
been a lens. Same as the pin-hole camera focal length f

a is the distance from the lens/aperture to the object


plane
a and b are related by 1/a + 1/b 1/fL, where fL is the
focal length of the lens. Often fL is variable
All points within the field of depth will be projected with
maximum sharpness x on the image plane
The geometric effect of a lens in the aperture is that
The camera center is placed at the center of the lens
The effective focal distance of the pinhole camera becomes b
75

76

Lens distortion

Radial lens distortion


Thi
This effect
ff t is
i called
ll d lens
l
di
distortion
t ti (geometric
(
t i di
distortion)
t ti )
and can, in the simplest case, be modeled as a
radial distortion
Position according
Ob
Observed
d point
i t
to the pinhole
camera model

(y1, y2) = correct image


g coordinate
(y1, y2) = r (cos , sin )
(y1, y2) = real image coordinate
(y1, y2) = h(r) (cos , sin )
Barrel distortion

No distortion

y2

y1

Pincushion distortion

The observed positions of points in the image are


displaced in the radial direction relative the image center
as described by the pinhole camera model
model.
77

Radial lens distortion

Lens distortion
Whi
Which
h di
distortion
t ti ffunction
ti h is
i used
d
depends on the type of lens and other
practical considerations:

h is approximately a linear function with


some non-linear deviation,, e.g.
g
The deviation from
f
a
linear function usually
grows with r

Once modeled, we can compensate for


the distortion

78

79

Number of parameters
Invertibility

More complicated distortion models


include angular dependent distortion
Cheap lenses significant distortion
Almost no distortion expensive lenses
80

Vignetting

Vignetting

Even if the light that enters the camera is


g plane
p
constant in all directions,, the image
will receive different amount of illumination

Sometimes used as a photographic effect


But is usually unwanted
Can be compensated for in digital
cameras

This effect is called vignetting

81

Mechanical vignetting

Image from a digital camera


with a very light lens

82

The cos4 law


W
We can see the
th aperture
t
as a light
li ht source iin th
the
form of a small area that illuminates the image
plane

Light from a larger solid angle


point A is focused
emitted from p
here

Light from a smaller solid angle


emitted from point B is focused
here
83

The flux density decreases with the square of the


distance to the light
g source: cos2
The effective area of the detector relative to the
aperture varies as cos
The effective area of the aperture relative to the
detector varies as cos

84

The cos4 law

Chromatic aberration

This effect exists also in lens-based


cameras
This means that, in general, there is an
attenuation of the image towards the
edges of the image, approximately
according to cos4
Can be compensated for in a digital
camera

The refraction index of matter (lenses) is


g dependent
p
wavelength
Example: a prism can decompose the light
into its spectrum

85

Chromatic aberration

Sometimes clearly visible if


you look close to the edges
through a pair of glasses

87

A ray of white light is decomposed into rays of


different colors that intersect the image
g p
plane
at different points
86

Вам также может понравиться