Академический Документы
Профессиональный Документы
Культура Документы
Neighborhood
In the context of image topology, neighborhood has a different meaning:
4-Neighborhood: Set of pixels situated above, below, to the right, and to the
left of the reference pixel (p),
8-Neighborhood: Set of all of p’s immediate neighbors
Diagonal Neighborhood: Pixels that belong to the 8-neighborhood, but not to
the 4-neighborhood, make up the diagonal neighborhood
N4 +ND = N8
Two pixels p and q are 4-adjacent if they have the same value and are
4-neighbors of each other
You can’t move diagonally
Two pixels p and q are 8-adjacent if they have the same value and are
8-neighbors of one another.
You can move diagonally but it may arise redundant paths
By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Basic terminology
Mixed Adjacency
Two pixels p and q are m-adjacent if they have the same value and
q is in the set N4(p) OR
q is in the set ND(p) AND the set N4(p) N4(q) is empty.
By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Basic terminology
Mixed Adjacency
By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Basic terminology
By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Basic terminology
Paths
A 4-path between two pixels p and q is a sequence of pixels starting with p
and ending with q such that each pixel in the sequence is 4-adjacent to its
predecessor in the sequence.
An 8-path indicates that each pixel in the sequence is 8-adjacent to its
predecessor.
Similarly, an m-path indicates that each pixel in the sequence is m-adjacent
to its predecessor.
Connectivity
If there is a 4-path between pixels p and q, they are said to be 4-connected.
The existence of an 8-path between them means that they are 8-connected.
By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Basic terminology
Connectivity
By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Basic terminology
Paths and Connectivity 4-connected paths
Result
By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Basic terminology Distances between pixels
By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Distance Measures
The pixels with D4 = 1 are the 4-
neighbors of p(x,y).
By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Basic terminology
Dm DistanceExample (Continued):
➢ Case – I: If p1 =0 and p3 = 0, the length of the shortest m-path (the Dm
distance) is --------
➢ Case2: If p1 =1 and p3 = 0, now, p2 and p will no longer be adjacent (see
m-adjacency definition) then, the length of the shortest path will be-----
➢ Case3: If p1 =0 and p3 = 1, the same applies here, and the shortest –m-path
will be-------
➢ Case4: If p1 =1 and p3 = 1, the length of the shortest m-path will be-------
Transform domain
Operations are performed on transformed image
By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Global (point) operations
Point operations apply the same mathematical function
(transformation function) to all pixels in a uniform
manner, regardless of their location in the image or the
values of their neighbors.
By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Global (point) operations
Example: Overall Intensity Reduction by s = r/2
By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Neighborhood-oriented operations
Neighborhood-oriented operations consist of determining
the resulting pixel value at coordinates (x,y) as a function
of its original value and the value of (some of) its
neighbors, using a convolution operation.
The convolution of a source image with a small 2D array
(mask or kernel or template or window) produces a
destination image in which each pixel value depends on its
original value and the value of (some of) its neighbors.
The convolution mask determines which neighbors are
used as well as the relative weight of their original values.
Example: Spatial Domain Filters
By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Neighborhood-oriented operations
Masks are normally 3×3.
Each mask coefficient can be interpreted as a weight.
By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Convolution
Just like 1D Convolution
Operations combining multiple images
There are many image processing application that
combine two images, pixel-by-pixel, using an arithmetic
or logical operator, resulting in a third image, Z:
X opn Y = Z
Opn: Mathematical (+, -, , ) or Logical (AND, OR, XOR)
By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Operations in a transform domain
A transform is a mathematical tool that allows the
conversion of a set of values to another set of values,
creating, therefore, a new way of representing the same
information e.g., 2D Fourier Transform.
By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Image sensing and acquisition
What will we learn?
What are the main parameters involved in the design of an
image acquisition solution?
What is sampling?
What is quantization?
By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Image acquisition, formation, and
digitization
Different intensity on each cell
Light sensitive
cells
By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Light, Color, and EM Spectrum
The existence of light (or other forms of EM radiation) is
an essential requirement for an image to be created,
captured, and perceived.
By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Light, Color, and EM Spectrum
By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Light:
• A photon is a tiny packet of vibrating EM energy.
• EM waves can be visualized as propagating sinusoidal waves of
certain f or 𝜆 or they can be thought of as a stream of massless
particles called photons, each travelling in a wavelike pattern
and moving at the speed of light.
Photon Energy: 𝐸 = ℎf Gamma rays are so dangerous to living organisms
Photon Wavelength: 𝜆 = 𝑣/𝑓
• HVS is sensitive to photons of wavelengths b/w 400 and 700 nm.
• Here we xclusively foucs on images within the visible range of
the EM spectrum.
• Light is the preferred energy source for most imaging tasks
because:
• It is safe, cheap, easy to control and process with optical hardware,
• It is easy to detect using relatively inexpensive sensors, and
• It can be readily processed by signal processing hardware.
By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Types of images
Reflection
images
Emission
images
Absorption
images
By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Types of images
Reflection images: are formed as the result of (ambient or artificial)
By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Light and Color Perception
By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Light and Color Perception
• Light is a particular type of EM radiation that can be sensed by
the human eye.
• Colors perceived by humans are determined by:
• the nature of the light reflected by the object, which is a
function of the spectral properties of the light source, and
• the absorption and reflectance properties of the object.
By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Radiance of Light Sources
• The radiance (physical power) of a light source is
expressed in terms of its spectral power distribution
(SPD).
• Commonly found light sources in imaging systems are:
• Sunlight,
• Tungsten lamp,
• Light-Emitting Diode (LED),
• Mercury Arc lamp, and
• Helium–Neon laser.
• The human perception of each of these light sources
will vary—from the yellowish nature of light produced
by tungsten light bulbs to the extremely bright and
pure red laser beam.
By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Spectral Power Distribution (SPD) of
Common Physical Light Sources
By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Spectral Power Distribution (SPD) of
Common Physical Light Sources
By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Visible Light
By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Radiance of Light Sources
The blue light has a wavelength of about 475 nm. Because
the blue wavelengths are shorter in the visible spectrum,
they are scattered more efficiently by the molecules in
the atmosphere. This causes the sky to appear blue.
The visible green light has a wavelength of about 510 nm.
Grass appears green because all of the colors in the
visible part of the spectrum are absorbed into the leaves
of the grass except green. Green is reflected, therefore
grass appears green.
The visible yellow light has a wavelength of about 570 nm.
Low-pressure sodium lamps, like those used in some
parking lots, emit a yellow (wavelength 589 nm) light.
The visible red light has a wavelength of about 650 nm.
At sunrise and sunset, red or orange colors are present
because the wavelengths associated with these colors are
less efficiently scattered by the atmosphere than the
shorter wavelength colors (e.g., blue and purple). A large
amount of blue and violet light has been removed as a
result of scattering and the longwave colors, such as red
and orange, are more readily seen.
By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Radiance of Light Sources
At 6,000 K, which is
the temperature of
the Sun's surface,
the peak color is
yellow and there's a
good production of
visible light. Again,
this is why our eyes
have evolved to see
this range of light.
That's what the Sun
produces the most
of.
By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Do it yourself
By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Color Encoding and Representation
Color can be encoded using three numerical components
and appropriate spectral weighting functions.
Colorimetry is the science that deals with the
quantitative study of color perception.
It is concerned with the representation of tristimulus
values (amounts of R, G, and B), from which the perception
of color is derived.
The simplest way to encode color in cameras and displays is
by using the red (R), green (G), and blue (B) values of each
pixel.
By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Color properties
Human perception of light (and hence color) commonly
described in terms of three parameters i.e., Brightness,
Hue, and Saturation.
Brightness:
subjective perception of luminous intensity, or “the
attribute of a visual sensation according to which an area
appears to emit more or less light”.
Hue:
“the attribute of a visual sensation according to which an
area appears to be similar to one of the perceived colors, red,
yellow, green and blue, or a combination of two of them”.
the dominant wavelength of an SPD.
By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Color properties
Saturation
“the colorfulness of an area judged in proportion to its
brightness”.
From a spectral viewpoint, the more an SPD is
concentrated at one wavelength, the more saturated will
be the associated color. Hence Saturation is the “purity” of
color.
The addition of white light ( that contains power at all
wavelengths) causes color desaturation.
Hue and Saturation taken together are called
“chromaticity” and hence a color may be
characterized by its brightness and chromaticity.
By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Color properties – MS Paint:
By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Image acquisition
Two main building blocks of image acquisition are:
Image Sensor
Camera Optics
Image sensors
Goal: to convert EM energy into electrical signals that
can be processed, displayed and/or interpreted as
images.
Common technologies:
CCDs (charge-coupled devices)
CMOS (complementary metal oxide semiconductor)
By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Image sensors
Goal: to convert EM energy
Digital cameras
typically use 2D
(area) CCD sensors.
Flat-bed scanners
employ 1D (line)
CCDs that move
across the image as
each row is
scanned.
By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
CCD image sensors
The nominal resolution of a CCD sensor is the size of the
scene element that images to a single pixel on the image
plane.
Example, if a 20 cm 20 cm sheet of paper is imaged to form
a 500×500 digital image, then the nominal resolution of the
sensor is 0.04 cm. (400/250,000=0.0016cm2 for 1 pixel 0.04 cm)
The field of view (FOV) of an imaging sensor is a measure
of how much of a scene it can see, e.g. 10 cm × 10 cm.
Since this may vary with depth, it is often more meaningful to
refer to the angular field of view, e.g. 55° × 40°.
By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Color CCD cameras
Bayer pattern
(single-CCD
cameras)
By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Color CCD cameras
Beam splitter (three-CCD cameras)
By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
CMOS cameras
X3 color
sensor
(Foveon)
By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Camera optics
By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Camera optics
Aberrations
(a) Pincushion
(b) Barrel
By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
MATLAB Image Acquisition Toolbox
By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Pixel arrays of several imaging
standards
By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Image digitization
By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Sampling
By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Quantization
The process of replacing a continuously varying
function with a discrete set of quantization levels.
By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Spatial resolution
A way of expressing the density of pixels in an
image using units such as dots per inch (dpi).
The greater the spatial resolution, the more pixels
are used to display the image within a certain fixed
physical size.
Artifacts associated with poor spatial resolution :
Pixelation
Jaggedness
Loss of detail
Moiré patterns
By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Spatial resolution – Example 5.1
By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Requantization in MATLAB
grayslice function
I1 = imread('ml_gray_640_by_480_256.png');
By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Requantization
in MATLAB
Example 5.2
(results)
By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Arithmetic and Logical Operations