Вы находитесь на странице: 1из 84

Basic terminology

 Neighborhood
 In the context of image topology, neighborhood has a different meaning:
 4-Neighborhood: Set of pixels situated above, below, to the right, and to the
left of the reference pixel (p),
 8-Neighborhood: Set of all of p’s immediate neighbors
 Diagonal Neighborhood: Pixels that belong to the 8-neighborhood, but not to
the 4-neighborhood, make up the diagonal neighborhood

N4 +ND = N8

4-Neighborhood Diagonal Neighborhood 8-Neighborhood


(a.k.a Edge Neighborbood) (a.k.a Point Neighborbood)
By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Basic terminology
 Adjacency

 Two pixels p and q are 4-adjacent if they have the same value and are
4-neighbors of each other
 You can’t move diagonally

 Two pixels p and q are 8-adjacent if they have the same value and are
8-neighbors of one another.
 You can move diagonally but it may arise redundant paths

 A third type of adjacency known as mixed adjacency (a.k.a. m-


adjacency) is often used to eliminate ambiguities (redundant paths) that
may arise when 8-adjacency is used.

By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Basic terminology
 Mixed Adjacency

 Two pixels p and q are m-adjacent if they have the same value and
 q is in the set N4(p) OR
 q is in the set ND(p) AND the set N4(p) N4(q) is empty.

 m-adjacency is a modified form of 8-adjacency


 It is used to eliminate redundant paths

By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Basic terminology
 Mixed Adjacency

By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Basic terminology

By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Basic terminology
 Paths
 A 4-path between two pixels p and q is a sequence of pixels starting with p
and ending with q such that each pixel in the sequence is 4-adjacent to its
predecessor in the sequence.
 An 8-path indicates that each pixel in the sequence is 8-adjacent to its
predecessor.
 Similarly, an m-path indicates that each pixel in the sequence is m-adjacent
to its predecessor.
 Connectivity
 If there is a 4-path between pixels p and q, they are said to be 4-connected.

 The existence of an 8-path between them means that they are 8-connected.

 If an m-path can be draw between two pixels  they are m-connected.

By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Basic terminology
 Connectivity

By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Basic terminology
 Paths and Connectivity 4-connected paths

8-connected but not 4


Pattern that is not 8-connected
By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Basic terminology
 Components in MATLAB:
 bwlabel – for labeling connected components in binary images
 Label2rgb – assigns each region with a different color

 Result

Original (Binary) Image Result for 8-connectivity Result for 4-connectivity

By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Basic terminology Distances between pixels

By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Distance Measures
 The pixels with D4 = 1 are the 4-
neighbors of p(x,y).

 Pixels having a D4 distance from p(x,y),


less than or equal to some value ‘r’
form a Diamond centered at p(x,y).

 Pixels having a D8 distance from p(x,y),


less than or equal to some value ‘r’
form a square Centered at p(x,y).
(r = 2 in the example shown.)
Basic terminology
 Dm Distance
 Defined as “the shortest m-path between two m-connected
pixels.”
 In this case, the distance between two pixels will depend on
the values of the pixels along the path, as well as the values
of their neighbors.
Example:
➢ Consider the following arrangement of pixels and assume that p, p2, and p4 have
value 1 and that p1 and p3 can have a value of 0 or 1.
➢ Suppose that we consider the adjacency of pixels values 1 (i.e. V = {1})

By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Basic terminology
 Dm DistanceExample (Continued):
➢ Case – I: If p1 =0 and p3 = 0, the length of the shortest m-path (the Dm
distance) is --------
➢ Case2: If p1 =1 and p3 = 0, now, p2 and p will no longer be adjacent (see
m-adjacency definition) then, the length of the shortest path will be-----
➢ Case3: If p1 =0 and p3 = 1, the same applies here, and the shortest –m-path
will be-------
➢ Case4: If p1 =1 and p3 = 1, the length of the shortest m-path will be-------

Case-I Case-II Case-III Case-IV


By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Basic terminology
 Dm DistanceExample (Continued):
➢ Case – I: If p1 =0 and p3 = 0, the length of the shortest m-path (the Dm
distance) is 2 (p, p2, p4).
➢ Case2: If p1 =1 and p3 = 0, now, p2 and p will no longer be adjacent (see
m-adjacency definition) then, the length of the shortest path will be 3 (p,
p1, p2, p4).
➢ Case3: If p1 =0 and p3 = 1, the same applies here, and the shortest –m-path
will be 3 (p, p2, p3, p4).
➢ Case4: If p1 =1 and p3 = 1, the length of the shortest m-path will be 4 (p,
p1 , p2, p3, p4).

Case-I Case-II Case-III Case-IV


By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Overview of image processing operations
 Main Categories of Image Processing Operations
 Spatial domain
 Operations performed on original pixel values
 Global (a.k.a. point) operations
 Neighborhood-oriented (a.k.a. local or area) operations
 Operations combining multiple images

 Transform domain
 Operations are performed on transformed image

By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Global (point) operations
 Point operations apply the same mathematical function
(transformation function) to all pixels in a uniform
manner, regardless of their location in the image or the
values of their neighbors.

 Example: Contrast Adjustment

By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Global (point) operations
 Example: Overall Intensity Reduction by s = r/2

By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Neighborhood-oriented operations
 Neighborhood-oriented operations consist of determining
the resulting pixel value at coordinates (x,y) as a function
of its original value and the value of (some of) its
neighbors, using a convolution operation.
 The convolution of a source image with a small 2D array
(mask or kernel or template or window) produces a
destination image in which each pixel value depends on its
original value and the value of (some of) its neighbors.
 The convolution mask determines which neighbors are
used as well as the relative weight of their original values.
 Example: Spatial Domain Filters

By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Neighborhood-oriented operations
 Masks are normally 3×3.
 Each mask coefficient can be interpreted as a weight.

By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Convolution
 Just like 1D Convolution
Operations combining multiple images
 There are many image processing application that
combine two images, pixel-by-pixel, using an arithmetic
or logical operator, resulting in a third image, Z:
X opn Y = Z
 Opn: Mathematical (+, -, , ) or Logical (AND, OR, XOR)

By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Operations in a transform domain
 A transform is a mathematical tool that allows the
conversion of a set of values to another set of values,
creating, therefore, a new way of representing the same
information e.g., 2D Fourier Transform.

 Example: Frequency Domain Filters

By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Image sensing and acquisition
What will we learn?
 What are the main parameters involved in the design of an
image acquisition solution?

 How do contemporary image sensors work?

 What is image digitization and what are the main


parameters that impact the digitization of an image or
video clip?

 What is sampling?

 What is quantization?

 How can I use MATLAB to resample or requantize an


image?
By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Image acquisition, formation, and
digitization
 Image: 2D representation of a 3D object.
How to go from 3D scene/object to 2D digital images

 Light source: is used to illuminate the object and is


essential for image formation.
 Imaging System:
 Front End (e.g., optical lens) Collects the incoming energy and focus it
onto an image plane
 Sensors: Capture the reflected energy and convert optical information
into its electrical equivalent (analog voltage readings)
 Image Digitization: Appropriate number of (horizontal and
samples; and appropriate number of quantization
vertical)
levels are selected

By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Image acquisition, formation, and
digitization
Different intensity on each cell

Light sensitive
cells
By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Light, Color, and EM Spectrum
The existence of light (or other forms of EM radiation) is
an essential requirement for an image to be created,
captured, and perceived.

By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Light, Color, and EM Spectrum

By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Light:
• A photon is a tiny packet of vibrating EM energy.
• EM waves can be visualized as propagating sinusoidal waves of
certain f or 𝜆 or they can be thought of as a stream of massless
particles called photons, each travelling in a wavelike pattern
and moving at the speed of light.
Photon Energy: 𝐸 = ℎf  Gamma rays are so dangerous to living organisms
Photon Wavelength: 𝜆 = 𝑣/𝑓
• HVS is sensitive to photons of wavelengths b/w 400 and 700 nm.
• Here we xclusively foucs on images within the visible range of
the EM spectrum.
• Light is the preferred energy source for most imaging tasks
because:
• It is safe, cheap, easy to control and process with optical hardware,
• It is easy to detect using relatively inexpensive sensors, and
• It can be readily processed by signal processing hardware.

By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Types of images

 Reflection
images

 Emission
images

 Absorption
images

By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Types of images
 Reflection images: are formed as the result of (ambient or artificial)

radiation that has been reflected from the surfaces of


objects. E.g., most of the images we perceive in our daily life.
 Information that can be extracted is primarily about surfaces
of objects, for example, their shapes, colors, and textures.
 Emission images: are formed from objects that are self-
luminous, such as stars and light bulbs (within visible light range),
and thermal or infrared images (beyond visible light range) for night
vision.
 Absorption images: are formed as the result of radiation
that passes through an object and provide information
about the object’s internal structure. The most common example
is X-ray image.

By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Light and Color Perception

By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Light and Color Perception
• Light is a particular type of EM radiation that can be sensed by
the human eye.
• Colors perceived by humans are determined by:
• the nature of the light reflected by the object, which is a
function of the spectral properties of the light source, and
• the absorption and reflectance properties of the object.

 Sunlight is actually composed of


many different “colors” of light rather Newton’s
than just one. 
 The “colors” were not in the light Experiments
itself but in the effect of the light on
the visual system.

By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Radiance of Light Sources
• The radiance (physical power) of a light source is
expressed in terms of its spectral power distribution
(SPD).
• Commonly found light sources in imaging systems are:
• Sunlight,
• Tungsten lamp,
• Light-Emitting Diode (LED),
• Mercury Arc lamp, and
• Helium–Neon laser.
• The human perception of each of these light sources
will vary—from the yellowish nature of light produced
by tungsten light bulbs to the extremely bright and
pure red laser beam.

By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Spectral Power Distribution (SPD) of
Common Physical Light Sources

By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Spectral Power Distribution (SPD) of
Common Physical Light Sources

By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Visible Light

By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Radiance of Light Sources
The blue light has a wavelength of about 475 nm. Because
the blue wavelengths are shorter in the visible spectrum,
they are scattered more efficiently by the molecules in
the atmosphere. This causes the sky to appear blue.
The visible green light has a wavelength of about 510 nm.
Grass appears green because all of the colors in the
visible part of the spectrum are absorbed into the leaves
of the grass except green. Green is reflected, therefore
grass appears green.
The visible yellow light has a wavelength of about 570 nm.
Low-pressure sodium lamps, like those used in some
parking lots, emit a yellow (wavelength 589 nm) light.
The visible red light has a wavelength of about 650 nm.
At sunrise and sunset, red or orange colors are present
because the wavelengths associated with these colors are
less efficiently scattered by the atmosphere than the
shorter wavelength colors (e.g., blue and purple). A large
amount of blue and violet light has been removed as a
result of scattering and the longwave colors, such as red
and orange, are more readily seen.
By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Radiance of Light Sources
At 6,000 K, which is
the temperature of
the Sun's surface,
the peak color is
yellow and there's a
good production of
visible light. Again,
this is why our eyes
have evolved to see
this range of light.
That's what the Sun
produces the most
of.

By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
 Do it yourself

By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Color Encoding and Representation
 Color can be encoded using three numerical components
and appropriate spectral weighting functions.
 Colorimetry is the science that deals with the
quantitative study of color perception.
 It is concerned with the representation of tristimulus
values (amounts of R, G, and B), from which the perception
of color is derived.
 The simplest way to encode color in cameras and displays is
by using the red (R), green (G), and blue (B) values of each
pixel.

By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Color properties
 Human perception of light (and hence color) commonly
described in terms of three parameters i.e., Brightness,
Hue, and Saturation.

 Brightness:
 subjective perception of luminous intensity, or “the
attribute of a visual sensation according to which an area
appears to emit more or less light”.

 Hue:
 “the attribute of a visual sensation according to which an
area appears to be similar to one of the perceived colors, red,
yellow, green and blue, or a combination of two of them”.
 the dominant wavelength of an SPD.

By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Color properties
 Saturation
 “the colorfulness of an area judged in proportion to its
brightness”.
 From a spectral viewpoint, the more an SPD is
concentrated at one wavelength, the more saturated will
be the associated color. Hence Saturation is the “purity” of
color.
 The addition of white light ( that contains power at all
wavelengths) causes color desaturation.
Hue and Saturation taken together are called
“chromaticity” and hence a color may be
characterized by its brightness and chromaticity.

By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Color properties – MS Paint:

By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Image acquisition
 Two main building blocks of image acquisition are:
 Image Sensor
 Camera Optics

Image sensors
 Goal: to convert EM energy into electrical signals that
can be processed, displayed and/or interpreted as
images.

 Common technologies:
 CCDs (charge-coupled devices)
 CMOS (complementary metal oxide semiconductor)

By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Image sensors
 Goal: to convert EM energy

A CCD sensor is made up of an array of light-sensitive cells called


photosites, manufactured in silicon, each of which produces a voltage
proportional to the intensity of light falling on them. A photosite has
a finite capacity of about 106 energy carriers, which imposes an upper
bound on the brightness of the objects to be imaged.
By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Image sensors
 Goal: to convert EM energy

 Digital cameras
typically use 2D
(area) CCD sensors.

 Flat-bed scanners
employ 1D (line)
CCDs that move
across the image as
each row is
scanned.

By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
CCD image sensors
 The nominal resolution of a CCD sensor is the size of the
scene element that images to a single pixel on the image
plane.
 Example, if a 20 cm  20 cm sheet of paper is imaged to form
a 500×500 digital image, then the nominal resolution of the
sensor is 0.04 cm. (400/250,000=0.0016cm2 for 1 pixel  0.04 cm)
 The field of view (FOV) of an imaging sensor is a measure
of how much of a scene it can see, e.g. 10 cm × 10 cm.
 Since this may vary with depth, it is often more meaningful to
refer to the angular field of view, e.g. 55° × 40°.

By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Color CCD cameras
 Bayer pattern
(single-CCD
cameras)

By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Color CCD cameras
 Beam splitter (three-CCD cameras)

By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
CMOS cameras

 X3 color
sensor
(Foveon)

By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Camera optics

Focal length (f) Magnification Factor (m)

By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Camera optics
 Aberrations
 (a) Pincushion
 (b) Barrel

By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
MATLAB Image Acquisition Toolbox

By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Pixel arrays of several imaging
standards

By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Image digitization

 Digitization = sampling + quantization

By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Sampling

 The process of measuring the value of a 2D function


at discrete intervals along the x and y dimensions.
 Parameters:
 Sampling rate: number of samples across the height
and width of the image.
 Sampling pattern: physical arrangement of the
samples.

By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Quantization
 The process of replacing a continuously varying
function with a discrete set of quantization levels.

By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Spatial resolution
 A way of expressing the density of pixels in an
image using units such as dots per inch (dpi).
 The greater the spatial resolution, the more pixels
are used to display the image within a certain fixed
physical size.
 Artifacts associated with poor spatial resolution :
 Pixelation
 Jaggedness
 Loss of detail
 Moiré patterns

By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Spatial resolution – Example 5.1

By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Requantization in MATLAB
 grayslice function

 Example 5.2 (code)

I1 = imread('ml_gray_640_by_480_256.png');

I2 = grayslice(I1,128); figure, imshow(I2,gray(128));


I3 = grayslice(I1,64); figure, imshow(I3,gray(64));
I4 = grayslice(I1,32); figure, imshow(I4,gray(32));
I5 = grayslice(I1,16); figure, imshow(I5,gray(16));
I6 = grayslice(I1,8); figure, imshow(I6,gray(8));
I7 = grayslice(I1,4); figure, imshow(I7,gray(4));
I8 = grayslice(I1,2); figure, imshow(I8,gray(2));

By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Requantization
in MATLAB

 Example 5.2
(results)

By Oge Marques Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.
Arithmetic and Logical Operations

Вам также может понравиться