Академический Документы
Профессиональный Документы
Культура Документы
Lecture 1
Introduction & Fundamentals
Spring 2008
Introduction to the course
http://www.cs.nmt.edu/~ip
Office Hours:
Wednesday 9-11 AM, Cramer 231A
or appointment by email @
liu@cs.nmt.edu or liuqzsc@nmt.edu
Textbooks
Weeks 1 & 2 2
Introduction to the course
Grading
Article Reading and Presentation: 15%
Homework: 20%
Exam: 15%
Project: 50%
Total: 100%
Extra Credits: 50%. If the method and experimental
results of your project achieve the state of the art, you
will earn the extra 50% credits.
Weeks 1 & 2 3
Introduction to the course
Article Reading and Project
Medical image analysis (MRI/PET/CT/X-ray tumor
detection/classification)
Face, fingerprint, and other object recognition
Image and/or video compression
Image segmentation and/or denoising
Digital image/video watermarking/steganography and
detection
Whatever youre interested
Weeks 1 & 2 4
Introduction to the course
Evaluation of article reading and project
Report
Article reading
Submit a survey of the articles you read and the list of the
articles
Project
Submit an article including introduction, methods,
experiments, results, and conclusions
Submit the project code, the readme document, and some
testing samples (images, videos, etc.) for validation
Presentation
Weeks 1 & 2 5
Journals & Conferences
in Image Processing
Journals:
IEEE T IMAGE PROCESSING
IEEE T MEDICAL IMAGING
INTL J COMP. VISION
IEEE T PATTERN ANALYSIS MACHINE INTELLIGENCE
PATTERN RECOGNITION
COMP. VISION AND IMAGE UNDERSTANDING
IMAGE AND VISION COMPUTING
Conferences:
CVPR: Comp. Vision and Pattern Recognition
ICCV: Intl Conf on Computer Vision
ACM Multimedia
ICIP
SPIE
ECCV: European Conf on Computer Vision
CAIP: Intl Conf on Comp. Analysis of Images and Patterns
Weeks 1 & 2 6
Introduction
What is Digital Image Processing?
Digital Image
a two-dimensional function f ( x, y )
x and y are spatial coordinates
The amplitude of f is called intensity or gray level at the point (x, y)
Pixel
the elements of a digital image
Weeks 1 & 2 7
Origins of Digital Image Processing
Weeks 1 & 2 9
Sources for Images
Electromagnetic (EM) energy spectrum
Acoustic
Ultrasonic
Electronic
Synthetic images produced by computer
Weeks 1 & 2 10
Electromagnetic (EM) energy spectrum
Major uses
Gamma-ray imaging: nuclear medicine and astronomical observations
X-rays: medical diagnostics, industry, and astronomy, etc.
Ultraviolet: lithography, industrial inspection, microscopy, lasers, biological imaging,
and astronomical observations
Visible and infrared bands: light microscopy, astronomy, remote sensing, industry,
and law enforcement
Microwave band: radar
Radio band: medicine (such as MRI) and astronomy
Weeks 1 & 2 11
Examples: Gama-Ray Imaging
Weeks 1 & 2 12
Examples: X-Ray Imaging
Weeks 1 & 2 13
Examples: Ultraviolet Imaging
Weeks 1 & 2 14
Examples: Light Microscopy Imaging
Weeks 1 & 2 15
Examples: Visual and Infrared Imaging
Weeks 1 & 2 16
Examples: Visual and Infrared Imaging
Weeks 1 & 2 17
Examples: Infrared Satellite Imaging
2003
USA 1993
Weeks 1 & 2 18
Examples: Infrared Satellite Imaging
Weeks 1 & 2 19
Examples: Automated Visual Inspection
Weeks 1 & 2 20
Examples: Automated Visual Inspection
Results of
automated
reading of the
plate content by
the system
Weeks 1 & 2 21
Example of Radar Image
Weeks 1 & 2 22
Examples: MRI (Radio Band)
Weeks 1 & 2 23
Examples: Ultrasound Imaging
Weeks 1 & 2 24
Fundamental Steps in DIP
Extracting image
components
Improving the
appearance Partition an image into
its constituent parts or
objects
Result is more
suitable than Represent image for
the original computer processing
Weeks 1 & 2 25
Light and EM Spectrum
c E h , h : Planck's constant.
Weeks 1 & 2 26
Light and EM Spectrum
Weeks 1 & 2 27
Light and EM Spectrum
Monochromatic light: void of color
Intensity is the only attribute, from black to white
Monochromatic images are referred to as gray-scale
images
Weeks 1 & 2 28
Image Acquisition
Transform
illumination
energy into
digital images
Weeks 1 & 2 29
Image Acquisition Using a Single Sensor
Weeks 1 & 2 30
Image Acquisition Using Sensor Strips
Weeks 1 & 2 31
Image Acquisition Process
Weeks 1 & 2 32
A Simple Image Formation Model
f ( x, y ) i ( x, y ) r ( x, y )
Weeks 1 & 2 33
Some Typical Ranges of illumination
Illumination
Lumen A unit of light flow or luminous flux
Lumen per square meter (lm/m2) The metric unit of measure
for illuminance of a surface
On a cloudy day, the sun may produce less than 10,000 lm/m2 of
illumination on the surface of the Earth
Weeks 1 & 2 34
Some Typical Ranges of Reflectance
Reflectance
Digitizing the
coordinate
values
Digitizing the
amplitude
values
Weeks 1 & 2 36
Image Sampling and Quantization
Weeks 1 & 2 37
Representing Digital Images
Weeks 1 & 2 38
Representing Digital Images
Weeks 1 & 2 39
Representing Digital Images
Weeks 1 & 2 40
Representing Digital Images
Weeks 1 & 2 41
Representing Digital Images
b=MNk
Weeks 1 & 2 42
Representing Digital Images
Weeks 1 & 2 43
Spatial and Intensity Resolution
Spatial resolution
A measure of the smallest discernible detail in an image
stated with line pairs per unit distance, dots (pixels) per
unit distance, dots per inch (dpi)
Intensity resolution
The smallest discernible change in intensity level
stated with 8 bits, 12 bits, 16 bits, etc.
Weeks 1 & 2 44
Spatial and Intensity Resolution
Weeks 1 & 2 45
Spatial and Intensity Resolution
Weeks 1 & 2 46
Spatial and Intensity Resolution
Weeks 1 & 2 47
Image Interpolation
http://www.dpreview.com/learn/?/key=interpolation
Weeks 1 & 2 48
Image Interpolation:
Nearest Neighbor Interpolation
f1(x2,y2) = f(x1,y1)
f(round(x2), round(y2))
=f(x1,y1)
f1(x3,y3) =
f(round(x3), round(y3))
=f(x1,y1)
Weeks 1 & 2 49
Image Interpolation:
Bilinear Interpolation
(x,y)
f 2 ( x, y )
(1 a) (1 b) f (l , k ) a (1 b) f (l 1, k )
(1 a) b f (l , k 1) a b f (l 1, k 1)
l floor ( x), k floor ( y ), a x l , b y k .
Weeks 1 & 2 50
Image Interpolation:
Bicubic Interpolation
The intensity value assigned to point (x,y) is obtained by
the following equation
3 3
f3 ( x, y) aij x y i j
i 0 j 0
Weeks 1 & 2 51
Examples: Interpolation
Weeks 1 & 2 52
Examples: Interpolation
Weeks 1 & 2 53
Examples: Interpolation
Weeks 1 & 2 54
Examples: Interpolation
Weeks 1 & 2 55
Examples: Interpolation
Weeks 1 & 2 56
Examples: Interpolation
Weeks 1 & 2 57
Examples: Interpolation
Weeks 1 & 2 58
Examples: Interpolation
Weeks 1 & 2 59
Basic Relationships Between Pixels
Neighborhood
Adjacency
Connectivity
Paths
Weeks 1 & 2 60
Basic Relationships Between Pixels
Weeks 1 & 2 61
Basic Relationships Between Pixels
Adjacency
Let V be the set of intensity values
Weeks 1 & 2 62
Basic Relationships Between Pixels
Adjacency
Let V be the set of intensity values
(ii) q is in the set ND(p) and the set N4(p) N4(p) has no pixels whose
values are from V.
Weeks 1 & 2 63
Basic Relationships Between Pixels
Path
A (digital) path (or curve) from pixel p with coordinates (x0, y0) to pixel
q with coordinates (xn, yn) is a sequence of distinct pixels with
coordinates
We can define 4-, 8-, and m-paths based on the type of adjacency
used.
Weeks 1 & 2 64
Examples: Adjacency and Path
V = {1, 2}
0 1 1 0 1 1 0 1 1
0 2 0 0 2 0 0 2 0
0 0 1 0 0 1 0 0 1
Weeks 1 & 2 65
Examples: Adjacency and Path
V = {1, 2}
0 1 1 0 1 1 0 1 1
0 2 0 0 2 0 0 2 0
0 0 1 0 0 1 0 0 1
8-adjacent
Weeks 1 & 2 66
Examples: Adjacency and Path
V = {1, 2}
0 1 1 0 1 1 0 1 1
0 2 0 0 2 0 0 2 0
0 0 1 0 0 1 0 0 1
8-adjacent m-adjacent
Weeks 1 & 2 67
Examples: Adjacency and Path
V = {1, 2}
0 1 1
1,1 1,2 1,3 0 1 1 0 1 1
0 2 0
2,1 2,2 2,3 0 2 0 0 2 0
0 0 1
3,1 3,2 3,3 0 0 1 0 0 1
8-adjacent m-adjacent
The 8-path from (1,3) to (3,3): The m-path from (1,3) to (3,3):
(i) (1,3), (1,2), (2,2), (3,3) (1,3), (1,2), (2,2), (3,3)
(ii) (1,3), (2,2), (3,3)
Weeks 1 & 2 68
Basic Relationships Between Pixels
Connected in S
Let S represent a subset of pixels in an image. Two pixels
p with coordinates (x0, y0) and q with coordinates (xn, yn)
are said to be connected in S if there exists a path
Weeks 1 & 2 69
Basic Relationships Between Pixels
Weeks 1 & 2 70
Basic Relationships Between Pixels
The boundary of the region R is the set of pixels in the region that
have one or more neighbors that are not in R.
If R happens to be an entire image, then its boundary is defined as the
set of pixels in the first and last rows and columns of the image.
Weeks 1 & 2 71
Question 1
1 1 1
Region 1
1 0 1
0 1 0
0 0 1 Region 2
1 1 1
1 1 1
Weeks 1 & 2 72
Question 2
1 1 1
Part 1
1 0 1
0 1 0
0 0 1 Part 2
1 1 1
1 1 1
Weeks 1 & 2 73
In the following arrangement of pixels, the two
regions (of 1s) are disjoint (if 4-adjacency is used)
1 1 1
Region 1
1 0 1
0 1 0
0 0 1 Region 2
1 1 1
1 1 1
Weeks 1 & 2 74
In the following arrangement of pixels, the two
regions (of 1s) are disjoint (if 4-adjacency is used)
1 1 1
foreground
1 0 1
0 1 0
0 0 1 background
1 1 1
1 1 1
Weeks 1 & 2 75
Question 3
0 0 0 0 0
0 1 1 0 0
0 1 1 0 0
0 1 1 1 0
0 1 1 1 0
0 0 0 0 0
Weeks 1 & 2 76
Question 4
0 0 0 0 0
0 1 1 0 0
0 1 1 0 0
0 1 1 1 0
0 1 1 1 0
0 0 0 0 0
Weeks 1 & 2 77
Distance Measures
b. D(p, q) = D(q, p)
Weeks 1 & 2 78
Distance Measures
a. Euclidean Distance :
De(p, q) = [(x-s)2 + (y-t)2]1/2
Weeks 1 & 2 79
Question 5
0 0 0 0 0
0 0 1 1 0
0 1 1 0 0
0 1 0 0 0
0 0 0 0 0
0 0 0 0 0
Weeks 1 & 2 80
Question 6
0 0 0 0 0
0 0 1 1 0
0 1 1 0 0
0 1 0 0 0
0 0 0 0 0
0 0 0 0 0
Weeks 1 & 2 81
Question 7
0 0 0 0 0
0 0 1 1 0
0 1 1 0 0
0 1 0 0 0
0 0 0 0 0
0 0 0 0 0
Weeks 1 & 2 82
Question 8
0 0 0 0 0
0 0 1 1 0
0 0 1 0 0
0 1 0 0 0
0 0 0 0 0
0 0 0 0 0
Weeks 1 & 2 83
Introduction to Mathematical Operations in
DIP
Array vs. Matrix Operation
Weeks 1 & 2 84
Introduction to Mathematical Operations in
DIP
Linear vs. Nonlinear Operation
H f ( x, y) g ( x, y)
H ai f i ( x, y ) a j f j ( x, y )
Additivity
H ai fi ( x, y ) H a j f j ( x, y )
ai H f i ( x, y ) a j H f j ( x, y ) Homogeneity
ai gi ( x, y ) a j g j ( x, y )
H is said to be a linear operator;
H is said to be a nonlinear operator if it does not meet the
above qualification.
Weeks 1 & 2 85
Arithmetic Operations
Weeks 1 & 2 86
Example: Addition of Noisy Images for Noise Reduction
Weeks 1 & 2 87
Example: Addition of Noisy Images for Noise Reduction
K
1
g ( x, y )
K
g ( x, y )
i 1
i
1 K
E g ( x, y ) E gi ( x, y )
2
2 K
K i 1 g ( x,y ) 1
gi ( x , y )
K i 1
1 K
E f ( x, y ) ni ( x, y )
K i 1 1 2
2
n( x, y )
1 K
1 K
K
ni ( x , y )
f ( x, y ) E ni ( x, y ) K i 1
K i 1
f ( x, y )
Weeks 1 & 2 88
Example: Addition of Noisy Images for Noise Reduction
Weeks 1 & 2 89
Weeks 1 & 2 90
An Example of Image Subtraction: Mask Mode Radiography
Weeks 1 & 2 91
Weeks 1 & 2 92
An Example of Image Multiplication
Weeks 1 & 2 93
Set and Logical Operations
Weeks 1 & 2 94
Set and Logical Operations
Let A be the elements of a gray-scale image
The elements of A are triplets of the form (x, y, z), where
x and y are spatial coordinates and z denotes the intensity
at the point (x, y).
A {( x, y, z ) | z f ( x, y)}
The complement of A is denoted Ac
Ac {( x, y, K z ) | ( x, y, z ) A}
K 2k 1; k is the number of intensity bits used to represent z
Weeks 1 & 2 95
Set and Logical Operations
The union of two gray-scale images (sets) A and B is
defined as the set
A B {max(a, b) | a A, b B}
z
Weeks 1 & 2 96
Set and Logical Operations
Weeks 1 & 2 97
Set and Logical Operations
Weeks 1 & 2 98
Spatial Operations
Single-pixel operations
Alter the values of an images pixels based on the intensity.
s T ( z)
e.g.,
Weeks 1 & 2 99
Spatial Operations
Neighborhood operations
( x, y ) T {(v, w)}
intensity interpolation that assigns intensity values to the spatially
transformed pixels.
Affine transform
t11 t12 0
x y 1 v w 1 t21 t22 0
t31 t32 1
Weeks 1 & 2 102
Weeks 1 & 2 103
Intensity Assignment
Forward Mapping
( x, y ) T {(v, w)}
Its possible that two or more pixels can be transformed to the same
location in the output image.
Inverse Mapping
(v, w) T 1{( x, y)}
The nearest input pixels to determine the intensity of the output pixel
value.
Inverse mappings are more efficient to implement than forward
mappings.
x c1v c2 w c3vw c4
y c5v c6 w c7 vw c8
M 1 N 1
f ( x, y ) T (u , v) s ( x, y, u , v)
u 0 v 0
M 1 N 1
T (u, v) f ( x, y )e j 2 ( ux / M vy / N )
x 0 y 0
M 1 N 1
1
f ( x, y )
MN
T (u, v)e
u 0 v 0
j 2 ( ux / M vy / N )
p( z ) 1
k 0
k