Вы находитесь на странице: 1из 104

Georeferencing and processing

of Terrestrial Laser Scanner


Point Clouds
and Unmanned Aerial Vehicle
images
Overview
• TU Wien / GEO
• 4 topics
• LS principles: geometry and physics
• TLS orientation: ICP algorithm
• Incremental BBA: SIFT, F-Matrix, RANSAC, etc.
• PCP: concepts
Vienna University of Technology - TU Wien
• Mission “technology for people”  Department of Geodesy and Geoinformation
• Founded 1815  8 research groups
• Photogrammetry
• 28,000 students, 5000 staff • Remote Sensing (Microwave RS)
• Remote Sensing (Climate and Environmental RS)
• Faculties • Advanced Geodesy
• Mathematics and Geoinformation • Geoinformation
• Physics • Cartography
• Informatics • Engineering Geodesy
• Geophysics
• Electrotechnical Engineering
• Civil Engineering  Photogrammetry group
• 15 people: Prof. (1), PostDocs (5), PhDs (9) + Support
• Architecture and Spatial Planning (Projekt-Admin, IT-Admin)
• Mechanical Engineering • ~ 20 % financed by TU Wien
• Chemistry • National and EU funds (FWF, FFG, FP7, H2020, …)
• Orientation, Calibration, 3D modelling, laser scanning
• Environmental applications

3
Laser Scanning
Fundamental equations
1. Georeferencing Geometry
cos 𝛼 cos 𝜑
𝐗 = 𝐗 𝟎 + 𝐑𝜌 sin 𝛼 cos 𝜑
sin 𝜑

2. Radar equation Physics

Which quantity/symbol do those equations have in common?


In which coordinate systems are those equations formulated?
5
3rd aspect: time
• Speed of single
measurement
• Freuquency of
measurements
• Synchronization of
measurement
• Duration of a scan
• Duration of setup
• Transfer of data
volume

Terrestrial Laser Scanning – 15 minutes epochs – acquired during night


detect circadian movement of birch tree (sleeping)
Puttonen et al.: https://doi.org/10.3389/fpls.2016.00222 6
Terms

• Light Amplification by Stimulated Emission of Radiation


… monochromatic, strongly collimated, coherent light
• Light Detection And Ranging
Emit a pulse of light and analyse its backscatter
from solid targets (ground surface), semi transparent (vegetation),
volumetric backscatter (atmosphere)
• Laser Scanning
scan a laser beam over an extended field of view
• ALS TLS MLS ULS ALB PLS
A, T, M, U, P refer to platform, but what is the relevant difference ?
Terms

• Light Amplification by Stimulated Emission of Radiation


… monochromatic, strongly collimated, coherent light
• Light Detection And Ranging
Emit a pulse of light and analyse its backscatter
from solid targets (ground surface), semi transparent (vegetation),
volumetric backscatter (atmosphere)
• Laser Scanning
scan a laser beam over an extended field of view
• ALS TLS MLS ULS ALB PLS
A, T, M, U, P refer to platform, but what is the relevant difference ?
Laser equation (Radar eq.)
PT 4 1 4 D 2
PR    A      ATM SYS  PBK
4R   4
2 2
4R 2
 4
Received power PR [W]
Transmitted power PT [W]
Equally distributed along a sphere [m-2]
Antenna gain  (opening angle in respect to the sphere) [],dB
Area A of target [m2] and with reflection coef. 
Equally distributed backscatter along a sphere [m-2]
Backscatter in cone with opening angle 
Receiver aperture D [m2]
Atmospheric and system loss ATM,  SYS []
Background radiation (sun, shot noise, …) PBK [W]
Influence of target size on
received power
PT D 2 4A
PR    ATMSYS  PBK
4 R
2 4

 Influence of target size on received power
• Extended target
A=R2 2/4 PR  1/R2
Example: open terrain

• Linear target
A=R d PR  1/R3
Example: wire with diameter d

• (very small) Point target


A=const PR  1/R4
Example: leaf
Reflectance and LiDAR wavelength

532nm 690nm 1064nm 1550nm


bathymetric terrestrial airborne +terrestrial topographic
Measure of received power
„intensity“ image
Single and multi target Lidar
2 principles of Laser Range Finding common in TLS
• Pulse round trip
- emit a short pulse (e.g. duration: __ = length: __)
- start timer
- detect echo (e.g. energy above a threshold)
- stop timer
- compute two-way distance with speef of light
time lapse for 100m distance: ___
• Phase difference
Range measurement with pulses
Pulse-Generator Counter
start stop
Photodiode,
laser Transmitter t Detector Photomultiplier, …
(free) atmosphere

LIDAR
LIght Detection And Ranging
Reflector
Target

6.7.2007 Ljubljana
Multiple targets along the laser beam

• Laser pulse has a certain energy distribution across the


beam direction (foot print area)
• Within the beam: „instantaneous field of view“
multiple different (area, linear, point) targets can be
illuminated and can generate a „sensable“ echo
• One pulse can therefore generate multiple echoes
i.e.: first echo, intermediate echoes, last echo

14
Single and multi target Lidar
2 principles of LRF L
• Pulse round trip
• Phase difference
- modulate laser light with harmonic signal (AM)
- measure phase difference
between emitted and received modulated signal
reflected signal
reflecting surface Example:
Laser Phase difference =
distance = 330° - 10° = 320°
sensor emitted signal L = 100m
Distance =
Phase angle = 10° Phase angle = 170° = 320°/360° * L / 2 =
= 44.4m
Phase angle = 330°
Phase difference
• uniqueness range = ½ modulation wave length L
reflecting surfaces in distances d and d + L/2
return the same signal
• measure distances ‘continuously’ … fast
• increase accuracy by
using shorter modulation wave lengths
• Use multiple modulation waves of different length
longest wave length defines uniqueness range
• accuracy depending on (shortest) modulation wave length,
but depending also on object reflectivity
• No separation between different targets along the same beam
… single target principle
From Ranging to Points
measurements  points

http://www.riegl.com
cos 𝛼𝑗 cos 𝛽𝑗
𝑷𝒋 = 𝑟𝑗 sin 𝛼𝑗 cos 𝛽𝑗
sin 𝛽𝑗
𝑷𝒋 Point j of scan i
Range 𝑟 (2)
Scanner: Angles 𝛼 (4) and 𝛽 (3)

(3)
(4)

(2)
Single scan
Point Cloud Range image

r color coded
measurements
 georeferenced points
cos 𝛼𝑖,𝑗 cos 𝛽𝑖,𝑗
𝑷𝒊,𝒋 = 𝑷𝟎,𝒊 + 𝐑 𝜔𝑖 , 𝜑𝑖 , 𝜅𝑖 𝑟𝑖,𝑗 sin 𝛼𝑖,𝑗 cos 𝛽𝑖,𝑗
sin 𝛽𝑖,𝑗

𝑷𝒊,𝒋 Point j of scan i


What does 𝑷𝟎,𝒊 (exactly) refer to ?
Additional (optional) components
- Levelling (𝜔𝑖 , 𝜑𝑖 ) by electronic spirit bubble
- Magnetic compass (𝜅𝑖 )
- 𝑷𝟎,𝒊 by GNSS
descreasing density and occlusions
TLS orientation - ICP
Relative orientation / registration

Targets or ICP
Tie points
• Signalized vs. natural points
• Identify in
Intensity image OR Range image OR Point cloud

ISPRS Summer School


Laser Scanning 2007
Georeferencing
GCP, known coordinates in WGS84/UTM or other
superior system

Tie point, coordinates not


known in advance, but
observed in several scans
Targets
Targets for tie points and GCPs

Decreasing point number on target


for range of 6m to 25m
How to measure targets ?
1. Digitize in intensity image
2D marker,
retro-reflective material
2. Fit primitive to 3D object
select points
of sphere and determine
adjusting sphere Point cloud of a
sphere as tie
point / GCP
Degree of automation?
e.g. automatically select birght image parts
Example: retro-reflective
markers as targets
3 scans, retro-reflective targets (2cm)

ISPRS Summer School


Laser Scanning 2007
Example: spheres as targets
• Tie point
special object (sphere,
b/w-square, …) to
generate corresponding
points between scans
• GCP
ground control point
known coordinates in
superior coordinate
system
• Multiple scans:
GCP can also be tie points

DFG/FWF project PROSA


Example: spheres as targets
• 4 TLS scans
TLS1 – TLS4
• Tie points: spheres
R1 - R5
• Tie points are in this
example also GCPs,
TLS data embedded in
ALS scan

Illustration: Philipp Glira


32
Relative orientation with tie points
• Identify with sub-pixel accuracy
• Can be observed in photos or by tacheometry
• How many tie points required to register 2 scans?

ISPRS Summer School


Laser Scanning 2007
Alternative mode of orientation
• 1 000 000 pts/scan, why use only 3
(5, 10, … a small number) for orientation ?
• Effort to place tie points (retroreflective markers)
in the scene or detect them (natural tie points)
• Precise manual selction or measurement necessary
• Alternative
use entire point cloud for orientation !
• Iterative closest point algorithm (ICP)
better name: iterative corresponding point
Registration with ICP
• Given: 2 point clouds
• on the same surface
• in different local coord.sys.
• Simple case pi
correspondences between points
• exist
• are known qi
• Solution

How to?
Unknown parameters of Euclidean transformation in 3D
Registration with ICP
Problems
• Correspondences are generally not know
• Point-to-point correspondences do NOT exist
Solution
• Replace: Corresponding point  closest point
• Solve for transformation parameters
• Iterate
Registration with ICP

Find for each pi closest point in Q={q1, …qm}


Solve for t and R to minimize offsets, and apply to pi

Find for each pi closest point in Q={q1, …qm}



Registration with ICP
Additional Notes

• num. pts. in Q  num. pts. in P All points or only short


distances (10% of all
distances)
• Rough alignment required (manual) Alternative approaches:

• Does not work well for flat surfaces 1. Subsample one


point set
• Requires many iterations
2. Distance to
estimated tangent
planes
• But! No tie points required!
3. Reject large
distances
Incremental BBA
Existing solutions

Pix4D, Agisoft
unsroteD sEt of IMages PhotoScan/Metashape,
Catch123, Zephyr,
OrientAL, …

Result
Image exterior and interior
orientation + object points
How does it work ?
Incremental
Bundle Block Adjustment
• Automatic pairwise image orientation
• keypoint extration in each image (SIFT)
• match keypoints
• robustly estimate relative orientation
Robustness: RANSAC
Orientation description: Fundamental matrix
• Estimate relative orientation for all image pairs
• Build bundle block incrementally by starting with
best image pair and iteratively adding one image
Automatic relative orientation
• Requirements:
Photos show similar
features
• (more or less) equal measurement time
• (more or less) similar viewing direction
• Extract of SIFT features and match them
• Determine relative orientation using RANSAC
SIFT
• Scale Invariant Feature Transform
• Method for extracting salient points in images and
describing them
• SIFT provides key points and descriptions, which
are independent of
• Image scale
• Orientation around optical axis
• Viewing directions (with restrictions)
• Brightness and contrast (linear grey value
transformations)
Image pyramids

Image 1 in different scales

Which scale of
image 1 fits to image 2 ?

Image 2
SIFT
5 steps

1. Find points
2. Refine points and select best points
3. Determine local point orientation
4. Describe image gradients in local point
neighborhood

5. Match points of an image pair


SIFT: finding the points
• Apply different magnitude of smoothing to images
(„scale“)
• Perform this for each niveau of the image pyramid
(„octave“)
Vary scale, all images have same size
Next octave, images with half the size
On this octave, again variation in scale
SIFT: finding the points
• Build differences between images of one octave

• Find extreme values in those difference images:


select points they are extreme values in
comparison to all of their 26 neighbors
SIFT
• Step 1: keypoint extraction … done
• Step 2: improve keypoint location (skipped here)
• Step 3: determine orientation (skipped here)
• Step 4: local description
SIFT – Step 4: local description
• Describe surrounding of
keypoints based on Keypoint
gradientens of grey values.
• Split keypoint surrounding into
16 fields.
• Each field is a 4x4 window
• Determine in each field a
histogram of the grey value
gradients. Histogram bins
are 45° (8 classes).
• These are 128 values.
Step 5: matching points between
image pairs one keypoint of image 2

• Each key point corresponds to a feature x128


vector in a 128 dimensional room.
• An image is represented by its Key points of image 1
key points with 128d feature points. x1

• For each feature point of image 2 find the nearest


feature point of image 1 (Euclidean distance).
This is a match !
• If the 2nd nearest point has a similar distance,
eliminate the correspondence.
SIFT match
So far we have matching points

Now we want to compute the relative orientation

Are all those matches correct?

Let us first have a look, how to compute the relative


orientation efficiently.
Relative orientation
Fundamental matrix F
Coplanarity condition: r1T(b x r2)=0
Axiator S(b)  r1T S(b) r2 = 0
Replace r by rotation and calibration matrix
and measured point r = R C p p = (x y 1)
p1TC1TR1TS(b)R2C2 p2 = 0
F … Fundamental matrix (3x3)
It can be shown that only 7 of the 9 elements of F are
free (i.e. there are 2 conditions)
Given 7 point pairs F can be estimated (linear System) !
Epipolar geometry
Fundamental matrix F

p1 (x, y, 1)T
l2T (a, b, c)
l2T·p2
(Euclidean!) distance p2 of l2 : v =
a2+b2
matching points  relative orientation
• Matches from SIFT contaminated by outliers
• Computation of relative orientation based on all
points likely to provide wrong results because of
outliers
• Robust method required, which can accept high
outlier ration: RANSAC
• RANSAC = RANdom SAmple Concensus
RANSAC
• Requirements
• Problem has a small number of paramters
• Efficient solution of problem exists for minimal number
of observations
• Rough estimate on outlier ration available
• E.g.: relative orientation with SIFT matches
• 7 parameters
• Fundamental matrix elements determined with
eigenvector analysis
• 30 % outliers
RANSAC
Provides … robust estimator for unknown parameters
from observations contaminated with many gross errors
Paradigma
• Randomly select observations:
minimal set for solution
• Evaluate solution with total set of observations,
provides a „score“
• Many repetitions of section and evaluation
• Chose solution with highest score as robust estimator
of the unknowns
RANSAC for F-Matrix estimation
• Score =
Number of observations (correspondences),
that support a certain model (1 Fundamental
matrix)
• Threshold (e.g.) t = 5 Pixel
• Correspondences (matches) count, if |v| < t

l2T·p2
v=
a2+b2
Relative orientations
• Displayed as weighted graph
• Knots are the images
• Edges are the relative orientation (edge weight =
quality of relative ori.) Photo 1
Photo 2
• E.g.:
4 images (6 pairs)
best rel.Ori.: Photo 1-2
no rel.Ori. Photo 2-3
Photo 3
Best solution for Photo 4

incremental BBA:
1-2 +4 +3
41 Photos of a scene,
oblique images

Graph after pairwise relative


orientation
• Knots = photos
• Edges: weighted by
number of e.g..: Photos 39-44 strongly connected,
correspondences with weaker link to other sub-blocks
Point Clouds
Why point clouds ?

Because Laser Scanning


measures (almost) points
and Image Matching provides
(almost) points, too

Google image search: point


Point cloud, with additional information
Point cloud definition
• A point cloud is a set of points.

• To emphasize that there is no real structure in or


between the points, but that most of the points are
somehow in a surrounding, the „cloud“ builds up.
• In a point cloud additional attributes may be given.

• Attributes can be of different types.

Otepka et al., 2013: Georeferenced point clouds: A survey of


features and point cloud management.
ISPRS International Journal of Geo-Information
Point cloud attributes

Possible attributes
Attribute Type
Point cloud processing
• Processing paradigm
• Each point carries information
• Modeling based on surfaces/lines aggregates
information … thus also a loss of information
• 2.5d raster vs. 3d distribution of points
• Averaging in cell vs. high resolution at the spot
• Interpolation of voids vs. maintaince of information lack
• Rather incorporate aggregation into each
application progam than build one aggregated
model for all applications
Point cloud processing
• Exploratory data analysis
range, maxima, mean, median, std.dev., quantiles, …
• Transformation: shift, rotation, etc.
• Selection or visualization based on attributes
• Feature computation
Local processing, process points depending
on their neighborhood
• Segmentation
• Classification
Neighborhood
• A neighborhood of a point is a subset of all
given points
• Neighborhood definitions for surface interpolation
are defined in 2D ( x,y ) only
• Neighborhood definitions in point clouds can be
defined in 3D
• Fixed distance neighbors in 2D / 3D
• K nearest neighbors in 2D / 3D
• Cylindrical neighborhood
• Box neighborhood
Neighborhood
Formal definitions, neighbors of point p
• Fixed distance neighborhood, parameter d

• k Nearest Neighbors, parameter k


NkNN(p) = the k closest points of P to p
• Cylindrical neighborhood, parameter d and h
𝑁𝐶𝑌𝐿 𝑝 = {𝑞 ( (𝑥𝑝 − 𝑥𝑞 )2 +(𝑦𝑝 − 𝑦𝑞 )2 ≤ 𝑑)AND( 𝑧𝑝 − 𝑧𝑞 ≤ ℎ)}

• Spherical nieghborhood, box neighborhood, …


Neighborhood
Notes
• Points are typically in their own neighborhood
• Neighborhoods may be empty (e.g. FDN)
• Neighborhoods may not be unique (e.g. kNN)

Filin and Pfeifer, 2005. Neighborhood Systems for


Airborne Laser Scanner data. Photogrammetric
Engineering & Remote Sensing 71(5).
Features
Features at a point are derived by analysing the point
and its neighborhood.
Useful features are:
• Normal vector
• Roughness
• Calibrated refelctance measure
• Echo number
•…
Normal vector
• Methods for computation
• Usage in visualization
• Normal vector display
Normal vector computation
• Differential geometry
• Point cloud

Point : neighborhood : local surface interpolation : tangent plane derivation : normal vector
Normal vector computation
• Point (), neighborhood (), local surface
interpolation, tangent plane derivation, normal
vector
• Local surface interpolation = Moving Least Squares
surface interpolation ?

Generally, point
clouds do not
follow / allow /
cannot be
parameterized
z = Function ( x,y )
Normal vector computation by
orthogonal regression plane
• Minimize orthogonal distances
between (unknown) plane and given
points.
• Local surface model = plane (tangent pl.)
• Determining this plane requires
solving an eigenvalue problem.
... ...

• Result: plane parameters (normal vector) + residuals

Go for the mathematics ?


Normal vectors for point cloud visualization

• Assume lighting direction


• Compute angle between light direction and point
normal vector 
• Assign grey value (brightness) depending on angle
Brightness
1

0
0° 90° 180° Angle 
Idealized surface

Feature: RoughnessActual surface


Measurements
• Roughness describes the difference between the
ideadlized and the actual surface
• Roughness is measured by characterizing the distribution
of differences between actual and idealized surface
• Standard deviation probability
• Maximum valley depth / peak height
• RMS slope of difference surface
• … diff
• Roughness measurements can be esimtated from surface
measurements
1. Perform measurement occurences
2. Estimate idealized surface from measurement
3. Residuals are analyzed
resid
Does this overestimate or underestimate roughness ?
TLS point cloud, local plane fit roughness, std.dev. in Meter

Вам также может понравиться