Вы находитесь на странице: 1из 10

International Circular of Graphic Education and Research, No.

7, 2014

Complete Digital Workflow for HDR Photography


Natalia Gurieva

Keywords: high dynamic range, photography, algorithm of HDR merging, digital workflow

HDR (High Dynamic Range) imaging is the general name of the technology to capture, store, and edit images
with a luminosity range exceeding the capabilities of standard technologies. It allows recording much greater
range of tonal detail than a camera could capture in a single exposition. Standard dynamic range photos allow
representing the dynamic range of luminosity of about 1000:1 while real scenes often possess a dynamic range
of 100000:1 or higher [1]. The goal of our research is to correctly manage the colors of HDR photos which
includes the tonal range of real-world scenes. Using the HDR technology allows working with the full range of
luminosity of the scene since HDR images use floating-point values with 32 bits per color channel.
The investigation focuses on development of a predictable workflow of graphic data for HDR photography for
further automation. The basic and extended workflow is demonstrated and tunable algorithm of HDR merging is
proposed.

1. Introduction
The main idea of an HDR is to highlight the best expo- appropriate method for our workflow. But, actually, the
sure for different areas of the same photo. Dynamic first ideas of how to extend the dynamic range on a
range is defined as the difference between the lightest photograph emerged 1850 when Gustave Le Gray tried
tonal values (highlights) and the darkest tonal values in to render landscapes of the sea and the sky [3]. Such
which minimal details can be perceived (the shadows). rendering was impossible at the time using standard
Bright areas, that are so bright that no details are to be methods, the luminosity range being too extreme. Le
seen are said to be “blown out”; dark areas that have Gray used one negative for the sky, and another one
gone totally black are “plugged”. Mastering the craft with a longer exposure for the sea, and combined both
of extending digital dynamic range opens the possibil- and thus invented HDR imaging. Later, in 1954, some
ity of compositions that would have been impossible experts tried to make the first tone mappings using
in the past because of dynamic range limitations [2]. methods like dodging and burning, increasing or de-
HDR photography typically works best when applied to creasing the exposition of the negatives trying to extract
photos taken outdoors or when there is a wide range of all the dynamic range they could.
light/color present. Scenes suitable for HDR techniques In general, HDR means combining multiple captures
are shown in Figure 1. (3 or more) with different exposure settings for extend-
HDR had the most rapid progress with the digital age ing the dynamic range (Figure 2) [4]. The brightness
arriving and the exponential creation of new software, distribution of digital images is represented in the histo-
each version released more powerful than the prec- grams below. If the image has optimum contract (as in
edent and with enough possibilities to choose the most result of merging) the histogram includes practically all

Figure 1: Examples of scenes suitable for HDR techniques

14
science & technology

brightness from highlights to shadows. If the image has to store the intensity of a particular pixel in a particular
low contrast (original images with exposure bracketing channel. 8 bits of data per pixel just allows 256 different
-2, 0, +2), the histogram includes only a small number integers. In 8-bit mode 0.49999 would have been read
of values. Out-of-range values would be recorded as roughly as 0 instead of 0.49999. HDR images are freed
“black” (underexposed) or “white” (overexposed) rather from this limitation. Instead of integers floating point
than the precisely graduated shades of colour and tone values are used, while the precision varies depending
andrequired
third istooverexposed. HDRFor
describe “details”. photos improve
example, over LDRupon
first image (Lowwhich
Dynamic
kind ofRange)
floatingimages becauseis there
point storage is a severe
used and
limitation regarding
on a Figure LDR images
2 is underexposed, in terms
second imageofhas howa much theinformation canof be
architecture thestored.
computer. A typical conventional LDR
image is stored
normal with and
exposition three channels:
third R, G, and
is overexposed. HDR B.photos
Each channel contains
With 8 bits itofisdata
HDR images, per pixel
possible to store
to record in onethe intensity
of aimprove
particular
overpixel in a particular
LDR (Low Dynamic Range)channel. 8 bits
images of data per
because photopixel
thejust
rangeallows 256 different
from dark integers.
to light that In 8-bit
is detectable by mode
0.49999
there iswould have
a severe been read
limitation roughly
regarding LDRasimages
0 instead
in of 0.49999. HDReye.
the human images arephotos
In such freed in
from
the this
samelimitation.
time detailsInstead
of integers floating
terms of how muchpoint values can
information are be
used, while
stored. the precision
A typical varies depending
in shadows, highlights and upon which can
mid-tones kindbeofrecorded.
floating point
storage is used and
conventional LDR the
imagearchitecture of the
is stored with computer.
three channels: R, To merge a high dynamic range image from a set of
G, and B. Each channel contains 8 bits of data per pixel differently exposed images of the same scene, pixels in

Figure 2: HDR Photo Exposure Bracketing ( -2, 0, +2 stops and result of merging)

Figure 2* HDR Photo Exposure Bracketing ( -2, 0, +2 stops and result of merging)
15
International Circular of Graphic Education and Research, No. 7, 2014

the individual exposures have to be related to the cor- desired final print quality. The manifold variety of types
responding pixels in the other exposures. of images require specific methods of merging and fur-
ther corrections. That is why recommendations of using
2. HDR photo workflow different techniques of merging depending on content
of the scene are given. Basic steps involved in the HDR
The HDR digital photo workflow consists of four workflow are discussed below.
main phases: taking a series of photos with different
exposures, merging for HDR, digital image processing 3. Data capturing for HDR
or editing and preparing for output [5-7]. During that
workflow image details and significant colors have to be The workflow begins with camera settings before ex-
preserved. posure: dynamic range evaluation of the scene, general
The workflow includes every step of the photographic camera settings, image specific camera settings and
process from selecting camera settings and composing composing a frame. Also file format has to be men-
a frame to the final print or published web-gallery. The tioned: JPEG or RAW. RAW files are just the raw sensor
main workflow phases are presented in Figure 3. data. In JPEG files all automatic and manual settings of
the camera (e.g. white balance, contrast, tonal value
All shown phases and steps present the basic compo- corrections, sharpness, color interpolation plus JPEG
nents of each digital HDR photo workflow. They could specific data reduction algorithms) are automatically
be modified for personal needs and requirements in merged to the image data stored. To assure image data
respect to productivity, HDR merging result quality or without unknown alterations and compressions is to use
RAW format.

Shooting, Transfer: camera-computer


Capture: Merging to HDR
Color management in camera
Viewing conditions, ISO RAW converters:
Exposure, histograms Applying DNG or ICC profiles
of the camera Digital image editing
Scene composing, framing
Converting to Pro Photo RGB space Color management in Preparing for output
Shooting in RAW format photo editing software:
RAW editing and conversion: Color management for output:
Cropping Applying ICC profile of the monitor
Transfer: camera-computer for visualizing Gamut mapping using output
White balance ICC printer profile
Download images to computer Converting to Lab color space
Image inspection: sharpness, Contrast/brightness corrections after corrections for futher Soft proof:
Tonality adjustment gamut mapping
color accuracy, dynamic range or Converting image from working
range of exposure accuracy, Saturation
Tone Mapping: color space to Lab
tonal response and contrast,
noise, chromatic aberration, Reduce noise
Choosing software and method Imitating output gamut and
lens distirtion, lens flare paper color
Chromatic aberration of tone mapping
Selecting images to process Lens distirtion, lens flare
Using ICC monitor profile f
Tonal final editing/optimizing: or a coorect visualizing
Backup/achiving Sharpening if necesarry
Reduction artefact s of merging
Convert/export to a standart format Print:
and color space Optimizing tonality Using RIPs to high quality printing
Merging to HDR: Color corrections Printer settings:
(global, then selective using masks) print formats, paper types
Choosing software and depending
on the content one of the strategies
Optimizing highlights and shadows Printing without additional color
of merging
management in a driver
Final sharpening if software will manage colors

Figure 3: Complete digital workflow from capturing RAW data and rendering HDR scenes to an output-referred encoded image

16
science & technology

When using the multiple exposure method of captur- timization. When multiple exposures are combined we
ing data for a single HDR image, it is necessary to take preserve the information from each of the exposures.
multiple shots altering the exposure for the different Then, the resulting 32-bit images are ready to have its
light levels. Depending on the type of the scene and wide contrast range adjusted to fit into the contrast
its dynamic range different capturing strategies can be range of the monitor or the output device.
applied. For example, in case of midday sun with strong
shadows it will be enough to take 3 shots at about 1 ~ 4.1 RAW editing, conversion and color translation
1.33 stops apart; inside buildings with some light com- RAW conversion process is presented by translating
ing through the windows at least 5 bracketed shots with the Bayer pattern from the sensor of the camera using
about 2 stops apart have to be taken. a camera ICC or DNG profile into RGB format. This
The shutter speed should be varied rather than the processing involves a number of operations, such as:
f-stop since varying the f-stop will introduce differences decoding the image data of raw files; demosaicing
in focus due to varying depth of field. The best details (interpolating the partial raw data received from the
levels and the clarity are achieved by starting with a very color-filtered image sensor into a matrix of colored
fast shutter speed – just enough to pick up the brightest pixels); changing white balance (if we need to correct
highlights in the scene with none of the pixels reach- color temperature of the light that was used to take the
ing a value of 255. For the next exposure, the exposure photo); straiten, rotation or crop (if necessary); exposure
time should be doubled, repeating this until the image correction (in a small range); defective pixel removal
is mostly white, with just the darkest objects showing (replacing data in known bad locations with interpola-
some detail. It is not necessary to take an exposure at tions from nearby locations); noise reduction (removing
every f-stop. It is possible to increase the exposure by small defects or fluctuations, eliminating small detail for
two stops or several stops. Each image represents a slice smoothness); removal of systematic noise; optical cor-
of the scene’s dynamic range and then could be digitally rection (lens distortion correction, vignetting correction,
merged to have a high dynamic range image. and color fringing correction); contrast enhancement,
For a successful recovery of the original dynamic tune brightness, saturation; color corrections (first glob-
range, the dynamic ranges of individual exposure slices ally, then selectively), sharpening (increasing visual acuity
have to overlap. Otherwise, the combination algorithms by unsharp masking) and final color translation (convert-
can not join them. It is important to get free noise ing from the camera native color space defined by the
reduction out of that because each pixel in the resulting spectral sensitivities of the image sensor into an output
HDR represents the average of the most useful parts color space ProPhoto RGB).
of each exposure. The most important are the middle Gamuts of the various color spaces on Figure 4
tones. The shadows, where all the image noise tends to indicate that there are colors that can be printed on
get overwhelming, are generally blown out and replaced an Epson 4900 that fall outside both sRGB and even
with the corresponding color values from the lower Adobe RGB. ProPhoto RGB can contain all colors that a
exposures. digital camera can capture–even highly saturated colors.
After multiple shots it is necessary to check the Cameras don’t capture and inkjet printers don’t print in
exposures using histograms, white balance and the ISO sRGB color space [8, 9]. That’s why it is recommended
settings. After this, all the photos in RAW format should to select ProPhoto RGB as output color space. Also it
be transferred from the camera to computer. is recommended to avoid applications that could not
use embedded ICC profiles, because the colors that
4. HDR merging could be observed on a screen or on a final print will be
unpredictable.
Processing HDR photos involves the following stages: Some raw formats also allow nonlinear quantization.
editing RAW data of source images (i.e. correction of This nonlinearity allows the compression of the raw data
the colors and geometrical distortions such as align- without visible degradation of the image by removing
ment, rotation, cropping), converting RAW files into invisible and irrelevant information from the image.
RGB color space, merging a series of source images into Although noise is discarded this has nothing to do with
a single HDR image, tone mapping and tonal final op- (visible) noise reduction.

17
into a matrix of colored pixels); changing white balance (if we need to correct color temperature of the light that was
used to take the
International photo);
Circular straiten,Education
of Graphic rotation or crop
and (if necessary);
Research, exposure correction (in a small range); defective
No. 7, 2014
pixel removal (replacing data in known bad locations with interpolations from nearby locations); noise reduction
(removing small defects or fluctuations, eliminating small detail for smoothness); removal of systematic noise;
optical correction (lens distortion correction, vignetting correction, and color fringing correction); contrast
enhancement, tune brightness, saturation; color corrections (first globally, then selectively), sharpening (increasing
visual acuity by unsharp masking) and final color translation (converting from the camera native color space defined
by the spectral sensitivities of the image sensor into an output color space ProPhoto RGB).

(a) (b) (c)

Figure 4: Comparison of gamuts of various color spaces in Gamut View a) highlights L=75, b)midtones L=50, c) shadows L=25
a) b) c)

Figure 4 Comparison of gamuts of various color spaces in Gamut View a) highlights L=75, b)midtones L=50, c)
4.2 Strategies of manual and automate mergingshadows L=25
There are quite a lot software programs available to be add a few hand-processed layers on top to reach
Gamuts of the various color spaces on figure 4 indicate that there are colors that can be printed on an Epson 4900
used to automatically combine the different exposure special artistic idea (Figure 5, a).
that fall outside both sRGB and even Adobe RGB. ProPhoto RGB can contain all colors that a digital camera can
versions (such as Photomatix Pro, Adobe Photoshop
capture–even highly saturated colors. Cameras don’t capture • Using and inkjet
as printers
a base andon’t print
image within normal
sRGB color space [8,
exposure
HDR Pro, Picturenaut,
9]. That’s Luminance HDR,
why it is recommended easy HDR,
to select ProPhoto RGB as output color space. Alsoand
it isdarker
recommended to top
avoid
and then adding lighter versions on
DynamicPhoto
applications that HDR).
couldThe
notprogram searches
use embedded forprofiles,
ICC correctlybecause the colors that could be observed on a screen or on a
(Figure 5, b).
exposed
final printimage areas
will be and ignores overexposed and un-
unpredictable.
derexposed regions of the photo. Every program has its • Using as a base the darkest image and then adding
own method to achieve this, but HDR images are rarely lighter images to highlight some details of the scene:
perfect when the HDR program finishes with them. This means processing the darkest, most underex-
That’s why, graphic artists and photographers prefer to posed capture first, then blend successively lighter
apply manual adjusting after the merging process and versions on top (Figure 5, c).
using additional layers of images with different expo-
sures to archive their goal for every specific photo. When the picture contains many details and, there-
fore, manual merging becomes a significant problem
Depending of the content photographer can select and causes technical difficulties like curving every leaf
one of the 3 strategies of manual HDR image rendering: in the plant (Figure 5, a) the automatic merging is
preferable. In the work, there has been proposed a
• Using automated algorithms and then adding layers highly-tuneable algorithm for creating an HDR image.
to auto-HDR. It works when photographer want to The algorithm is based on main principles of manual
reach someFormeln Artikel
special glow076,
orNatalia Gurieva
highlight one specific image rendering. It does not involve image recognition
area in the image. With automatic HDR we have a techniques and implements weighted patches summa-
COMPLETE DIGITAL WORKFLOW FOR HDR PHOTOGRAPHY tion based on their histograms only (1).
base for further manual processing. Then we can

imageHDR | R ,G , B = image1 R ,G , B
⋅ c1 + image2 R ,G , B
⋅ c 2 + image3 R ,G , B
⋅ c3 ,
(1) [1]
c1 + c 2 + c3 = 1

Where c1, c2 and c3 are weights defining the transparency of each image.
GRAY = R ⋅ 0.2989 + G ⋅ 0.587 + B ⋅ 0.1140 [2]

255

∑i ⋅ h
i =0
i
Mid j = 255 18 [3]

i =0
hi
Formeln Artikel 076, Natalia Gurieva 255 science & technology
COMPLETE DIGITAL WORKFLOW FOR HDR PHOTOGRAPHY
∑ i ⋅ hi
Formeln Artikel 076, Natalia Gurieva
i =0
Mid j = 255
COMPLETE DIGITAL

h WORKFLOW FOR HDR PHOTOGRAPHY
i =0
i

Formeln Artikel 076, Natalia Gurieva


imageHDR | R ,G , B = image1 R ,G , B
⋅ c1 + image2 R ,G , B
⋅ c 2 + image3 R ,G , B
⋅ c3 ,
imageHDR | R ,G , B = image1 [1] 2
⋅ c1 + image ⋅ c 2 + image3 ⋅ c3 ,
COMPLETE
c + c + cDIGITAL
= 1 WORKFLOW FOR HDR PHOTOGRAPHY R ,G , B R ,G , B R ,G , B
1
After all2 three
3
images are loaded, the auxiliary gray- c1 + c 2 + c3 = 1  2π 
c3 = 0.5 − 0.5 ⋅ sin  ( Mid1 − b1 ) ⋅ ,
scale images are created using the relation [10]:  4 ⋅τ 
(4)
imageHDR | R ,G , B+=Gimage ⋅ c1 + image2 R(2) c2 = 1 − c3 ⋅ c ,
⋅ c 2 + image
GRAY = R ⋅ 0.2989 ⋅ 0.5871+RB
,G⋅, B0.1140 ,G , B GRAY = R ⋅ 0.2989 + G ⋅ 0.587 + B ⋅ 0.1140 [2]
3 R ,G , B 3
[1]
c1 + c 2 + c3 = 1
which are then used for the coefficients estimation at Where b1 is the brightness limit for the shadows and t is
every nearly-uniform patch. the transition
255
255
zone. An example of the coefficients com-

255
puted for∑
8x8.GRAY
∑ i the⋅ 0patches
Initially, thei size
The =histograms
⋅ h of
=i =R0 ⋅ 0.2989 +ofGthe three
.587
to analyse was taken
+ Bpatches
⋅ 0.1140 from different Figure
D j ==
Mid =00
ih
⋅ hi i⋅ Mid
some
6.ii=255
j − i of b1, b2 and t are presented in
specific
[2]
Mid j 255 [3]
images were255analysed. We have first found the weight-
∑ hi
As a result, ∑ hi
∑0 i which is a mix of two different
instead h of sharp edges a smooth gradient
ed centres of i =0
each histogram using the expression (3): appears ini =0thei =picture
255
images.
∑i ⋅ h i
One of the major  problems 2π of technique proposed
Mid j = i =255
0
(3) c3 = 0.5 − 0.5 ⋅ sin  ( Mid1 − b1 ) ⋅ , [3]
 2π  was due to a stable  patch size.4 ⋅τ  Because of this, the

c3 = 0.5 − 0.5h⋅i sin  ( Mid1 − b1 ) ⋅
i =0

,
4 ⋅τ  defects
c2 = 1 − appear
c3
[4]
at the edges of the objects. However, we
c2 = 1 − c3
have overcome this problem by analyzing the patches
According to the positions of the Mid j the transparency histogram dispersion found as follows:
coefficients are selected. A problem appeared at the gra- 255
2π  technique, a sharp
dient
c3 colors
= 0.5 − where,

0.5 ⋅ sin  according
( Mid1 − b1 ) ⋅to this
, ∑ h ⋅ Midi j −i [4]
edge appeared
255 
between two parts4 ⋅τ taken from different Dj = i =0
(5)
images. ∑ hi ⋅ Midthe
c2 = 1However,
−c
j − i problem have been overcome by
255

∑h i
using = i =0 3 255 function close to the boundary bright-
D jsmoothing i =0 [5]
ness, like this:∑ hi
i =0
255

∑ h ⋅ Mid
i =0
i j −i
Dj = 255
[5]
∑h
i =0
i

(a) (b)

(c) Figure 5: Different scenes which required


different strategies of HDR merging

19
International Circular of Graphic Education and Research, No. 7, 2014

It is obvious that if the patch contains parts of two ob- Analyzing the histograms the important conclusions
jects, the dispersion of its histogram will be much larger can be made. Namely, our approach allows preserving
than the one of the uniform and even slightly gradient more details in whole dynamic range. The histogram
image. If the dispersion exceeds certain limit, the patch looks more uniform covering more shades (15…25) and
is subdivided and its parts are analyzed separately until semi-tones (130…200). On the other hand, comparing
the largest element with satisfying histogram dispersion the images, the one with wider histogram (see figure
is found. Result of such HDR merging by two different 8b) looks more contrast without introducing any ad-
tools Figure 6* Images
without additional for HDR merging
post-processing (0, -2,
is presented in +2 ditional
stops) effects.
and their histograms in MATLAB
Figure 8b.

Initially, the size of the patches to analyse was taken 8x8. The histograms of the three patches from different images
were analysed. We have first found the weighted centres of each histogram using the expression (3):
255

∑i ⋅ h
i =0
i
Mid j = 255
(3)
∑h
i =0
i

According to the positions of the Mid j the transparency coefficients are selected. A problem appeared at the
gradient colors where, according to this technique, a sharp edge appeared between two parts taken from different
images. However, the problem have been overcome by using smoothing function close to the boundary brightness,
like this:

⎛ 2π ⎞
c3 = 0.5 − 0.5 ⋅ sin ⎜ ( Mid1 − b1 ) ⋅ ⎟, (4)
⎝ 4 ⋅τ ⎠

c2 = 1 − c3 ,

Where b1 Figure (0, -2,τ +2


6: Images for HDR merging (0, -2, +2 stops) and their histograms in MATLAB
is the brightness
Figure 6*limit
Imagesforforthe HDRshadows
mergingand is stops)
the transition zone. Anin example
and their histograms MATLAB of the coefficients
computed for some specific of b1 , b2 and τ are presented in figure …
Initially, the size of the patches to analyse was taken 8x8. The histograms of the three patches from different images
were analysed. We have first found the weighted centres of each histogram using the expression (3):
255

∑i ⋅ h
i =0
i
Mid j = 255
(3)
∑h
i =0
i

According to the positions of the Mid j the transparency coefficients are selected. A problem appeared at the
gradient colors where, according to this technique, a sharp edge appeared between two parts taken from different
images. However, the problem have been overcome by using smoothing function close to the boundary brightness,
like this:
Figure 7: Transparency coefficients for merging the images into2HDR
Figure 7 Transparency coefficients for merging⎛ the images πinto ⎞ according
HDR toaccording
(1)
to (1)
c = 0.5 − 0.5 ⋅ sin ⎜ ( Mid − b ) ⋅
3 1 1 ⎟, (4)
⎝ 4 ⋅τ ⎠

2 c =1− c ,
As a result, instead of sharp edges a smooth gradient appears in3 the picture which is a mix of two different images.
Where b1 is the brightness limit for the shadows and τ is the transition zone. An example of the coefficients
computed for some specific of b1 , b2 and τ are presented in figure …
20
∑h
i =0

i =0
i

science & technology


It is obvious that if the patch contains parts of two objects, the dispersion of its histogram will be much larger than
theIt is
one of thethat
obvious uniform and even
if the patch slightly
contains partsgradient image. the
of two objects, If the dispersion
dispersion exceeds
of its certain
histogram will limit, the larger
be much patch than
is
subdivided
the one of the uniform and even slightly gradient image. If the dispersion exceeds certain limit, the patchis is
and its parts are analyzed separately until the largest element with satisfying histogram dispersion
found.
subdivided and its parts are analyzed separately until the largest element with satisfying histogram dispersion is
found.
Result of such HDR merging by two different tools without additional post-processing is presented in Figure 8b.
Result of such HDR merging by two different tools without additional post-processing is presented in Figure 8b.

(a) (b)
a) b)
a) b)
(c)

Figure 8:(a
Figure 8 Result of HDR merging Result of HDRofmerging
– result mergingand in
theirAdobe
histograms – a) result HDR
Photoshop of merging
Pro,inbAdobe
– our method realized in
Photoshop HDR Pro, b) our method realized in MATLAB, c) histograms
MATLAB) and their histograms
Figure 8 Result of HDR merging (a – result of merging in Adobe Photoshop HDR Pro, b – our method realized in
MATLAB) and their histograms
Analyzing the histogramsand
5. Post-processing the important
preparing conclusions
for output can be made. Namely, our approach allows preserving more
details
Analyzing the histograms the important conclusionsmore
in whole dynamic range. The histogram looks can uniform
bethe covering
dynamic
made. more
range
Namely, shades
or contrast
our (15…25)
approachratio and
of the
allows semi-tones
entire
preservingimage
more
(130…200).
details OnHDR
The resulting
in whole the other hand,
image
dynamic with comparing
such
range. the images,
Thehigh-color-depth
histogram the one
looks more with
while widerlocalized
retaining
uniform coveringhistogram (see (15…25)
contrast
more shades figure
between8b)and
looks more
neighbour
semi-tones
contrast withoutOn
(130…200). introducing
the other any
hand,additional
comparingeffects.
the images, the one with wider histogram (see figure 8b) looks more
needs to be rendered for pleasing effect and correct pixels, trying to represent the whole dynamic range
contrast
viewing without introducing anymonitors
on low-dynamic-range additional
or effects.
prints. while retaining realistic color and contrast.
The next step of HDR image production (tonal mapping) Among final steps of the digital HDR workflow we
involves reducing the image’s tonal range to a viewable/ also should mention image optimization and preparing
printable format without losing any important informa- to output. During optimization it is often necessary to
tion (details). The tone-reproduction rendering often selectively dodge/burn specific parts of a finished HDR
includes separate tone mapping and gamma compres- photo, or to adjust the colors to more realistic level.
sion steps. Most HDR programs offer various tone map- Then depending on type of the media that photogra-
ping methods with various parameters for adjustment. pher can choose (inkjet prints, publishing on the Web or
Depending on the particular photographer needs (e.g. producing other types of output) the whole phase can
producing just aesthetically pleasing images, reproduc- be divided into several steps: backup HDR images, scale
ing as many image details as possible, maximizing the images for output, output sharpening, generate output
image contrast) there can be selected one of tone map- (convert to output device color space), soft proof, check
ping methods. Tone mapping methods allow reducing results and publish.

21
International Circular of Graphic Education and Research, No. 7, 2014

6. Conclusions References

In the paper, different approaches to HDR image [1] White Paper (2009): High Dynamic Range Imaging:
capturing, rendering and latest post-processing tech- Images and Sensors. Fundamentals, Method of
niques have been joined into one general workflow. Functioning, Application. IDS Imaging Development
For convenience, we have presented all the stages of Systems, p. 21. Online: www.ids-imaging.com/en/
the workflow as a single flowchart for further process IDS_Whitepaper_2009-3_HDR_EN.pdf
automation as a multiparameter optimization problem. [2] Davis, H. (2010): Creative Close-Ups: Digital Pho-
Obviously there is no unique way to achieve a final tography Tips and Techniques Paperback. Wiley
goal. Photographers always have to adapt the general Publishing Inc., p. 239.
complete workflow to their personal experience, con- [3] Botelho, A. M. (): Early Paper Photographic Process-
tent of the images, available software and peculiarities es: The Calotype. Le Gray’s Waxed Paper Negative
of their job. Moreover, proposed basic structure of the Process. Andrew W. Mellon Fellowship, Advanced
workflow should help the photographers, guide them Residency Program in Photograph Conservation.
in developing specialized workflow for HDR panoramic Online: http://www.notesonphotographs.org/im-
images, close-ups, high-contrast scenes and architecture ages/6/69/Botelho_calotype_for_web.pdf
photography. [4] Michael Goesele, et al. (2005): High Dynamic
As a first step towards full automation of the per- Range Techniques in Graphics: from Acquisition to
sonalized HDR workflow it has been proposed an HDR Display. Eurographics 2005, Tutorial T7.
merging algorithm involving many parameters which [5] Schewe, J; Fraser, B. (2004): A Color Managed Raw
leads to further optimization works. Workflow from Camera to Final Print. Technical
paper. Online: http://www.adobe.com/digitalimag/
pdfs/color_managed_raw_workflow.pdf.
[6] Gurieva, N.; Tkachenko, V. (2012): The Role of
Color Management in Digital Prepress Workflow.
In: International Circular of Graphic Education and
Research, Issue no. 5, pp. 44–51.
[7] Long, B. (2010): Complete Digital Photography.
Fifth edition, Course Tecnology, USA, p. 581.
[8] Grey, T. (2008): Color confidence. The digital
Photographer’s Guide to Color Management. 2nd
edition, Sybex Inc., San Francisco, p. 252.
[9] Homann, J.-P. (2008): Digital Color Management.
Principles and Strategies for the Standardized Print
Production. Berlin X. media. publishing Series,
Springer, p. 212.
[10] Mathworks (2013): MATLAB R2013a Documenta-
tion. Mathworks, Cambridge, MA.

All photos in this paper taken by Natalia Gurieva.


Camera: Canon EOS REBEL T3i, lens: EF-S18–55mm
f/3.5-5.6 IS II.

22
science & technology

Natalia Gurieva

Department of Digital Art


and Management,
Division of Engineering,
University of Guanajuato,
Gto, Salamanca,
Mexico

gurieva.natalia@gmail.com

23

Вам также может понравиться