Академический Документы
Профессиональный Документы
Культура Документы
Mahmood R. Golzarian1
1
Agricultural machinery research and design center University of South Australia, Adelaide, Australia
Mahmood.golzarian@postgrads.unisa.edu.au
Abstract
The main part of a machine vision system is to distinguish the object of interest (in the case of this project, a plant) from non-important regions (we refer it as background). Distinguishing the objects of interest is simplified if the high contrast between the objects of interest and background is created. The objective of this study is to find the best color index by which the algorithm is able to create the highest contrast between plant and non-plant regions. For this study, images were taken of varying numbers of wheat plants under several growth stages in a loamy sand soil and in diffused light condition. Three regions were predefined on the images; plant, pebble, and soil regions. Regions for plants, soil and pebbles were separately cropped within each image, aiming to provide a pooled representation for each object in each image. For each image, 13 mean color index were computed for each the three regions of interest (plant, soil, and pebble). The results of applying Analysis of variance (ANOVA) and consequently t-tests indicated that modified Excessive Green Index (MEGI) can potentially make the highest contrast between plant and non-plant regions rather than other color indices.
Key word:
Computer vision system, image analysis, color plant segmentation, color indices.
1. Introduction
Agriculture may considerably benefit from machinevision technology in applications such as automatic control of farm equipment, automatic fruit sorting and plant/weed detection for monitoring crop establishment Monitoring crop establishment can directly help grain growers and farm advisers improve grain yield, manage crops more effectively (e.g. for weeds), and achieve a more efficient use of resources [1]. Plant identification and counting plants are two underlying factors in monitoring crop establishment. In conventional methods, the plant identification and counting are done by an operator using quadrats (Fig. 1). This process is a physically demanding, tedious and time-consuming task. Operator fatigue and boredom are recognized problems of the conventional method, which lead to inaccuracies likely to increase over time, and limits the number of sampling relied upon.
Figure 1. The use of a quadrat to monitor crop establishment (Photos courtesy of footscary City College)
Computer vision potentially may provide solution for monitoring crop establishment. Computer vision is the study and application of methods which allow computers to "understand" image content or the content of multidimensional data in general. The term "understand" here means that specific information is being extracted from the image data for a specific purpose: either for presenting it to a human operator, or for controlling some process such as an industry robot, an agricultural implement or an autonomous vehicle [2]. If a machine vision system is to offer a competitive advantage over the conventional method, it must have clearly defined objects of interest to enhance the step of
recognition. Therefore, there is a need to develop a reliable process able to achieve clearly segmented objects of interest. Segmentation is a process in which several image processing techniques and analyses are applied to images to separate the background area from the object under consideration (technically referred to as area or object of interest). This study investigated different processing methods using color features to segment plant species from a soil background in a selection of images. The objective of this study is to find the best color index by which the algorithm is able to create the highest contrast between plant and non-plant objects.
green and blue inputs of a color monitor, produce a color image on the screen (Fig. 2).
Figure 2. Three R, G and B component images form RGB color image (after ref [4])
Each dimension has 256 levels, numbered 0 to 255. In total, 2563 different colors can be represented by (R, G, B), e.g., black is shown as (0, 0, 0) while white is shown as (255, 255, 255). To make the color independent of changes in lighting intensity, a process called normalization is applied uniformly across the spectral distribution [5]. Using the equation (1) normalized red, green and blue are obtained from the three components of RGB space.
g= G B R ,b = ,r = G+R+B G+R+B G+R+B
(1)
Regarding the definition of r, g and b, the following relationship among these three parameters can be obtained: (2)
r +g + b =1
The simple (R-G) index is often used, tending to be negative for plants, while positive for soil. Normalized difference between red and green channels (r-g) was used by authors of ref [6] for enhancing the contrast between vegetation and background. The Excessive Green Index (EGI) transforms a 24 bit RGB source image to 256 gray-level in which plant pixels appear brighter than soil [7,8, 9]. This index is defined as follows: (3) where r, g and b are normalized r, g and b relatively. Following (2) and (3), EGI can be re-written as a linear function of g: (4)
EGI = 2g - r - b
EGI = 3g - 1
The-modified excessive green index (MEGI) is a variation of EGI proposed by some researchers [10] whereby (5) MEGI = EGI, for g<r or g<b MEGI=0 The normalized difference index (NDI) further increases the contrast between plant and soil, and [10, 11] formulated NDI as follows:
NDI =
green and blue respectively. Hue was also used by some researcher for plant-
RGB vlaues
background segmentation [7,8]. In Hue, Saturation and Intensity (HSI) color model, Hue is the angle measured from the red axis to the point of interest and it describes pure color (Fig. 3). The relationship between hue and rgb values is formulated in (7).
300 250 200 150 100 50 0 Plant Soil Class Pebble Red Green Blue
(7)
Figure 4. Comparison of non-normalized Red, Green and Blue values for three groups (plant, pebble, soil background). The non-normalized RG factor would only be used to differentiate plant area from soil background. However when bluish pebble areas exist, this factor could not separate plant from pebbles. For this color index, mean value for plant regions was -36.077 with standard error (SE) of 0.644 whereas mean value of -33.402 with SE of 0.767 for pebble-related regions, n=60. (Fig. 5)
20
For those pixels whose blue value is greater than green value, and greater than red value, then hue is modified to hue=360-hue. This condition was set to eliminate the possibility of laying a point on the right or the left of the red axis (reference line) for which there is a big difference for their hue values, while in fact they are almost the same color. To solve this problem all hue values greater than 300 will be pushed to hue-360, thus a color of 345 hue would be -15 in this modified hue system.
-40
-60 Class
Figure 5. The comparison among RG values for three categories (plant, soil and pebble)
0.25 0.20 0.15 0.10 0.05 0.00 Plant -0.05 Class Soil Pebble
the results for bigger range of plant types and different soil and other non-plant backgrounds.
EGI value
Acknowledgments
The author is particularly grateful to Ferdowsi University of Mashhad, Iran for currently providing a research scholarship. The help of Dr. Desbiolles for proof-reading and editing the manuscript is also gratefully acknowledged.
References:
Figure 6. The comparison among EGI values for three categories (plant, soil and pebble)
0.42
1.
GRDC. 1999. Crop establishment, Grains Research and Development Corporation 18 p. viewed on 21 May 2006 at http://topcrop.grdc.com.au/publications/cmg/check2.htm G. Kormann and W. Flohr, Development of a constituent sensor for agricultural applications, CIGR world congress, Germany, 2006 A. McAndrew, Introduction to digital image processing with MATLAB, Thomson Course Technology Boston, MA, 2004 R.C. Gonzalez, R.E. Woods and S.L. Eddins, Digital Image processing using MATLAB Pearson Prentice Hall, the United States, NJ, 2004 Cheng, H.D., Jiang X.H., Sun Y. and Wang J. 2001. Colour image segmentation: advances and prospects. Pattern Recognition, 34, pp. 2259-2281 Blasco, J., Benlloch, J. V., Agusti, M. and Molto E. 1998. Machine vision for precise control of weeds. In: SPIE 98: Proceeding of the International Society for Optical Engineering, pp. 336-343. Wobbecke, D.M., Meyer G.E., Bargen K.V., Mortensen, D. A. 1995. Colour indices for weed identification under various soil, residue, and lighting conditions. Transaction of ASAE, 38(1) 259-269. Lamm, R.D., Slaughter, D.C., and Giles D.K. 2002. Precision weed control system for cotton. Transactions of ASAE, 45(1) 231-238. L. Tian, D.C. Slaughter, and R. F. Norris, Outdoor field machine vision identification of tomato seedlings for automated weed control, Transaction of ASAE, Vol.40, no. 6, pp 1764-1768, 1997
2.
0.39 g value
0.36
3.
0.33
4.
0.30 Plant Soil Class Pebble
5. Figure 7. The comparison among g values for three regions of interest (plant, soil and pebble) 6.
0.25 0.20 MEGI value 0.15 0.10 0.05 0.00 Plant -0.05 Class Soil Pebble
7.
8.
9. Figure 8. The comparison among g values for three regions of interest (plant, soil and pebble)
10. Mao, W., Wang, Y. and Wang Y. 2003. Real-time detection of between-row weeds using machine vision. ASAE Annual International Meeting 11. A.J. Prez, F. Lpez, J.V. Benlloch and S. Christensen, Color and shape analysis techniques for weed detection in cereal fields Computers and electronics in agriculture, Vol(25):197-212, 2000. 12. D.C .Montgomery, G.C.Runger, N. F Hubele, Engineering statistics, Wiley, New York, 2004