Escolar Documentos
Profissional Documentos
Cultura Documentos
UP-GAP
ENESAD
Quétigny, France
F. Truchetet
LE2I
Université de Bourgogne
Le Creusot, France
ABSTRACT
Keywords: Spectral modeling, BRDF, RGB color space, field modeling, weed
discrimination.
INTRODUCTION
Angle calculation
The spectral response obtained for each of the field pixel does not take the
light or the action of a filter into account. These two steps, essential to create a
picture that simulate the reality are performed using simple calculation processes.
Filter simulation
The different RGB color spaces, which are specified by their primary color
and a white point (often D65). They principally differ in term of gamut, the set of
color that can be represented by the color space. Two color spaces are mostly
used: Adobe RGB and sRGB (Adobe RGB will be used in further examples).
X, Y and Z values are, then, transformed into an RGB color space using a
matrix dependent to the RGB color space. A lot of resources (matrices,
conversion formulae…) and details about these operations can be found in Bruce
Lindbloom website (Lindbloom).
RESULTS
RGB values are now available for every field pixel, their values will be
modified by the integration involved by the world to picture transform.
Nevertheless, the lack of multi-angular spectrum implies the impossibility to
create pictures that simulate a field correctly. As a consequence, the presented
field Fig. 4 is obtained by inappropriate PROSPECT and SOILSPECT
parameters: the data used to estimate the parameters were not multi-angular.
ACKNOWLEDGEMENTS
The authors are grateful for the financial support provided by Tecnoma
(trademark of the EXEL Industries group: http://www.tecnoma.com) and the
Regional Council of Burgundy.
In this paper, we present the bases to add a spectral layer to the spatial field
modeling previously developed by the authors. This approach uses BRDF models
that have been validated, one is dedicated to simulate vegetation reflectance
(PROSPECT) and the other to simulate soil reflectance (SOILSPECT). These
models imply the variation of a reflectance spectrum, depending on the viewing
and lightning angles and allow the characterization of a plant (or a soil) by a set of
parameters linked to physical parameters. The ability to reproduce optical filter
effects is also considered allowing to fit a larger number of experimental devices.
Resulting pictures are expressed into RGB color space, with the ability to chose a
particular one. The transformation from spectra to RGB is done in a two-step
process, a first to integrate the spectrum into the CIE XYZ space and a second to
transform the XYZ tristimuli into an RGB tristimuli.
The data acquisition could be one of the future work to complete this spectral
approach, then, a comparison between real and virtual scenes would be very
interesting to validate the modeling. An other interesting point is the development
of a crop/weed discrimination algorithm that uses both spatial and spectral
information to break the limitations involved by the spatial approach on its own.
REFERENCES
Bossu, J., Gée, C., Guillemin, J. P. and F., T. (2006). Development of methods
based on double Hough transform and Gabor filtering to discriminate crop
and weeds in agronomic images. . SPIE 18th Annual Symposium
Electronic Imaging Science and technology, San Jose, USA,15-19jan.
Hahn, F. and Muir, A. Y. (1994). "Spectral sensing for crops and weed
discrimination." Acta Hort. (ISHS) 372: 179-186.
Jones, G., Gée, C. and Truchetet, F. (2007b). Simulation of agronomic images for
an automatic evaluation of crop/weed discrimination algorithms. Eight
International Conference on Quality Control by Artificial Vision, Le
Creusot – FRANCE, 23-25 May, SPIE. European Conference on Precision
Agriculture, Skiathos, Greece, 3-6 June.
Vioix, J., Sliwa, T. and Gée, C. (2006). An automatic inter and intra-row weed
detection in agronomic images. . EurAgEng, Germany, 3-7 September.
Vioix, J. B., Douzals, J. P., Truchetet, F., Assemat, L. and Guillemin, J. P. (2002).
"Spatial and spectral methods for weed detection and localization. ."
Eurasip Journal on Applied Signal Processing 7: 679-685.