DOC PREVIEW
lecture

This preview shows page 1-2 out of 6 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 6 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 6 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 6 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

Clearly, it requires order n time to compute error(jd) fromthe d(i,j) alone. It is possible, however, to compute error(jd+ 1) from error(jd) in constant time. To accomplish this,we express the calculation of error(jd) in terms of the fol-lowing sums:(1)(2)(3)(4)From these, error(jd) is calculated as follows:(5)Since Eqns 1-4 can be computed recursively, it takes onlya constant number of operations to compute tl, tr, t2l, t2r,and error at jd+1 given their values at jd.We can make the computation faster still by computing,instead of error(jd), some monotonic 1-to-1 function oferror(jd) such as the following:(6)which can be computed recursively using the followingthree equations:errordij,()ml–2j 0=jd∑dij,()mr–2jjd1+=jmax∑+jmax1+------------------------------------------------------------------------------------------------------------------------------=tljddij,()j 0=jd∑=trjddij,()jjd1+=jmax∑=t2ljddij,()[]2j 0=jd∑=t2rjddij,()[]2jjd1+=jmax∑=error jdt2ljd–tljd2jd1+-------------------------–t2rjd–trjd2jmaxjd–--------------------------–+jmax1+-------------------------------------------------------------------------------------------------=fjderror jdjmax1+⋅2dij,()j 0=jmax∑–=(7)(8)(9)This additional speedup results in a roughly 30% increasein cycle rate compared to computing error(jd) directlyfrom Eqn. 5.References[1] Billingsley, J. and Schoenfisch, M. Vision-Guidance ofAgricultural Vehicles. Autonomous Robots, 2, pp. 65-76(1995).[2] Reid, J.F. and Searcy, S.W. An Algorithm for Separat-ing Guidance Information from Row Crop Images. Trans-actions of the ASAE. Nov/Dec 1988 v 31 (6) pp. 1624-1632.[3] Hayashi, M. and Fujii, Y. Automatic Lawn MowerGuidance Using a Vision System. Proceedings of theUSA-Japan Symposium on Flexible Automation, NewYork, NY July 1988.[4] Jahns, Gerhard. Automatic Guidance in Agriculture: AReview. ASAE paper NCR 83-404, St. Joseph. MI (1983).[5] Ross, Bill. A Practical Stereo Vision System. Proceed-ings of IEEE Conference on Computer Vision and PatternRecognition (CVPR 93), New York, NY June 1993.[6] Tomasi, Carlo & Kanade, Takeo. Detection and Track-ing of Point Features. Technical Report CMU-CS 91-132,Carnegie-Mellon University, Pittsburgh, PA April 1991.[7] Krumm, John & Shafer, Steven. Segmenting textured3D surfaces using the space/frequency representation.Spatial Vision, Vol. 8, No. 2, pp. 281-308 (1994).[8] Young, Tzay & Calvert, Thomas. Classification, Esti-mation and Pattern Recognition. American Elsevier Pub-lishing Co., 1974, pp. 138-142.tl0() di0,()=tljdtljd1–dijd,+=tr0() dij,()j 1=jmax∑=trjdtrjd1–dijd,–=fjdtljd–2jd1+----------------------------trjd–2jmaxjd–-----------------------------+=4. ResultsUsing the algorithm described above, we have success-fully harvested approximately one acre of alfalfa autono-mously. This occurred during one week of testing at a sitein Hickory, Pennsylvania, in sparse crop, with curved croplines. (Not all of our testing involved actual harvesting; toavoid cutting all of our available crop, many tests wereconducted by simply driving next to the crop line, usingthe same perception system to track the line.) Our peakspeed while harvesting was approximately 4.5 miles anhour; the average speed was approximately 3 miles anhour.The RGB camera was mounted at the level of the top ofthe cab (about 4 meters high) and about 2 meters to theside, directly over the crop line. This allowed us to controlthe harvester without the need for camera calibration; ourcontrol algorithm was merely based on steering to keepthe cut line in the center of the image. Steering commandswere temporally smoothed over a one second time intervalto prevent jerky motion. We currently have no quantitativemeans for evaluating the precision of the cut; however, weFigure 5: Sample image segmentationsestimate that the crop line was tracked successfully towithin a tolerance of roughly one foot.The images in Figure 5 were processed using a discrimi-nant of red/ green. The images shown are 640 x 480 pix-els; typically, when running in real time, we use only a 400x 300 window in the center of the image. The first is fromEl Centro, California; the second is from our harvestertestbed during a tracking run in Hickory, Pennsylvania.The black dots indicate the location of the computed cropcut boundary.5. Future workWe plan to improve Demeter’s perception system in thefuture on several fronts. First, we are examining improve-ments to this algorithm such as detecting and removingshadow noise and using a custom-built filtered camerainstead of an RGB camera. Second, we plan to integratethe crop line tracker with GPS and inertial sensing in orderto provide additional information about the location of thecrop cut line and also to help with tasks such as end-of-row detection. Finally, we plan to continue our investiga-tion into alternative ways of sensing the crop line, such asby texture segmentation. AcknowledgmentsThe authors would like to thank Nick Collela and RedWhittaker for their technical input; Regis Hoffman andTom Pilarski for their coding assistance; and Kerien Fitz-patrick, Henning Pangels, and Simon Peffers for assis-tance with the Demeter harvester experiments. This workwas supported by NASA under contract number NAGW-3903. Appendix: Algorithm derivationLet jd be the rightmost column to the left of the discontinu-ity in the step function, and suppose that the column num-bers vary from 0 to jmax. Then the ml, mr, and error termscan be calculated as functions of jd as follows (these aredefined for jd from 0 to jmax - 1):mljddij,()j 0=jd∑jd1+-----------------------------=mrjddij,()jjd1+=jmax∑jmaxjd–----------------------------------------=between these regions is a single-valued function of therow coordinate, and that this boundary does not intersecteither the left or right edge of the image. This boundaryfunction is represented explicitly by the set of pixelswhich lie on it, so that nothing further is assumed aboutthe shape of the boundary.Figure 2 shows a representable segmentation, and Figure 3shows some non-representable segmentations. This repre-sentation was


lecture

Download lecture
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view lecture and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view lecture 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?