New version page

DEVELOPMENT OF A REAL-TIME VISION SYSTEM FOR AN AUTONOMOUS MODEL AIRPLANE

This preview shows page 1-2-3-4-5-6-7-52-53-54-55-56-57-58-59-105-106-107-108-109-110-111 out of 111 pages.

View Full Document
View Full Document

End of preview. Want to read all 111 pages?

Upload your study docs or become a GradeBuddy member to access this document.

View Full Document
Unformatted text preview:

DEVELOPMENT OF A REAL-TIME VISION SYSTEM FOR AN AUTONOMOUS MODEL AIRPLANE Danko Antolovic Submitted to the faculty of the University Graduate School in partial fulfillment of the requirements for the degree Master of Science in the Department of Computer Science Indiana University October, 2001ii Accepted by the Graduate Faculty, Indiana University, in partial fulfillment of the requirements for the degree of Master of Science. ________________________________________ Prof. Steven D. Johnson, Ph.D. ________________________________________ Thesis committee Prof. Florin Cutzu, Ph.D. ________________________________________ Prof. Michael E. Gasser, Ph.D. September 21, 2001.iii Copyright © 2001 Danko Antolovic ALL RIGHTS RESERVEDiv ACKNOWLEDGMENTS This thesis is a description of a hardware/software system constructed in a series of Y790 independent study courses in the Department of Computer Science at Indiana University. The work was done under the supervision of Professor Steven D. Johnson, to whom I am grateful for his support, his interest in the progress of the project, and for his insightful and critical comments. Professor Johnson constructed the camera gimbal currently in use. I am happy to have had the help of Mr. Bryce Himebaugh, engineer and pilot extraordinaire. Beside constructing the A/D converter and the servo circuit, Bryce has shared his knowledge and skill through many helpful and enjoyable discussions. I also wish to thank Professor Robert DeVoe of the IU School of Optometry. His expertise on animal vision has helped me establish a broader context for some of the problems encountered in robotic perception. Finally, my thanks go to Laurie, my spouse, for her patience during this, the latest of my academic stints.v ABSTRACT Danko Antolovic DEVELOPMENT OF A REAL-TIME VISION SYSTEM FOR AN AUTONOMOUS MODEL AIRPLANE This thesis describes a real-time embedded vision system capable of tracking two-dimensional objects in a relatively simple (uncluttered) scene, in live video. This vision system is intended as a component of a robotic flight system, used to keep a model airplane in a holding pattern above an object on the ground. The system uses a two-pronged approach to object tracking, taking into account the motion of the scene and the graphic “signature” of the object. The vision system consists of these main components: a motion-detection and filtering ASIC, implemented on FPGAs, a scene-analysis program running on a Motorola ColdFire processor, a dual-port RAM holding the image data, and a digital camera on a motorized gimbal. ________________________________________ ________________________________________ ________________________________________vi CONTENTS Page Acknowledgments iv Abstract v 1. Introduction to the Skeyeball Vision Project 1 1.1 History of the vision system 1 1.2 Structure of this document 4 2. Functional Overview of the Vision System 8 2.1 Vision methodology 8 2.2 Biological parallels 11 3. Project Status 13 3.1 Capabilities and limitations 13 3.2 Measurements of the tracking speed 13 3.3 Summary remarks on the perception problem 16 4. Hardware Architecture 19 4.1 Architectural components 19 4.2 Biomorphic approach to architecture 20 5. Design Summary 24 5.1 XC4010 digital design 24 5.1.1 Front-end FPGA 24 5.1.2 Back-end FPGA 25 5.2 MCF5307 (ColdFire) code 26vii 6. NTSC Video Signal 28 6.1 Even field 28 6.2 Odd field 31 7. Formatting the Image Scan 33 7.1 Vertical formatting 34 7.2 Horizontal formatting 37 7.3 Auxiliary components 39 7.4 Signals 40 8. Digitizing and Thresholding 42 8.1 Black-and-white inversion 42 9. Setting the Threshold Automatically 43 9.1 Heuristic procedure 43 9.2 Threshold calculation on the back-end FPGA 45 9.2.1 Data path 45 9.2.2 Control 47 9.2.3 Signals 47 10. Digital Zoom 51 10.1 Zoom implementation on the FPGA 52 11. Round Robin Procedure for Data Sharing 55 11.1 Status byte 57 11.2 Round robin on the front-end FPGA 59 11.3 Round robin on the MCF5307 processor 63 12. Pixel read/write cycle 64viii 13. Frame Comparison and the Motion Vector 68 13.1 Methodology 68 13.2 Computation 69 13.3 Design components 69 13.4 Signals 70 14. Writing the Motion Vector to DPRAM 72 15. Parameters of the Front-End FPGA 75 16. IRQ5/Parallel Port Complex 80 16.1 IRQ5 handler 80 16.2 Duty-cycle generator 81 16.3 Servo motion feedback 81 16.4 Displacement vector 81 16.5 Saccadic blanking 82 16.6 IRQ/PP circuit on the back-end FPGA 83 17. Auxiliary Features 88 17.1 Serial communication with the MCF5307 88 17.2 Diagnostic data logging 88 17.3 Soft restart of the vision program 88 17.4 Radio controls 89 17.4.1 Radio decoder’s signals 90 18. Feature Recognition on the MCF5307 Processor 91 18.1 Main data structures in the MCF5307 code 93 19. Initialization of the SBC5307 Board 95ix 20. Characteristics of the Camera/Servo System 96 21. Supplementary Diagrams 98 References 1011 1. INTRODUCTION TO THE SKEYEBALL VISION PROJECT Skeyeball is an ongoing project in the Department of Computer Science at Indiana University. It is centered around a radio-controlled model airplane, which is being converted into a semi-autonomous vehicle. Its primary perception is a computer vision system, and it will also be equipped with attitude sensors, digital video and telemetry downlink, and digital command uplink. The objective is to give the airplane the autonomy to fly beyond the line of sight, navigate, and find objects of interest by their visual appearance rather than by location. The objective of the work described here was to build a vision system that follows an object in a relatively simple (uncluttered) scene, in live video. This vision system will be integrated into a larger robotic navigation system used to steer the airplane into holding pattern above a selected feature on the ground. 1.1 History of the vision system The Skeyeball vision was first


Loading Unlocking...
Login

Join to view DEVELOPMENT OF A REAL-TIME VISION SYSTEM FOR AN AUTONOMOUS MODEL AIRPLANE and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view DEVELOPMENT OF A REAL-TIME VISION SYSTEM FOR AN AUTONOMOUS MODEL AIRPLANE and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?