Pupil Detection and Tracking SystemOutlineMotivationGoalsBackgroundSlide 6Difference of the two ImagesDesign MethodologySystem DefinitionUML DiagramsUse Case DiagramClass DiagramY-Chart of the ProjectModel of ComputationSimulink versus Ptolemy (Virgil)Simulink Model of ComputationSimulink ModelVerifying the FunctionalitySystem Architecture on Y-ChartSystem ArchitectureMapping on Y-ChartMapping and HW/SW PartitionSlide 23CommunicationDesign ChoicesSlide 26Implementation from Behavioral ModelImplementationVerificationSystem DemoConclusionsAcknowledgements1Pupil Detection and Tracking SystemLior ZimetSean KaoEE 249 ProjectMentors: Dr. Arnon AmirYoshi WatanabeDecember 5, 2002EE 249 Project Presentation Lior Zimet and Sean Kao2OutlineIntroduction and GoalsDesign MethodologyModel of ComputationMapping and ImplementationVerificationConclusionsDecember 5, 2002EE 249 Project Presentation Lior Zimet and Sean Kao3MotivationHuman-computer interfaces are becoming more importantNew interfaces may benefit from knowledge of the location of the user’s eyes–Auto-stereoscopic displays–Virtual Reality interfaces–Facial recognition systems–Eye gaze trackingDecember 5, 2002EE 249 Project Presentation Lior Zimet and Sean Kao4GoalsExercise design process from functional specification to implementation and verificationDevelop an embedded system that will find the two-dimensional location of a user’s pupilsApply various methodologies we have learnedUse a heterogeneous collection of various components in a real-time environmentDecember 5, 2002EE 249 Project Presentation Lior Zimet and Sean Kao5BackgroundHuman pupils may be found using two infrared light sourcesAn on-axis light source will give the “red-eye” effect.December 5, 2002EE 249 Project Presentation Lior Zimet and Sean Kao6BackgroundAn off-axis light source will give a dark-pupil effectWe can synchronize the two light sources with the capturing deviceDecember 5, 2002EE 249 Project Presentation Lior Zimet and Sean Kao7Difference of the two ImagesDecember 5, 2002EE 249 Project Presentation Lior Zimet and Sean Kao8Design MethodologyBegin with a definition of the system with illustrationsMake a formal specification using the Unified Modeling LanguageDescribe the system using a behavioral modelExplore architectural spaceMap functionality into chosen architectureVerify implementationDecember 5, 2002EE 249 Project Presentation Lior Zimet and Sean Kao9System DefinitionTake in an image illuminated with two different light sourcesFind the pupils in the imageCalculate the position of the pupilsOutput the coordinatesSensor&Lightcontrol VideostreamhandlingImage ProcessingPupilDetectionSensorTrackingOutputInterfaceUserInterface2DLocationDecember 5, 2002EE 249 Project Presentation Lior Zimet and Sean Kao10UML DiagramsUML is used for formally describing a systemUse-case diagram shows functions of the system without implying how it is doneClass diagram shows what functional blocks are used Sequence diagrams show how the use-cases are executedDecember 5, 2002EE 249 Project Presentation Lior Zimet and Sean Kao11Use Case DiagramImage TargetUserReceiverIlluminate Target**Capture Image**«extends»SynchronizeIllumination with Image CaptureActivate System**Modify Settings**Calculate PupilPosition«uses»«uses»«uses»«extends»Format Data**«extends»System«invariant»{On-axis and off-axis LEDs may not be lit at the same time.}Use Case: 1. User activates the system.2. User may modify system characteristics.3. System synchronizes illumination with image capturing.4. System calculates pupil position based on caputed images.5. System outputs position based on output format.December 5, 2002EE 249 Project Presentation Lior Zimet and Sean Kao12Class Diagram+Activate()+Deactivate()-Location : bool = {on-axis,off-axis}«implementation class»IR Light Source+Activat_OnAxis_Light()+Activat_OffAxis_Light()-FrameType : bool = {Odd,Even}«implementation class»Synchronizer-Get_Image()-Send_Pixel_Data()-Send_Next_Frame_Signal()-Send_Next_Line_Signa()-Send_Next_Pixel_Signal()+Initialize()-Brightness : int-Resolution : int«implementation class»Sensor-Store_Pixel()-Advance_Line()-Advance_Frame()-Get_Pixel()-Current_Location : long«implementation class»Memory Handler+LocationX : int+LocationY : int-Value : byte«struct»Pixel+Size : long«struct»Frame+Size : int«struct»Line+Send_Tick()-Frequency«implementation class»System Clock-Subtract2Lines()-Threshold_Line()-Connect_Components()-Track_Components()-Store_Pupil_Locations()+Send_Pupil_Locations()«implementation class»Image Processor-LocationX : int-LocationY : int«struct»Pupil Locations+Display_Locations()«implementation class»Display+Initialize()«implementation class»Controller2..* 1111111111111211 n1n31n11111December 5, 2002EE 249 Project Presentation Lior Zimet and Sean Kao13Y-Chart of the ProjectSystemBehaviorSystemArchitectureMappingRefineImplementationof SystemFirst, we’ll describe the system behavior with an appropriate model of computationDecember 5, 2002EE 249 Project Presentation Lior Zimet and Sean Kao14Model of ComputationProcesses large amounts of data in a similar wayDataflow model is the most appropriateOur system is not control-centricBut some parts of the system are easier to describe using sequential algorithmsMixed models of computationDecember 5, 2002EE 249 Project Presentation Lior Zimet and Sean Kao15Simulink versus Ptolemy (Virgil)Ptolemy has the ability to mix models of computation and has support for synchronous dataflowBut Virgil does not have a simple way to integrate sequential algorithmsSimulink has extensive support for sequential algorithms (Matlab)But lacks clearly defined semantics, combination of dataflow and discrete-eventDecember 5, 2002EE 249 Project Presentation Lior Zimet and Sean Kao16Simulink Model of ComputationUse one clock to synchronize the systemNo implicit definitions how many tokens are generated or usedWe use counters and synchronous signals to determine how many “tokens” are on edgesDecember 5, 2002EE 249 Project Presentation Lior Zimet and Sean Kao17Simulink ModelProperties of objects on a lineConnect objects in different linesUpdate properties of known objects in a frameCalculate (X,Y) coordinates from those propertiesDisplay the coordinatesImage capture model (source)FIFOSubtraction
View Full Document