Eye Ball Tracking with OpenCV

Topics Covered

Overview

Eye Ball Tracking with OpenCV refers to a computer vision project that involves using the OpenCV library to track the movement of a person's eye. This technology can be used in a variety of applications, such as eye-controlled interfaces for disabled individuals, marketing research studies to track user attention, and improving driver safety systems. By analyzing the movement of the eye, the project can provide valuable insights into human behavior and interaction with technology.

What are We Building?

Introduction

Eye tracking technology has gained popularity in recent years due to its numerous applications. In this project, we will be using OpenCV, a popular computer vision library, to build an eye tracking system that can detect and track the movement of a person's eye.

final output

Pre-requisites

To successfully complete this project, you will need a basic understanding of the following topics:

Python programming language

  • Basic understanding of data types, variables, operators, and control structures in Python.
  • Familiarity with functions and modules in Python.

Image processing and manipulation

  • Understanding of the basic concepts of image processing, such as pixels, color spaces, and image filters.
  • Knowledge of image manipulation techniques such as resizing, cropping, and rotation.

Computer vision concepts

  • Understanding of computer vision concepts such as object detection, feature extraction, and image segmentation.
  • Familiarity with machine learning concepts such as supervised and unsupervised learning, as well as deep learning.

OpenCV library

  • Familiarity with the basic functions of OpenCV such as reading and displaying images, and manipulating pixel values.
  • Understanding of the image processing and computer vision functions available in the OpenCV library, such as object detection, feature extraction, and image filtering.

What is OpenCV?

OpenCV is an open-source library that is widely used in computer vision applications. It provides various tools and functions to process and analyze images and videos. OpenCV is written in C++, but it also has bindings for Python, making it accessible to a wider audience.

How are We Going to Build this?

We will be using OpenCV and Python to build the eye tracking system. The general approach we will take involves the following steps:

  • Capture the video stream from the webcam.
  • Pre-process the video frames to improve the quality and reduce noise.
  • Detect the face in the frame using a pre-trained face detection model.
  • Isolate the eyes from the face region.
  • Apply image processing techniques to detect the pupil and iris of the eyes.
  • Calculate the gaze direction using the position of the pupil in the eye.

We will walk through each step in detail and provide code snippets to help you implement the system.

Final Output

  • A real-time video feed from the webcam, with a superimposed circle highlighting the detected iris.
  • The direction of eye gaze, displayed as an arrow or other visual cue on the video feed.
  • The eye gaze direction may be indicated in different ways depending on the implementation, such as by changing the color of the circle or arrow, or by displaying a message.
  • Overall, the output is a real-time visual representation of the user's eye gaze, which can be used for various purposes such as user interaction or monitoring.

final output

Requirements

To build the eye tracking system with OpenCV, you will need to install the following libraries and modules:

  • OpenCV: This is the main library we will be using to process the video frames and detect the eyes. You can install it using pip: pip install opencv-python.
  • NumPy: This is a numerical computing library that we will use for various operations on the image data. You can install it using pip: pip install numpy.
  • Dlib: This is a C++ library that provides a facial landmark detector, which we will use to detect the location of the eyes in the face. You can install it using pip: pip install dlib.
  • imutils: This is a set of convenience functions to make it easier to work with OpenCV. You can install it using pip: pip install imutils.
  • PyAutoGUI: This is a Python library that allows you to control the mouse and keyboard, which we will use to move the cursor based on the gaze direction. You can install it using pip: pip install pyautogui.

Once you have installed these libraries, you should be able to run the code for the eye tracking system.

Implementation of Eye Ball Tracking with OpenCV

Here are the step-wise instructions for implementing Eye Ball Tracking with OpenCV Step 1. Import the necessary libraries Step 2. Load the pre-trained Haar Cascade classifier for detecting the face. Step 3. Load the pre-trained Haar Cascade classifier for detecting the eyes. Step 4. Initialize the video capture object to capture frames from the webcam. Step 5. Start a while loop to continuously capture frames from the webcam. Step 6. Convert the captured frame to grayscale for better image processing. Step 7. Detect the face in the grayscale image using the face cascade classifier. Step 8. Detect the eyes within the detected face region using the eye cascade classifier. Step 9. Use the Hough Circle Transform to detect the iris within the eye region. Step 10. Draw a circle around the detected iris. Step 11. Determine the center of the iris and the center of the eye region. Step 12. Calculate the distance between the centers to determine the direction of eye gaze. Step 13. Display the processed frame with the iris circle and direction of gaze.

Here's an incremental approach to implementing the Eye Ball Tracking with OpenCV project:

Imports

We need to import the required libraries and modules for this project. Here are the imports needed for this project:

Sample Data

You can download the Haar cascades classifiers for face and eye detection from the OpenCV GitHub repository using the following links:

  • Haar cascades for face detection can be found here.
  • Haar cascades for eye detection can be found here.

You can simply download these XML files and save them in the same directory as your Python script. Then, you can use them to detect faces and eyes in your video feed.

Implementation

The implementation of Eye Ball Tracking with OpenCV involves the following steps:

  • Preprocessing
  • Pupil Detection
  • Gaze Estimation

Load the Haar Cascade Classifiers

Capture the video feed

To track the eye movements in real-time, you need to capture the video feed from your webcam. Here's how you can do that:

Detect the face and eyes

Now, you can start processing each frame of the video feed. First, you need to detect the face in the frame using the face cascade classifier:

Once you have detected the face, you can detect the eyes within the region of interest using the eye cascade classifier:

Calculate the eye gaze direction

To calculate the eye gaze direction, you can use the position of the pupils in the eyes. You can find the position of the pupils by thresholding the image and finding the center of the resulting blob. Here's how you can do that:

Display the output

Finally, you can display the output frame with the eye gaze direction:

This will display the video feed with rectangles drawn around the detected eyes and the pupil center marked with a blue circle.

After you are done processing the video feed, you should release the resources:

This will release the video capture and destroy all the OpenCV windows.

OUTPUT

output

Conclusion

  • Eye Ball Tracking with OpenCV can be a useful tool for gaze analysis, human-computer interaction, and assistive technologies.
  • The project uses Haar cascades classifiers for face and eye detection, which are pre-trained machine learning models that can detect faces and eyes in an image or video feed.
  • The project processes the video feed frame by frame, detecting faces and eyes, and then finding the pupil center using image processing techniques such as thresholding and contour detection.
  • The project displays the output video feed with rectangles drawn around the detected eyes and the pupil center marked with a blue circle.
  • The accuracy of the eye tracking system can be affected by factors such as lighting conditions, head movement, and eye shape.
  • The project can be improved by using more advanced eye tracking techniques such as feature-based tracking or machine learning-based methods.