The EyeTeractive project aims to develop an accessible and efficient gaze-tracking system by leveraging paraconsistent logic and artificial intelligence. This system translates eye movements into computational commands, providing inclusive solutions for users with physical or motor limitations. By utilizing cost-effective hardware like standard webcams and robust algorithms, the project overcomes the challenges of traditional gaze-tracking technologies.
- Paraconsistent Logic: Handles uncertainties and inconsistencies in gaze-tracking data caused by variable lighting and image quality.
- AI-Powered Processing: Uses the ResNet architecture and Dlib library for accurate iris detection and gaze direction estimation.
- Accessibility: Eliminates the need for expensive hardware like Tobii Eye Trackers, enabling compatibility with low-cost devices.
- Real-Time Interaction: Processes and maps gaze movements into actionable commands in real time.
Traditional gaze-tracking solutions often require proprietary hardware that is expensive and not universally accessible. The EyeTeractive project addresses these issues by:
- Reducing dependency on specialized hardware.
- Enhancing robustness in challenging environments (e.g., variable lighting).
- Promoting digital inclusion by making the technology accessible to diverse user groups.
- Develop an efficient model to translate gaze positions into computational commands using paraconsistent logic and artificial intelligence.
- Validate the model's performance under adverse conditions, such as lighting and viewing angles.
- Ensure scalability and applicability in low-cost devices.
-
Data Collection:
- 7,000+ frames were collected from 27 videos of volunteers at the Federal University of Rondônia.
- Participants followed a moving circle on a screen to capture gaze data.
-
Model Development:
- Utilized the Dlib library for facial landmarks and iris region detection.
- Trained the model using the ResNet architecture for robust classification and detection tasks.
- Avoided data augmentation to ensure real-world scenario validity.
-
Paraconsistent Logic Integration:
- Applied paraconsistent logic to process and translate gaze data, addressing inconsistencies effectively.
- Calculated precise gaze directions using horizontal and vertical displacement metrics.
-
Calculating the Horizontal Axis:
This image illustrates the calculations used to determine the horizontal displacement of the iris.
-
Eye Position Interpretation:
This graph visualizes the processed interpretation of the eye's position.
-
Adapted Paraconsistent Logic:
This chart represents the application of adapted paraconsistent logic to interpret gaze directions.
-
Iris Mask Extraction:
This image shows the segmented mask of the eye region extracted using the Dlib library.
- Biometric analysis.
- Eye-based interface control for individuals with motor impairments.
- Behavioral studies using gaze tracking.
- Python 3.8+
- Dlib
- OpenCV
- TensorFlow or PyTorch (for ResNet implementation)
- Clone the repository:
git clone https://github.com/user/EyeTeractive.git
- Install Dependencies:
pip install -r requirements.txt
- Run the application:
python main.py
- Jáder Louis de Souza Gonçalves
- Samih Santos de Oliveira
- Prof. Dr. Lucas Marques da Cunha