Skip to content

This project involves developing a real-time system for recognizing facial gestures and analyzing motor execution signals using EEG data to control prosthetic devices.

Notifications You must be signed in to change notification settings

Pianissimo-3115/EEG-project

Repository files navigation

BCI using EEG

This repository presents an innovative approach to enhance human-machine interaction through real-time facial gesture recognition, integrated with EEG signals for controlling prosthetic devices. The goal is to leverage brain-computer interface (BCI) technologies to create a more intuitive user experience.

Key Features

  • Real-Time Facial Gesture Recognition: Developed a robust model to accurately detect and interpret facial gestures, facilitating the control of assistive technologies.
  • EEG Signal Processing: Utilized advanced machine learning techniques, including Independent Component Analysis (ICA), Principal Component Analysis (PCA), and Linear Discriminant Analysis (LDA), to analyze and extract meaningful motor execution signals from EEG data.
  • Exploration of Motor Imagery: Investigated the potential of motor imagery for enhancing prosthetic control, providing insights into the challenges and opportunities within BCI systems.

Technologies Used

  • Programming Languages: Python, MATLAB

Getting Started

To set up the project locally, clone this repository using:

git clone https://github.com/Pianissimo-3115/EEG-project

For training the model, you can refer to the Google Colab notebook here: Google Colab Notebook (includes eye blink detection).

Feel free to explore the code and contribute to this project! Your feedback and contributions are welcome.

About

This project involves developing a real-time system for recognizing facial gestures and analyzing motor execution signals using EEG data to control prosthetic devices.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages