Skip to content

Latest commit

 

History

History
21 lines (15 loc) · 1.49 KB

README.md

File metadata and controls

21 lines (15 loc) · 1.49 KB

BCI using EEG

This repository presents an innovative approach to enhance human-machine interaction through real-time facial gesture recognition, integrated with EEG signals for controlling prosthetic devices. The goal is to leverage brain-computer interface (BCI) technologies to create a more intuitive user experience.

Key Features

  • Real-Time Facial Gesture Recognition: Developed a robust model to accurately detect and interpret facial gestures, facilitating the control of assistive technologies.
  • EEG Signal Processing: Utilized advanced machine learning techniques, including Independent Component Analysis (ICA), Principal Component Analysis (PCA), and Linear Discriminant Analysis (LDA), to analyze and extract meaningful motor execution signals from EEG data.
  • Exploration of Motor Imagery: Investigated the potential of motor imagery for enhancing prosthetic control, providing insights into the challenges and opportunities within BCI systems.

Technologies Used

  • Programming Languages: Python, MATLAB

Getting Started

To set up the project locally, clone this repository using:

git clone https://github.com/Pianissimo-3115/EEG-project

For training the model, you can refer to the Google Colab notebook here: Google Colab Notebook (includes eye blink detection).

Feel free to explore the code and contribute to this project! Your feedback and contributions are welcome.