Code Monkey home page Code Monkey logo

brain-signal-based-navigation-of-intelligent-wheelchair-by-using-visual-perception's Introduction

Brain-signal-based-navigation-of-intelligent-wheelchair-by-using-visual-perception

Final Year Undergraduate Project 2023

Electrical Engineering Department

University Of Moratuwa

Team Members:

  • D.J.Temcious.Fernando
  • Apisaruthan Thanabalasingam
  • Sobikanth Mahendran

Supervisors:

  • Prof. A.G.B.P. Jayasekara, Faculty of Engineering, Department of Electrical Engineering
  • Prof. R.A.R.C. Gopura, Faculty of Engineering, Department of Mechanical Engineering

Problem Statement:

Individuals with spinal cord injuries or paralysis, despite having a healthy brain and vision, face a significant challenge in independently navigating their wheelchairs in known environments. The inability to utilize their hands and legs for wheelchair control hinders their mobility and independence, necessitating constant reliance on others for movement. This dependence on external assistance limits their ability to explore and navigate their surroundings freely. Consequently, there is a pressing need to develop innovative solutions that empower these individuals to regain control over their wheelchair navigation.

Conceptualization:

The process of navigating a wheelchair using Steady State Visual Evoked Potential (SSVEP) brain signals involves a specific mechanism. When a person focuses their attention on a flickering light that operates at a particular frequency, a corresponding frequency signal is generated in their brain. This signal is known as the SSVEP brain signal. The SSVEP brain signal can be extracted from the brain using appropriate techniques and converted into a control signal. In a known environment, if a patient wishes to navigate from their current location to another location, they can provide control commands to the wheelchair using their brain-generated control signal. By utilizing the extracted control signal, the wheelchair can autonomously navigate from the current location to the specified location without the need for external assistance. This eliminates the requirement for another person to physically move the wheelchair, providing the patient with greater independence and freedom of movement.

System Overview:

he system is divided into two major sections: brain signal processing and automatic wheelchair navigation. In the brain signal processing section, a user interface is created, consisting of four flickering frequency squares. Each frequency corresponds to a different location within the known environment. The brain signal is acquired using a brain signal acquisition system. It undergoes preprocessing, feature extraction, and classification techniques to convert it into a control signal. The processed control signal serves as input for the wheelchair navigation section, allowing the wheelchair to autonomously navigate to the specified location within the known environment.

simulationModel_new_final (1) drawio

Results :

In the experiment, brain signals corresponding to frequencies of 6 Hz, 7 Hz, 8 Hz, and 9 Hz were successfully extracted and converted into control signals. The brain signals, obtained in real-time from the subjects, underwent signal processing and classification to accurately identify the frequencies of interest. By mapping these frequencies to specific navigation commands, the control signals were generated. These control signals were then utilized for automatic navigation of a robot model using ROS2 in a simulation environment.

result

Figure 01: Extracted brain signals after preprocessing.

gazeboSimulation

Figure 02: Real-time automatic navigation of the robot in the simulation environment using Gazebo and Rviz platforms based on the detection of the 7 Hz frequency signal, leading to the selection of the corresponding location.

The implementation of this idea directly into the wheelchair itself has not been realized yet, primarily due to hardware limitations. However, an alternative approach has been adopted by activating simple controls on the wheelchair. This is achieved through real-time classification of brain signals, bypassing the traditional joystick and utilizing an Arduino Micro controller. Each classified brain signal is assigned to a specific path, enabling real-time navigation of the wheelchair along predefined routes.

Hardware Model

Impact :

This project holds significant potential to benefit patients worldwide who have a healthy mind and visual abilities. By leveraging the advancements in brain signal processing and control systems, it provides a valuable solution for individuals with limited mobility, specifically those who are unable to use their hands or legs to control a wheelchair.

brain-signal-based-navigation-of-intelligent-wheelchair-by-using-visual-perception's People

Contributors

temci024 avatar

Watchers

 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.