Code Monkey home page Code Monkey logo

camilabga / maestro Goto Github PK

View Code? Open in Web Editor NEW
1.0 2.0 0.0 66.16 MB

The project consists in a system designed to teach and correct the movement of a blind student that aims to learn a movement. With a OpenCV algorithm and a wearable, we made it possible to these students have a better perception to wheather they are making the gesture correctly or not.

MATLAB 0.63% Makefile 0.82% C++ 59.80% QMake 0.23% M 0.08% C 7.34% nesC 24.15% Prolog 6.96%

maestro's Introduction

Repository made to support the project Maestro

Introduction

The project consists in a system designed to teach and correct the movement of a blind student that aims to learn a movement. With a OpenCV algorithm and a wearable, we made it possible to these students have a better perception to wheather they are making the gesture correctly or not.

The correction of the movement is passed to the user with an haptic feedback in real time while he/she is executing the gesture.

This work proposes a human gesture evaluation with visual detection and haptic feedback as an additional tool. The purpose is to utilize gesture monitoring with a visual detection alongside markers to execute a gesture following and then, send a haptic feedback to the user.

The project evolved to a different direction. To filter the gesture, we use a machine learning algorithm. At first, a SOM neural network was tested, but then the problem fitted better with a network called GWR.

Installing the requisites

This project was initially executed on a Linux environment with the distribution Ubuntu 16.04. The OpenCV version was 3.4.1 and the ArUco version was 3.0.10. The machine had an Intel Core i5 7th generation and 8GB of RAM Memory.

Configuring the Wearable

  1. Regular motors activation
  2. Alternated motors activation

Regular motors activation

to do

Alternated motors activation

to do

Running the code

The main code is in the ../structure directory. To execute it you must, first choose the mode of execution by editing the main.cpp file and the project that contains the NN algorithm (GWR).

It has 4 modes:

  1. Recording a new Gesture
  2. Running the data through the GWR
  3. Running the correction algorithm without the Wearable
  4. Running the correction algorithm with the Wearable

Recording a new Gesture

Open the file main.cpp and edit it to look like this:

Vision vision(argc, argv);
    Trajectory trajectory("../data/square.csv");
    
    trajectory.unnormalize(Point(FRAME_WIDTH/2, FRAME_HEIGHT/2));

    trajectory.saveMovement("../data/new_movement.csv");
    vision.record("../../Videos/random_test.avi");

    while(1){
        vision.calculateTagCenter();
        if (vision.isTargetOn()) {
            trajectory.savePoint(vision.getCenter());
            vision.saveVideo();
        }

        vision.show();
        vision.saveVideo();
    }

    trajectory.endSaving();

Running the data through the GWR

to do

Running the correction algorithm without the Wearable

Open the file main.cpp and edit it to look like this:

    Vision vision(argc, argv);
    Trajectory trajectory("../data/square.csv");
    trajectory.unnormalize(Point(FRAME_WIDTH/2, FRAME_HEIGHT/2));
    trajectory.saveMovement("../data/random_test.csv");

    while(1){
        vision.calculateTagCenter();
        vision.drawTrajectory(trajectory, trajectory.getCurrentPointId());
        if (vision.isTargetOn()) {
            trajectory.setNextPoint0(vision.getCenter());
            vision.drawError(vision.getCenter(), trajectory.getCurrentPoint());
            trajectory.savePoint(vision.getCenter());
        }

        vision.show();
    }

    trajectory.endSaving();

Running the correction algorithm with the Wearable

Open the file main.cpp and edit it to look like this:

    Vision vision(argc, argv);
    Weareable weareable;
    Trajectory trajectory("../data/square.csv");

    trajectory.unnormalize(Point(FRAME_WIDTH/2, FRAME_HEIGHT/2));
    trajectory.saveMovement("../data/random_test.csv");

    weareable.setIP((char*)"10.6.4.107");
    weareable.start();

    while(1){
        vision.calculateTagCenter();
        vision.drawTrajectory(trajectory, trajectory.getCurrentPointId());
        if (vision.isTargetOn()) {
            trajectory.setNextPoint0(vision.getCenter());
            vision.drawError(vision.getCenter(), trajectory.getCurrentPoint());

            weareable.send(trajectory.getError(vision.getCenter()));

            trajectory.savePoint(vision.getCenter());
        }

        vision.show();
    }

    trajectory.endSaving();

Article(s) Published

Human Gesture Evaluation with Visual Detection and Haptic Feedback

Abstract: Learning a gesture pertains to learning a expression of motion by a human, involving the hands, arms, face, head, and/or body. In this work, we propose to employ haptic feedback as an additional tool in the gesture following/evaluation loop. Accordingly, the user wears a haptic wearable device in the form of a bracelet which vibrates according to the trajectory error. Our research hypothesis is then to investigate whether such haptic device aids the user in correcting his movement in relation to the prerecorded trajectory.

Acess in: https://dl.acm.org/citation.cfm?id=3243104

maestro's People

Contributors

camilabga avatar

Stargazers

Junior Nascimento avatar

Watchers

James Cloos avatar  avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.