Code Monkey home page Code Monkey logo

cozmo-digital-twin's Introduction

Mixed-Reality Cozmo Robot Digital Twin

The goal of this project is to use augmented reality to visualize the world map and navigation-stack of a small ground-based robot.

  • The primary purpose is to be an educational tool to more intuitively understand the robot's localization capabilities and how it differs from the robot's actual location.
  • Secondarily, this project is meant to work towards developing a platform for third-person (more natural than first-person in many circumstances) remote-control of robots while exploring a foreign environment.

demo1

demo2

Dependencies

This system runs on top of CONIX ARENA for the AR graphics rendering. ARENA is cross-platform and can be run on iOS, Android, web browsers, and any headset supporting WebXR (including HoloLens, Oculus, and Magic Leap).

The robot platform used is the Cozmo. The framework used to communicate with the robot is cozmo-tools by Dave Touretzky.

To install the Python (3.7 or newer) dependencies:

pip3 install -r requirements.txt

This will install ARENA-py, Flask (which is used to bridge the Cozmo process with the ARENA process), and the other required libraries. To install cozmo_fsm and its dependencies, see the instructions in the cozmo-tools repository

Running the demo

To run the demo, first run python3 arena_app.py. The first time you run this, it will request authentication with ARENA using a Google account (either in the terminal or by popping up a browser); you must complete this authentication in order to use ARENA. Once it has completed, you should see a log entry that looks like the following, which will give you the link that you can launch in a web browser to see the scene in ARENA.

Loading: https://arenaxr.org/YOURARENAUSERNAME/cozmo-new, realm=realm
Connecting to the ARENA...

The augmented reality demo runs best on a modern iPhone or iPad (inside of XR Browser) but can also be run on a sufficiently-powerful Android device using Chrome Beta.

Next, print and cut out a "universal marker." The default configuration uses only one marker, tags/tags_1_4.pdf. This should be printed, cut out, folded up, and glued together. The top side is an AprilTag (for the ARENA app to detect) while the front and back sides have ArUco markers (for Cozmo to detect). It allows the AR app to localize itself relative to Cozmo to make it easier to visualize exactly where the robot thinks it is.

hardware

Finally, to run the program on Cozmo, launch a new shell (while arena_app.py is running in its own process). Start with genfsm cozmo_ar.fsm to generate the python FSM file. Launch simple_cli (from cozmo-tools) and run show all (to see the camera feed and particle viewer) and then runfsm("cozmo_ar") to start the example FSM.

fsm

If you are viewing the ARENA scene in a browser or mobile device (and the origin AprilTag on the 1_4 universal marker has been detected), then you should now see a 3D model of Cozmo at its estimated location, as well as a small colored circle which follows your "gaze". Tapping the screen places a waypoint at this colored circle and commands the robot to drive there.

Demo Video

This demo video shows the three modalities of the system. The first segment shows the top-down AR mode, in which the AR system visualizes exactly where the robot thinks it is (the drift/error shows the imprecisions in the particle filter). The mobile device is able to place "waypoints" for Cozmo to drive to. The second segment shows the "immersive" mode, which is used for remote control of the robot from afar. In this mode, the entire world is scaled up, and the user places waypoints using the hand-tracking module on the HoloLens. The third segment is the immersive mode but shown in a browser, from a 3rd-person perspective (the system is designed such that multiple users can view it in AR at once).

https://youtu.be/AXnh5hB7WAg

cozmo-digital-twin's People

Contributors

asinghani avatar

Stargazers

 avatar

Watchers

 avatar  avatar  avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.