Code Monkey home page Code Monkey logo

ihmcrobotics / ihmc-open-robotics-software Goto Github PK

View Code? Open in Web Editor NEW
247.0 37.0 86.0 967.96 MB

Robotics software featuring legged locomotion algorithms and a momentum-based controller core with optimization. Supporting software for world-class robots including humanoids, running birds, exoskeletons, mechs and more.

Home Page: https://robots.ihmc.us

MATLAB 0.33% Shell 0.05% Java 96.73% CMake 0.09% C++ 1.59% C 0.46% HTML 0.07% TeX 0.02% Python 0.46% CSS 0.01% PostScript 0.01% Dockerfile 0.04% GLSL 0.14% Batchfile 0.01%
robotics java ihmc humanoid-robot walking whole-body-control

ihmc-open-robotics-software's Introduction

IHMC Open Robotics Software

Robots

Licensing

All of the software in IHMC Open Robotics Software is licensed under the Apache 2.0 license.

Developing with IHMC Open Robotics Software from source

IHMC Open Robotics Software uses the Gradle build system, and requires JDK 17. We recommend working in IntelliJ.

Our developers are a mix of Windows and Linux users. We officially support:

  • Windows 10/11
  • Ubuntu 20.04+

Arch Linux will work fine for development. Other GNU/Linux distros will likely work, however largely untested. macOS is partially supported, but incomplete at this time.

To get set up, use our public Confluence pages: https://ihmcrobotics.atlassian.net/wiki/spaces/PUBLIC/overview

Other IHMC Libraries

IHMC Open Robotics Software both depends on and is depended on by many other IHMC Robotics Libraries. A small sampling of our other software includes:

You can find all of our other repositories as well as the ones above at https://github.com/ihmcrobotics

ROS APIs

We provide a ROS 2 API for many of the core components in our software stack. You can find the .msg definitions for use in your own projects in this project's ihmc-interfaces folder.

We have ROS 1 support via the ROS 2 ros1_bridge package. You can find the ROS 1 message definitions and instructions on using the ROS 1 Bridge here: https://github.com/ihmcrobotics/ihmc_msgs

Building .jars

IHMC Open Robotics Software is pre-configured for generating Maven publications. You can publish directly from the source code right in to your local Maven repository, e.g. the $HOME/.m2 directory. These builds will be tagged with a build "version" of "LOCAL" instead of an incrementing version number.

An example workflow for developing against a local clone of the software:

  1. Clone IHMC Open Robotics Software
  2. Make modifications
  3. Publish to your local $HOME/.m2 repository

To publish jars to your local Maven repository:

$ cd /path/to/ihmc-open-robotics-software
$ gradle publishAll -PcompositeSearchHeight=0

To depend on the jars in your local Maven repository:

In this example we'll have a compile-time dependency of the locally built Simulation Construction Set project. In the build.gradle of the project you wish to have link against Simulation Construction Set:

repositories {
  mavenLocal()
  <your other repositories>
}

dependencies {
  api("us.ihmc:simulation-construction-set:LOCAL") {
     changing = true
  }
}

Creating a project

To create a project that uses IHMC Open Robotics Software, your project hierarchy needs to take a particular form.

First be sure you have completed the section above titled "Clone repositories".

Next, create your project folder:

mkdir -p src/ihmc/my-project-a

Follow the project setup tutorial at https://github.com/ihmcrobotics/ihmc-build#quick-project-setup.

Your directory structure should now look something like:

src/ihmc
├── my-project-a
│   └── build.gradle.kts
│   └── gradle.properties
│   └── settings.gradle.kts
├── my-project-b
│   └── ...
├── ihmc-open-robotics-software
│   └── atlas
│   └── common-walking-control-modules
│   └── ...
├── my-multi-project-c
│   └── subproject-a
│   │  └── build.gradle.kts
│   └── subproject-b
│      └── build.gradle.kts
├── ...
├── build.gradle.kts
├── gradle.properties
└── settings.gradle.kts

If this is set up correctly, you will have applied the ihmc-build plugin and use the dependency resolver methods exposed by the build extension. Alternatively, you can manually identify dependencies on projects using the normal Gradle syntax for project dependencies. A sample build.gradle dependency block:

/* Normal Gradle way */
dependencies {
  api(project(":ihmc-open-robotics-software:ihmc-java-toolkit"))
  testApi(project(":ihmc-open-robotics-software:ihmc-java-toolkit-test"))
}

/* ihmc-build way */
mainDependencies {
  api("us.ihmc:ihmc-java-toolkit:source")
}
testDependencies {
  api("us.ihmc:ihmc-java-toolkit-test:source")
}

Maintainers

ihmc-open-robotics-software's People

Contributors

aprvs avatar bhavyanshm avatar btshrewsbury avatar calvertdw avatar christiandebuys avatar cmschmid avatar ds58 avatar eaozone avatar edilee avatar egalbally avatar james-p-foster avatar jcarff avatar jerrypratt avatar jespersmith avatar kylc avatar lessthanoptimal avatar luigipenco93 avatar mahopkins avatar mj-fl avatar nathaniel0728 avatar neyssette avatar perrymacmurray avatar pneuhaus avatar potatopeeler3000 avatar rjgriffin42 avatar stephenmcc avatar sylvainbertrand avatar tkoolen avatar tomasztb avatar wmrifenb avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

ihmc-open-robotics-software's Issues

Remove yaw drift

We have now removed the imu with the magnetometers and are back using the original version (model 15). The orientation is now stable and we have the state estimation performance we saw in training. Happy days!

That still includes a 1-2 degree yaw bias per minute, which we really want to remove. I see some ihmc code modules that do bias estimation:

  1. does this module work? I remember Peter talking about this for Atlas
  2. it is enabled or can we enable it?
  3. if no to 1 and 2 can we feed back a bias estimate (rather than just the position correction) to your state estimator?

The joint: leftHipYaw has not been registered.

Running the new-api-beta1 on a valkyrie
"Be a robot"
roslaunch val_deploy val_controller_manager.launch model_file:=model/urdf/valkyrie_C_no_hands.urdf

roslaunch ihmc_valkyrie_ros val_wholebody_control_robot.launch

Controller partially starts, is selectable from remoteValkyrieVisualizer, however cannot select HighLevelState -> walking (it does nothing)

[ INFO] [1462985624.653557850]: Starting JVM with arguments: -Djava.class.path=ValkyrieController.jar -XX:+UseSerialGC -Xmx10g -Xms10g -XX:NewSize=8g -XX:MaxNewSize=8g -XX:CompileThreshold=1000 -verbosegc -Djava.library.path=lib/
Starting Java VM from path /home/val/valkyrie
Started Java VM: success
[ INFO] [1462985624.949677928]: Partying hard with max memory of: 3720871936
[ INFO] [1462985625.163270628]: Loading robot model from: 'models/val_description/sdf/valkyrie_C.sdf'
[ INFO] [1462985627.890139359]: INFO: Looking for network parameters in network parameters file at /home/val/.ihmc/IHMCNetworkParameters.ini
[ INFO] [1462985627.890958536]: INFO: Found Network parameters file at /home/val/.ihmc/IHMCNetworkParameters.ini
[ INFO] [1462985627.892397782]: INFO: Looking for network parameters in environment variables
[ INFO] [1462985627.892870819]: INFO: Environment variables will override entries in the network parameters file.
Attaching native thread 24639 with priority 45 to JVM
[ INFO] [1462985628.565184643]: Looking for forceSensorDefinition leftAnkleRoll
[ INFO] [1462985628.569506447]: Looking for forceSensorDefinition rightAnkleRoll
[ INFO] [1462985630.401027353]: PelvisRotationalStateUpdater: More than 1 IMU sensor, using only the first one: pelvis_pelvisRearImu
[ INFO] [1462985630.473485884]: PelvisIMUBasedLinearStateCalculator: More than 1 IMU sensor, using only the first one: pelvis_pelvisRearImu
Attaching native thread 24685 with priority 45 to JVM
[ INFO] [1462985634.281517648]: Announcing logging session on: name:eth0 (eth0)
[ INFO] [1462985634.326534100]: Trying port 56572
Attaching native thread 24971 with priority 40 to JVM
Attaching native thread 24973 with priority 94 to JVM
[ INFO] [1462985645.759891008]: Setting estimator thread affinity to processor 1
[ERROR] [1462985645.892412199]: Exception in thread "Thread-6" java.lang.RuntimeException: java.lang.RuntimeException: The joint: leftHipYaw has not been registered.
[ERROR] [1462985645.892545157]: at us.ihmc.wholeBodyController.DRCControllerThread.run(DRCControllerThread.java:347)
[ERROR] [1462985645.892646076]: at us.ihmc.wholeBodyController.concurrent.MultiThreadedRealTimeRobotController$MultiThreadedRobotControlElementRunner.run(MultiThreadedRealTimeRobotController.java:79)
[ERROR] [1462985645.892726211]: at us.ihmc.realtime.RealtimeThread.runFromNative(RealtimeThread.java:133)
[ERROR] [1462985645.892911464]: Caused by: java.lang.RuntimeException: The joint: leftHipYaw has not been registered.
[ERROR] [1462985645.893013333]: at us.ihmc.commonWalkingControlModules.controllerCore.command.lowLevel.LowLevelOneDoFJointDesiredDataHolder.throwJointNotRegisteredException(LowLevelOneDoFJointDesiredDataHolder.java:409)
[ERROR] [1462985645.893109047]: at us.ihmc.commonWalkingControlModules.controllerCore.command.lowLevel.LowLevelOneDoFJointDesiredDataHolder.setDesiredJointTorque(LowLevelOneDoFJointDesiredDataHolder.java:176)
[ERROR] [1462985645.893217997]: at us.ihmc.commonWalkingControlModules.controllerCore.command.lowLevel.LowLevelOneDoFJointDesiredDataHolder.setDesiredTorqueFromJoints(LowLevelOneDoFJointDesiredDataHolder.java:123)
[ERROR] [1462985645.893306241]: at us.ihmc.wholeBodyController.diagnostics.DiagnosticsWhenHangingController.doControl(DiagnosticsWhenHangingController.java:320)
[ERROR] [1462985645.893392394]: at us.ihmc.wholeBodyController.diagnostics.DiagnosticsWhenHangingController.doAction(DiagnosticsWhenHangingController.java:860)
[ERROR] [1462985645.893472100]: at us.ihmc.robotics.stateMachines.GenericStateMachine.doAction(GenericStateMachine.java:114)
[ERROR] [1462985645.893562863]: at us.ihmc.commonWalkingControlModules.highLevelHumanoidControl.HighLevelHumanoidControllerManager.doControl(HighLevelHumanoidControllerManager.java:172)
[ERROR] [1462985645.893646710]: at us.ihmc.simulationconstructionset.robotController.ModularRobotController.doControl(ModularRobotController.java:20)
[ERROR] [1462985645.893724373]: at us.ihmc.wholeBodyController.DRCControllerThread.run(DRCControllerThread.java:339)
[ERROR] [1462985645.893799006]: ... 2 more
[ INFO] [1462985647.791642668]: Accepted client: java.nio.channels.SocketChannel[connected local=/10.185.0.30:56572 remote=/10.185.0.5:45825]

Valkyrie Simulator does not obey by joint limits / does not match URDF joint limits

Originally reported by: Wolf Merkt (Bitbucket: wxmerkt, GitHub: Unknown)


Values returned from the ValkyrieROSAPISimulator are often outside the joint limits.

Hence, it'd be good if either the URDFs reflect the true joint limits or the simulator did not go over joint limits.

We are seeing this issue with the most recent, frozen SVN version. Joints that we noticed to deviate from the joint limits in the URDF files are e.g. upperNeckPitch and neckYaw.

Could you perhaps comment on how to work around this issue in the mean time?

Affected joints:

  • Neck Joints (yaw, pitch)
  • Ankle Joints (pitch)

Thanks,
Wolfgang
(Edinburgh)


ROS_MASTER_URI not correctly parsed

Not explicitly setting the IP address in the ROS_MASTER_URI causes the IHMC API to be unable to hear incoming command messages. (Does not effect outgoing sensor messages)

explicitly setting the IP in the URI currently works:

  • export

the default URI that is set when sourcing ROS does not work:

IHMC Log Files should log stdout/stderr

IHMCLogger should log stdout/stderr so we can get at output from runs when robots are not being managed using IHMC Mission Control (currently, the only robot managed by IHMC Mission Control is Atlas).

There is an internal JIRA ticket on the IHMC JIRA for this, but as JIRA isn't public yet and we don't have a nice way of bridging our JIRA with GitHub Issues I wanted to make this issue as well.

IHMC JIRA key is ROSI-381.

PointCloud2 from Multisense SL in SCS

Would it be possible to add support for PointCloud2 msg type from the Multisense SL in the SCS simulator?

(Moved from ihmc_valkyrie_ros as suggested by Doug)

add artificial noise to simulated state estimate

@dljsjr or @SylvainBertrand: is there a good away of adding drift to the state estimator (in sim)?
The idea would be to change the x, y, z and yaw of the floating base while walking.
I'm guessing you might have a way of turning it on for a unit test.

I have been able to kind of get something to work by adding to rootJointPosition in:

PelvisLinearStateUpdater.computePositionFromMergingMeasurements()

This causes the estimator to jump forward a few cm on occasion but isn't working right:

      FramePoint point = new FramePoint();
      point.set(0.003, -0.003, 0);
      rootJointPosition.add(point);

(Thanks a lot for adding back in the localization module @dljsjr , it works just fine!)

OutOfMemoryError: Direct Buffer Memory when starting val_wholebody_control_robot.launch

[ERROR] [1465575403.536153554]: java.lang.OutOfMemoryError: Direct buffer memory
[ERROR] [1465575403.536453683]: at java.nio.Bits.reserveMemory(Bits.java:658)
[ERROR] [1465575403.536588590]: at java.nio.DirectByteBuffer.(DirectByteBuffer.java:123)
[ERROR] [1465575403.536665159]: at java.nio.ByteBuffer.allocateDirect(ByteBuffer.java:306)
[ERROR] [1465575403.536824277]: at us.ihmc.multicastLogDataProtocol.MultiClientStreamingDataTCPServer$ClientHandler$1.newInstance(MultiClientStreamingDataTCPServer.java:194)
[ERROR] [1465575403.536917967]: at us.ihmc.multicastLogDataProtocol.MultiClientStreamingDataTCPServer$ClientHandler$1.newInstance(MultiClientStreamingDataTCPServer.java:189)
[ERROR] [1465575403.536991759]: at us.ihmc.concurrent.ConcurrentRingBuffer.(ConcurrentRingBuffer.java:90)
[ERROR] [1465575403.537073445]: at us.ihmc.multicastLogDataProtocol.MultiClientStreamingDataTCPServer$ClientHandler.(MultiClientStreamingDataTCPServer.java:188)
[ERROR] [1465575403.538797140]: at us.ihmc.multicastLogDataProtocol.MultiClientStreamingDataTCPServer$ClientHandler.(MultiClientStreamingDataTCPServer.java:172)
[ERROR] [1465575403.538896573]: at us.ihmc.multicastLogDataProtocol.MultiClientStreamingDataTCPServer.(MultiClientStreamingDataTCPServer.java:44)
[ERROR] [1465575403.538984550]: at us.ihmc.multicastLogDataProtocol.MultiClientStreamingDataTCPServer.(MultiClientStreamingDataTCPServer.java:52)
[ERROR] [1465575403.539060087]: at us.ihmc.robotDataCommunication.YoVariableProducer.start(YoVariableProducer.java:89)
[ERROR] [1465575403.540361913]: at us.ihmc.robotDataCommunication.YoVariableServer.start(YoVariableServer.java:95)
[ERROR] [1465575403.540690742]: at us.ihmc.valkyrieRosControl.ValkyrieRosControlController.init(ValkyrieRosControlController.java:256)
[ERROR] [1465575403.540781285]: at us.ihmc.rosControl.IHMCRosControlJavaBridge.initFromNative(IHMCRosControlJavaBridge.java:103)
[ERROR] [1465575403.540938449]: Initializing controller 'ihmc_valkyrie_control_java_bridge' failed

We have also been getting this error when running the controller for Gazebo. Any ideas?

NullPointerException in JointPrivilegedConfigurationHandler

In JointPrivilegedConfigurationHandler.java the function submitPrivilegedConfigurationCommand there is a nullpointer exception, I have traced it to ~line 159
int jointIndex = jointIndices.get(joint).intValue();

I added a print

if(jointIndices.get(joint) == null)
System.err.println("JOINT INDICES IS NULL " + joint.getName());

Which prints out leftForearmYaw. Is leftForearmYaw even a joint we are supposed to be able to move?

Full exception
[ERROR] [1466027433.206373074]: Exception in thread "Thread-6" java.lang.RuntimeException: java.lang.NullPointerException
[ERROR] [1466027433.206484247]: at us.ihmc.wholeBodyController.DRCControllerThread.run(DRCControllerThread.java:347)
[ERROR] [1466027433.206575785]: at us.ihmc.wholeBodyController.concurrent.MultiThreadedRealTimeRobotController$MultiThreadedRobotControlElementRunner.run(MultiThreadedRealTimeRobotController.java:79)
[ERROR] [1466027433.206648428]: at us.ihmc.realtime.RealtimeThread.runFromNative(RealtimeThread.java:133)
[ERROR] [1466027433.206813253]: Caused by: java.lang.NullPointerException
[ERROR] [1466027433.206926639]: at us.ihmc.commonWalkingControlModules.inverseKinematics.JointPrivilegedConfigurationHandler.submitPrivilegedConfigurationCommand(JointPrivilegedConfigurationHandler.java:164)
[ERROR] [1466027433.207017935]: at us.ihmc.commonWalkingControlModules.momentumBasedController.optimization.MotionQPInputCalculator.updatePrivilegedConfiguration(MotionQPInputCalculator.java:131)
[ERROR] [1466027433.207114014]: at us.ihmc.commonWalkingControlModules.momentumBasedController.optimization.InverseDynamicsOptimizationControlModule.submitPrivilegedConfigurationCommand(InverseDynamicsOptimizationControlModule.java:276)
[ERROR] [1466027433.207202409]: at us.ihmc.commonWalkingControlModules.controllerCore.WholeBodyInverseDynamicsSolver.submitInverseDynamicsCommandList(WholeBodyInverseDynamicsSolver.java:216)
[ERROR] [1466027433.207282803]: at us.ihmc.commonWalkingControlModules.controllerCore.WholeBodyControllerCore.submitControllerCoreCommand(WholeBodyControllerCore.java:87)
[ERROR] [1466027433.207365684]: at us.ihmc.commonWalkingControlModules.highLevelHumanoidControl.HighLevelHumanoidControllerManager.doControl(HighLevelHumanoidControllerManager.java:174)
[ERROR] [1466027433.207441665]: at us.ihmc.simulationconstructionset.robotController.ModularRobotController.doControl(ModularRobotController.java:20)
[ERROR] [1466027433.207512246]: at us.ihmc.wholeBodyController.DRCControllerThread.run(DRCControllerThread.java:339)

have high level behavior of standing be reported when fully finished walking

the high level state of the controller is reported as an integer in:
ihmc_ros/valkyrie/output/behavior

when walking this value is reported as WALKING (i.e. 4).
When the robot puts it foot on the ground for the final step in a walking plan this immediately changes to STANDING (i.e. 3). But the robot is still moving: it settles its foot on the ground and sways into the standing position which takes about 3 seconds.

@dljsjr would it be possible to have this state transition happen after the robot has come to a standstill?
Accounting for this is confusing for autonomy (and different to how BDI behaved)

Specs of the "New API"

Hi @dljsjr,
Currently, speaking of "current/legacy" and "new" API is super high-level for us so it is hard to figure out what and how the impact will be.

Hence, would it be possible for us to receive a specification document for the new API? We'd like to see

  • what will change,
  • how it will be designed (what endpoints, what messages, where do they feed to in the controller, what will we get back as feedback),
  • and also have a note on any breaking/behaviour changes.

Also, if you could give us a heads-up about what changes with the controller during the rewrite (on a functional level) that would help immensely. Getting our hands on a spec document now will also enable us to give feedback early on to save engineer hours later regarding feature/amendment requests.

Thanks!

Valkyrie Forearm Control

Hi!
We'd like to know what's the status of controlling the position-controlled forearm (not finger!) joints in the whole body controller - is that already supported and executed?

Thanks!

Valkyrie starting procedure is simplified

Only one command should be needed from the user to:

  • Perform the calibration of the joint torque offsets, and foot six axis force sensors
  • Start the walking controller as soon as the robot touches the ground.

SCS Simulator on Multiple Monitors - Hybrid Laptop

Originally reported by: Velin Dimitrov (Bitbucket: velinddimitrov, GitHub: velinddimitrov)


The SCS simulator appears to fail when launching from Eclipse when a hybrid laptop (W530 in this case). It appears the error is related to parsing the connecting displays, especially when they are connected to digital ports related to the discrete graphics card.

SCS works fine when laptop is not docked using built-in display. Using NVidia-352 drivers on Ubuntu 14.04.

Appears related to this: LWJGL/lwjgl#112

Maybe a new version of LWJGL will fix this, not sure?

Console output from Eclipse:
INFO: Loading network parameters from /home/vdimitrov/workspace/IHMCOpenRoboticsSoftware/Valkyrie/IHMCNetworkParameters.ini
Exception in thread "jME3 Main" java.lang.ExceptionInInitializerError
at org.lwjgl.opengl.Pbuffer.getCapabilities(Pbuffer.java:259)
at com.jme3.system.lwjgl.LwjglOffscreenBuffer.initInThread(LwjglOffscreenBuffer.java:58)
at com.jme3.system.lwjgl.LwjglOffscreenBuffer.run(LwjglOffscreenBuffer.java:154)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.ArrayIndexOutOfBoundsException: 0
at org.lwjgl.opengl.LinuxDisplay.getAvailableDisplayModes(LinuxDisplay.java:951)
at org.lwjgl.opengl.LinuxDisplay.init(LinuxDisplay.java:738)
at org.lwjgl.opengl.Display.(Display.java:138)
... 4 more

xrandr output:
Screen 0: minimum 8 x 8, current 3840 x 1200, maximum 16384 x 16384
LVDS-0 disconnected (normal left inverted right x axis y axis)
DP-0 disconnected (normal left inverted right x axis y axis)
DP-1 disconnected (normal left inverted right x axis y axis)
DP-2 connected 1920x1200+1920+0 (normal left inverted right x axis y axis) 518mm x 324mm
1920x1200 60.0_+
1920x1080 60.0
1680x1050 60.0
1600x1200 60.0
1280x1024 60.0
1280x960 60.0
1024x768 60.0
800x600 60.3
640x480 59.9
DP-3 disconnected (normal left inverted right x axis y axis)
DP-4 connected primary 1920x1200+0+0 (normal left inverted right x axis y axis) 546mm x 352mm
1920x1200 60.0_+
1920x1080 60.0 60.0 59.9 50.0 60.1 60.0 50.0
1680x1050 60.0
1600x1200 60.0
1440x900 59.9
1366x768 59.8
1280x1024 75.0 60.0
1280x960 60.0
1280x720 60.0 59.9 50.0
1152x864 75.0
1024x768 75.0 70.1 60.0
800x600 75.0 72.2 60.3 56.2
720x576 50.0
720x480 59.9
640x480 75.0 72.8 59.9 59.9
DP-5 disconnected (normal left inverted right x axis y axis)
LVDS1 connected
1920x1080 60.0 + 59.9 50.0
1680x1050 60.0 59.9
1600x1024 60.2
1400x1050 60.0
1280x1024 60.0
1440x900 59.9
1280x960 60.0
1360x768 59.8 60.0
1152x864 60.0
1024x768 60.0
800x600 60.3 56.2
640x480 59.9
VGA1 disconnected
VIRTUAL1 disconnected
1680x1050 (0x31c) 146.2MHz
h: width 1680 start 1784 end 1960 total 2240 skew 0 clock 65.3KHz
v: height 1050 start 1053 end 1059 total 1089 clock 60.0Hz
1280x1024 (0x31e) 108.0MHz
h: width 1280 start 1328 end 1440 total 1688 skew 0 clock 64.0KHz
v: height 1024 start 1025 end 1028 total 1066 clock 60.0Hz
1440x900 (0x32b) 106.5MHz
h: width 1440 start 1520 end 1672 total 1904 skew 0 clock 55.9KHz
v: height 900 start 903 end 909 total 934 clock 59.9Hz
1280x960 (0x31f) 108.0MHz
h: width 1280 start 1376 end 1488 total 1800 skew 0 clock 60.0KHz
v: height 960 start 961 end 964 total 1000 clock 60.0Hz
1024x768 (0x320) 65.0MHz
h: width 1024 start 1048 end 1184 total 1344 skew 0 clock 48.4KHz
v: height 768 start 771 end 777 total 806 clock 60.0Hz
800x600 (0x321) 40.0MHz
h: width 800 start 840 end 968 total 1056 skew 0 clock 37.9KHz
v: height 600 start 601 end 605 total 628 clock 60.3Hz
800x600 (0x336) 36.0MHz
h: width 800 start 824 end 896 total 1024 skew 0 clock 35.2KHz
v: height 600 start 601 end 603 total 625 clock 56.2Hz
640x480 (0x322) 25.2MHz
h: width 640 start 656 end 752 total 800 skew 0 clock 31.5KHz
v: height 480 start 490 end 492 total 525 clock 59.9Hz


Support hand state and control in sim

Originally reported by: Maurice Fallon (Bitbucket: mauricefallon, GitHub: mauricefallon)


Publishing the Valkyrie hand state (i.e. joint angles) and control of the hands (open/close/position) isn't currently exposed in simulation. Could this be added?

Some hand state integration of Atlas used to exist in AtlasObstacleInterfaceDemo, but that has now been removed I believe.

( @dljsjr suggested you wanted to make the sim interface more complete)


Valkyrie Simulator does not obey by joint limits / does not match URDF joint limits

Originally reported by: Wolf Merkt (Bitbucket: wxmerkt, GitHub: Unknown)


Values returned from the ValkyrieROSAPISimulator are often outside the joint limits.

Hence, it'd be good if either the URDFs reflect the true joint limits or the simulator did not go over joint limits.

We are seeing this issue with the most recent, frozen SVN version. Joints that we noticed to deviate from the joint limits in the URDF files are e.g. upperNeckPitch and neckYaw.

Could you perhaps comment on how to work around this issue in the mean time?

Affected joints:

  • Neck Joints (yaw, pitch)
  • Ankle Joints (pitch)

Thanks,
Wolfgang
(Edinburgh)


Arm controller with separate PID gains for each joint

Originally reported by: Vladimir Ivan (Bitbucket: vivan, GitHub: vivan)


The PID controllers for the arm joints currently share one set of PID gains between them.
These gains are set up in ManipulationControlModule.java on line 98:

YoPIDGains jointspaceControlGains = armControllerParameters.createJointspaceControlGains(registry);

This severely limits accuracy/response of the arms. Each joint drives a link(age) with different inertia properties. Sharing the PID gains restricts setting these to conservative values.

We (the Edinburgh team) would like to tune these gains and run a couple of experiments while we are at JSC. This is high on our priority list, since we can't execute any manipulation/reaching plans without good tracking performance.

I'm happy to help with making this happen.


Integration of laser-based localization

@dljsjr: I am introducing a laser-based module to correct the drift in Valkyrie's state estimation. I am subscribing /ihmc_ros/valkyrie/output/robot_pose message generated by the IHMC state estimator, computing a correction using ICP-based point clouds registration and publishing a corrected pose.

It would be great if the state estimator could read this message and update the state. Would you be happy to re-introduce the interface for this in the IHMC code base? I refer to the topics /ihmc_ros/localization/pelvis_odom_pose_correction and /ihmc_ros/localization/pelvis_pose_correction which do not seem to be active anymore.

Robot might be stomping when taking a step

When taking a step robot seems to bring down the foot faster than it brings it up, doesn't seem to be enough force to trigger an estop or fault (so it might be normal), but looks like it is using more force than needed.

To replicate:
Either in sim or robot, send a footstep data list with step commands. Test in data file attached was run on the simulator, sending first a footstep in place, then several with forward movement, Z is set to the value of the current foot position (from tf tree) in all cases.

log file: https://www.dropbox.com/s/9aju9kic1g04df7/maybeStomping.data.data.gz?dl=0

Max torque deflection

For the torque deflection model the controller uses very different upper limits on the deflection:
ValkyrieStateEstimatorParameters -> jointAngleMaxDeflection = 0.25
AtlasStateEstimatorParameters -> jointAngleMaxDeflection = 0.1

0.1 (5 degrees) is pretty huge, but 0.25 (15degrees) incredibly high. I don't think it would ever exceed this limit without about 2000Nm of force. I don't think Val can generate that ;)

@SylvainBertrand increased it from 0.2 to 0.25 in late January. Any idea why?

@manuelli

Issues with WholeBodyTrajectoryRosMessage in new API

We've updated nearly all of our interface to the new API. Footstep plans and individual arm trajectories are fine as well as all the incoming sensor signals. (Yeah!)

But I'm having lots of issues with WholeBodyTrajectoryRosMessage (which we plan to use a lot). These issues are likely to be trival. I'm guessing the bits of information which make up the message are not being properly set:

ISSUE ONE
Its seems that I cannot provide a portion of WholeBodyTrajectoryRosMessage message e.g. arm/pelvis trajectories but not hand and foot poses (which would contridict). I was able to set execution_mode=QUEUE for the hand/feet trajs which forces the controller to ignore these parts of the message (it does print a warning). But this isnt what you intend, I think. I also explored changing the unique_id bit to 0 for these parts, which I think is not being checked for these unwanted components.

In the old API you could provide an empty list and the controller would ignore the components I don't want to provide.

ISSUE TWO
Pelvis tracking seems to be completely wrong for the WholeBodyTrajectoryRosMessage. It looks like the robot is shifting onto the left foot - which I didn't tell it to do. It looks like its standing on one foot. (Occasionally the controller crashed and mentioned the Flamingo stance)

So that you can easily reproduce, I've created a 50 line python script which will send a simple WholeBodyTrajectoryRosMessage. I expect this script will raise up the arms and shift the pelvis to the back & left by 6cm:
https://www.dropbox.com/sh/2ahizs03s8fwshl/AAD04x4huNFp4e9AWAeY9Yd5a?dl=0

But currently the pelvis is moving in the wrong direction so as to stand on one foot:
https://www.dropbox.com/s/zzwkmdro263kqon/simple_command_whole_body.m4v?dl=0

There are also simple pelvis and arm scripts which do work exactly as expected and in theory the WholeBodyTrajectoryRosMessage should just be a combination of them

We would really appreciate very simple python scripts to demo what should be the input for various messages. I remember such scripts from DRCSim being a huge time saver.

I see that you are busy with Unit A, and expect it to take time for you to test this.

Valkyrie feet contact points are evaluated.

I'm wondering if the contact points for valkyrie's feet have changed - or possibly the urdf containing them? I notice the contact points are not at the corners of the feet, they are now a few cm further forward. Is this intentional?

This is what they look like:
val_contact_points

And Atlas for comparison has them right at the back of the feet:
atlas_contact_points

FYI: @jlack1987 @dljsjr

whole_body_trajectory message with two trajectory points close together in time blows up the controller

Originally reported by: Vladimir Ivan (Bitbucket: vivan, GitHub: vivan)


When a whole_body_trajectory contains two trajectory points close in time (<0.2s), the jumps or even blows up (in the simulation).

This behaviour can be reproduced in the simulator:

  1. Run ValkyrieROSSimulator.java
  2. Publish the control message from the attached ROS bag file.

The smaller the time between the two trajectory points, the bigger the effect.


Simulated head control doesn't match reality

Originally reported by: Maurice Fallon (Bitbucket: mauricefallon, GitHub: mauricefallon)


We would like to implement automated head tracking ('Look At Target'). Currently the head orientation control on the robot (QP) and the simulator (joint position) is implemented differently.

It's not clear if the same HeadOrientationPacket commands in simulation and reality achieve the same look direction. Do they?

This issue was mentioned here:
https://bitbucket.org/ihmcrobotics/ihmc_ros/issues/53/headorientationpacketmessage-documentation

BTW: @btshrewsbury 's comment on that ticket mentions that the problem is the redundant joint (UPPER_NECK_PITCH), but you could just remove it from the QP and not support looking up?


Valkyrie Simulator does not obey by joint limits / does not match URDF joint limits

Originally reported by: Wolf Merkt (Bitbucket: wxmerkt, GitHub: Unknown)


Values returned from the ValkyrieROSAPISimulator are often outside the joint limits.

Hence, it'd be good if either the URDFs reflect the true joint limits or the simulator did not go over joint limits.

We are seeing this issue with the most recent, frozen SVN version. Joints that we noticed to deviate from the joint limits in the URDF files are e.g. upperNeckPitch and neckYaw.

Could you perhaps comment on how to work around this issue in the mean time?

Affected joints:

  • Neck Joints (yaw, pitch)
  • Ankle Joints (pitch)

Thanks,
Wolfgang
(Edinburgh)


ihmc_msgs dependency problem

This package does not define dependency on geometry_msgs properly.

Fix:
package.xml (add following lines):

  <build_depend>geometry_msgs</build_depend>
  <run_depend>geometry_msgs</run_depend>

CMakeLists.txt (change following lines):

catkin_package(
#  INCLUDE_DIRS include
#  LIBRARIES ihmc_msgs
  CATKIN_DEPENDS std_msgs geometry_msgs
#  DEPENDS system_lib
)

Left Elbow Tracking

We are carrying out some simple experiments to asses the tracking performance with Valkyrie. We are having issues with the left elbow joint. Basically we cannot achieve angles greater than -1.4 rad. The issue shows up when we start the robot (the robot is not able to achieve the symmetric pose we command) and when we command arm motions. We cannot straighten the elbow even if we move that joint only. The joint is not mechanically stuck because we can move it manually when the robot is off. When the robot is on we can push the forearm but it immediately goes back to a wrong pose.

This is the wrong pose when we start the robot
wrong_starting_pose

When we tried to move just the left elbow joint we got this
leftelbowplot

Alex, our control guy, suggested that there could be an offset on the desired joint acceleration. We are still investigating because it seems other joints can be affected (likely the shoulder pitch). We'll try to post more details here.
Can you help to trace back to the origin of this problem?

A small log can be found here
http://terminator.robots.inf.ed.ac.uk/public/logs/20160509_100353_left_elbow.zip

WholeBodyTrajectoryMessage is validated before getting processed

Something that has been forgotten during the writing of the new API is to validate the WholeBodyTrajectoryMessage and especially the messages it contains.
This is done for other messages and has the advantage of preventing a controller crash and also providing information to the user on why the message is not valid.

Bug in RigidBody.java

Originally reported by: Twan Koolen (Bitbucket: tkoolen, GitHub: tkoolen)


Somebody added the CoM offset as a separate thing in us.ihmc.robotics.screwTheory .RigidBody.java. This information already exists in its RigidBodyInertia. Two issues:

  • the way it is computed in the constructor on line 41 is wrong. It should be computed from the RigidBodyInertia's crossPart (which is probably minus com offset * mass in the RigidBodyInertia's body frame, for historical reasons; check this)
  • there is a setCoMOffset method, which only changes the comOffset member, but doesn't update the inertia member.

By the way, I'm not very happy that you didn't preserve the revision history for the entire IHMCUtilities project (and presumably others as well).


received plan feedback message

In the old API there was a message called LastReceivedMessage.msg. This was published whenever the controller received a message from us.
It used to be published on /ihmc_ros/valkyrie/output/last_received_message

We used it for a few things (to check that messages were not being dropped, to sequence together autonomous actions and to test plan tracking)

It seems to have have been removed around about 19d71e3

@dljsjr : Could you add it back in?

(this is @mauricefallon)

Streaming head orientation messages blocks controller

Originally reported by: Maurice Fallon (Bitbucket: mauricefallon, GitHub: mauricefallon)


I observed that if HeadOrientationPacketMessage is streamed (e.g. 100Hz) to simulated Valkyrie in SCS, that the head doesn't move at all.

(The idea is that the camera is tracking something and streams HeadOrientationPacketMessage messages, causing the head to look at the target.)

I guess that the new messages over write the gains of the controller?

Sending a single individual command (and then stopping) is fine.


Minimal pose changes lead to qdd_d close to 0, freakout

This is a follow up on the Valkyrie controller freakout issue reported last night by @mauricefallon and @jlack1987. I was able to reliably recreate a similar situation in simulation.

Observation: if sending a second trajectory that is only minimally different from the first close to joint limits, the qdd_d values are all 0 (or very close to it), and the [robot freaks out/simulator stops].

The console output is:

Loading robot model from: 'models/val_description/sdf/valkyrie_sim.sdf'
[INFO] (NetworkParameters.java:34): Loading network parameters from /home/wxm/.ihmc/IHMCNetworkParameters.ini
[WARN] (JointStateUpdater.java:90): Could not find the given pelvis and/or chest IMUs: pelvisIMU = pelvis_pelvisRearImu, chestIMU = torso_leftTorsoImu
[WARN] (JointStateUpdater.java:98): Pelvis IMU is null.
[WARN] (DRCSimulationFactory.java:276): Initializing Estimator to Actual!
[INFO] (DRCNetworkProcessor.java:290): Connecting to controller using intra process communication
[INFO] (OffscreenBufferVideoServer.java:39): Starting video stream
[INFO] (JMERenderer.java:859): GPULidar scene updated. Took 5.4E-4 s
[INFO] (JMERenderer.java:766): A PBOAwtPanel showed on screen.
[INFO] (JMERenderer.java:859): GPULidar scene updated. Took 1.5E-5 s
IHMC ROS API node successfully started.
[WARN] (HandControlModule.java:914): Waypoint 18 is out of the joint position limits, cancelling action.

The git revision is 3e86994. The git blame shows the last lines that output any message to be from the 24th of January, Valkyrie Operator Computer - cf. here).

Here's a minimal rosbag to recreate it - just start the ValkyrieROSAPISimulator or OpenHumanoidsSimulator, start playing the rosbag and watch when the LIDAR stops spinning/the HandControlModule message appears.

Here's the corresponding video; watch how the SCS threads die out [we've reproduced this on a laptop, our workstations have more RAM and CPU, so that wasn't the issue]

Can I provide any other logs that would be helpful?

cc: @dljsjr @SylvainBertrand

Valkyrie Simulator does not obey by joint limits / does not match URDF joint limits

Originally reported by: Wolf Merkt (Bitbucket: wxmerkt, GitHub: Unknown)


Values returned from the ValkyrieROSAPISimulator are often outside the joint limits.

Hence, it'd be good if either the URDFs reflect the true joint limits or the simulator did not go over joint limits.

We are seeing this issue with the most recent, frozen SVN version. Joints that we noticed to deviate from the joint limits in the URDF files are e.g. upperNeckPitch and neckYaw.

Could you perhaps comment on how to work around this issue in the mean time?

Affected joints:

  • Neck Joints (yaw, pitch)
  • Ankle Joints (pitch)

Thanks,
Wolfgang
(Edinburgh)


Falling behavior is improved

At the moment, when the robot falls, the controller to a zero torque mode.
This needs to be changed to some sort of compliant joint PD control that brings the robot back to a stand prep configuration.

rostopic low_level_control_mode not working

@dljsjr
After starting the robot up including running val_wholebody_control_robot.launch and startROSNetworkProcessor.sh I should be able to execute the command

rostopic pub /ihmc_ros/valkyrie/control/low_level_control_mode ihmc_valkyrie_ros/ValkyrieLowLevelControlModeRosMessage "requested_control_mode: 1
unique_id: 1"

And have the robot enter diagnostics mode, however that does not happen. I get the following error from runROSNetworkProcessor,

Exception in thread "IntraprocessCommunicatorCallback-6007-thread-2" com.esotericsoftware.kryo.KryoException: java.lang.IllegalArgumentException: Class is not registered: us.ihmc.humanoidRobotics.communication.packets.valkyrie.ValkyrieLowLevelControlModeMessage$ControlMode
Note: To register this class use: kryo.register(us.ihmc.humanoidRobotics.communication.packets.valkyrie.ValkyrieLowLevelControlModeMessage$ControlMode.class);
Serialization trace:
requestedControlMode (us.ihmc.humanoidRobotics.communication.packets.valkyrie.ValkyrieLowLevelControlModeMessage)
at com.esotericsoftware.kryo.serializers.FieldSerializer$ObjectField.write(FieldSerializer.java:585)
at com.esotericsoftware.kryo.serializers.FieldSerializer.write(FieldSerializer.java:213)
at com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo.java:571)
at com.esotericsoftware.kryonet.KryoSerialization.write(KryoSerialization.java:50)
at com.esotericsoftware.kryonet.TcpConnection.send(TcpConnection.java:192)
at com.esotericsoftware.kryonet.Connection.sendTCP(Connection.java:59)
at us.ihmc.communication.net.KryoObjectClient.sendTCP(KryoObjectClient.java:111)
at us.ihmc.communication.net.KryoObjectCommunicator.send(KryoObjectCommunicator.java:151)
at us.ihmc.communication.packetCommunicator.PacketCommunicator.send(PacketCommunicator.java:163)
at us.ihmc.communication.PacketRouter.forwardPacket(PacketRouter.java:125)
at us.ihmc.communication.PacketRouter.processPacketRouting(PacketRouter.java:113)
at us.ihmc.communication.PacketRouter.access$100(PacketRouter.java:12)
at us.ihmc.communication.PacketRouter$PacketRoutingAction.receivedPacket(PacketRouter.java:280)
at us.ihmc.communication.packetCommunicator.PacketCommunicator$GlobalPacketObjectConsumer.consumeObject(PacketCommunicator.java:179)
at us.ihmc.communication.net.local.IntraprocessObjectCommunicator.receiveObject(IntraprocessObjectCommunicator.java:63)
at us.ihmc.communication.net.local.IntraprocessCommunicationNetwork$IntraprocessCommunicator$1.run(IntraprocessCommunicationNetwork.java:169)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.IllegalArgumentException: Class is not registered: us.ihmc.humanoidRobotics.communication.packets.valkyrie.ValkyrieLowLevelControlModeMessage$ControlMode
Note: To register this class use: kryo.register(us.ihmc.humanoidRobotics.communication.packets.valkyrie.ValkyrieLowLevelControlModeMessage$ControlMode.class);
at com.esotericsoftware.kryo.Kryo.getRegistration(Kryo.java:443)
at com.esotericsoftware.kryo.Kryo.getSerializer(Kryo.java:463)
at com.esotericsoftware.kryo.serializers.FieldSerializer$ObjectField.write(FieldSerializer.java:567)
... 18 more
INFO: Success! Connected KryoClient to SCS at ip /10.185.0.30 on port 4895

IHMC ROS API has a better workflow for using rosbags to reproduce reported bugs on other people's physical robots

CC'ing @iamwolf @jlack1987 @SylvainBertrand @mithrandir123 @kodyensley on this for transparency and profit.

Working with rosbags is currently a pain in the ass for us (IHMC) and we should resolve this somehow, so I'm opening up a discussion issue to bat ideas around. If we're going to support a ROS API then we need to support rosbags.

The rosbag problem is many-faceted:

  1. Transform information is lacking, which means that rosbags are useless when interacting with the real robot as the location of "world" will be poorly defined and change in between runs,
  2. Many IHMC engineers use Windows and OS X so don't have a local ROS installation (yay rosjava), and
  3. rosjava does not have an implementation of the rosbag API, which compounds number 2 from above.

The current and short-term workaround is to only allow for using rosbags that are recorded from a simulation and only play them back on simulations, and then either force non Linux devs to have virtual machines or companion computers running Ubuntu that can consume the rosbags.

Unfortunately I have no idea how to really fix problem number 1, and I'm open to suggestions from people that are smarter than I am and know ROS much better than I am. While the proposed solutions from above for fixing 2 and 3 are viable, they also kinda suck. I'm pretty sure we're gonna have to end up writing some code to fix this. So if anybody has any thoughts on this, dump them here.

Robot feet point inwards when going to home position

After starting robot, when servo-ing legs (after all rotors have been initialized, and ihmc controller is running), instead of going to the home position, legs go to a position where each foot points inwards (towards each other). This seems to persist even after sending footsteps with a different orientation (i.e. the robot takes the step and twists the foot, but right before finishing the step, foot twists inwards again)

Log (from cold boot): https://www.dropbox.com/s/pe6pcvftocryrde/legsInwardLog.zip?dl=0

sending an incorrectly formatted message causes the ROS reciever to crash

sending an incorrectly formatted footstep plan causes the controller to stop responding to all future messages. It also does this silently without any warning or printf.

My guess (extrapolating from the same kind of behaviour in Python) is that the Controller's ROS receiving thread crashes but the controller is still active.

This can be readily replicated in simulation by putting faulty robot_side id into a message. (the usual ids are 0 or 1)

footstep_data_list: 
  - 
    robot_side: 1
    location: 
      x: 0.154122320942
      y: -0.125441902765
      z: 0.0927366825572
    orientation: 
      x: 0
      y: 0
      z: 0
      w: 1
    predicted_contact_points: []
    trajectory_type: 1
    swing_height: 0.05
    unique_id: 0
  - 
    robot_side: 2  <------------------------------- should be either zero or one
    location: 
      x: 0.339570214942
      y: 0.133368271576
      z: 0.0928407591907
    orientation: 
      x: 0
      y: 0
      z: 0
      w: 1
    predicted_contact_points: []
    trajectory_type: 1
    swing_height: 0.05
    unique_id: 0

Yaw bias estimation performs very poorly - when in motion

@SylvainBertrand : I think the Yaw bias estimation has a number of short comings.

Firstly, we now have a Vicon system and can measure State estimation performance very accurately. I can tell that the yaw bias, for our robot in the following experiment, was essentially constant at -0.0008 rad/sec.

But the IHMC-estimated value when the robot is moving is very unstable. I think this video is self explanatory:
https://www.youtube.com/watch?v=CvfuvTd-YJQ
Log of this:
http://terminator.robots.inf.ed.ac.uk/public/logs/20160503_102222_ValkyrieRosControlController_yaw_bias_snippet/

The observations from the experiment are:

  1. the bias is well estimated when stationary - to be -0.0008 (but with too much bandwidth). (0-20 seconds)
  2. when the arms move the bias becomes really unreliable (at 20 seconds)
  3. when walking the robot shifts between double support (Where the bias is really unstable) and single support (where the bias is fixed)

I think there is likely to be one bug covering 2 and 3, which doesn't accounting for the movement of the pelvis when the robot is in motion.

The knock-on result is that when walking the robot's orientation drifts a huge amount.
This is for a full log where the robot is walking about 80% of the time: 10 degrees in 400 seconds!
ihmc_yaw_drift

Only claim required sliderboard channels

As bespoken in Slack with @dljsjr:

  • Have a boolean DEBUG or TUNING flag that determines whether all sliderboard channels are mapped or just 7 and 8 (for CoM height and smoothing between standing and walking control), i.e. if (DEBUG_OR_TUNING) { claim_all_sliders_and_knobs }

CP overshoot to the right

We had an issue with the robot falling when switching from double support to single support w/ the right foot forward. Consistently happens on that side despite spending days to try different combinations of footstep parameters. We haven't been able to confirm on new API due to a msg mismatch (I'm not around this week and can't debug) but we couldn't send footsteps to the robot when I left.

Log: https://drive.google.com/open?id=0ByK9BDWokDuoWW5MX2ZIcEMxWWM

Valkyrie Simulator does not obey by joint limits / does not match URDF joint limits

Originally reported by: Wolf Merkt (Bitbucket: wxmerkt, GitHub: Unknown)


Values returned from the ValkyrieROSAPISimulator are often outside the joint limits.

Hence, it'd be good if either the URDFs reflect the true joint limits or the simulator did not go over joint limits.

We are seeing this issue with the most recent, frozen SVN version. Joints that we noticed to deviate from the joint limits in the URDF files are e.g. upperNeckPitch and neckYaw.

Could you perhaps comment on how to work around this issue in the mean time?

Affected joints:

  • Neck Joints (yaw, pitch)
  • Ankle Joints (pitch)

Thanks,
Wolfgang
(Edinburgh)


WholeBodyTrajectory ROS API: Short whole-body plan results in fall

We are trying to test the bugFix/badArmTrajectoryPacket branch (as in PR #19) at the moment. When executing a trajectory that is only raising the pelvis z value of the robot issued through the WholeBodyTrajectory ROS API, the robot e-stops. Initial analysis showed a spike in the qdd, e.g. very prominent for the shoulder joints.

A cropped SCS log can be found here.

@SylvainBertrand, could you please have a look and let us know whether we can provide any other information?

Thanks!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.