Comments (4)
Additionally, if both the LIDAR and the ego vehicle / ego body frames have z-axes parallel to the current local ground plane, then it is not clear why the extrinsics between them are time-varying (in the calibrated_sensor
data for the LIDAR). The offset between them should be invariant to time, shouldn't it?
I suspect the diagram is not precise when it shows the IMU
frame as parallel to the ground plane normal, since the data format says the ego_pose
is in global coordinates (i.e. the normal of the local ground plane is not involved in its computation), which contradicts the diagram.
from nuscenes-devkit.
Dear @emanther : thanks for your questions, and apologies for the delay.
- For the global coordinate system x and y points along the ground, z straight up in the direction of gravity.
- The ego-vehicle coordinate system is as you say the local ground-plane that contains the points where the four wheels touch the ground. It is translated and rotated with respect to the global coord system as define in the
ego_pose
table. I have included some code that calculates the angle between the global z and the local z. Hopefully that will help you get started on building what you need. - You are right: the
calibrated_sensor
does not change over time. This is also reflected in the schema, and in my code-snippet below.
"""
This code is to sort out a reply to
https://github.com/nutonomy/nuscenes-devkit/issues/122
"""
import numpy as np
import matplotlib.pyplot as plt
from pyquaternion import Quaternion
from nuscenes import NuScenes
GLOBAL_X_AXIS = np.array([1, 0, 0])
GLOBAL_Z_AXIS = np.array([0, 0, 1])
def unit_vector(vector):
""" Returns the unit vector of the vector. """
return vector / np.linalg.norm(vector)
def angle_between(v1, v2):
""" Returns the angle in degrees between vectors 'v1' and 'v2' """
v1_u = unit_vector(v1)
v2_u = unit_vector(v2)
return 180/np.pi * np.arccos(np.clip(np.dot(v1_u, v2_u), -1.0, 1.0))
def main():
nusc = NuScenes('v1.0-mini')
scene_number = 6
# Grab first sample
sample = nusc.get('sample', nusc.scene[scene_number]['first_sample_token'])
# Cycle through all samples and store ego pose translation and rotation.
poses = []
while not sample['next'] == '':
sample = nusc.get('sample', sample['next'])
lidar = nusc.get('sample_data', sample['data']['LIDAR_TOP'])
pose = nusc.get('ego_pose', lidar['ego_pose_token'])
poses.append((pose['translation'], Quaternion(pose['rotation'])))
# Print values for first pose.
(x, y, _), q = poses[0]
print('First location (x, y): {:.1f}, {:.1f}'.format(x, y))
print('Orientation at first location: {:.2f} degrees'.format(q.degrees))
# Plot all poses on global x, y plan along with orientation.
# This doesn't really answer the question, it's just as a sanity check.
for (x, y, z), q in poses:
plt.plot(x, y, 'r.')
# The x axis of the ego frame is pointing towards the front of the vehicle.
ego_x_axis = q.rotate(GLOBAL_X_AXIS)
plt.arrow(x, y, ego_x_axis[0], ego_x_axis[1])
plt.axis('equal')
plt.xlabel('x-global')
plt.ylabel('y-global')
# Calculate deviations from global vertical axis and plot.
# This demonstrates that the z-axis of the ego poses are not
# all pointing upwards.
vertical_offsets = []
for (_, _, _), q in poses:
# The z axis of the ego frame indicates deviation from global vertical axis (direction of gravity)
ego_z_axis = q.rotate(GLOBAL_Z_AXIS)
vertical_offsets.append(angle_between(ego_z_axis, GLOBAL_Z_AXIS))
plt.savefig('xyposes.png')
plt.figure()
plt.hist(vertical_offsets)
plt.xlabel('Angle (degrees)')
plt.ylabel('Count')
plt.title('Offset from global vertical axis.')
plt.savefig('vertical_offsets.png')
# Finally show that all calibrated sensor tokens for a particular scene is indeed the same
sample = nusc.get('sample', nusc.scene[scene_number]['first_sample_token'])
cali_sensor_tokens = []
while not sample['next'] == '':
sample = nusc.get('sample', sample['next'])
lidar = nusc.get('sample_data', sample['data']['LIDAR_TOP'])
cali_sensor = nusc.get('calibrated_sensor', lidar['calibrated_sensor_token'])
cali_sensor_tokens.append(cali_sensor['token'])
# Assert that they all point to the same calibrated sensor record
assert len(set(cali_sensor_tokens)) == 1
if __name__ == "__main__":
main()
from nuscenes-devkit.
Dear @oscar-nutonomy, thanks for your detailed reply! I now understand that the ego_pose
z-axis / IMU z-axis are not quite parallel to gravity, and that the calibrated sensor tokens are persistent within all samples of a scene. I'm assuming almost all of the non-parallel "error" is due to the slope of the plane the vehicle wheels rest on.
The time-varying aspect of the calibrated sensors I noticed must have been cross-scene (probably cross-vehicle). I checked all scenes in the trainval
set, each has a unique calibrated_tensor_token
.
from nuscenes-devkit.
Yes: the "errors" (offsets) are precisely because of the ground sloping.
And yes: for simplicity we record a calibrated sensor record per scene, even though some may be similar across scenes (If they are from the same log).
from nuscenes-devkit.
Related Issues (20)
- How to create and split NuScenes subsets into trainval and test like v1.0 HOT 4
- The number of points is inconsistent with the number of labels HOT 4
- Is there C++ version to read the data HOT 1
- Rotation problem of point cloud object detection HOT 2
- Alignment of multi-frame radar dynamic object data HOT 4
- When v1.1.13 will be released in PyPI ? HOT 2
- pip install failed HOT 5
- nuScenes-lidar Annotation file HOT 1
- Navigation. How can I get waypoints from (x1,y1) to (x2 y2) in nuscenes map HOT 5
- Cannot download anything HOT 6
- how does the start angle and end angle of the lidar to project to the image with the moving object HOT 1
- can_bus assignment to key frames? HOT 1
- why is the steering angle so big? HOT 3
- Height of front camera HOT 2
- Stop Line Interpretation HOT 3
- Convert nuscenes format from nuplan HOT 1
- Steering angle feedback min and max mismatch
- How is the nuScenes raster map created? HOT 1
- Problem between nuscenes and kitti HOT 2
- Where can I find the md5 of a downloaded file? HOT 2
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from nuscenes-devkit.