简体   繁体   中英

Question about getting global coordinates of lidar point cloud from relative in Webots

I need to do custom mapping of surroundings with lidar using mobile robot in Webots. What I use for that:

  • GPS for getting robot position.
  • Compass for getting direction robot.
  • Lidar for getting info about surroundings.

Maybe someone familiar with Webots and can show basic code example or explain the math behind it or there is a method that I missed in Webots?

I did translation and rotation of relative points from lidar, which worked well when robot is on flat surface (2D rotation). But no matter how much I tried I can't figure out how to get accurate global coordinates from point cloud relative points, when robot is even a bit tilted (3D rotation).

My guess is that it suppose to use spatial transformation matrices, but I not sure how to use Webots Compass values in rotation matrix .

After getting some useful info in StackExchange . Basic Example of solution on Python:

from scipy.spatial.transform import Rotation as Rotation

RobotPoint = gps.getValues()
STR = Rotation.from_quat(InertialUnit.getQuaternion())
for RelativeCloudPoint in lidar.getPointCloud():
  Point2 = STR.apply(RelativeCloudPoint)
  GlobalCloudPoint = RelativeCloudPoint + RobotPoint

Using InternalUnit to get Quaternion for spartial rotation matrix. Then apply it to relative coordinates. After that add to it real robot coordinates from GPS. In the end you will get global coordinates of points you need.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM