Skip to main content

Simulating Sensors: LiDAR, Cameras, and IMUs

πŸ“‘ Sensor Types​

LiDAR (Light Detection and Ranging)​

<gazebo reference="lidar_link">
<sensor name="lidar" type="ray">
<pose>0 0 0 0 0 0</pose>
<ray>
<scan>
<horizontal>
<samples>360</samples>
<resolution>1</resolution>
<min_angle>-3.14159</min_angle>
<max_angle>3.14159</max_angle>
</horizontal>
</scan>
<range>
<min>0.1</min>
<max>30.0</max>
</range>
</ray>
<plugin name="gazebo_ros_lidar" filename="libgazebo_ros_ray_sensor.so">
<ros>
<namespace>/robot</namespace>
<remapping>~/out:=scan</remapping>
</ros>
<output_type>sensor_msgs/LaserScan</output_type>
</plugin>
</sensor>
</gazebo>

RGB-D Camera​

<gazebo reference="camera_link">
<sensor name="depth_camera" type="depth">
<camera>
<horizontal_fov>1.047</horizontal_fov>

<clip>
<near>0.1</near>
<far>10.0</far>
</clip>
</camera>
<plugin name="depth_camera_controller" filename="libgazebo_ros_camera.so">
<ros>
<namespace>/camera</namespace>
</ros>
<camera_name>depth</camera_name>
<frame_name>camera_link</frame_name>
</plugin>
</sensor>
</gazebo>

IMU (Inertial Measurement Unit)​

<gazebo reference="imu_link">
<sensor name="imu_sensor" type="imu">
<plugin name="imu_plugin" filename="libgazebo_ros_imu_sensor.so">
<ros>
<namespace>/imu</namespace>
<remapping>~/out:=data</remapping>
</ros>
<initial_orientation_as_reference>false</initial_orientation_as_reference>
</plugin>
</sensor>
</gazebo>

πŸ“Š Processing Sensor Data​

import rclpy
from rclpy.node import Node
from sensor_msgs.msg import LaserScan, Image, Imu
from cv_bridge import CvBridge

class SensorFusion(Node):
def __init__(self):
super().__init__('sensor_fusion')

self.create_subscription(LaserScan, '/scan', self.lidar_callback, 10)
self.create_subscription(Image, '/camera/depth/image_raw', self.depth_callback, 10)
self.create_subscription(Imu, '/imu/data', self.imu_callback, 10)

self.bridge = CvBridge()

def lidar_callback(self, msg):
# Process LiDAR data
min_distance = min(msg.ranges)
self.get_logger().info(f'Closest obstacle: {min_distance}m')

def depth_callback(self, msg):
# Process depth image
depth_image = self.bridge.imgmsg_to_cv2(msg, desired_encoding='passthrough')

def imu_callback(self, msg):
# Process IMU data
self.get_logger().info(f'Orientation: {msg.orientation}')

🎯 Key Takeaways​

  • LiDAR provides 360Β° obstacle detection
  • RGB-D cameras combine color and depth
  • IMUs measure orientation and acceleration
  • Sensor fusion improves perception accuracy

Next: Hands-On Lab β†’