Sensors#
The Simulation framework provides sensor interfaces for agents to perceive the environment. Currently, the primary supported sensor type is the Camera.
Camera#
Configuration#
The CameraCfg class defines the configuration for camera sensors. It inherits from SensorCfg and controls resolution, clipping planes, intrinsics, and active data modalities.
Parameter |
Type |
Default |
Description |
|---|---|---|---|
|
|
|
Width of the captured image. |
|
|
|
Height of the captured image. |
|
|
|
Camera intrinsics |
|
|
|
Pose configuration (see below). |
|
|
|
Near clipping plane distance. |
|
|
|
Far clipping plane distance. |
|
|
|
Enable RGBA image capture. |
|
|
|
Enable depth map capture. |
|
|
|
Enable segmentation mask capture. |
|
|
|
Enable surface normal capture. |
|
|
|
Enable 3D position map capture. |
Camera Extrinsics#
The ExtrinsicsCfg class defines the position and orientation of the camera.
Parameter |
Type |
Default |
Description |
|---|---|---|---|
|
|
|
Name of the link to attach to (e.g., |
|
|
|
Position offset |
|
|
|
Orientation quaternion |
|
|
|
(Optional) Camera eye position for look-at mode. |
|
|
|
(Optional) Target position for look-at mode. |
|
|
|
(Optional) Up vector for look-at mode. |
Usage#
You can create a camera sensor using sim.add_sensor() with a CameraCfg object.
Code Example#
from embodichain.lab.sim.sensors import Camera, CameraCfg
# 1. Define Configuration
camera_cfg = CameraCfg(
width=640,
height=480,
intrinsics=(600, 600, 320.0, 240.0),
extrinsics=CameraCfg.ExtrinsicsCfg(
parent="ee_link", # Attach to robot end-effector
pos=[0.09, 0.05, 0.04], # Relative position
quat=[0, 1, 0, 0], # Relative rotation [w, x, y, z]
),
enable_color=True,
enable_depth=True,
)
# 2. Add Sensor to Simulation
camera: Camera = sim.add_sensor(sensor_cfg=camera_cfg)
Observation Data#
Retrieve sensor data using camera.get_data(). The data is returned as a dictionary of tensors on the specified device.
Key |
Data Type |
Shape |
Description |
|---|---|---|---|
|
|
|
RGBA image data. |
|
|
|
Depth map in meters. |
|
|
|
Segmentation mask / Instance IDs. |
|
|
|
Surface normal vectors. |
|
|
|
3D Position map (OpenGL coords). |
Note: B represents the number of environments (batch size).
Stereo Camera#
Configuration#
The StereoCameraCfg class defines the configuration for stereo camera sensors. It inherits from CameraCfg and includes additional settings for the right camera and stereo-specific features like disparity computation.
In addition to the standard CameraCfg parameters, it supports the following:
Parameter |
Type |
Default |
Description |
|---|---|---|---|
|
|
|
The intrinsics for the right camera |
|
|
|
Position offset |
|
|
|
Rotation offset |
|
|
|
Enable disparity map computation. Note: Requires |
Usage#
You can create a stereo camera sensor using sim.add_sensor() with a StereoCameraCfg object.
Code Example#
from embodichain.lab.sim.sensors import StereoCamera, StereoCameraCfg
# 1. Define Configuration
stereo_cfg = StereoCameraCfg(
width=640,
height=480,
# Intrinsics for Left (inherited) and Right cameras
intrinsics=(600, 600, 320.0, 240.0),
intrinsics_right=(600, 600, 320.0, 240.0),
# Baseline configuration (e.g., 5cm baseline)
left_to_right_pos=(0.05, 0.0, 0.0),
extrinsics=StereoCameraCfg.ExtrinsicsCfg(
parent="head_link",
pos=[0.1, 0.0, 0.0],
),
# Data modalities
enable_color=True,
enable_depth=True,
enable_disparity=True,
)
# 2. Add Sensor to Simulation
stereo_camera: StereoCamera = sim.add_sensor(sensor_cfg=stereo_cfg)
Contact Sensor#
Configuration#
The ContactSensorCfg class defines the configuration for contact sensors. It inherits from SensorCfg and enables filtering and monitoring of contact events between specific rigid bodies and articulation links in the simulation.
Parameter |
Type |
Default |
Description |
|---|---|---|---|
|
|
|
List of rigid body UIDs to monitor for contacts. |
|
|
|
List of articulation link contact filter configurations. |
|
|
|
Whether to filter contact only when both actors are in the filter list. If |
Articulation Contact Filter Configuration#
The ArticulationContactFilterCfg class specifies which articulation links to monitor for contacts.
Parameter |
Type |
Default |
Description |
|---|---|---|---|
|
|
|
Unique identifier of the articulation (robot or articulated object). |
|
|
|
List of link names in the articulation to monitor. If empty, all links are monitored. |
Usage#
You can create a contact sensor using sim.add_sensor() with a ContactSensorCfg object.
Code Example#
from embodichain.lab.sim.sensors import ContactSensor, ContactSensorCfg, ArticulationContactFilterCfg
import torch
# 1. Define Contact Filter Configuration
contact_filter_cfg = ContactSensorCfg()
# Monitor contacts for specific rigid bodies
contact_filter_cfg.rigid_uid_list = ["cube0", "cube1", "cube2"]
# Monitor contacts for specific articulation links
contact_filter_art_cfg = ArticulationContactFilterCfg()
contact_filter_art_cfg.articulation_uid = "UR10_PGI"
contact_filter_art_cfg.link_name_list = ["finger1_link", "finger2_link"]
contact_filter_cfg.articulation_cfg_list = [contact_filter_art_cfg]
# Only report contacts when both actors are in the filter list
contact_filter_cfg.filter_need_both_actor = True
# 2. Add Sensor to Simulation
contact_sensor: ContactSensor = sim.add_sensor(sensor_cfg=contact_filter_cfg)
# 3. Update and Retrieve Contact Data
sim.update(step=1)
contact_sensor.update()
contact_report = contact_sensor.get_data()
# 4. Filter contacts by specific user IDs
cube2_user_ids = sim.get_rigid_object("cube2").get_user_ids()
finger1_user_ids = sim.get_robot("UR10_PGI").get_user_ids("finger1_link").reshape(-1)
filter_user_ids = torch.cat([cube2_user_ids, finger1_user_ids])
filter_contact_report = contact_sensor.filter_by_user_ids(filter_user_ids)
# 5. Visualize Contact Points
contact_sensor.set_contact_point_visibility(
visible=True,
rgba=(0.0, 0.0, 1.0, 1.0), # Blue color
point_size=6.0
)
Observation Data#
Retrieve contact data using contact_sensor.get_data(). The data is returned as a dictionary of tensors on the specified device.
Key |
Data Type |
Shape |
Description |
|---|---|---|---|
|
|
|
Contact positions in arena frame (world coordinates minus arena offset). |
|
|
|
Contact normal vectors. |
|
|
|
Contact friction forces. Note: Currently this value may not be accurate. |
|
|
|
Contact impulse magnitudes. |
|
|
|
Contact penetration distances. |
|
|
|
Pair of user IDs for the two actors in contact. Use with |
|
|
|
Environment IDs indicating which parallel environment each contact belongs to. |
Note: N represents the number of contacts detected.
Additional Methods#
filter_by_user_ids(item_user_ids): Filter contact report to include only contacts involving specific user IDs.set_contact_point_visibility(visible, rgba, point_size): Enable/disable visualization of contact points with customizable color and size.