Sensors#
The Simulation framework provides sensor interfaces for agents to perceive the environment. Currently, the primary supported sensor type is the Camera.
Camera#
Configuration#
The CameraCfg class defines the configuration for camera sensors. It inherits from SensorCfg and controls resolution, clipping planes, intrinsics, and active data modalities.
Parameter |
Type |
Default |
Description |
|---|---|---|---|
|
|
|
Width of the captured image. |
|
|
|
Height of the captured image. |
|
|
|
Camera intrinsics |
|
|
|
Pose configuration (see below). |
|
|
|
Near clipping plane distance. |
|
|
|
Far clipping plane distance. |
|
|
|
Enable RGBA image capture. |
|
|
|
Enable depth map capture. |
|
|
|
Enable segmentation mask capture. |
|
|
|
Enable surface normal capture. |
|
|
|
Enable 3D position map capture. |
Camera Extrinsics#
The ExtrinsicsCfg class defines the position and orientation of the camera.
Parameter |
Type |
Default |
Description |
|---|---|---|---|
|
|
|
Name of the link to attach to (e.g., |
|
|
|
Position offset |
|
|
|
Orientation quaternion |
|
|
|
(Optional) Camera eye position for look-at mode. |
|
|
|
(Optional) Target position for look-at mode. |
|
|
|
(Optional) Up vector for look-at mode. |
Usage#
You can create a camera sensor using sim.add_sensor() with a CameraCfg object.
Code Example#
from embodichain.lab.sim.sensors import Camera, CameraCfg
# 1. Define Configuration
camera_cfg = CameraCfg(
width=640,
height=480,
intrinsics=(600, 600, 320.0, 240.0),
extrinsics=CameraCfg.ExtrinsicsCfg(
parent="ee_link", # Attach to robot end-effector
pos=[0.09, 0.05, 0.04], # Relative position
quat=[0, 1, 0, 0], # Relative rotation [w, x, y, z]
),
enable_color=True,
enable_depth=True,
)
# 2. Add Sensor to Simulation
camera: Camera = sim.add_sensor(sensor_cfg=camera_cfg)
Observation Data#
Retrieve sensor data using camera.get_data(). The data is returned as a dictionary of tensors on the specified device.
Key |
Data Type |
Shape |
Description |
|---|---|---|---|
|
|
|
RGBA image data. |
|
|
|
Depth map in meters. |
|
|
|
Segmentation mask / Instance IDs. |
|
|
|
Surface normal vectors. |
|
|
|
3D Position map (OpenGL coords). |
Note: B represents the number of environments (batch size).
Stereo Camera#
Configuration#
The StereoCameraCfg class defines the configuration for stereo camera sensors. It inherits from CameraCfg and includes additional settings for the right camera and stereo-specific features like disparity computation.
In addition to the standard CameraCfg parameters, it supports the following:
Parameter |
Type |
Default |
Description |
|---|---|---|---|
|
|
|
The intrinsics for the right camera |
|
|
|
Position offset |
|
|
|
Rotation offset |
|
|
|
Enable disparity map computation. Note: Requires |
Usage#
You can create a stereo camera sensor using sim.add_sensor() with a StereoCameraCfg object.
Code Example#
from embodichain.lab.sim.sensors import StereoCamera, StereoCameraCfg
# 1. Define Configuration
stereo_cfg = StereoCameraCfg(
width=640,
height=480,
# Intrinsics for Left (inherited) and Right cameras
intrinsics=(600, 600, 320.0, 240.0),
intrinsics_right=(600, 600, 320.0, 240.0),
# Baseline configuration (e.g., 5cm baseline)
left_to_right_pos=(0.05, 0.0, 0.0),
extrinsics=StereoCameraCfg.ExtrinsicsCfg(
parent="head_link",
pos=[0.1, 0.0, 0.0],
),
# Data modalities
enable_color=True,
enable_depth=True,
enable_disparity=True,
)
# 2. Add Sensor to Simulation
stereo_camera: StereoCamera = sim.add_sensor(sensor_cfg=stereo_cfg)
Contact Sensor#
Configuration#
The ContactSensorCfg class defines the configuration for contact sensors. It inherits from SensorCfg and enables filtering and monitoring of contact events between specific rigid bodies and articulation links in the simulation.
Parameter |
Type |
Default |
Description |
|---|---|---|---|
|
|
|
List of rigid body UIDs to monitor for contacts. |
|
|
|
List of articulation link contact filter configurations. |
|
|
|
Whether to filter contact only when both actors are in the filter list. If |
|
|
|
Maximum number of contacts per environment that the sensor can handle. |
Articulation Contact Filter Configuration#
The ArticulationContactFilterCfg class specifies which articulation links to monitor for contacts.
Parameter |
Type |
Default |
Description |
|---|---|---|---|
|
|
|
Unique identifier of the articulation (robot or articulated object). |
|
|
|
List of link names in the articulation to monitor. If empty, all links are monitored. |
Usage#
You can create a contact sensor using sim.add_sensor() with a ContactSensorCfg object.
Code Example#
from embodichain.lab.sim.sensors import ContactSensor, ContactSensorCfg, ArticulationContactFilterCfg
import torch
# 1. Define Contact Filter Configuration
contact_filter_cfg = ContactSensorCfg()
# Monitor contacts for specific rigid bodies
contact_filter_cfg.rigid_uid_list = ["cube0", "cube1", "cube2"]
# Monitor contacts for specific articulation links
contact_filter_art_cfg = ArticulationContactFilterCfg()
contact_filter_art_cfg.articulation_uid = "UR10_PGI"
contact_filter_art_cfg.link_name_list = ["finger1_link", "finger2_link"]
contact_filter_cfg.articulation_cfg_list = [contact_filter_art_cfg]
# Only report contacts when both actors are in the filter list
contact_filter_cfg.filter_need_both_actor = True
# Set maximum contacts per environment
contact_filter_cfg.max_contacts_per_env = 128
# 2. Add Sensor to Simulation
contact_sensor: ContactSensor = sim.add_sensor(sensor_cfg=contact_filter_cfg)
# 3. Update and Retrieve Contact Data
sim.update(step=1)
contact_sensor.update()
contact_report = contact_sensor.get_data()
# Access contacts for a specific environment using is_valid mask
env_id = 0
env_valid_mask = contact_report["is_valid"][env_id]
env_contact_positions = contact_report["position"][env_id][env_valid_mask]
# Or get all valid contacts across all environments
valid_mask = contact_report["is_valid"]
all_valid_positions = contact_report["position"][valid_mask] # Shape: (total_valid_contacts, 3)
# 4. Filter contacts by specific user IDs
cube2_user_ids = sim.get_rigid_object("cube2").get_user_ids()
finger1_user_ids = sim.get_robot("UR10_PGI").get_user_ids("finger1_link").reshape(-1)
filter_user_ids = torch.cat([cube2_user_ids, finger1_user_ids])
# Filter for specific environments
filter_contact_report = contact_sensor.filter_by_user_ids(filter_user_ids, env_ids=[env_id])
# 5. Visualize Contact Points
contact_sensor.set_contact_point_visibility(
visible=True,
rgba=(0.0, 0.0, 1.0, 1.0), # Blue color
point_size=6.0,
env_ids=[env_id], # Optional: visualize only specific environments
)
Observation Data#
Retrieve contact data using contact_sensor.get_data(). The data is returned as a dictionary of tensors on the specified device.
Key |
Data Type |
Shape |
Description |
|---|---|---|---|
|
|
|
Contact positions in arena frame (world coordinates minus arena offset). |
|
|
|
Contact normal vectors. |
|
|
|
Contact friction forces. Note: Currently this value may not be accurate. |
|
|
|
Contact impulse magnitudes. |
|
|
|
Contact penetration distances. |
|
|
|
Pair of user IDs for the two actors in contact. Use with |
|
|
|
Boolean mask indicating which contact slots contain valid data. Use this mask to filter out unused slots. |
Note: Use the is_valid mask to access only valid contacts:
# Get all valid contacts across all environments
valid_mask = contact_report["is_valid"]
valid_positions = contact_report["position"][valid_mask] # Shape: (total_valid_contacts, 3)
# Or access per-environment
env_id = 0
num_valid = contact_report["is_valid"][env_id].sum().item()
env_positions = contact_report["position"][env_id, :num_valid]
Additional Methods#
filter_by_user_ids(item_user_ids, env_ids=None): Filter contact report to include only contacts involving specific user IDs. Optionally filter by specific environment IDs.set_contact_point_visibility(visible, rgba, point_size, env_ids=None): Enable/disable visualization of contact points with customizable color and size. Optionally visualize only specific environments.