No version for distro humble. Known supported distros are highlighted in the buttons above.
No version for distro jazzy. Known supported distros are highlighted in the buttons above.
No version for distro rolling. Known supported distros are highlighted in the buttons above.

autoware_motion_velocity_run_out_module package from autoware_universe repo

autoware_adapi_specs autoware_agnocast_wrapper autoware_auto_common autoware_component_interface_specs_universe autoware_component_interface_tools autoware_component_interface_utils autoware_cuda_dependency_meta autoware_fake_test_node autoware_glog_component autoware_goal_distance_calculator autoware_grid_map_utils autoware_path_distance_calculator autoware_polar_grid autoware_time_utils autoware_traffic_light_recognition_marker_publisher autoware_traffic_light_utils autoware_universe_utils tier4_api_utils autoware_autonomous_emergency_braking autoware_collision_detector autoware_control_performance_analysis autoware_control_validator autoware_external_cmd_selector autoware_joy_controller autoware_lane_departure_checker autoware_mpc_lateral_controller autoware_obstacle_collision_checker autoware_operation_mode_transition_manager autoware_pid_longitudinal_controller autoware_predicted_path_checker autoware_pure_pursuit autoware_shift_decider autoware_smart_mpc_trajectory_follower autoware_trajectory_follower_base autoware_trajectory_follower_node autoware_vehicle_cmd_gate autoware_control_evaluator autoware_kinematic_evaluator autoware_localization_evaluator autoware_perception_online_evaluator autoware_planning_evaluator autoware_scenario_simulator_v2_adapter tier4_autoware_api_launch tier4_control_launch tier4_localization_launch tier4_map_launch tier4_perception_launch tier4_planning_launch tier4_sensing_launch tier4_simulator_launch tier4_system_launch tier4_vehicle_launch autoware_geo_pose_projector autoware_gyro_odometer autoware_ar_tag_based_localizer autoware_landmark_manager autoware_lidar_marker_localizer autoware_localization_error_monitor autoware_ndt_scan_matcher autoware_pose2twist autoware_pose_covariance_modifier autoware_pose_estimator_arbiter autoware_pose_initializer autoware_pose_instability_detector yabloc_common yabloc_image_processing yabloc_monitor yabloc_particle_filter yabloc_pose_initializer autoware_lanelet2_map_visualizer autoware_map_height_fitter autoware_map_tf_generator autoware_bevfusion autoware_bytetrack autoware_cluster_merger autoware_compare_map_segmentation autoware_crosswalk_traffic_light_estimator autoware_detected_object_feature_remover autoware_detected_object_validation autoware_detection_by_tracker autoware_elevation_map_loader autoware_euclidean_cluster autoware_ground_segmentation autoware_image_projection_based_fusion autoware_lidar_apollo_instance_segmentation autoware_lidar_centerpoint autoware_lidar_transfusion autoware_map_based_prediction autoware_multi_object_tracker autoware_object_merger autoware_object_range_splitter autoware_object_velocity_splitter autoware_occupancy_grid_map_outlier_filter autoware_probabilistic_occupancy_grid_map autoware_radar_crossing_objects_noise_filter autoware_radar_fusion_to_detected_object autoware_radar_object_clustering autoware_radar_object_tracker autoware_radar_tracks_msgs_converter autoware_raindrop_cluster_filter autoware_shape_estimation autoware_simple_object_merger autoware_tensorrt_classifier autoware_tensorrt_common autoware_tensorrt_plugins autoware_tensorrt_yolox autoware_tracking_object_merger autoware_traffic_light_arbiter autoware_traffic_light_category_merger autoware_traffic_light_classifier autoware_traffic_light_fine_detector autoware_traffic_light_map_based_detector autoware_traffic_light_multi_camera_fusion autoware_traffic_light_occlusion_predictor autoware_traffic_light_selector autoware_traffic_light_visualization perception_utils autoware_costmap_generator autoware_external_velocity_limit_selector autoware_freespace_planner autoware_freespace_planning_algorithms autoware_mission_planner_universe autoware_obstacle_cruise_planner autoware_obstacle_stop_planner autoware_path_optimizer autoware_path_smoother autoware_planning_validator autoware_remaining_distance_time_calculator autoware_rtc_interface autoware_scenario_selector autoware_surround_obstacle_checker autoware_behavior_path_avoidance_by_lane_change_module autoware_behavior_path_dynamic_obstacle_avoidance_module autoware_behavior_path_external_request_lane_change_module autoware_behavior_path_goal_planner_module autoware_behavior_path_lane_change_module autoware_behavior_path_planner autoware_behavior_path_planner_common autoware_behavior_path_sampling_planner_module autoware_behavior_path_side_shift_module autoware_behavior_path_start_planner_module autoware_behavior_path_static_obstacle_avoidance_module autoware_behavior_velocity_blind_spot_module autoware_behavior_velocity_crosswalk_module autoware_behavior_velocity_detection_area_module autoware_behavior_velocity_intersection_module autoware_behavior_velocity_no_drivable_lane_module autoware_behavior_velocity_no_stopping_area_module autoware_behavior_velocity_occlusion_spot_module autoware_behavior_velocity_rtc_interface autoware_behavior_velocity_run_out_module autoware_behavior_velocity_speed_bump_module autoware_behavior_velocity_template_module autoware_behavior_velocity_traffic_light_module autoware_behavior_velocity_virtual_traffic_light_module autoware_behavior_velocity_walkway_module autoware_motion_velocity_dynamic_obstacle_stop_module autoware_motion_velocity_obstacle_cruise_module autoware_motion_velocity_obstacle_slow_down_module autoware_motion_velocity_obstacle_velocity_limiter_module autoware_motion_velocity_out_of_lane_module autoware_motion_velocity_run_out_module autoware_bezier_sampler autoware_frenet_planner autoware_path_sampler autoware_sampler_common autoware_cuda_pointcloud_preprocessor autoware_cuda_utils autoware_image_diagnostics autoware_image_transport_decompressor autoware_imu_corrector autoware_pcl_extensions autoware_pointcloud_preprocessor autoware_radar_objects_adapter autoware_radar_scan_to_pointcloud2 autoware_radar_static_pointcloud_filter autoware_radar_threshold_filter autoware_radar_tracks_noise_filter autoware_livox_tag_filter autoware_carla_interface autoware_dummy_perception_publisher autoware_fault_injection autoware_learning_based_vehicle_model autoware_simple_planning_simulator autoware_vehicle_door_simulator tier4_dummy_object_rviz_plugin autoware_bluetooth_monitor autoware_component_monitor autoware_component_state_monitor autoware_default_adapi autoware_adapi_adaptors autoware_adapi_visualizers autoware_automatic_pose_initializer autoware_diagnostic_graph_aggregator autoware_diagnostic_graph_utils autoware_dummy_diag_publisher autoware_dummy_infrastructure autoware_duplicated_node_checker autoware_hazard_status_converter autoware_mrm_comfortable_stop_operator autoware_mrm_emergency_stop_operator autoware_mrm_handler autoware_processing_time_checker autoware_system_diagnostic_monitor autoware_system_monitor autoware_topic_relay_controller autoware_topic_state_monitor autoware_velodyne_monitor reaction_analyzer autoware_accel_brake_map_calibrator autoware_external_cmd_converter autoware_raw_vehicle_cmd_converter autoware_steer_offset_estimator autoware_bag_time_manager_rviz_plugin autoware_mission_details_overlay_rviz_plugin autoware_overlay_rviz_plugin autoware_string_stamped_rviz_plugin autoware_perception_rviz_plugin tier4_adapi_rviz_plugin tier4_camera_view_rviz_plugin tier4_datetime_rviz_plugin tier4_localization_rviz_plugin tier4_planning_factor_rviz_plugin tier4_planning_rviz_plugin tier4_state_rviz_plugin tier4_system_rviz_plugin tier4_traffic_light_rviz_plugin tier4_vehicle_rviz_plugin

Package Summary

Tags No category tags.
Version 0.39.0
License Apache License 2.0
Build type AMENT_CMAKE
Use RECOMMENDED

Repository Summary

Description
Checkout URI https://github.com/autowarefoundation/autoware_universe.git
VCS Type git
VCS Version main
Last Updated 2025-04-25
Dev Status UNKNOWN
CI status No Continuous Integration
Released UNRELEASED
Tags planner ros calibration self-driving-car autonomous-driving autonomous-vehicles ros2 3d-map autoware
Contributing Help Wanted (0)
Good First Issues (0)
Pull Requests to Review (0)

Package Description

run out module for the motion_velocity_planner

Additional Links

No additional links.

Maintainers

  • Maxime Clement

Authors

  • Maxime Clement

Run Out

Role

The run_out module adds deceleration and stop points to the ego trajectory in order to prevent collisions with objects that are moving towards the ego vehicle path.

Activation

This module is activated if the launch parameter launch_mvp_run_out_module is set to true.

Inner-workings / Algorithms

This module calculates the times when the ego vehicle and the objects are predicted to overlap each other’s trajectories. These times are then used to decide whether to stop before the overlap or not.

Next we explain the inner-workings of the module in more details.

1. Ego trajectory footprint

In this first step, the trajectory footprint is constructed from the corner points of the vehicle. 4 linestrings are constructed from the 4 corners (front left, front right, rear left, rear right) projected at each trajectory point.

At this step, the footprint size can be adjusted using the ego.lateral_margin and ego.longitudinal_margin parameters.

The following figures show the 4 corner linestrings calculated for the red trajectory.

front left front right rear left rear right
ego_front_left_footprint ego_front_right_footprint ego_rear_left_footprint ego_rear_right_footprint

These can be visualized on the debug markers with the ego_footprint_(front|rear)_(left|right) namespaces.

2. Extracting map filtering data

In the second step, we extract geometric information from the vector map that will be used to filter dynamic objects. For each object classification label, we prepare the following sets of geometries based on the parameters defined for that label (objects.{CLASSIFICATION_LABEL}):

  • polygons to ignore objects (ignore.polygon_types and ignore.lanelet_subtypes);
    • polygons for the ego trajectory footprint are also added if ignore.if_on_ego_trajectory is set to true.
  • polygons to ignore collisions (ignore_collisions.polygon_types and ignore_collisions.lanelet_subtypes);
  • segments to cut predicted paths (cut_predicted_paths.polygon_types, cut_predicted_paths.linestring_types, and cut_predicted_paths.lanelet_subtypes).
    • the rear segment of the current ego footprint is also added if cut_predicted_paths.if_crossing_ego_from_behind is set to true.

The following figure shows an example where the polygons to ignore objects are shown in blue, to ignore collisions in green, and to cut predicted paths in red.

map_filtering_data

These geometries can be visualized on the debug markers with the filtering_data_(ignore_objects|ignore_collisions|cut_predicted_paths) namespaces. The classification label corresponding to the published debug markers can be selected with parameter debug.object_label.

3. Dynamic objects filtering

In this step, objects and their predicted paths are filtered based on its classification label and the corresponding parameters objects.{CLASSIFICATION_LABEL}.

An object is ignored if one of the following condition is true:

  • its classification label is not in the list defined by the objects.target_labels parameter;
  • its velocity is bellow the ignore.stopped_velocity_threshold and ignore.if_stopped is set to true;
  • its current footprint is inside one of the polygons prepared in the previous step.

However, if it was decided to stop for the object in the previous iteration, or if a collision was detected with the object, then it cannot be ignored.

If an object is not ignored, its predicted path footprints are generated similarly to the ego footprint First, we only keep predicted paths that have a confidence value above the confidence_filtering.threshold parameter. If, confidence_filtering.only_use_highest is set to true then for each object only the predicted paths that have the higher confidence value are kept. Next, the remaining predicted paths are cut according to the segments prepared in the previous step.

The following figures shows an example where crosswalks are used to ignore pedestrians and to cut their predicted paths.

debug markers (objects_footprints) objects of interest
objects_footprints objects_of_interest

The result of the filtering can be visualized on the debug markers with the objects_footprints namespace which shows in yellow which predicted path will be used for collision checking in the next step.

In addition, the objects of interests markers shows which objects are not ignored and the color will correspond to the decision made towards that object (green for nothing, yellow for slowdown, and red for stop).

4. Collision detection

Now that we prepared the ego trajectory footprint, the dynamic objects, and their predicted paths, we will calculate the times when they are predicted to collide.

The following operations are performed for each object that was not ignored in the previous iteration.

First, we calculate the intersections between each pair of linestrings between the ego and object footprints. For each intersection, we calculate the corresponding point, the time when ego and the object are predicted to reach that point, and the location of that point on the ego footprint (e.g., on the rear left linestring).

All these intersections are then combined into intervals representing when the overlap between the ego trajectory and object predicted paths starts and ends. An overlap is represented by the entering and exiting intersections for both ego and the object.

These overlaps calculated for all the object’s predicted paths are then combined if overlapping in time (including the collision.time_overlap_tolerance parameter). and classified into the following collision types:

  • ignored_collision if one of the following condition is true:
    • parameter collision.ignore_conditions.if_ego_arrives_first.enable is set to true and:
      • ego enters the overlap at least time_margin seconds before the object.
        • time_margin is calculated based on the time when ego enters the overlap, which is used to interpolate the margin using the mapping between margin.ego_enter_times and margin.time_margins.
      • ego does not overlap the object’s path by more than max_overlap_duration seconds.
    • parameter collision.ignore_conditions.if_ego_arrives_first.enable is set to true and:
      • ego cannot stop before entering the interval by using the deceleration limit set with deceleration_limit.
  • collision if ego and the object are predicted to be in the overlap at the same time.
    • the time distance between the time intervals of ego and the objects must be smaller than the time_margin parameter (a distance of 0 means that the intervals overlap).
  • pass_first_no_collision if ego is predicted to exit the overlap before the object enters it.
  • no_collision in all other cases.

In the case where a collision is detected, the corresponding collision time is calculated based on the yaw difference between the object and the ego vehicle at the first intersection point.

  • default: collision time is set to the time when ego enters the overlap.
  • object yaw is within collision.same_direction_angle_threshold of the ego yaw:
    • if the object is faster than ego, no collision will happen and the type is changed to no_collision.
    • the collision time is increased based on the velocity difference.
  • object yaw is within collision.opposite_direction_angle_threshold of the opposite of the ego yaw:
    • the collision time is increased based on the estimated time when ego will collide will the object after entering the overlap.

The following figure shows the collision points in red and a table showing each overlap found and the corresponding collision type, ego and object time intervals, and predicted ego collision time.

collisions

The collisions points and the table can be visualized on the debug markers with the collisions_points and collisions_table namespaces.

5. Decisions

We will now make the decision towards each object on whether to stop, slowdown, or do nothing. For each object, we consider what decision each collision require, and keep the one with highest priority as follows:

  • stop types have higher priority than slowdown types.
  • for two decisions of the same type, the one whose predicted collision time is earlier has higher priority.

Once a decision is made, the history of the object is updated and will allow to know, for each previous time step, what was the decision made and the type of collision identified.

To decide the type of a collision, we use the decision history of the object and first check if it satisfies the following conditions to stop:

  • if the current collision type is collision and collisions with the object have been identified for a consecutive duration of at least stop.on_time_buffer seconds.
  • if the previous decision was stop and the time since the last identified collision with the object was less than stop.off_time_buffer seconds ago.

If the condition to stop is not met, we check the following conditions to slowdown as follows:

  • if the current collision type is collision and collisions with the object have been identified for a consecutive duration of at least slowdown.on_time_buffer seconds.
  • if the previous decision was slowdown and the time since the last identified collision with the object was less than slowdown.off_time_buffer seconds ago.

decisions

The decision table can be visualized on the debug markers with the decisions namespace.

6. Calculate the stop or slowdowns

Finally, for each object, we calculate how the velocity profile will be modified based on the decision made:

  • stop: insert a 0 velocity ahead of the predicted collision point by the distance set in the stop.distance_buffer parameter.
  • slowdown: insert a $V_{slow}$ velocity between the collision point and the point ahead of collision point by the distance set in the slowdown.distance_buffer parameter.
    • $V_{slow}$ is calculated as the maximum between the safe velocity and the comfortable velocity.
      • safe velocity: velocity required to be able to stop over the distance_buffer assuming a deceleration as set by the stop.deceleration_limit parameter.
      • comfortable velocity: velocity ego would reach assuming it constantly decelerates at the slowdown.deceleration_limit parameter until the slowdown point.

The slowdowns and stops inserted in the trajectory are visualized with the virtual walls.

virtual_walls

If an inserted stop point requires a stronger deceleration than set by the stop.deceleration_limit parameter, then an ERROR diagnostic is published to indicate that the stop in unfeasible.

Use of Rtree for fast spatial queries

In step 1, each segment of the 4 linestrings of the ego trajectory footprint are stored in a Rtree along with the corresponding trajectory point index. This allows to efficiently find intersections with an object’s predicted path along with the corresponding ego trajectory segment from which the interpolated time_from_start can be calculated.

In step 2, the polygons and linestrings used for filtering the objects are stored in Rtree objects to efficiently find whether an object is inside a polygon or if its predicted path intersects a linestring.

For more information about Rtree, see https://beta.boost.org/doc/libs/1_82_0/libs/geometry/doc/html/geometry/spatial_indexes/introduction.html

Accounting for prediction inaccuracies

When calculating predicted collisions between ego and the objects, we assume that the input ego trajectory contains accurate time_from_start values. Similarly, accurate predicted paths are expected to be provided for the objects.

To allow for errors in these predictions, margins around the time intervals can be added using the parameters collision.time_margin. Higher values of this parameter will make it more likely to detect a collision and generate a stop.

The time buffers on_time_buffer and off_time_buffer allow to delay the addition or removal of the decisions to stop/slowdown. Higher values prevent incorrect decisions in case of noisy object predictions, but also increase the reaction time, possibly causing stronger decelerations once a decision is made.

Module Parameters

{{ json_to_markdown(“planning/motion_velocity_planner/autoware_motion_velocity_run_out_module/schema/run_out.schema.json”) }}

Flow Diagram

Flow diagram

Debugging and Tuning Guide

Ego does not stop for the incoming object

Possible reasons to investigate:

  • the object classification label is not in the objects.target_labels;
  • the object is inside an ignore polygon (objects.LABEL.ignore.polygon_types or lanelet_subtypes);
  • the object is on the ego trajectory (objects.LABEL.ignore.if_on_ego_trajectory) or behind ego (if_behind_ego);
  • the predicted path of the object is cut (objects.LABEL.cut_predicted_paths.polygon_types, lanelet_subtypes, or linestring_types);
  • the collision is ignored;
    • ego does not have time to stop (ignore_conditions.if_ego_arrives_first_and_cannot_stop);
      • deceleration_limit can be increased.
    • ego is predicted to pass before the object (ignore_conditions.if_ego_arrives_first), including the time margin (calculated from margin.ego_enter_times and margin.time_margins);
      • time_margins can be increased.
  • the collision is not detected;
    • collision.time_margin can be increased.
    • the ego footprint can be made larger (ego.lateral_margin and ego.longitudinal.margin).
CHANGELOG
No CHANGELOG found.

Wiki Tutorials

This package does not provide any links to tutorials in it's rosindex metadata. You can check on the ROS Wiki Tutorials page for the package.

Launch files

No launch files found

Messages

No message files found.

Services

No service files found

Plugins

No plugins found.

Recent questions tagged autoware_motion_velocity_run_out_module at Robotics Stack Exchange

No version for distro noetic. Known supported distros are highlighted in the buttons above.
No version for distro ardent. Known supported distros are highlighted in the buttons above.
No version for distro bouncy. Known supported distros are highlighted in the buttons above.
No version for distro crystal. Known supported distros are highlighted in the buttons above.
No version for distro eloquent. Known supported distros are highlighted in the buttons above.
No version for distro dashing. Known supported distros are highlighted in the buttons above.
No version for distro galactic. Known supported distros are highlighted in the buttons above.
No version for distro foxy. Known supported distros are highlighted in the buttons above.
No version for distro iron. Known supported distros are highlighted in the buttons above.
No version for distro lunar. Known supported distros are highlighted in the buttons above.
No version for distro jade. Known supported distros are highlighted in the buttons above.
No version for distro indigo. Known supported distros are highlighted in the buttons above.
No version for distro hydro. Known supported distros are highlighted in the buttons above.
No version for distro kinetic. Known supported distros are highlighted in the buttons above.
No version for distro melodic. Known supported distros are highlighted in the buttons above.