机器人静止时,如何禁止继续添加node?

Posted COCO_PEAK_NOODLE

tags:

篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了机器人静止时,如何禁止继续添加node?相关的知识,希望对你有一定的参考价值。

为什么要这么做?因为我们知道,cartographer会在纯定位时,不断增长submap,当周围环境发生变化,且机器人静止时,定位信息会随着imu的偏移不断累积,导致定位错误。

这里给出了其中一个解决方案,就是机器人静止时,不再添加node,方法就是检测机器人前端匹配的结果,当显示机器人距离和角度变化比较小的时候,就不在往后端发送node,前端也传回不增长的active_submap。代码如下:

std::unique_ptr<KalmanLocalTrajectoryBuilder2D::MatchingResult>
KalmanLocalTrajectoryBuilder2D::AddAccumulatedRangeData(
    const common::Time time,
    const sensor::RangeData& gravity_aligned_range_data,
    const transform::Rigid3d& gravity_alignment,
    const absl::optional<common::Duration>& sensor_duration) 
  if (gravity_aligned_range_data.returns.empty()) 
    LOG(WARNING) << "Dropped empty horizontal range data.";
    return nullptr;
  

  if(gravity_aligned_range_data.returns.size() < 50)
    LOG(WARNING) << "Miss lots of horizontal range data";
    return nullptr;
  

  //
  int current_match_state = 0; //default is no error

  transform::Rigid3d gravity_alignment_variety = gravity_alignment;

  // Computes a gravity aligned pose prediction.
  //LOG(INFO) << "1 pose_prediction time " << time;
  const transform::Rigid3d non_gravity_aligned_pose_prediction =
      extrapolator_->ExtrapolatePose(time);

  if (last_pose_estimate.translation() != Eigen::Vector3d::Zero()) 
     pose_prediction_distance =
        (non_gravity_aligned_pose_prediction.translation() -
         last_pose_estimate.translation())
            .norm();

     pose_prediction_angle = transform::GetAngle(
        non_gravity_aligned_pose_prediction.inverse() * last_pose_estimate);
  

  //cartographer
  const transform::Rigid2d pose_prediction = transform::Project2D(
     non_gravity_aligned_pose_prediction * gravity_alignment_variety.inverse());

  const sensor::PointCloud& filtered_gravity_aligned_point_cloud =
      sensor::AdaptiveVoxelFilter(options_.adaptive_voxel_filter_options())
          .Filter(gravity_aligned_range_data.returns);
  if (filtered_gravity_aligned_point_cloud.empty()) 
    return nullptr;
  

  // local map frame <- gravity-aligned frame
  pose_estimate_2d =
      ScanMatch(time, pose_prediction, filtered_gravity_aligned_point_cloud);
  
  last_pose_estimate_2d = *pose_estimate_2d;

  if (pose_estimate_2d == nullptr) 
    LOG(WARNING) << "Scan matching failed.";
    return nullptr;
  

  //cartographer
  // const transform::Rigid3d pose_estimate =
  //    transform::Embed3D(*pose_estimate_2d) * gravity_alignment;
  
  //
  transform::Rigid3d pose_estimate =
     transform::Embed3D(*pose_estimate_2d) * gravity_alignment_variety;

  if (last_pose_estimate.translation() != Eigen::Vector3d::Zero()) 
    pose_prediction_distance =
        (pose_estimate.translation() - last_pose_estimate.translation())
            .norm();

    pose_prediction_angle = transform::GetAngle(
        pose_estimate.inverse() * last_pose_estimate);
  

  if (pose_prediction_distance < 0.005 &&
      pose_prediction_angle < (0.25 / 180 * M_PI)) 
    //LOG(INFO) << " pose_prediction_distance " << pose_prediction_distance
    //          << " angle " << pose_prediction_angle / M_PI * 180;

    //robot is still at local
    //LOG(INFO) << "robot do not move ";
    // not fail .just do not tell back end to optimization!
    //return nullptr; 
    current_match_state = 1; // robot is in place;

    //give the last_pose_estimate
    //the other reason is 
    pose_estimate = last_pose_estimate;

  

    //kalman_filter addposeObservation
  pose_tracker_->AddPoseObservation(
      time, pose_estimate,
      options_.scan_matcher_variance() *
          kalman_filter::PoseCovariance::Identity());
  //kalman_filter GetPoseEstimate
  kalman_filter::PoseCovariance covariance_estimate;
  pose_tracker_->GetPoseEstimateMeanAndCovariance(
      time, &scan_matcher_pose_estimate_, &covariance_estimate);

  //LOG(INFO) << “AddAccumulatedRangeData AddPose";
  extrapolator_->AddPose(time, scan_matcher_pose_estimate_/*pose_estimate*/); //
  last_pose_estimate =  scan_matcher_pose_estimate_/*pose_estimate*/;

  sensor::RangeData range_data_in_local =
      TransformRangeData(gravity_aligned_range_data,
                         transform::Embed3D(pose_estimate_2d->cast<float>()));

  std::unique_ptr<InsertionResult> insertion_result = nullptr;
  
  //
  if (current_match_state == 0) 
    insertion_result = InsertIntoSubmap(
        time, range_data_in_local, filtered_gravity_aligned_point_cloud,
        scan_matcher_pose_estimate_ /*pose_estimate*/,
        gravity_alignment_variety.rotation());
   else if(current_match_state == 1) 
    //LOG(INFO) << "InsertIntoSubmapWithoutNode";
    insertion_result = InsertIntoSubmapWithoutNode(
        time, range_data_in_local, filtered_gravity_aligned_point_cloud,
        scan_matcher_pose_estimate_ /*pose_estimate*/,
        gravity_alignment_variety.rotation());
  
  else
  
    return nullptr;
  
  

  //
  last_gravity_alignment = gravity_alignment_variety;

  const auto wall_time = std::chrono::steady_clock::now();
  if (last_wall_time_.has_value()) 
    const auto wall_time_duration = wall_time - last_wall_time_.value();
    kLocalSlamLatencyMetric->Set(common::ToSeconds(wall_time_duration));
    if (sensor_duration.has_value()) 
      kLocalSlamRealTimeRatio->Set(common::ToSeconds(sensor_duration.value()) /
                                   common::ToSeconds(wall_time_duration));
    
  
  const double thread_cpu_time_seconds = common::GetThreadCpuTimeSeconds();
  if (last_thread_cpu_time_seconds_.has_value()) 
    const double thread_cpu_duration_seconds =
        thread_cpu_time_seconds - last_thread_cpu_time_seconds_.value();
    if (sensor_duration.has_value()) 
      kLocalSlamCpuRealTimeRatio->Set(
          common::ToSeconds(sensor_duration.value()) /
          thread_cpu_duration_seconds);
    
  
  last_wall_time_ = wall_time;
  last_thread_cpu_time_seconds_ = thread_cpu_time_seconds;
  return absl::make_unique<MatchingResult>(
      MatchingResulttime, scan_matcher_pose_estimate_/*pose_estimate*/, 
                     std::move(range_data_in_local),
                     std::move(insertion_result),
                     current_match_state);
  

返回不插入子图的方式

std::unique_ptr<KalmanLocalTrajectoryBuilder2D::InsertionResult>
KalmanLocalTrajectoryBuilder2D::InsertIntoSubmapWithoutNode(
    common::Time time, const sensor::RangeData& range_data_in_local,
    const sensor::PointCloud& filtered_gravity_aligned_point_cloud,
    const transform::Rigid3d& pose_estimate,
    const Eigen::Quaterniond& gravity_alignment) 
  if (motion_filter_.IsSimilar(time, pose_estimate)) 
    return nullptr;
  
  std::vector<std::shared_ptr<const Submap2D>> insertion_submaps =
      active_submaps_.submaps();

  return absl::make_unique<InsertionResult>(InsertionResult
      std::make_shared<const TrajectoryNode::Data>(TrajectoryNode::Data
          time,
          gravity_alignment,
          filtered_gravity_aligned_point_cloud,
          ,  // 'high_resolution_point_cloud' is only used in 3D.
          ,  // 'low_resolution_point_cloud' is only used in 3D.
          ,  // 'rotational_scan_matcher_histogram' is only used in 3D.
          pose_estimate),
      std::move(insertion_submaps));

ros前端发布机器人位姿时修改后的代码,就是这句话,当发现机器人不动时,及时更新当前时间,以防止超时

              //robot is still in place, update the time to navigation need
              if(distance < 0.005 && angle<0.25)
                last_update_time = ros::Time::now();
              

以上是关于机器人静止时,如何禁止继续添加node?的主要内容,如果未能解决你的问题,请参考以下文章

如何添加“原因”来禁止命令

OpenGL:移动相机时如何使光线静止?

cartographer当机器人不动时,同时收到landmark,如何解决定位问题?

如何禁止TreeView.ExpandAll方法自动滚动到树的末尾?

UIImagePickerController 静止图像质量

尝试在终端中运行命令时,Discord.JS 令牌禁止功能无法正常工作[关闭]