上节说到,在global_traject_builder 中会进行AddSensorData 加入转化为点云的雷达扫描数据
第一步即,扫描匹配,in GlobalTrajectoryBuilder类的AddSensorData中
1 2 3 4 std::unique_ptr<typename LocalTrajectoryBuilder::MatchingResult> matching_result = local_trajectory_builder_->AddRangeData ( sensor_id, timed_point_cloud_data);
localtrajectory_builder 在构造函数的时候即被赋予了初值。
可以为LocalTrajectoryBuilder2d 或者 LocalTrajectoryBuilder3d
我们先看LocalTrajectoryBuilder2d, 位于cartograpger/mapping/internal/2d/local_trajectory_builder_2d.h
点云数据结构 在介绍匹配之前,我们首先强调一下cartographer中保存点云的数据结构
没有时间同步前的点云sensor::TimedPointCloudData
每一块点云都有一个时间time,表示此点云最后一个点的时间
1 2 3 4 5 6 7 8 struct TimedPointCloudData { common::Time time; Eigen::Vector3f origin; TimedPointCloud ranges; std::vector<float > intensities; };
再看数据点TimedPointCloud ,是一个保存了所有点的一个vector
using TimedPointCloud = std::vector\;
每一个点,都有一个自己的时间,但是是负的,根据time_increment计算得到
1 2 3 4 5 6 7 struct TimedRangefinderPoint { Eigen::Vector3f position; float time; };
LocalTrajectoryBuilder2D头文件 首先看一下头文件,为两个保存结果的结构体
1 2 3 4 5 6 7 8 9 10 11 12 13 14 class LocalTrajectoryBuilder2D { struct InsertionResult { std::shared_ptr<const TrajectoryNode::Data> constant_data; std::vector<std::shared_ptr<const Submap2D>> insertion_submaps; }; struct MatchingResult { common::Time time; transform::Rigid3d local_pose; sensor::RangeData range_data_in_local; std::unique_ptr<const InsertionResult> insertion_result; };
看一下成员变量
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 ActiveSubmaps2D active_submaps_; MotionFilter motion_filter_; scan_matching::RealTimeCorrelativeScanMatcher2D real_time_correlative_scan_matcher_; scan_matching::CeresScanMatcher2D ceres_scan_matcher_; std::unique_ptr<PoseExtrapolator> extrapolator_; int num_accumulated_ = 0 ;sensor::RangeData accumulated_range_data_; absl::optional<std::chrono::steady_clock::time_point> last_wall_time_; absl::optional<double > last_thread_cpu_time_seconds_; absl::optional<common::Time> last_sensor_time_; RangeDataCollator range_data_collator_;
构造函数 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 LocalTrajectoryBuilder2D::LocalTrajectoryBuilder2D ( const proto::LocalTrajectoryBuilderOptions2D& options, const std::vector<std::string>& expected_range_sensor_ids) : options_ (options), active_submaps_ (options.submaps_options ()), motion_filter_ (options_.motion_filter_options ()), real_time_correlative_scan_matcher_ ( options_.real_time_correlative_scan_matcher_options ()), ceres_scan_matcher_ (options_.ceres_scan_matcher_options ()), range_data_collator_ (expected_range_sensor_ids) {}
构造函数即对一些成员变量赋值
options对应的配置文件为trajectory_builder_2d.lua的所有参数
options.submaps_options() 可以获得这个lua文件中,submaps下大括号的参数配置
options_.motion_filter_options() 可以对应 motion_filter下的参数配置
options_.ceres_scan_matcher_options() 可以获得ceres_scan_matcher下的所有配置
options_.real_time_correlative_scan_matcher_options() 可以获得real_time_correlative_scan_matcher下的所有配置
下面是lua配置文件的整体结构
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 TRAJECTORY_BUILDER_2D = { use_imu_data = true , min_range = 0. , max_range = 30. , min_z = -0.8 , max_z = 2. , missing_data_ray_length = 5. , num_accumulated_range_data = 1 , voxel_filter_size = 0.025 , adaptive_voxel_filter = {... } loop_closure_adaptive_voxel_filter = {... } use_online_correlative_scan_matching = false , real_time_correlative_scan_matcher = { ceres_scan_matcher = {... } motion_filter = { } imu_gravity_time_constant = 10. , pose_extrapolator = { submaps = {... }, }
AddRangeData 函数 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 std::unique_ptr<LocalTrajectoryBuilder2D::MatchingResult> LocalTrajectoryBuilder2D::AddRangeData ( const std::string& sensor_id, const sensor::TimedPointCloudData& unsynchronized_data) { auto synchronized_data = range_data_collator_.AddRangeData (sensor_id, unsynchronized_data);
点云时间同步 第一步,将多个雷达信息的点云时间进行同步,并且返回同步后的数据,点云如何同步查看下一篇。
返回后的数据结构为
1 2 3 4 5 6 7 8 9 10 11 struct TimedPointCloudOriginData { struct RangeMeasurement { TimedRangefinderPoint point_time; float intensity; size_t origin_index; }; common::Time time; std::vector<Eigen::Vector3f> origins; std::vector<RangeMeasurement> ranges; };
注意,time 表示此点云最后一个点的时间
如果,没有使用imu的话,启动位姿估计器,位姿估计器后续再说
1 2 3 4 5 6 const common::Time& time = synchronized_data.time;if (!options_.use_imu_data ()) { InitializeExtrapolator (time); }
整个for循环 利用位姿估计器,估计每一个点的位姿,放入range_data_poses
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 for (const auto & range : synchronized_data.ranges) { common::Time time_point = time + common::FromSeconds (range.point_time.time); if (time_point < extrapolator_->GetLastExtrapolatedTime ()) { if (!warned) { LOG (ERROR) << "Timestamp of individual range data point jumps backwards from " << extrapolator_->GetLastExtrapolatedTime () << " to " << time_point; warned = true ; } time_point = extrapolator_->GetLastExtrapolatedTime (); } range_data_poses.push_back ( extrapolator_->ExtrapolatePose (time_point).cast<float >()); }
运动畸变 下面是运动畸变的去除:
运动畸变产生的原因,雷达转动一圈的过程中,始终已发出第一针的位置为坐标系,因此在运动的过程中测到的雷达距离会不准,比如墙体越来越近,那么画到了原来的坐标系上就会产生一条斜的墙面,见下图
那么只要我们得到每一个点所机器人对应的移动距离,(即我们用位姿估计器得到的值),用这个移动距离把测到的值映射到原来的坐标系中,就可以较为准确的值
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 for (size_t i = 0 ; i < synchronized_data.ranges.size (); ++i) { const sensor::TimedRangefinderPoint& hit = synchronized_data.ranges[i].point_time; const Eigen::Vector3f origin_in_local = range_data_poses[i] * synchronized_data.origins.at (synchronized_data.ranges[i].origin_index); sensor::RangefinderPoint hit_in_local = range_data_poses[i] * sensor::ToRangefinderPoint (hit); const Eigen::Vector3f delta = hit_in_local.position - origin_in_local; const float range = delta.norm (); if (range >= options_.min_range ()) { if (range <= options_.max_range ()) { accumulated_range_data_.returns.push_back (hit_in_local); } else { hit_in_local.position = origin_in_local + options_.missing_data_ray_length () / range * delta; accumulated_range_data_.misses.push_back (hit_in_local); } } }
关于accumulatedrange_data 的数据结构,他保存的是最后的转到local slam坐标系下的值
1 2 3 4 5 6 7 8 9 10 11 12 struct RangeData { Eigen::Vector3f origin; PointCloud returns; PointCloud misses; };
开始匹配 此时,我们就有了一帧有效的数据
当有效的数据达到配置文件设置的大小,则开始一次匹配,匹配前,获取机器人当前姿态,最后一个点
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 if (num_accumulated_ >= options_.num_accumulated_range_data ()) { const common::Time current_sensor_time = synchronized_data.time; absl::optional<common::Duration> sensor_duration; if (last_sensor_time_.has_value ()) { sensor_duration = current_sensor_time - last_sensor_time_.value (); } last_sensor_time_ = current_sensor_time; num_accumulated_ = 0 ; const transform::Rigid3d gravity_alignment = transform::Rigid3d::Rotation ( extrapolator_->EstimateGravityOrientation (time)); accumulated_range_data_.origin = range_data_poses.back ().translation (); return AddAccumulatedRangeData ( time, TransformToGravityAlignedFrameAndFilter ( gravity_alignment.cast<float >() * range_data_poses.back ().inverse (), accumulated_range_data_), gravity_alignment, sensor_duration); } return nullptr ;