Skip to content

Commit

Permalink
Marker id is now in the request field
Browse files Browse the repository at this point in the history
  • Loading branch information
Hydran00 committed Jul 9, 2024
1 parent 5bf47d1 commit 3f33279
Show file tree
Hide file tree
Showing 6 changed files with 159 additions and 123 deletions.
20 changes: 10 additions & 10 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,31 +31,31 @@ ros2 launch aruco_camera_pose_estimator_cpp aruco_camera_pose_estimator.launch.p
```
In another sourced terminal call the service
```
ros2 service call /calibration/get_camera_pose calibration_srv/srv/Calibration "req:
data: ''"
ros2 service call /calibration/get_camera_pose calibration_srv/srv/Calibration "marker_id: <your-marker-id>"
```
You will get a PoseStamped message as response corresponding to the camera pose with respect to the world frame.
### Maths behind this implementation
The OpenCV Aruco Library let us compute $M_{c}^{a} \in \mathcal{R}^{4x4}$ that express the rototranslation of the Aruco with respect to the camera frame.
The OpenCV Aruco Library let us compute $M_{a}^{c} \in \mathcal{R}^{4x4}$ that express the rototranslation of the Aruco with respect to the camera frame.

Since we want $M_{w}^{c}$ (the camera with respect to the world frame), we can compute this matrix using homogeneous matrices multiplication as:
Since we want $M_{c}^{w}$ (the camera with respect to the world frame), we can compute this matrix using homogeneous matrices multiplication as:

$M_{w}^{c} = M_{w}^{a} M_{a}^{c}$
$M_{c}^{w} = M_{a}^{w} M_{c}^{a}$

where $M_{a}^{c} \in \mathcal{R}^{4x4}$ is the transpose of $M_{c}^{a}$ and $M_{w}^{a} \in \mathcal{R}^{4x4}$ is the rototranslation of the aruco marker with respect to the world frame.
where $M_{c}^{a} \in \mathcal{R}^{4x4}$ is the transpose of $M_{a}^{c}$ and $M_{a}^{w} \in \mathcal{R}^{4x4}$ is the rototranslation of the aruco marker with respect to the world frame.

**Note that if you just need the camera pose with respect to the aruco marker you can neglect $M_{w}^{a}$, i.e. setting it to equal to the identity matrix.**
**Note that if you just need the camera pose with respect to the aruco marker you can neglect $M_{a}^{w}$, i.e. setting it to equal to the identity matrix.**

### Parameters
#### ROS2 Parameters
- `input_topic_name`: ROS2 input topic name of the camera image stream
- `output_service_name`: ROS2 output service name of the computed camera pose
- `aruco_marker_id`: id of the aruco in the dictionary.
- `n_observation`: the number of measurements used for the computation of the mean rototranslation (`timeout` should be set accordingly)
- `frame_id`: the name of the world frame in which the camera pose will be expressed
- `timeout_ms`: the amount of time in milliseconds before the data collection is considered failed. Higher `n_observation` means more time to wait for the image stream. When the timeout is reached, the service response is sent having `error` in the `frame_id` field.
- `show_img`: wether to show the axis of the aruco when one is found.
- `aruco_size`: the aruco edge size in centimeters
- `aruco_XYZ_offset_from_baseframe`: XYZ displacement of the aruco with respect to the world frame. This corresponds to the entries (0,3), (1,3), (2,3) of $M_{w}^{a}$
`aruco_rot_offset_from_baseframe`: orientation displacement of the aruco with respect to the world frame. It is specified in (w,x,y,z) quaternion forms. This, after the conversion to the rotation matrix, corresponds to the 3x3 matrix starting from (0,0) in $M_{w}^{a}$
- `aruco_XYZ_offset_from_baseframe`: XYZ displacement of the aruco with respect to the world frame. This corresponds to the entries (0,3), (1,3), (2,3) of $M_{a}^{w}$
`aruco_rot_offset_from_baseframe`: orientation displacement of the aruco with respect to the world frame. It is specified in (w,x,y,z) quaternion forms. This, after the conversion to the rotation matrix, corresponds to the 3x3 matrix starting from (0,0) in $M_{a}^{w}$
#### Camera calibration parameters
Camera intrinsic and distortion coefficients can be set editing `calibration_params.yaml` under `config`.

Expand Down
2 changes: 1 addition & 1 deletion aruco_camera_pose_estimator_cpp/config/config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@ pose_service:
ros__parameters:
input_topic_name: "/camera/camera/color/image_raw"
output_service_name: "/calibration/get_camera_pose"
aruco_marker_id: 9
frame_id: "base_link"
n_observation: 10 # number of measurements for computing the average
timeout_ms: 2000
show_img: false
Expand Down
69 changes: 35 additions & 34 deletions aruco_camera_pose_estimator_cpp/include/image_processor.hpp
Original file line number Diff line number Diff line change
Expand Up @@ -10,42 +10,43 @@
#include <rclcpp/rclcpp.hpp>
#include <sensor_msgs/msg/image.hpp>

class ImageProcessor : public rclcpp::Node {
public:
ImageProcessor(const std::string &topic_name, uint32_t N,
Eigen::Vector3d &mean_tvec, Eigen::Quaterniond &mean_quat,
bool &mean_computed, bool show_img, uint16_t aruco_marker_id,
double arucobase_size, double cx, double cy, double fx,
double fy, double k1, double k2, double k3, double p1,
double p2);
std::string topic_name_;
uint32_t N_;
Eigen::Vector3d &mean_tvec_;
Eigen::Quaterniond &mean_quat_;
bool &mean_computed_;
int idx_;
rclcpp::Subscription<sensor_msgs::msg::Image>::SharedPtr sub_;
void image_callback(const sensor_msgs::msg::Image::SharedPtr msg);
class ImageProcessor : public rclcpp::Node
{
public:
ImageProcessor(const std::string &topic_name, uint32_t N,
Eigen::Vector3d &mean_tvec, Eigen::Quaterniond &mean_quat,
bool &mean_computed, bool show_img,
double arucobase_size, double cx, double cy, double fx,
double fy, double k1, double k2, double k3, double p1,
double p2);
std::string topic_name_;
uint32_t N_;
Eigen::Vector3d &mean_tvec_;
Eigen::Quaterniond &mean_quat_;
bool &mean_computed_;
int idx_;
rclcpp::Subscription<sensor_msgs::msg::Image>::SharedPtr sub_;
uint16_t aruco_marker_id_;
void image_callback(const sensor_msgs::msg::Image::SharedPtr msg);

private:
std::vector<cv::Vec3d> rvec_cam_list_;
std::vector<cv::Vec3d> tvec_cam_list_;
cv::Ptr<cv::aruco::Dictionary> dictionary_;
cv::Ptr<cv::aruco::DetectorParameters> detector_params_;
cv::Mat image_;
bool show_img_;
uint16_t aruco_marker_id_;
double arucobase_size_;
double cx_, cy_, fx_, fy_;
double k1_, k2_, k3_, p1_, p2_;
cv::Mat intrinsic_camera_ = cv::Mat_<double>(3, 3);
cv::Mat distortion_camera_ = cv::Mat_<double>(1, 5);
private:
std::vector<cv::Vec3d> rvec_cam_list_;
std::vector<cv::Vec3d> tvec_cam_list_;
cv::Ptr<cv::aruco::Dictionary> dictionary_;
cv::Ptr<cv::aruco::DetectorParameters> detector_params_;
cv::Mat image_;
bool show_img_;
double arucobase_size_;
double cx_, cy_, fx_, fy_;
double k1_, k2_, k3_, p1_, p2_;
cv::Mat intrinsic_camera_ = cv::Mat_<double>(3, 3);
cv::Mat distortion_camera_ = cv::Mat_<double>(1, 5);

bool camera_pose_estimation(cv::Mat &frame, cv::Vec3d &tvec_camera,
cv::Vec3d &rvec_camera);
bool camera_pose_estimation(cv::Mat &frame, cv::Vec3d &tvec_camera,
cv::Vec3d &rvec_camera);

Eigen::Quaterniond quaternion_avg(
const std::vector<Eigen::Quaterniond> &quat_list);
Eigen::Quaterniond quaternion_avg(
const std::vector<Eigen::Quaterniond> &quat_list);
};

#endif // IMAGE_PROCESSOR_HPP
#endif // IMAGE_PROCESSOR_HPP
59 changes: 40 additions & 19 deletions aruco_camera_pose_estimator_cpp/src/image_processor.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ using namespace std::chrono_literals;
ImageProcessor::ImageProcessor(
const std::string &topic_name, const uint32_t N, Eigen::Vector3d &mean_tvec,
Eigen::Quaterniond &mean_quat, bool &mean_computed, bool show_img,
uint16_t aruco_marker_id, double arucobase_size, double cx, double cy,
double arucobase_size, double cx, double cy,
double fx, double fy, double k1, double k2, double k3, double p1, double p2)
: Node("image_processor"),
topic_name_(topic_name),
Expand All @@ -22,7 +22,6 @@ ImageProcessor::ImageProcessor(
mean_quat_(mean_quat),
mean_computed_(mean_computed),
show_img_(show_img),
aruco_marker_id_(aruco_marker_id),
arucobase_size_(arucobase_size),
cx_(cx),
cy_(cy),
Expand All @@ -32,7 +31,9 @@ ImageProcessor::ImageProcessor(
k2_(k2),
k3_(k3),
p1_(p1),
p2_(p2) {
p2_(p2)
{
aruco_marker_id_ = 0;
idx_ = 0;
dictionary_ = cv::aruco::getPredefinedDictionary(cv::aruco::DICT_4X4_50);
detector_params_ = cv::aruco::DetectorParameters::create();
Expand All @@ -46,16 +47,21 @@ ImageProcessor::ImageProcessor(

bool ImageProcessor::camera_pose_estimation(cv::Mat &frame,
cv::Vec3d &tvec_camera,
cv::Vec3d &rvec_camera) {
cv::Vec3d &rvec_camera)
{
cv::Mat gray;
cv::cvtColor(frame, gray, cv::COLOR_BGR2GRAY);
std::vector<int> ids;
std::vector<std::vector<cv::Point2f>> corners;
cv::aruco::detectMarkers(gray, dictionary_, corners, ids, detector_params_);

if (!ids.empty()) {
for (size_t i = 0; i < ids.size(); ++i) {
if (ids[i] == aruco_marker_id_) {
// RCLCPP_INFO(this->get_logger(), "Looking for aruco with id %d", aruco_marker_id_);
if (!ids.empty())
{
for (size_t i = 0; i < ids.size(); ++i)
{
if (ids[i] == aruco_marker_id_)
{
// RCLCPP_INFO(this->get_logger(), "Detected Aruco base marker");
std::vector<cv::Vec3d> rvec, tvec;
cv::aruco::estimatePoseSingleMarkers(corners, arucobase_size_,
Expand All @@ -64,7 +70,8 @@ bool ImageProcessor::camera_pose_estimation(cv::Mat &frame,
tvec_camera = tvec[0];
rvec_camera = rvec[0];
// Draw axis for the aruco marker
if (show_img_) {
if (show_img_)
{
cv::aruco::drawAxis(frame, intrinsic_camera_, distortion_camera_,
rvec[0], tvec[0], 0.1);
cv::imshow("Aruco Marker", frame);
Expand All @@ -78,9 +85,11 @@ bool ImageProcessor::camera_pose_estimation(cv::Mat &frame,
}

Eigen::Quaterniond ImageProcessor::quaternion_avg(
const std::vector<Eigen::Quaterniond> &quat_list) {
const std::vector<Eigen::Quaterniond> &quat_list)
{
Eigen::Matrix4d M = Eigen::Matrix4d::Zero();
for (const auto &q : quat_list) {
for (const auto &q : quat_list)
{
Eigen::Vector4d q_vec(q.w(), q.x(), q.y(), q.z());
M += q_vec * q_vec.transpose();
}
Expand All @@ -92,18 +101,24 @@ Eigen::Quaterniond ImageProcessor::quaternion_avg(
}

void ImageProcessor::image_callback(
const sensor_msgs::msg::Image::SharedPtr msg) {
const sensor_msgs::msg::Image::SharedPtr msg)
{
// RCLCPP_INFO(this->get_logger(), "Received image");
cv_bridge::CvImagePtr cv_ptr;
try {
try
{
cv_ptr = cv_bridge::toCvCopy(msg, msg->encoding);
} catch (cv_bridge::Exception &e) {
}
catch (cv_bridge::Exception &e)
{
RCLCPP_ERROR(this->get_logger(), "cv_bridge exception: %s", e.what());
return;
}
cv::Vec3d tvec_camera, rvec_camera;
bool success =
camera_pose_estimation(cv_ptr->image, tvec_camera, rvec_camera);
if (success) {
if (success)
{
tvec_cam_list_[idx_] = tvec_camera;
rvec_cam_list_[idx_] = rvec_camera;
idx_++;
Expand All @@ -112,27 +127,33 @@ void ImageProcessor::image_callback(
cv_ptr.reset();

// Check if we've reached N samples
if (idx_ == static_cast<int>(N_)) {
if (idx_ == static_cast<int>(N_))
{
idx_ = 0;
Eigen::Vector3d mean_tvec = Eigen::Vector3d::Zero();
for (const auto &tvec : tvec_cam_list_) {
for (const auto &tvec : tvec_cam_list_)
{
mean_tvec += Eigen::Vector3d(tvec[0], tvec[1], tvec[2]);
}
mean_tvec /= tvec_cam_list_.size();
mean_tvec_ = mean_tvec;
std::vector<Eigen::Quaterniond> quat_list(tvec_cam_list_.size());
cv::Mat R_cv;
Eigen::Matrix3d R;
for (size_t i = 0; i < tvec_cam_list_.size(); ++i) {
for (size_t i = 0; i < tvec_cam_list_.size(); ++i)
{
cv::Rodrigues(rvec_cam_list_[i], R_cv);
cv::cv2eigen(R_cv, R);
quat_list[i] = Eigen::Quaterniond(R).normalized();
}
mean_quat_ = quaternion_avg(quat_list);
// Lock mean_computed with mutex (if multithreading is involved)
mean_computed_ = true;
} else {
if (idx_ >= static_cast<int>(N_)) {
}
else
{
if (idx_ >= static_cast<int>(N_))
{
idx_ = 0;
}
}
Expand Down
Loading

0 comments on commit 3f33279

Please sign in to comment.