Skip to content

Commit

Permalink
updated readme with ov_secondary links
Browse files Browse the repository at this point in the history
  • Loading branch information
rpng-guest committed Jul 7, 2020
1 parent 8774a97 commit 28f0a52
Show file tree
Hide file tree
Showing 3 changed files with 24 additions and 4 deletions.
22 changes: 21 additions & 1 deletion ReadMe.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,7 @@ Please take a look at the feature list below for full details on what the system

## News / Events

* **May 18, 2020** - Released secondary pose graph example repository [ov_secondary](https://github.com/rpng/ov_secondary) based on [VINS-Fusion](https://github.com/HKUST-Aerial-Robotics/VINS-Fusion). OpenVINS now publishes marginalized feature track, feature 3d position, and first camera intrinsics and extrinsics. See [PR#66](https://github.com/rpng/open_vins/pull/66) for details and discussion.
* **April 3, 2020** - Released [v2.0](https://github.com/rpng/open_vins/releases/tag/v2.0) update to the codebase with some key refactoring, ros-free building, improved dataset support, and single inverse depth feature representation. Please check out the [release page](https://github.com/rpng/open_vins/releases/tag/v2.0) for details.
* **January 21, 2020** - Our paper has been accepted for presentation in [ICRA 2020](https://www.icra2020.org/). We look forward to seeing everybody there! We have also added links to a few videos of the system running on different datasets.
* **October 23, 2019** - OpenVINS placed first in the [IROS 2019 FPV Drone Racing VIO Competition
Expand Down Expand Up @@ -63,9 +64,27 @@ Please take a look at the feature list below for full details on what the system
* Out of the box evaluation on EurocMav and TUM-VI datasets
* Extensive evaluation suite (ATE, RPE, NEES, RMSE, etc..)

## Demo Videos


## Codebase Extensions

* **[ov_secondary](https://github.com/rpng/ov_secondary)** -
This is an example secondary thread which provides loop closure in a loosely coupled manner for [OpenVINS](https://github.com/rpng/open_vins).
This is a modification of the code originally developed by the HKUST aerial robotics group and can be found in their [VINS-Fusion](https://github.com/HKUST-Aerial-Robotics/VINS-Fusion) repository.
Here we stress that this is a loosely coupled method, thus no information is returned to the estimator to improve the underlying OpenVINS odometry.
This codebase has been modified in a few key areas including: exposing more loop closure parameters, subscribing to camera intrinsics, simplifying configuration such that only topics need to be supplied, and some tweaks to the loop closure detection to improve frequency.

* **[ov_maplab](https://github.com/rpng/ov_maplab)** -
This codebase contains the interface wrapper for exporting visual-inertial runs from [OpenVINS](https://github.com/rpng/open_vins) into the ViMap structure taken by [maplab](https://github.com/ethz-asl/maplab).
The state estimates and raw images are appended to the ViMap as OpenVINS runs through a dataset.
After completion of the dataset, features are re-extract and triangulate with maplab's feature system.
This can be used to merge multi-session maps, or to perform a batch optimization after first running the data through OpenVINS.
Some example have been provided along with a helper script to export trajectories into the standard groundtruth format.



## Demo Videos

[![](docs/youtube/KCX51GvYGss.jpg)](http://www.youtube.com/watch?v=KCX51GvYGss "OpenVINS - EuRoC MAV Vicon Rooms Flyby")
[![](docs/youtube/Lc7VQHngSuQ.jpg)](http://www.youtube.com/watch?v=Lc7VQHngSuQ "OpenVINS - TUM VI Datasets Flyby")
[![](docs/youtube/vaia7iPaRW8.jpg)](http://www.youtube.com/watch?v=vaia7iPaRW8 "OpenVINS - UZH-FPV Drone Racing Dataset Flyby")
Expand All @@ -76,6 +95,7 @@ Please take a look at the feature list below for full details on what the system
[![](docs/youtube/ExPIGwORm4E.jpg)](http://www.youtube.com/watch?v=ExPIGwORm4E "OpenVINS - UZH-FPV Drone Racing Dataset Demonstration")



## Credit / Licensing

This code was written by the [Robot Perception and Navigation Group (RPNG)](https://sites.udel.edu/robot/) at the University of Delaware.
Expand Down
4 changes: 2 additions & 2 deletions ov_msckf/launch/pgeneva_ros_tum.launch
Original file line number Diff line number Diff line change
Expand Up @@ -4,11 +4,11 @@
<arg name="max_cameras" default="2" />
<arg name="use_stereo" default="true" />
<arg name="bag_start" default="0" />
<arg name="bag" default="/home/patrick/datasets/tum/dataset-room2_512_16.bag" />
<arg name="bag" default="/home/patrick/datasets/tum/dataset-room1_512_16.bag" />

<!-- imu starting thresholds -->
<arg name="init_window_time" default="1.0" />
<arg name="init_imu_thresh" default="0.75" />
<arg name="init_imu_thresh" default="0.70" /> <!-- room1:0.70, room2:0.75, room3:0.70 -->

<!-- saving trajectory path and timing information -->
<arg name="dosave" default="false" />
Expand Down
2 changes: 1 addition & 1 deletion ov_msckf/launch/pgeneva_serial_tum.launch
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@
<param name="max_cameras" type="int" value="2" />
<param name="dt_slam_delay" type="double" value="3" />
<param name="init_window_time" type="double" value="1.0" />
<param name="init_imu_thresh" type="double" value="0.8" />
<param name="init_imu_thresh" type="double" value="0.70" /> <!-- room1:0.70, room2:0.75, room3:0.70 -->
<rosparam param="gravity">[0.0,0.0,9.80766]</rosparam>
<param name="feat_rep_msckf" type="string" value="GLOBAL_3D" />
<param name="feat_rep_slam" type="string" value="ANCHORED_FULL_INVERSE_DEPTH" />
Expand Down

0 comments on commit 28f0a52

Please sign in to comment.