Implementing ORB_SLAM with ROS and kinect

Hello all, I have been trying to implement ORB_SLAM2 using kinect in ROS. I have cloned the source code of ORB_SLAM from github.
Now I am confused on how to launch it with kinect. Can anyone suggest me a way?

Hi raktim,

This package looks very promising :slight_smile: but I will not be able to help you if you do not tell me what the problem is. maybe let’s start from the beginning: You are able to run the depth sensor from your kinect and show its result, for example in Rviz? And which version of kinect do you have?

Regards,
Hubert.

1 Like

No sir, I have not been able to run depth sensors and show its results.
I am using the xbox360 kinect and want to create a map in real time using it…

Hi raktim,

If you just starting your adventure with kinect and ROS I recomend you to look closer on rtabmap_ros package. You can find it here. On ROS wiki you can also find tutorials which should be very helpful for you on start. It’s using freenect or openni packages to get depth sensor data, so you have to remember to install it before you start using rtabmap_ros.

Regards,
Hubert.

1 Like

Hey Hubert, Thanks a lot for the suggestion. Now that I’ve installed Rtab_Map and and created a map of my room using kinect, I want to do something more with it. Do you have any suggestions on it? I was thinking of checking out the loop closure detection and mapping-building efficiency in different lighting conditions. Please do comment on this.

Regards,
Raktim.

Hi raktim,

Your suggestions for further projects are very interesting but maybe you will be interested in going in a different direction. I suggest exploring the subject of object recognition and determining their location. You can find a lot of examples in internet:

or:

You can use PCL, ORK, etc. Possibilities of using these libraries are huge.

I can’t wait to get the results of your work.

Regards,
Hubert.