Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ArCore Support now that Tango is dead? #378

Open
eric-schleicher opened this issue Sep 11, 2017 · 14 comments
Open

ArCore Support now that Tango is dead? #378

eric-schleicher opened this issue Sep 11, 2017 · 14 comments
Assignees

Comments

@eric-schleicher
Copy link

For my purposes, one of the most useful aspects of the tango device is using it to integrate a device pose into ROS. Curiously, I don't (currently) use the tango depth sensor as I'm using a much more powerful RGBD.

With the death of Tango & resulting advent of ArCore, is there any thought about where to take this project / add support for ArCore? On the surface it looks to radically open up a lot of opportunities for people to write ROS powered consumer applications.

Somewhat selfishly, it would be awesome to use son-of-tango-streamer on the ArCore supporting phones as a device pose appliance for my ROS based projects. I'm willing to go do that myself, but it seemed that this was the right place to surface the question.

Thoughts?

@eric-schleicher eric-schleicher changed the title Question ArCore ArCore Support now that Tango is dead? Sep 11, 2017
@smits
Copy link
Contributor

smits commented Sep 19, 2017

It's definitely possible, but for it will require some work as now the TangoRosStreamer is based on Tango's c-api which is currently not available yet for ARCore. Let me get in contact with the Google Tango team to assess the possibility of a c-api release. If that is not the case we would have to basically rewrite TangoRosStreamer on top op ARCore's Java API

@eric-schleicher
Copy link
Author

Thx, I'll stay tuned.

@jimwhite
Copy link

End of Tango announced today with ARCore being the new new thing (https://blog.google/products/google-vr/arcore-developer-preview-2/). Any update on prospects for a ROS Streamer for ARCore?

@eric-schleicher
Copy link
Author

At this point it would make sense to just re-write a really trimmed down version in java. :(

@daranguiz
Copy link

There was a C API introduced for ARCore in their v2 release (https://developers.google.com/ar/reference/c/). Any thoughts there?

@eric-schleicher
Copy link
Author

This is still very compelling (streaming device poses from an android into ROS). I just lack the c chops to implement it (sad face)

@daranguiz
Copy link

Quick update, I just wrote my own ROS node in Java. It's not great (only 5Hz or so), but it gets the job done for our data. I can't share the code, but I'm using Float64MultiArray to handle the pose and Image to handle the camera data via rosjava.

@eric-schleicher
Copy link
Author

eric-schleicher commented Jan 5, 2018

The 5 hz part, was that an arbitrary choice? or some external limitation.

(edit) perhaps because you're sending the image as well?

@daranguiz
Copy link

Yep, I'm sending the image too. You could probably go full rate (30Hz) if you're sending over USB via USB tethering (enable USB tethering, use the appropriate IP address on your desktop) instead of over wifi, but I didn't try going any faster.

@eric-schleicher
Copy link
Author

In a different example; using javascript/webvr, going from a browser i was able to push poses over websockets using roslibjs to ros_bridge at about 100hz (no photos) IIRC it was about 32kb/s

unfortunately that wasn't ARcore, but networking wise should be plenty of speed to move something simple like poses.

@daranguiz
Copy link

Oh, yeah. If you're just streaming poses, you can definitely go at the ARCore max rate (30Hz, same as the GL update rate). I'm streaming a 40Hz topic alongside the image topic and my subscriber is receiving the updates in real time.

@eric-schleicher
Copy link
Author

eric-schleicher commented Jan 17, 2018

@daranguiz if you can't share your project, can you point me to how to get the RGB image from the camera? I'm building effectively the same (posting to ROS the pose and rgb). I see how to get the pose, but not the actual camera texure (in the HelloAR example)

the divergence here from streamer here is only to keep it known that ArCore/ROS integration is a need.

@daranguiz
Copy link

daranguiz commented Jan 17, 2018

@eric-schleicher have you looked at the new computer vision sample app released alongside the ARCore v2 developer preview? Link is here. They implement a way to grab the camera texture and save it to CPU-accessible memory in the /utility folder, but for usage, see line 259 of MainActivity. CameraImageBuffer is just a wrapper class with an internal ByteBuffer. You should be able to convert that to a bitmap and send via the sensor_msgs/Image topic.

Sorry to be so cagey - let me know if this works for you!

@eric-schleicher
Copy link
Author

No worries! that helps a bunch!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants