Skip to content
This repository has been archived by the owner on Mar 18, 2022. It is now read-only.

A little help with the stream #3

Open
Adrianogba opened this issue May 15, 2019 · 7 comments
Open

A little help with the stream #3

Adrianogba opened this issue May 15, 2019 · 7 comments

Comments

@Adrianogba
Copy link

Adrianogba commented May 15, 2019

Do you recommend some tutorial or other examples on how to get this data from the camera and send somewhere?
I need to solve these issues:

  • Take a 3d picture and send it somewhere
  • Identify if there is a person on this 3d image
  • (optional, but preaty helpful) Stream the data to a webservice

Can you give me any help in any of these?
And thanks for this example, I was really lost on how to work with this camera before it.

@Michael-List
Copy link
Owner

There are nearly no examples or tutorials. I figured out the most things via reverse engineering. I updated the example a little bit with commit 264dc73. You should download the .aar file for Android and then unzip it. I decompiled the .class files with IntelliJ and looked at the code.

The OrbbecCamAndroid class includes now a get3DVectors() and a getDepthData() which should partly solve your first problem.

Unfortunately I cannot test the code because i don't have the hardware. But you are welcome to send a pull request for improvements

@Adrianogba
Copy link
Author

@Michael-List Thanks, this was really helpful to understand better how to get these. But about face detection and face recognition, do you have any clues? I need to have a way to detect if the face been tracked is a real person and not a photo. I know this isn't perfect, but it's a lot better than using just a regular camera.

@Michael-List
Copy link
Owner

I'm sorry but I have no clue about face detection and recognition. I only used the hardware to get 3D point clouds.

@abolfazljalali
Copy link

Hello @Michael-List ,

I am using your application for getting depth data frames from hardware.
I can send DepthFrame data via TCP socket to a python server but I don't know how to decode and create an image from that.
Can you please help me with this ?
or even how to create a Image or video stream in android with Depth Frame data?

@Michael-List
Copy link
Owner

Hi @abolfazljalali ,

I never created an image/video stream from that. I used the depth data to calculate the volume of bowls. To debug the whole thing I saved the depth data and visualized it with Matlab. Also I didn't send the data anywhere, all was calculated on the device.

But you could convert the depth data to json format for sending it to a server, maybe this can help you.

@abolfazljalali
Copy link

Hi @abolfazljalali ,

I never created an image/video stream from that. I used the depth data to calculate the volume of bowls. To debug the whole thing I saved the depth data and visualized it with Matlab. Also I didn't send the data anywhere, all was calculated on the device.

But you could convert the depth data to json format for sending it to a server, maybe this can help you.

Michael, Thank you for your response. I just need to know what is the image format or encoding for depth data. I just sent them through a TCP Socket to a Python server and plotted that with Pillow Image library but I don't know how is the Point Stream (Pose) and ColoredStream.
Can you tell me how are those data if you have worked with it.
Thank you in advance.

@Michael-List
Copy link
Owner

I'm sorry I can't help you with that. It was nearly two years ago since I made a project with this. I don't have the hardware, otherwise i could make a test and take a look at it.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants