Skip to content
This repository has been archived by the owner on Mar 22, 2022. It is now read-only.

help with camera feed live processing #843

Open
daddym85 opened this issue Jan 18, 2022 · 5 comments
Open

help with camera feed live processing #843

daddym85 opened this issue Jan 18, 2022 · 5 comments

Comments

@daddym85
Copy link

Hi there,
I’m working on MS Holoens 2, my goal is to analyze in realtime the frames coming from the webcam feed to process them with Barracuda ML engine. I’m following this tutorial to view to camera feed on a MeshRenderer component: https://microsoft.github.io/MixedReality-WebRTC/manual/unity/helloworld-unity-localvideo.html.
However, my problem is to get a texture2d with the raw images from the camera live streaming. Any suggestions?
Thanks,
Davide

@Xiink
Copy link

Xiink commented Jan 19, 2022

Hi there, I’m working on MS Holoens 2, my goal is to analyze in realtime the frames coming from the webcam feed to process them with Barracuda ML engine. I’m following this tutorial to view to camera feed on a MeshRenderer component: https://microsoft.github.io/MixedReality-WebRTC/manual/unity/helloworld-unity-localvideo.html. However, my problem is to get a texture2d with the raw images from the camera live streaming. Any suggestions? Thanks, Davide

This is my solution:
First I create a RenderTexture and use Graphics.Blit() to get the render result.
It will like this:

"Graphics.Blit(renderTexture, videoMaterial);"

And you can get image form renderTexture.
My method is transfer to Base64 it will like this:

Texture2D tex = new Texture2D(renderTexture.width, renderTexture.height, TextureFormat.RGB24, false);
RenderTexture.active = renderTexture;
tex.ReadPixels(new Rect(0, 0, renderTexture.width, renderTexture.height), 0, 0);
tex.Apply();
var bytes = tex.EncodeToJPG();
var str = Convert.ToBase64String(bytes);
UnityEngine.Object.Destroy(tex);

Hope this helps you and forgive me for being terrible in my English.

@daddym85
Copy link
Author

Excuse me, I'm a newbie iin Unity. I have the renderTexture on a RawImage object. I'm confused about the "Graphics.Blit" commands.
How can I get a Textture2d containing the camera feed frame by frame?
Thanks
Davide

@Xiink
Copy link

Xiink commented Jan 20, 2022

Excuse me, I'm a newbie iin Unity. I have the renderTexture on a RawImage object. I'm confused about the "Graphics.Blit" commands. How can I get a Textture2d containing the camera feed frame by frame? Thanks Davide

Did you want to get camera image or only Textture2D?
You can open the MR-WebRTC "VideoRender.cs" source code.
The function "TryProcessI420Frame" will help you.

You can not use one Textture2D to get raw image.
Because the toturial teaching the video frames coming from the webcam source are encoded using the I420 format. This format contains 3 separate components : the luminance Y, and the chroma U and V.
And you need to uploading the video data for the Y, U, and V channels into 3 separate Texture2D objects,and the material is using a custom shader called to sample those textures and convert the YUV value to an ARGB value on the fly before rendering the quad with it.

The material is the real raw image.
So I use "Graphics.Blit" to render the material result to a renderTexture object then I can get the raw image.
I use this to process face recognition

@daddym85
Copy link
Author

Sorry for the delay, I want the current camera image on a Texture2D.
Thanks,
Davide

@Xiink
Copy link

Xiink commented Feb 10, 2022

Sorry for the delay,I was on New Year's holiday.
This is my test project.
https://github.com/Xiink/UnityWebRTC-Take-Photo-Sample
You can try to see this example.

Note:
You need to chage MySignaler Local Peer Id and Server-Http Server Address to your own Peer Id and own ip.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants