Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Building for iOS: "Shader Is Not Supported On This GPU (none of subshaders/fallbacks are suitable)" #15

Open
nickcastel50 opened this issue Jul 23, 2018 · 6 comments

Comments

@nickcastel50
Copy link

Hello,

First, very cool project! I wanted to try building this for iOS, but I'm running into an issue. What I've done:

  1. Removed NetworkCommunication Code (as referenced in another issue if we are not building for Windows Store)
  2. Stripped Hololens Gesture Control code
  3. Hardcoded IP - manually called connect function.

The project compiles fine, but I see a warning under the GSG Billboard shader object, saying: "Shader Is Not Supported On This GPU (none of subshaders/fallbacks are suitable)". Does anyone know the reason for this? I'm not quite sure why the iOS GPU wouldn't be able to properly render this shader.

Thanks for any information you can provide, and thanks again for the cool project!

@MarekKowalski
Copy link
Owner

Hi,

I had an email from someone who had a similar issue previously, he used this shader:
https://github.com/googlearchive/tango-examples-unity/blob/master/UnityExamples/Assets/TangoPrefabs/Shaders/PointCloud.shader

And apparently he got it working on an iPhone 8.

Let me know if it worked!

Marek

@nickcastel50
Copy link
Author

Thank you @MarekKowalski! I will try this and report back. I think I still have an error to fix, I was getting a black screen on iOS after the Unity logo appears and the logs say "SocketException:Connection Refused". This is strange to me because on the PC where I am running the server I see that the port connection is established and looking at Wireshark logs I can see the point cloud stream is being sent to my phone's IP over port 48002.

I thought even with the bad shader it should still render, albeit just a solid pink color? Maybe I'm wrong, will let you know as soon as I try with the shader you provided. Thanks again!

@nickcastel50
Copy link
Author

nickcastel50 commented Jul 28, 2018

Okay @MarekKowalski, I got it rendering on the iPhone! Since I removed the gesture control code, I had to rotate the camera Y axis 180 degrees to see it. That shader is still an issue though, it just renders depth, it doesn't render color at all.

Also one other thing, the background is black. Would you know would I go about making the background transparent and having the background be my phone's camera feed for some sort of AR experience? Thanks again for your help and project!

Edit: I found this thread which has some great conversation on the issue: https://forum.unity.com/threads/ios-11-metal2-has-no-geometry-shader.499676/

So in short, Apple doesn't support Geometry shaders. Admittedly I don't know much about building shaders, I am not sure how get the same functionality from your Geometry shader using Compute shaders. If you could shed some light on this it would be greatly appreciated!

Double Edit: I modified the shader to use the color from the application. It doesn't do the triangle geometry but I will have to look into modifying the shader to do the geometry with compute code.

Here is the shader code:

// Don't remove the following line. It is used to bypass Unity
// upgrader change. This is necessary to make sure the shader 
// continues to compile on Unity 5.2
// Created by Team Tango
// Modified by Nick Castellucci to preserve color on iOS for
// LiveScan3D
// UNITY_SHADER_NO_UPGRADE
Shader "Tango/PointCloud" {
Properties{
        point_size("Point Size", Float) = 5.0
}
  SubShader {
     Pass {
        CGPROGRAM
        #pragma vertex vert
        #pragma fragment frag
       
        #include "UnityCG.cginc"

        struct appdata
        {
           float4 vertex : POSITION;
           // add color here because livescan3d transmits it
           float4 color : COLOR;
        };

        struct v2f
        {
           float4 vertex : SV_POSITION;
           float4 color : COLOR;
           float size : PSIZE;
        };
       
        float4x4 depthCameraTUnityWorld;
        float point_size;
       
        v2f vert (appdata v)
        {
           v2f o;
           o.vertex = mul(UNITY_MATRIX_MVP, v.vertex);
           o.size = point_size;
           
           // Use color from appdata
           o.color = v.color;
           return o;
        }
       
        fixed4 frag (v2f i) : SV_Target
        {
           return i.color;
        }
        ENDCG
     }
  }
}

@TheBricktop
Copy link

How did You manage to autoconnect at start, it always freezes my editor when I try to get it like this

@MarekKowalski
Copy link
Owner

MarekKowalski commented Aug 18, 2018

@nickcastel50, congratulations on your progress and sorry I did not answer, for some reason I missed your post completely :( Unfortunately I know little a bout shaders as well, so I won't be able to help you there. Your project sounds quite interesting, please post more info about your progress or maybe show a video.

@TheBricktop , have you made sure the live view window is open and showing the point cloud (on the same device as Unity)? If you did and it does not work, please check your firewall settings.

Marek

@nickcastel50
Copy link
Author

How did You manage to autoconnect at start, it always freezes my editor when I try to get it like this

@TheBricktop: You can just just modify the KeyboardInput.cs file in @MarekKowalski's project. Just call keyboardDone.invoke(YOUR_IP_ADDRESS); in the start() function. It works fine for testing.

Something like this:

void Start ()
    {
        if (keyboardDone != null)
            keyboardDone.Invoke(YOUR_IP_ADDRESS);
    }
	
void Update ()
    {
     
    }

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants