diff --git a/content/learning-paths/smartphones-and-mobile/ray_tracing/_index.md b/content/learning-paths/smartphones-and-mobile/ray_tracing/_index.md index 1f9757d32..9de7cf122 100644 --- a/content/learning-paths/smartphones-and-mobile/ray_tracing/_index.md +++ b/content/learning-paths/smartphones-and-mobile/ray_tracing/_index.md @@ -1,17 +1,17 @@ --- -title: Introduction to Ray tracing with Vulkan on Android +title: Learn about Ray Tracing with Vulkan on Android minutes_to_complete: 120 -who_is_this_for: Vulkan developers who are familiar with rendering and are interested in adding ray tracing to their applications. +who_is_this_for: This Learning Path is for Vulkan developers who are familiar with rendering and are interested in deploying ray tracing in their applications. learning_objectives: - - Learn how the Vulkan ray tracing API works. - - Learn how to use ray tracing to implement realistic shadows, reflections and refractions. + - Describe how the Vulkan ray tracing API works. + - Describe how to use ray tracing to implement realistic shadows, reflections, and refractions. - Implement basic ray tracing effects in a Vulkan renderer. prerequisites: - - An appropriate Android device (e.g., Vivo X100) supporting the required Vulkan extensions. + - An appropriate Android device that supports the required Vulkan extensions (for example, Vivo X100). - Knowledge of the Vulkan API. - A Vulkan renderer. Most code is generic and should be easy to incorporate into any deferred PBR renderer. diff --git a/content/learning-paths/smartphones-and-mobile/ray_tracing/_review.md b/content/learning-paths/smartphones-and-mobile/ray_tracing/_review.md index 5caaeee33..61f2f758f 100644 --- a/content/learning-paths/smartphones-and-mobile/ray_tracing/_review.md +++ b/content/learning-paths/smartphones-and-mobile/ray_tracing/_review.md @@ -2,28 +2,17 @@ review: - questions: question: > - Which is the recommended way to do ray traversal on Arm GPUs? + Which of these is the recommended way to do ray traversal on Arm GPUs? answers: - - Ray Query - - Ray Tracing Pipeline + - Ray query + - Ray tracing pipeline correct_answer: 1 explanation: > Ray query is the most efficient way to implement ray traversal on Arm GPUs. - questions: question: > - Select the correct statement: - answers: - - A TLAS contains the model geometry - - A BLAS uses instances to group other TLASes - - A BLAS contains the model geometry - correct_answer: 3 - explanation: > - BLASes (Bottom-Level Acceleration Structures) contain the actual geometry data, usually as triangles. TLASes (Top-Level Accelerations Structures) contain other BLASes and use instances to group and link them with other properties. - - - questions: - question: > - How should we design the acceleration structure? + When designing acceleration structure, which of the following statements is true? answers: - Empty space does not matter, reduce BLAS overlap. - Minimize empty space and minimize overlap. @@ -32,53 +21,18 @@ review: correct_answer: 2 explanation: > On ray tracing, the quality of your acceleration structure can have a huge performance impact. Try to reduce overlap across BLASes and reduce empty space inside a BLAS as much as possible. - - - questions: - question: > - Is bindless necessary for ray tracing? - answers: - - We do not need it for shadows, but it is needed for reflections and refractions. - - Technically no, but we need it to implement our effects. - - It is not needed, but it makes implementing our effects a lot easier. - correct_answer: 3 - explanation: > - Bindless or descriptor indexing is independent of ray tracing. It is possible to implement our ray tracing effects without using it, however it will make it very easy and simple to access the data of the intercepted objects. This helps a lot when implementing reflections and refractions. - - questions: question: > Can reflections handle objects outside the screen? answers: - Ray tracing reflections can reflect objects not on the screen but Screen Space Reflections can only reflect objects on the screen. - Both Screen Space Reflections and ray tracing reflections can reflect objects not on the screen. - - Neither Screen Space Reflections or ray tracing reflections can reflect objects on the screen. + - Neither Screen Space Reflections nor ray tracing reflections can reflect objects on the screen. - Ray tracing reflections can only reflect objects on the screen but Screen Space Reflections can reflect objects not on the screen. correct_answer: 1 explanation: > Screen Space Reflections obtains the information from the G-buffer so it can only reflect object currently on the screen. Ray tracing reflections offer better quality since they can handle any object in the acceleration structure, including objects not on the screen. - - questions: - question: > - Which sentence is true for our ray tracing effects? - answers: - - In ray tracing shadows and reflections we use bindless to retrieve the material of the intercepted object and illuminate it. - - In ray tracing shadows we do not care about which objects we hit, only whether we hit an object or not. - - In ray tracing reflections we can use the flag gl_RayFlagsTerminateOnFirstHitEXT. - correct_answer: 2 - explanation: > - In ray tracing reflections we need to know which object we are hitting to retrieve its material and illuminate it. In ray tracing shadows we do not care which exact object we hit, just whether we hit an object at all. This allows us to enable the gl_RayFlagsTerminateOnFirstHitEXT optimization for shadows but not for reflections. - - - - questions: - question: > - What is the difference between transparency and refraction? - answers: - - Light goes through a transparent object in a straight line, but refractions bend the light. - - None, refractions are how transparency is implemented in ray tracing. - - Refractions are a faster way to implement transparency. - correct_answer: 1 - explanation: > - In a transparent material, light goes through in a straight line, so light rays enter and exit the material in the same direction. Refractions bend the light inside the object, so the ray exits the object in a different direction. We can use refractions to simulate ray tracing transparency, however there are simpler, more efficient ways. - # ================================================================================ # FIXED, DO NOT MODIFY # ================================================================================ diff --git a/content/learning-paths/smartphones-and-mobile/ray_tracing/rt01_introduction.md b/content/learning-paths/smartphones-and-mobile/ray_tracing/rt01_introduction.md index 10ffc5069..0f8e3eea1 100644 --- a/content/learning-paths/smartphones-and-mobile/ray_tracing/rt01_introduction.md +++ b/content/learning-paths/smartphones-and-mobile/ray_tracing/rt01_introduction.md @@ -1,34 +1,32 @@ --- -title: "Introduction: What is ray tracing?" +title: "What is ray tracing?" weight: 2 ### FIXED, DO NOT MODIFY layout: learningpathall --- -## Introduction: What is ray tracing? - -Ray tracing is a technique to render images using ray casting methods to simulate light transport. Traditionally, this is not suitable for real time or games, and is mostly used by CGI or movies. +Ray tracing is a technique to render images using ray casting methods to simulate light transport. Historically, this was not suitable for real-time computer graphics or gaming, and was mostly used by CGI in movies. Vulkan includes a new set of extensions for ray intersection tests, designed to facilitate ray tracing. -The API allows applications to define a ray, setting an origin and a direction and then test this ray against the scene geometry. By default, the API will return the closest hit that the ray will intercept in our scene. +The API allows applications to define a ray; setting an origin and a direction, and then testing this ray against the scene geometry. By default, the API returns the closest hit that the ray intercepts in the scene. -Vulkan's ray tracing API is very flexible and allows creative uses like collision detection in physics simulations. However, the main use case for ray tracing is rendering. This is because ray tracing allows applications to simulate the behavior of light, in a way resembling physical reality. +Vulkan's ray tracing API is very flexible and enables creative use, such as collision detection in physics simulations. The the main use case for ray tracing, however, is *rendering*. This is because ray tracing allows applications to simulate the behavior of light in a way that resembles physical reality. -In the real world a light source emits light rays in all directions, these rays interact with multiple objects. When a ray intersects an object, the light will interact with it, causing the object to absorb and reflect certain amounts of light. +In the real world, a light source emits light rays in all directions. These rays interact with multiple objects. When a ray intersects an object, the object absorbs and reflects certain amounts of light. -Traditionally, developers render games using rasterization. Rasterization works by transforming the geometry into triangle primitives and projecting them onto the screen. GPUs then use a depth buffer to resolve the visibility and decide which pixels are covered by each triangle. +Traditionally, developers render games using *rasterization*. Rasterization works by transforming the geometry into triangle primitives and projecting them onto the screen. GPUs then use a depth buffer to resolve the visibility and decide which pixels are covered by each triangle. -With ray tracing, we can instead use path tracing. A path tracer does not need to use rasterization, instead it can launch rays from the camera. These rays will bounce around the scene until they produce a final image, resolving visibility using the closest hit. In the real world, rays travel from a light until they reach the camera, but this is extremely inefficient since most rays will not reach our eyes, this is why in rendering we launch rays in the reverse order, starting from the camera. +With ray tracing, you can instead use *path tracing*. A path tracer does not need to use rasterization, instead it can launch rays from the camera. These rays bounce around the scene until they produce a final image, resolving visibility using the closest hit. In the real world, rays travel from a light until they reach the camera, but this is extremely inefficient as most rays do not reach our eyes. This is why in rendering you can launch rays in the reverse order, starting from the camera. -Path tracing is extremely costly since we will need to launch thousands of rays per pixel to produce a non-noisy image. Rendering a frame in real time using path tracing is not feasible even on desktop high end GPUs. The common approach is to have a hybrid renderer, with a traditional rasterization pass to resolve visibility and compute the G-buffer, and then implement each ray tracing effect as a separate post-process. +Path tracing is extremely costly as it requires thousands of rays per pixel to be launched to produce a non-noisy image. Rendering a frame in real time using path tracing is just not feasible, even on desktop high-end GPUs. The common solution therefore is to have a hybrid renderer, with a traditional rasterization pass to resolve visibility and compute the G-buffer, and then implement each ray tracing effect as a separate post-process. {{< tabpane >}} - {{< tab header="Example 1: RT ON" title="Example 1: ray tracing ON" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/city_rt_on.png">}}{{< /tab >}} - {{< tab header="Example 1: RT OFF" title="Example 1: ray tracing OFF" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/city_rt_off.png">}}{{< /tab >}} - {{< tab header="Example 2: RT ON" title="Example 2: ray tracing ON" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/bonza_rt_on.png">}}{{< /tab >}} - {{< tab header="Example 2: RT OFF" title="Example 2: ray tracing OFF" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/bonza_rt_off.png">}}{{< /tab >}} - {{< tab header="Example 3: RT ON" title="Example 3: ray tracing ON" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/immortalis_rt_on.png">}}{{< /tab >}} - {{< tab header="Example 3: RT OFF" title="Example 3: ray tracing OFF" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/immortalis_rt_off.png">}}{{< /tab >}} + {{< tab header="Example 1: RT ON" title="Example 1: Ray tracing ON" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/city_rt_on.png">}}{{< /tab >}} + {{< tab header="Example 1: RT OFF" title="Example 1: Ray tracing OFF" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/city_rt_off.png">}}{{< /tab >}} + {{< tab header="Example 2: RT ON" title="Example 2: Ray tracing ON" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/bonza_rt_on.png">}}{{< /tab >}} + {{< tab header="Example 2: RT OFF" title="Example 2: Ray tracing OFF" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/bonza_rt_off.png">}}{{< /tab >}} + {{< tab header="Example 3: RT ON" title="Example 3: Ray tracing ON" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/immortalis_rt_on.png">}}{{< /tab >}} + {{< tab header="Example 3: RT OFF" title="Example 3: Ray tracing OFF" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/immortalis_rt_off.png">}}{{< /tab >}} {{< /tabpane >}} \ No newline at end of file diff --git a/content/learning-paths/smartphones-and-mobile/ray_tracing/rt02_setup.md b/content/learning-paths/smartphones-and-mobile/ray_tracing/rt02_setup.md index aeccb8c2b..983eb1277 100644 --- a/content/learning-paths/smartphones-and-mobile/ray_tracing/rt02_setup.md +++ b/content/learning-paths/smartphones-and-mobile/ray_tracing/rt02_setup.md @@ -1,5 +1,5 @@ --- -title: "Setup: Enabling ray tracing" +title: "Setup: enabling ray tracing" weight: 3 ### FIXED, DO NOT MODIFY @@ -8,13 +8,13 @@ layout: learningpathall ## Setup: Enabling ray tracing -Vulkan uses the same API for ray tracing on PC and mobile devices. This makes it extremely easy to implement and test our ray tracing effects on PC and deploy them on mobile. Porting existing ray tracing effects from PC to mobile should also be simple. +Vulkan uses the same API for ray tracing on PC and mobile devices. This makes it extremely easy to implement and test ray tracing effects on PC and deploy them on mobile. Porting existing ray tracing effects from PC to mobile should also be simple. -Arm® Mali™-G7-series GPUs after the [Arm® Mali™-G715](https://developer.arm.com/Processors/Mali-G715) might or might not support for ray tracing, depending on the phone. Immortalis GPUs like [Arm® Immortalis-G715](https://developer.arm.com/Processors/Immortalis-G715), [Arm® Immortalis-G720](https://developer.arm.com/Processors/Immortalis-G720) or [Arm® Immortalis-G925](https://developer.arm.com/Processors/Immortalis-G925) always support ray tracing. These GPUs are available in multiple devices already in the market. Moreover, ray tracing is a promising modern technology, so multiple GPU vendors are supporting the API. Most recent high end Android smartphones support ray tracing, making it a feature game developers can rely on. +Arm® Mali™-G7-series GPUs after the [Arm® Mali™-G715](https://developer.arm.com/Processors/Mali-G715) might or might not support ray tracing, depending on the phone model. Immortalis GPUs such as [Arm® Immortalis-G715](https://developer.arm.com/Processors/Immortalis-G715), [Arm® Immortalis-G720](https://developer.arm.com/Processors/Immortalis-G720), or [Arm® Immortalis-G925](https://developer.arm.com/Processors/Immortalis-G925) support ray tracing. These GPUs are available in multiple devices already in the market. Moreover, ray tracing is a promising modern technology, so multiple GPU vendors support the API. Most recent high-end Android smartphones support ray tracing, making it a feature game developers can rely on. -Vulkan offers ray tracing as a series of extensions, making it easy to query for support and to enable it. The most relevant extensions are: `VK_KHR_acceleration_structure`, `VK_KHR_ray_query` and `VK_KHR_ray_tracing_pipeline`. +Vulkan offers ray tracing as a series of extensions, making it easy to query for support and to enable it. The most relevant extensions are: `VK_KHR_acceleration_structure`, `VK_KHR_ray_query`, and `VK_KHR_ray_tracing_pipeline`. -We can query the physical device to check if it supports the necessary extensions. Here is a helper function: +You can query the physical device to check if it supports the extensions. Here is a helper function: ``` cpp uint32_t device_extension_count; @@ -37,7 +37,7 @@ auto is_extension_supported = }; ``` -We can use it to query the ray tracing extensions' features: +You can use it to query the ray tracing extensions' features: ``` cpp VkPhysicalDeviceFeatures2 features2{VK_STRUCTURE_TYPE_PHYSICAL_DEVICE_FEATURES_2}; @@ -64,7 +64,7 @@ if (is_extension_supported(VK_KHR_RAY_TRACING_PIPELINE_EXTENSION_NAME)) vkGetPhysicalDeviceFeatures2(physical_device, &features2); ``` -Finally, enable the extensions when creating the logical device: +Finally, you can enable the extensions when creating the logical device: ``` cpp std::vector enabled_extensions{}; diff --git a/content/learning-paths/smartphones-and-mobile/ray_tracing/rt03_ray_traversal.md b/content/learning-paths/smartphones-and-mobile/ray_tracing/rt03_ray_traversal.md index b77db4936..c4b1b6cae 100644 --- a/content/learning-paths/smartphones-and-mobile/ray_tracing/rt03_ray_traversal.md +++ b/content/learning-paths/smartphones-and-mobile/ray_tracing/rt03_ray_traversal.md @@ -1,36 +1,42 @@ --- -title: "Ray traversal: Ray Query vs Ray Tracing Pipeline" +title: "Ray traversal: ray tracing pipeline versus ray query" weight: 4 ### FIXED, DO NOT MODIFY layout: learningpathall --- -## Ray traversal: Ray Query vs Ray Tracing Pipeline - To control ray traversal, and launch rays, Vulkan offers two options: -### Ray Tracing Pipeline +### Ray tracing pipeline -The first option is to use `VK_KHR_ray_tracing_pipeline`. This is an opaque and more driver managed approach, it uses a new set of shader stages to allow users to control ray traversal. +The first option is to use `VK_KHR_ray_tracing_pipeline`. This is an opaque and more driver-managed approach. It uses a new set of shader stages to allow users to control ray traversal. -First, we use a `Ray Generation` shader to define our rays and set the origin and direction of each ray that we wish to trace. As the GPU traverses the acceleration structure, it may call one of two shaders. If it intersects user-defined geometry it will call the `Intersection` shader. If it intersects non-opaque geometry, it will call an `Any-Hit` shader to decide if the hit should be considered. +First, you can use a `Ray Generation` shader to define rays and set the origin and direction of each ray that you want to trace. As the GPU traverses the acceleration structure, it may call one of two shaders. If it intersects user-defined geometry, it will call the `Intersection` shader. If it intersects non-opaque geometry, it will call an `Any-Hit` shader to decide if the hit should be considered. -Once ray traversal is complete, one of two things can happen: If the ray does not hit anything in the scene, we call the `Miss` shader. A common use case of a `Miss` shader is to sample an environment map containing the sky. +Once ray traversal is complete, one of two things can happen. If the ray does not hit anything in the scene, you call the `Miss` shader. A common use case of a `Miss` shader is to sample an environment map containing the sky. -If instead we have a confirmed hit, the GPU will invoke our `Closest-Hit` shader. In this shader we can then determine what data we need from the object that we intersected, and return that data to the `Ray Generation` shader. For example, we may illuminate the object and later store the generated color into an output image. +If instead you have a confirmed hit, the GPU invokes our `Closest-Hit` shader. In this shader, you can then determine what data you need from the object that you intersected, and return that data to the `Ray Generation` shader. For example, you might illuminate the object and later store the generated color into an output image. ![Diagram of Ray Tracing Pipeline #center](images/RTPipeline_diagram.svg "Diagram of Ray Tracing Pipeline") -### Ray Query +### Ray query + +The `VK_KHR_ray_query` extension allows you to use ray tracing from existing shader stages. Usually, you will launch rays from `Compute` or `Fragment` shaders, but it is also possible to launch rays from other stages, such as `Vertex` or even `Ray Generation`. -The `VK_KHR_ray_query` extension allows us to use ray tracing from existing shader stages. Usually, we will launch our rays from `Compute` or `Fragment` shaders, but it is also possible to launch our rays from other stages like `Vertex` or even `Ray Generation`. +This makes it easy to add ray tracing to existing shaders, but it means that you need to manage the ray traversal. However, in most cases. ray traversal is simple to manage: -This makes it very easy to add ray tracing to our existing shaders, but it will mean that we will need to manage the ray traversal ourselves. However, ray traversal is very simple to manage in most cases. In the shader, we will first define a ray using `rayQueryInitializeEXT` to set its ray parameters. Then we can start ray traversal using `rayQueryProceedEXT`. `rayQueryProceedEXT` will return false once ray traversal is complete, in non-opaque geometry we will need to use `rayQueryConfirmIntersectionEXT` to confirm non-opaque candidates. If we only have opaque geometry, we recommend to just call `rayQueryProceedEXT` and ignore its return value. +* In the shader, you first define a ray using `rayQueryInitializeEXT` to set its ray parameters. +* Then you can start ray traversal using `rayQueryProceedEXT`. +* `rayQueryProceedEXT` returns false once ray traversal is complete. +* In non-opaque geometry, you need to use `rayQueryConfirmIntersectionEXT` to confirm non-opaque candidates. +* If you only have opaque geometry, you can call `rayQueryProceedEXT` and ignore its return value. ![Diagram of Ray Query #center](images/RQuery_diagram.svg "Diagram of Ray Query") -Once the ray traversal is complete, we would use `rayQueryGetIntersectionTypeEXT` to query if we have hit something or missed. If we hit something, we can use other ray query functions to query some data and use this result in our shader: +* Once the ray traversal is complete, you can use `rayQueryGetIntersectionTypeEXT` to query if you have hit something, or missed. + +If you hit something, you can use other ray query functions to query the data and use the result in your shader: ``` glsl layout(set = 0, binding = 10) uniform accelerationStructureEXT top_level_acceleration_structure; @@ -52,7 +58,7 @@ void trace_ray() float ray_t_max = 1e24; // Will reject candidates if the distance is greater. Useful for point lights with a radius. rayQueryInitializeEXT(rayQuery, top_level_acceleration_structure, flags, cull_mask, ray_origin, ray_t_min, ray_direction, ray_t_max); - // The geometry is opaque so we do not need to check the return value + // The geometry is opaque so you do not need to check the return value rayQueryProceedEXT(rayQuery); const bool committed_intersection = true; @@ -75,6 +81,6 @@ void trace_ray() } ``` -### Which one to use +### Do I use ray query or ray tracing? -For most simple examples, Ray Query usually offers better performance, and it is what we recommend you use to implement ray tracing effects. Right now, Mali devices in the market only offer Ray Query support. On Arm GPUs, Ray Query is hardware accelerated for both `Compute` and `Fragment` shaders, but we recommend using `Fragment` shaders, to benefit from frame buffer compression. We recommend reading our [AFBC best practices](https://developer.arm.com/documentation/101897/latest/Buffers-and-textures/AFBC-textures) to learn more. +For most simple examples, ray query usually offers better performance, and it is what is recommended to use for implementing ray tracing effects. At the time of writing, Mali devices in the market only offer ray query support. On Arm GPUs, ray query is hardware accelerated for both `Compute` and `Fragment` shaders, but it is recommended to use `Fragment` shaders, to benefit from frame buffer compression. For further information, see [AFBC best practices](https://developer.arm.com/documentation/101897/latest/Buffers-and-textures/AFBC-textures) to learn more. diff --git a/content/learning-paths/smartphones-and-mobile/ray_tracing/rt04_acceleration_structure.md b/content/learning-paths/smartphones-and-mobile/ray_tracing/rt04_acceleration_structure.md index 175dc2711..af98c63d4 100644 --- a/content/learning-paths/smartphones-and-mobile/ray_tracing/rt04_acceleration_structure.md +++ b/content/learning-paths/smartphones-and-mobile/ray_tracing/rt04_acceleration_structure.md @@ -8,31 +8,31 @@ layout: learningpathall ## Acceleration structure -For ray tracing, the first thing that we need is to create an acceleration structure to represent our scene. +For ray tracing, the first thing that you need is to create an acceleration structure to represent a scene. -An acceleration structure is an optimised data structure for fast ray intersection tests. It organizes the geometry so that we can efficiently narrow down what each ray intersects. Vulkan represents them using `VK_KHR_acceleration_structure`, and although the actual structure is opaque and defined by the implementation, it is usually implemented as a treelike structure, and the API gives us some options to control the topology and keep it balanced. +An acceleration structure is an optimized data structure for fast ray intersection tests. It organizes the geometry so that you can efficiently narrow down what each ray intersects. Vulkan represents them using `VK_KHR_acceleration_structure`, and although the actual structure is opaque and defined by the implementation, it is usually implemented as a tree-like structure, and the API gives some options to control the topology and keep it balanced. The API makes a distinction between *Bottom-Level Acceleration Structures* (BLAS) and *Top-Level Accelerations Structures* (TLAS). -*Bottom-Level Acceleration Structures* contain the actual geometry data, usually as triangles, but they could also be mathematically defined models. The actual implementation of a BLAS is opaque to users, but the driver usually organizes this data in a Bounding Volume Hierarchy (BVH), so that when we cast a ray, we can minimize the number of triangle intersection checks. +*Bottom-Level Acceleration Structures* contain the actual geometry data, usually as triangles, but they can also be mathematically-defined models. The actual implementation of a BLAS is opaque to users, but the driver usually organizes this data in a Bounding Volume Hierarchy (BVH), so that when you cast a ray, you can minimize the number of triangle intersection checks. -*Top-Level Accelerations Structures* contain Bottom-Level Acceleration Structures. Top-Level Acceleration Structures use instances to group BLASes and link them with other properties. These properties include a custom ID, which we can use to define a material; or a transform matrix, which we can use to efficiently update animations. +*Top-Level Accelerations Structures* contain Bottom-Level Acceleration Structures. Top-Level Acceleration Structures use instances to group BLASes and link them with other properties. These properties include a custom ID, which you can use to define a material; or a transform matrix, which you can use to efficiently update animations. ![Scheme of an Acceleration structure #center](images/tlas_blas_diagram.png "Scheme of an Acceleration structure") ### Acceleration structure best practice -Building BLASes and TLASes is relatively expensive, so we recommend trying to reduce the number of acceleration structures builds and updates. +As building BLASes and TLASes is relatively expensive, it is best to try to reduce the number of acceleration structures builds and updates. -Skinned animations are costly in ray tracing as they require us to continuously update the BLASes of these meshes. We recommend limiting the number of skinned objects in your scenes. Depending on your use case you might consider not updating all the acceleration structures every frame. It is usually possible to reuse the same old acceleration structure across a few frames, without noticeable artifacts. Similarly, we can implement some kind of heuristic to reduce the update frequency of objects too far from the camera, although this can be tricky since most effects still need to consider objects outside the view frustum. +Skinned animations are costly in ray tracing as they require you to continuously update the BLASes of the meshes. It is recommended that you limit the number of skinned objects in your scenes. Depending on your use case, you might consider not updating all the acceleration structures in every frame. It is usually possible to reuse the same old acceleration structure across a few frames, without noticeable artifacts. Similarly, you can implement some kind of heuristic to reduce the update frequency of objects too far from the camera, although this can be tricky since most effects still need to consider objects outside the view frustum. -When building a new BLAS, we generally recommend using `VK_BUILD_ACCELERATION_STRUCTURE_PREFER_FAST_TRACE_BIT_KHR` for static geometry. `VK_BUILD_ACCELERATION_STRUCTURE_PREFER_FAST_BUILD_BIT_KHR` can be faster in certain situations, like dynamic geometry or streaming. If updating BLASes becomes problematic, one should consider testing different options. +When building a new BLAS, it is generally recommended to use `VK_BUILD_ACCELERATION_STRUCTURE_PREFER_FAST_TRACE_BIT_KHR` for static geometry. `VK_BUILD_ACCELERATION_STRUCTURE_PREFER_FAST_BUILD_BIT_KHR` can be faster in certain situations, such as dynamic geometry or streaming. If updating BLASes becomes problematic, you can consider testing alternative options. -When building new TLASes we recommend using `PREFER_FAST_BUILD`. For TLASes `PREFER_FAST_TRACE` can be useful for a static scene without updates; however, the build itself is more expensive. In general, BLASes are in use for a long time, but TLASes are only in use for a few frames. This means that it is usually not worth it to build your TLAS with `PREFER_FAST_TRACE`. +When building new TLASes, it is recommended to use `PREFER_FAST_BUILD`. For TLASes, `PREFER_FAST_TRACE` can be useful for a static scene without updates; however, the build itself is more expensive. In general, BLASes are in use for a long time, but TLASes are only in use for a few frames. This means that it is usually not efficient to build your TLAS with `PREFER_FAST_TRACE`. Similarly, building a new acceleration structure is significantly slower than updating or refitting an existing one. Try to avoid creating new acceleration structures from scratch, and if possible, set the `srcAccelerationStructure` field. Setting a previous version of the acceleration structure as a base for the new acceleration structure will be significantly faster since it triggers an update or refit instead of a build. Remember to also set `VK_BUILD_ACCELERATION_STRUCTURE_ALLOW_UPDATE_BIT_KHR` when creating the acceleration structure. -In [our ray tracing best practices](https://developer.arm.com/documentation/101897/latest/Ray-tracing/Acceleration-structures) we recommend to use `PREFER_FAST_TRACE` for static BLASes. For TLASes and dynamic BLASes we recommend `PREFER_FAST_BUILD`, usually in combination with `ALLOW_UPDATE`. +In [ray tracing best practices](https://developer.arm.com/documentation/101897/latest/Ray-tracing/Acceleration-structures) it is recommended that you use `PREFER_FAST_TRACE` for static BLASes. For TLASes and dynamic BLASes `PREFER_FAST_BUILD` is recommended, usually in combination with `ALLOW_UPDATE`. ``` cpp void create_blas(bool allow_update, std::vector &&geometries, std::vector &&geometry_primitive_counts, VkAccelerationStructureKHR prev_as = VK_NULL_HANDLE) @@ -159,16 +159,16 @@ void build_as(VkAccelerationStructureBuildGeometryInfoKHR as_geom_info, std::vec } ``` -Finally, the quality of your geometry can have a huge impact when building your acceleration structure. Try to reduce the number of vertices as much as possible. Grouping objects close together into the same BLAS can also have a positive impact. At the same time, try to reduce overlapping across BLASes and reduce empty space inside a BLAS as much as possible. Empty space will increase the size of a BLAS, which might cause extra AABB hits. Similarly, if BLASes overlap, we might need to evaluate triangles on multiples BLASes. +Finally, the quality of your geometry can have a huge impact when building your acceleration structure. Try to reduce the number of vertices as much as possible. Grouping objects close together into the same BLAS can also have a positive impact. At the same time, try to reduce overlapping across BLASes and reduce empty space inside a BLAS as much as possible. Empty space will increase the size of a BLAS, which might cause extra AABB hits. Similarly, if BLASes overlap, you might need to evaluate triangles on multiples BLASes. {{< tabpane >}} {{< tab header="Example of a scene with a bad acceleration structure" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/city_scene_as_overlapping.png" title="Example of a scene with a bad acceleration structure: Significant overlap and empty space. This will result in unnecessary triangle intersections." >}}{{< /tab >}} {{< tab header="Example of a scene with a good acceleration structure" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/city_scene_as_separate.png" title="Example of a scene with a good acceleration structure: Minimum overlap and empty space." >}}{{< /tab >}} {{< /tabpane >}} -When tracing a scene, it is common to have opaque and non-opaque objects. Non-opaque objects will invoke the `Any-Hit` shader and require extra evaluation in ray query. Vulkan allows to use the flags `VK_GEOMETRY_INSTANCE_FORCE_OPAQUE_BIT_KHR` and `VK_GEOMETRY_OPAQUE_BIT_KHR` to mark specific instances and geometries as opaque. We recommend using the appropriate flags to mark relevant geometry as opaque. +When tracing a scene, it is common to have opaque and non-opaque objects. Non-opaque objects will invoke the `Any-Hit` shader and require extra evaluation in ray query. Vulkan allows you to use the flags `VK_GEOMETRY_INSTANCE_FORCE_OPAQUE_BIT_KHR` and `VK_GEOMETRY_OPAQUE_BIT_KHR` to mark specific instances and geometries as opaque. It is recommended that you use the appropriate flags to mark relevant geometry as opaque. -Vulkan also offers shader flags, they will affect all objects, but they offer better performance. You can use `gl_RayFlagsCullNoOpaqueEXT` to cull non-opaque geometries or `gl_RayFlagsOpaqueEXT`, to mark all objects as opaque. To enable compiler optimizations, you should also use `gl_RayFlagsSkipAABBEXT` and ensure the flags can be resolved statically. If possible, prefer `gl_RayFlagsCullNoOpaqueEXT` it might improve shader performance. +Vulkan also offers shader flags. These affect all objects, but they offer better performance. You can use `gl_RayFlagsCullNoOpaqueEXT` to cull non-opaque geometries or `gl_RayFlagsOpaqueEXT`, to mark all objects as opaque. To enable compiler optimizations, you should also use `gl_RayFlagsSkipAABBEXT` and ensure the flags can be resolved statically. If possible, as a preference, use `gl_RayFlagsCullNoOpaqueEXT` as it might improve shader performance. ```glsl // [GOOD] Compiler optimizations enabled diff --git a/content/learning-paths/smartphones-and-mobile/ray_tracing/rt05_bindless.md b/content/learning-paths/smartphones-and-mobile/ray_tracing/rt05_bindless.md index 41d8d1879..0187f9cbb 100644 --- a/content/learning-paths/smartphones-and-mobile/ray_tracing/rt05_bindless.md +++ b/content/learning-paths/smartphones-and-mobile/ray_tracing/rt05_bindless.md @@ -8,13 +8,13 @@ layout: learningpathall ## Bindless materials -Bindless resources is implemented by `VK_EXT_descriptor_indexing` and has been a core feature of Vulkan since version 1.2. It is independent of ray tracing and technically it is possible to implement our ray tracing effects without using it, however it will make it very easy and simple to access the data of the intercepted objects. +Bindless resources are implemented by `VK_EXT_descriptor_indexing` and have been a core feature of Vulkan since version 1.2. It is independent of ray tracing, and technically it is possible to implement ray tracing effects without using it, however it makes it easy and simple to access the data of the intercepted objects. -Descriptor Indexing allows applications to define arrays of buffers and textures that shaders can access using dynamic and non-uniform indices. This extension will then enable us to keep all our resources organized in look-up tables. +Descriptor Indexing allows applications to define arrays of buffers and textures that shaders can access using dynamic and non-uniform indices. This extension then enables you to keep all your resources organized in look-up tables. -In a traditional rasterized renderer, applications bind all the necessary information of an object to each draw call, allowing us to retrieve the material, position, texture, etc. that is needed to draw it. However, when we use ray tracing, we cannot follow this approach to identify the materials of the hit object. Instead, we use a unique identifier to retrieve the necessary information from look-up tables. +In a traditional rasterized renderer, applications bind all the necessary information of an object to each draw call, allowing you to retrieve the material, position, and texture that is needed to draw it. However, when you use ray tracing, this approach will not enable you to identify the materials of the hit object. Instead, you must use a unique identifier to retrieve the necessary information from look-up tables. -The ray query API allows us to obtain an instance ID, a geometry ID, and a primitive ID from the acceleration structure that we can use to query these tables. +The ray query API allows you to obtain an instance ID, a geometry ID, and a primitive ID from the acceleration structure that you can use to query these tables. ``` glsl #extension GL_EXT_ray_query : require @@ -25,7 +25,7 @@ uint intersection_primitive_id = rayQueryGetIntersectionPrimitiveIndexEXT(rayQue uint intersection_geometry_global_index = intersection_custom_instance_id + intersection_geometry_id; ``` -As we build the acceleration structure, we keep track of how many geometries each instance contains, and make sure that everything can be identified uniquely using the sum of the instance custom ID and the geometry ID. We build a set of look-up tables so that with this unique identifier we can access everything that we need to render the intersected triangle. +As you build the acceleration structure, you can keep track of how many geometries each instance contains, and make sure that everything can be identified uniquely using the sum of the instance custom ID and the geometry ID. You can build a set of look-up tables so that with this unique identifier you can access everything that you need to render the intersected triangle: ``` cpp uint32_t custom_index = 0; @@ -42,7 +42,7 @@ for (const auto &scene_blas : get_scene_blases()) } ``` -With the Primitive ID we can access the relevant index buffer and retrieve all the vertex attributes, including texture coordinates. The API also provides the barycentric coordinates of the intersection point. Using this we can mimic a `Fragment` shader and interpolate the attributes, like for example the UV coordinates. +With the Primitive ID you can access the relevant index buffer and retrieve all the vertex attributes, including texture coordinates. The API also provides the barycentric coordinates of the intersection point. Using this, you can mimic a `Fragment` shader and interpolate the attributes, for example the UV coordinates. ``` glsl #extension GL_EXT_nonuniform_qualifier : require @@ -112,9 +112,9 @@ vec2 get_intersection_uv(in rayQueryEXT rayQuery) } ``` -Note that to optimize memory consumption, we have compressed the mesh and material identifier in a single `uint`. +Note that to optimize memory consumption, you have compressed the mesh and material identifier in a single `uint`. -Once we have our UV coordinates, we can retrieve the material ID from a different table and use it to index into our texture arrays. We also keep separate bindless arrays, each one containing the different textures we will need to use. Finally, we can then use the UV coordinates to sample the textures and compute the final color. +Once you have our UV coordinates, you can retrieve the material ID from a different table and use it to index into our texture arrays. You also keep separate bindless arrays, each one containing the different textures you will need to use. Finally, you can then use the UV coordinates to sample the textures and compute the final color. ``` glsl layout(set = BINDLESS_MATERIALS_SET, binding = binding_index) uniform texture2D base_color_textures[BINDLESS_MAX_MATERIALS]; diff --git a/content/learning-paths/smartphones-and-mobile/ray_tracing/rt06_reflections.md b/content/learning-paths/smartphones-and-mobile/ray_tracing/rt06_reflections.md index 05ffef4fc..f6b37af4e 100644 --- a/content/learning-paths/smartphones-and-mobile/ray_tracing/rt06_reflections.md +++ b/content/learning-paths/smartphones-and-mobile/ray_tracing/rt06_reflections.md @@ -8,39 +8,39 @@ layout: learningpathall ## Reflections -Illuminating and lighting a scene is a complex topic. When a light hits an object part of that light is reflected; this is known as specular reflection. +Illuminating and lighting a scene is a complex topic. When a light hits an object, part of that light is reflected; this is known as specular reflection. {{< tabpane >}} - {{< tab header="Example 1: ON" title="Example 1: reflections ON" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/city_reflections_on.png">}} {{< /tab >}} - {{< tab header="Example 1: OFF" title="Example 1: reflections OFF" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/city_reflections_off.png">}} {{< /tab >}} - {{< tab header="Example 2: ON" title="Example 2: reflections ON" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/bonza_reflections_mirror.png">}} {{< /tab >}} - {{< tab header="Example 2: OFF" title="Example 2: reflections OFF" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/bonza_reflections_no.png">}} {{< /tab >}} - {{< tab header="Example 3: ON" title="Example 3: reflections ON" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/immortalis_reflections_mirror.png">}} {{< /tab >}} - {{< tab header="Example 3: OFF" title="Example 3: reflections OFF" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/immortalis_reflections_no.png">}} {{< /tab >}} + {{< tab header="Example 1: ON" title="Example 1: Reflections ON" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/city_reflections_on.png">}} {{< /tab >}} + {{< tab header="Example 1: OFF" title="Example 1: Reflections OFF" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/city_reflections_off.png">}} {{< /tab >}} + {{< tab header="Example 2: ON" title="Example 2: Reflections ON" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/bonza_reflections_mirror.png">}} {{< /tab >}} + {{< tab header="Example 2: OFF" title="Example 2: Reflections OFF" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/bonza_reflections_no.png">}} {{< /tab >}} + {{< tab header="Example 3: ON" title="Example 3: Reflections ON" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/immortalis_reflections_mirror.png">}} {{< /tab >}} + {{< tab header="Example 3: OFF" title="Example 3: Reflections OFF" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/immortalis_reflections_no.png">}} {{< /tab >}} {{< /tabpane >}} -The direction of the reflected ray depends on the surface normal and the direction of the incident ray, and it is very easy to compute since GLSL already offers the built-in function `reflect` to compute it. +The direction of the reflected ray depends on the surface and the direction of the incident ray. It is easy to compute as GLSL already offers the built-in function `reflect` to compute it. ![Diagram of reflections #center](images/reflections_diagram.svg "Diagram of reflections") -As one can see, reflections are very suited for ray tracing. Traditional non ray tracing techniques try to simulate this ray in multiple ways, but they produce small artifacts. For example, most games use Screen Space Reflections (SSR). +As you can see, reflections are ideal for ray tracing. Other traditional techniques that do not implement ray tracing to simulate this ray in multiple ways, but they produce small artifacts. For example, most games use Screen Space Reflections (SSR). -It is common to find corner cases and bugs on Screen Space Reflection, at the same time, this technique is more difficult to implement, requiring more magic numbers. The main limitation of Screen Space Reflections is that they depend on the G-buffer information, so occluded objects and objects outside the view frustum cannot be reflected, causing visible artifacts that are common in current games. +It is common to find corner cases and bugs on Screen Space Reflection at the same time. This technique is more difficult to implement, requiring more magic numbers. The main limitation of Screen Space Reflections is that they depend on the G-buffer information, so occluded objects and objects outside the view frustum cannot be reflected, causing visible artifacts that are common in current games. {{< tabpane >}} - {{< tab header="Example 1: SSR" title="Example 1: screen space reflections" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/reflections_artifacts_1_ssr.png">}}{{< /tab >}} - {{< tab header="Example 1: RT" title="Example 1: ray tracing reflections" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/reflections_artifacts_1_rt.png">}}{{< /tab >}} - {{< tab header="Example 2: SSR" title="Example 2: screen space reflections" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/reflections_artifacts_2_ssr.png">}}{{< /tab >}} - {{< tab header="Example 2: RT" title="Example 2: ray tracing reflections" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/reflections_artifacts_2_rt.png">}}{{< /tab >}} - {{< tab header="Example 3: SSR" title="Example 3: screen space teflections" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/reflections_artifacts_3_ssr.png">}}{{< /tab >}} - {{< tab header="Example 3: RT" title="Example 3: ray tracing reflections" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/reflections_artifacts_3_rt.png">}}{{< /tab >}} - {{< tab header="Example 4: SSR" title="Example 4: screen space reflections" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/reflections_artifacts_4_ssr.png">}}{{< /tab >}} - {{< tab header="Example 4: RT" title="Example 4: ray tracing reflections" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/reflections_artifacts_4_rt.png">}}{{< /tab >}} + {{< tab header="Example 1: SSR" title="Example 1: Screen space reflections" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/reflections_artifacts_1_ssr.png">}}{{< /tab >}} + {{< tab header="Example 1: RT" title="Example 1: Ray tracing reflections" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/reflections_artifacts_1_rt.png">}}{{< /tab >}} + {{< tab header="Example 2: SSR" title="Example 2: Screen space reflections" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/reflections_artifacts_2_ssr.png">}}{{< /tab >}} + {{< tab header="Example 2: RT" title="Example 2: Ray tracing reflections" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/reflections_artifacts_2_rt.png">}}{{< /tab >}} + {{< tab header="Example 3: SSR" title="Example 3: Screen space teflections" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/reflections_artifacts_3_ssr.png">}}{{< /tab >}} + {{< tab header="Example 3: RT" title="Example 3: Ray tracing reflections" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/reflections_artifacts_3_rt.png">}}{{< /tab >}} + {{< tab header="Example 4: SSR" title="Example 4: Screen space reflections" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/reflections_artifacts_4_ssr.png">}}{{< /tab >}} + {{< tab header="Example 4: RT" title="Example 4: Ray tracing reflections" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/reflections_artifacts_4_rt.png">}}{{< /tab >}} {{< /tabpane >}} ### Generating Reflection rays from the G-buffer -In this demo, we have a raster pass to compute our G-buffer, so we are already saving all the information we need to compute our reflection rays when producing the G-buffer. The first step is just to read this information from the G-Buffer, and use the normal, position and roughness to generate the reflection ray that we will launch. +In this demo, there is a raster pass to compute the G-buffer, so you are already saving all the information you need to compute your reflection rays when producing the G-buffer. The first step is just to read this information from the G-Buffer, and use the normal, position and roughness to generate the reflection ray that you will launch. ``` glsl vec3 get_g_buffer_depth(ivec2 depth_coord_texel) @@ -102,14 +102,14 @@ vec4 main() ### Trace our reflection rays -Once we have created our rays, we can trace them, either using the Ray Tracing Pipeline or Ray Query. As we explained previously in [the ray traversal section](../rt03_ray_traversal) we currently recommend using Ray Query from a `Fragment` shader. +Once you have created your rays, you can trace them, either using the ray tracing pipeline or ray query. As explained previously in [the ray traversal section](../rt03_ray_traversal), it is recommend using Ray Query from a `Fragment` shader. ``` glsl bool trace_ray(vec3 ray_orig, vec3 ray_dir, rayQueryEXT rayQuery, uint flags, uint cull_mask, float ray_t_min, float ray_t_max) { rayQueryInitializeEXT(rayQuery, top_level_acceleration_structure, flags, cull_mask, ray_origin, ray_t_min, ray_direction, ray_t_max); - // The geometry is opaque so we do not need to check the return value + // The geometry is opaque so you do not need to check the return value rayQueryProceedEXT(rayQuery); const bool committed_intersection = true; @@ -162,7 +162,7 @@ vec4 main() ### Solving reflections hits -If our ray hits a valid object, we will need to retrieve some material information to illuminate the hit. Previously, in the [bindless material section](../rt05_bindless) we showed that obtaining this information is quite easy thanks to descriptor indexing, which will allow us to access the material information necessary to illuminate the ray. If the ray fails to hit anything, we can just sample the environment map of the skybox to reflect the sky. +If the ray hits a valid object, you need to retrieve some material information to illuminate the hit. In the [bindless material section](../rt05_bindless) it was demonstrated that obtaining this information is easy thanks to descriptor indexing, which allows you to access the material information necessary to illuminate the ray. If the ray fails to hit anything, you can sample the environment map of the skybox to reflect the sky. ``` glsl void obtain_rq_hit_data(rayQueryEXT rayQuery, out vec4 hit_material_properties, out vec3 hit_pos, out vec3 hit_normal, out vec2 hit_uv, out uint material_id) @@ -261,9 +261,9 @@ vec4 main() ### Glossy and Rough reflections -When creating our rays, we can use the roughness to decide how many reflection rays we need to launch. +When creating rays, you can use the roughness to decide how many reflection rays you need to launch. -When we think of reflections, we usually think of mirror-like reflections, however non-mirror objects also reflect part of their incident light producing glossy or rough reflections. In a PBR render we use the material roughness to decide the reflected light dispersion. A roughness of 0 will indicate that the material is a mirror, so we will need a single ray to solve it. On the other hand, objects with a higher roughness will reflect the incident ray in multiple directions, producing rough reflections. A render will need to launch multiple rays to solve the color of rough reflections. +When one thinks of reflections, one usually thinks of mirror-like reflections, however non-mirror objects also reflect part of their incident light producing glossy or rough reflections. In a PBR render, you can use the material roughness to decide the reflected light dispersion. A roughness of 0 indicates that the material is a mirror, so you will need a single ray to solve it. On the other hand, objects with a higher roughness will reflect the incident ray in multiple directions, producing rough reflections. A render will need to launch multiple rays to solve the color of rough reflections. ``` glsl vec4 main() @@ -313,29 +313,29 @@ vec4 main() } ``` -Launching multiple rays can be expensive, so we recommend limiting the number of rays in your shaders. At the same time, rough reflections will launch rays with more divergence, being more expensive than pure mirror reflections. It is more efficient to traverse the acceleration structure if all rays in the same warp follow a similar path, so try to minimize ray divergence. If we want glossy and rough reflections, we might need to add a denoising pass to deal with this noise. This will further increase the cost of reflections. Please consider the performance impact before adding non-mirror reflections. +Launching multiple rays can be expensive, so it is better to limit the number of rays in your shaders. At the same time, rough reflections will launch rays with more divergence, being more expensive than pure mirror reflections. It is more efficient to traverse the acceleration structure if all rays in the same warp follow a similar path, so try to minimize ray divergence. If you want glossy and rough reflections, you might need to add a denoising pass to deal with the noise. This further increases the cost of reflections. Do consider the performance impact before adding non-mirror reflections. {{< tabpane >}} - {{< tab header="Example 1: No" title="Example 1: no reflections" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/bonza_reflections_no.png">}}{{< /tab >}} - {{< tab header="Example 1: Mirror" title="Example 1: mirror reflections" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/bonza_reflections_mirror.png">}}{{< /tab >}} - {{< tab header="Example 1: Rough" title="Example 1: rough reflections" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/bonza_reflections_no_mirror.png">}}{{< /tab >}} - {{< tab header="Example 2: No" title="Example 2: no reflections" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/immortalis_reflections_no.png">}}{{< /tab >}} - {{< tab header="Example 2: Mirror" title="Example 2: mirror reflections" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/immortalis_reflections_mirror.png">}}{{< /tab >}} - {{< tab header="Example 2: Rough" title="Example 2: rough reflections" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/immortalis_reflections_no_mirror.png">}}{{< /tab >}} + {{< tab header="Example 1: No" title="Example 1: No reflections" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/bonza_reflections_no.png">}}{{< /tab >}} + {{< tab header="Example 1: Mirror" title="Example 1: Mirror reflections" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/bonza_reflections_mirror.png">}}{{< /tab >}} + {{< tab header="Example 1: Rough" title="Example 1: Rough reflections" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/bonza_reflections_no_mirror.png">}}{{< /tab >}} + {{< tab header="Example 2: No" title="Example 2: No reflections" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/immortalis_reflections_no.png">}}{{< /tab >}} + {{< tab header="Example 2: Mirror" title="Example 2: Mirror reflections" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/immortalis_reflections_mirror.png">}}{{< /tab >}} + {{< tab header="Example 2: Rough" title="Example 2: Rough reflections" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/immortalis_reflections_no_mirror.png">}}{{< /tab >}} {{< /tabpane >}} ### Reflections with multiple bounces -In the real world, ray lights usually have multiple bounces. We can simulate this behavior by launching additional reflection rays if we hit a reflective surface. However, this will be expensive, and we recommend evaluating if the extra complexity is worth it. +In the real world, ray lights usually have multiple bounces. You can simulate this behavior by launching additional reflection rays if you hit a reflective surface. However, this will be expensive, and you should evaluate to see if the extra complexity is worth it. {{< tabpane >}} - {{< tab header="Example 1: No" title="Example 1: no reflections" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/reflections_bounce_b_no.png">}}{{< /tab >}} - {{< tab header="Example 1: 1 bounce" title="Example 1: reflections with 1 bounce" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/reflections_bounce_b_1.png">}}{{< /tab >}} - {{< tab header="Example 1: 2 bounces" title="Example 1: reflections with 2 bounces" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/reflections_bounce_b_2.png">}}{{< /tab >}} - {{< tab header="Example 2: No" title="Example 2: no reflections" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/reflections_bounce_a_no.png">}}{{< /tab >}} - {{< tab header="Example 2: 1 bounce" title="Example 2: reflections with 1 bounce" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/reflections_bounce_a_1.png">}}{{< /tab >}} - {{< tab header="Example 2: 2 bounces" title="Example 2: reflections with 2 bounces" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/reflections_bounce_a_2.png">}}{{< /tab >}} - {{< tab header="Example 2: 4 bounces" title="Example 2: reflections with 4 bounces" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/reflections_bounce_a_4.png">}}{{< /tab >}} + {{< tab header="Example 1: No" title="Example 1: No reflections" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/reflections_bounce_b_no.png">}}{{< /tab >}} + {{< tab header="Example 1: 1 bounce" title="Example 1: Reflections with 1 bounce" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/reflections_bounce_b_1.png">}}{{< /tab >}} + {{< tab header="Example 1: 2 bounces" title="Example 1: Reflections with 2 bounces" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/reflections_bounce_b_2.png">}}{{< /tab >}} + {{< tab header="Example 2: No" title="Example 2: No reflections" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/reflections_bounce_a_no.png">}}{{< /tab >}} + {{< tab header="Example 2: 1 bounce" title="Example 2: Reflections with 1 bounce" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/reflections_bounce_a_1.png">}}{{< /tab >}} + {{< tab header="Example 2: 2 bounces" title="Example 2: Reflections with 2 bounces" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/reflections_bounce_a_2.png">}}{{< /tab >}} + {{< tab header="Example 2: 4 bounces" title="Example 2: Reflections with 4 bounces" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/reflections_bounce_a_4.png">}}{{< /tab >}} {{< /tabpane >}} ``` glsl @@ -434,6 +434,6 @@ vec4 main() } ``` -Finally, we can see how our reflection algorithm looks: +Finally, you can see how your reflection algorithm looks: ![Diagram of our reflection algorithm #center](images/reflections_algorithm_diagram.drawio.svg "Diagram of our reflection algorithm") \ No newline at end of file diff --git a/content/learning-paths/smartphones-and-mobile/ray_tracing/rt07_shadows.md b/content/learning-paths/smartphones-and-mobile/ray_tracing/rt07_shadows.md index e553b1835..b8195233f 100644 --- a/content/learning-paths/smartphones-and-mobile/ray_tracing/rt07_shadows.md +++ b/content/learning-paths/smartphones-and-mobile/ray_tracing/rt07_shadows.md @@ -8,28 +8,28 @@ layout: learningpathall ## Shadows -Light sources emit light that will illuminate most objects. However, not all the emitted light will reach all objects; some light rays will be intercepted by other objects before they can reach a light source. This process creates darker areas and produces some shadows in our images. +Light sources emit light that illuminates most objects. However, not all the emitted light reaches all objects; some light rays are intercepted by other objects before they can reach a light source. This process creates darker areas and produces some shadows in the images. {{< tabpane >}} - {{< tab header="Example 1: ON" title="Example 1: shadows ON" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/city_shadows_on.png">}} {{< /tab >}} - {{< tab header="Example 1: OFF" title="Example 1: shadows OFF" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/city_shadows_off.png">}} {{< /tab >}} - {{< tab header="Example 2: ON" title="Example 2: shadows ON" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/bonza_shadows_on.png">}} {{< /tab >}} - {{< tab header="Example 2: OFF" title="Example 2: shadows OFF" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/bonza_shadows_off.png">}} {{< /tab >}} - {{< tab header="Example 3: ON" title="Example 3: shadows ON" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/immortalis_shadows_on.png">}} {{< /tab >}} - {{< tab header="Example 3: OFF" title="Example 3: shadows OFF" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/immortalis_shadows_off.png">}} {{< /tab >}} + {{< tab header="Example 1: ON" title="Example 1: Shadows ON" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/city_shadows_on.png">}} {{< /tab >}} + {{< tab header="Example 1: OFF" title="Example 1: Shadows OFF" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/city_shadows_off.png">}} {{< /tab >}} + {{< tab header="Example 2: ON" title="Example 2: Shadows ON" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/bonza_shadows_on.png">}} {{< /tab >}} + {{< tab header="Example 2: OFF" title="Example 2: Shadows OFF" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/bonza_shadows_off.png">}} {{< /tab >}} + {{< tab header="Example 3: ON" title="Example 3: Shadows ON" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/immortalis_shadows_on.png">}} {{< /tab >}} + {{< tab header="Example 3: OFF" title="Example 3: Shadows OFF" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/immortalis_shadows_off.png">}} {{< /tab >}} {{< /tabpane >}} -In our demo, we use a similar algorithm, to produce all our ray tracing effects. For shadows, similarly to what we did for reflections, we start by sampling some information from the G-buffer, like our position and normal. Then we use this information to generate shadows rays, that we can trace either using Ray Tracing Pipeline or Ray Query. If our shadow ray hits an object, it means that the ray is obstructed and we are in shadow, however, if the ray reaches the light without hitting anything it means that we are illuminated. +In the demo, there is a similar algorithm to produce all the ray tracing effects. For shadows, similarly to what was done for reflections, you start by sampling some information from the G-buffer, like the position and normal. Then you use this information to generate shadows rays, that you can trace either using ray tracing pipeline or ray query. If your shadow ray hits an object, it means that the ray is obstructed and you are in shadow, however, if the ray reaches the light without hitting anything, it means that you are illuminated. -![Diagram of our shadow algorithm # center](images/shadows_algorithm_diagram.drawio.svg "Diagram of our shadow algorithm") +![Diagram of the shadow algorithm # center](images/shadows_algorithm_diagram.drawio.svg "Diagram of the shadow algorithm") -When tracing shadows we launch a ray from the objects towards the lights, so we will need to launch a separate ray for each light that casts shadows, so please, consider limiting the number of lights casting shadows in your scenes. Launching a single ray per light source will only allow us to produce hard shadows, but it is possible to obtain accurate soft shadows by launching multiple shadow rays and doing some area sampling, you can learn more in [this tutorial](https://medium.com/@alexander.wester/ray-tracing-soft-shadows-in-real-time-a53b836d123b). Soft shadows are more costly to produce, so evaluate if they are necessary. +When tracing shadows, a ray is launched from the objects towards the lights, so you will need to launch a separate ray for each light that casts shadows, so consider limiting the number of lights casting shadows in your scenes. Launching a single ray per light source will only allow you to produce hard shadows, but it is possible to obtain accurate soft shadows by launching multiple shadow rays and doing some area sampling, you can learn more in [this tutorial](https://medium.com/@alexander.wester/ray-tracing-soft-shadows-in-real-time-a53b836d123b). Soft shadows are more costly to produce, so evaluate if they are necessary. ![Diagram of shadows #center](images/shadows_diagram.png "Diagram of shadows") -In reflections and refractions, we need to know which object we hit to illuminate it. However, in shadows we only care about whether we hit an object or not; we do not need to know the exact object we hit. This allows us to skip accessing bindless resources for illumination, and we can use the GLSL flag `gl_RayFlagsTerminateOnFirstHitEXT`. With this flag, rays will terminate on the first confirmed hit instead of evaluating successive hits to verify which hit is closer. This optimization flag can allow us to skip multiple triangle intersection checks and improve our performance. +In reflections and refractions, you need to know which object you hit to illuminate it. However, in shadows you only care about whether you hit an object or not; you do not need to know the exact object you hit. This allows you to skip accessing bindless resources for illumination, and you can use the GLSL flag `gl_RayFlagsTerminateOnFirstHitEXT`. With this flag, rays terminate on the first confirmed hit instead of evaluating successive hits to verify which hit is closer. This optimization flag can allow you to skip multiple triangle intersection checks and improve your performance. -Another important optimization is that we know that objects not facing the light will always be in shadow, so we will not need to launch a shadow ray. This usually happens when the light is behind the object, and it is easy to check by calculating the dot product. +Another important optimization is that it is known that objects not facing the light are always in shadow, so you do not need to launch a shadow ray. This usually happens when the light is behind the object, and it is easy to check by calculating the dot product. ``` glsl const uint LightTypeDirectional = 0; diff --git a/content/learning-paths/smartphones-and-mobile/ray_tracing/rt08_refractions.md b/content/learning-paths/smartphones-and-mobile/ray_tracing/rt08_refractions.md index b44b106e4..e7eb82ee9 100644 --- a/content/learning-paths/smartphones-and-mobile/ray_tracing/rt08_refractions.md +++ b/content/learning-paths/smartphones-and-mobile/ray_tracing/rt08_refractions.md @@ -10,23 +10,23 @@ layout: learningpathall Refractions are one of the main advantages of ray tracing. They are extremely difficult to simulate using non ray tracing techniques, so most games do not use them yet. -To simulate transparency and opacity, we usually use an alpha texture, to indicate that the light can pass thought the object. At first glance transparency and refractions might look similar, but in a transparent material light goes through in a straight line, so light rays enter and exit the material in the same direction. Refractions bend the light inside the object and change the direction of the ray, allowing artists to create impressive effects that are not feasible without ray tracing. Transparency is a special case of refractions with a refraction index of 1.0, however there are more efficient and simple ways to simulate ray tracing transparency and opacity than using refractions. +To simulate transparency and opacity, an alpha texture is usually used, to indicate that the light can pass thought the object. At first glance, transparency and refractions might look similar, but in a transparent material light goes through in a straight line, so light rays enter and exit the material in the same direction. Refractions bend the light inside the object and change the direction of the ray, allowing artists to create impressive effects that are not feasible without ray tracing. Transparency is a special case of refractions with a refraction index of 1.0, however there are more efficient and simple ways to simulate ray tracing transparency and opacity than using refractions. {{< tabpane >}} - {{< tab header="Example: no refractions" title="Example 1: no refractions or transparency" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/refractions_no.png">}} {{< /tab >}} - {{< tab header="Example: refractions" title="Example: ray tracing refractions" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/refractions_10.png">}} {{< /tab >}} - {{< tab header="Example: transparency" title="Example: transparency" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/refractions_transparent.png">}} {{< /tab >}} + {{< tab header="Example: No refractions" title="Example 1: No refractions or transparency" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/refractions_no.png">}} {{< /tab >}} + {{< tab header="Example: Refractions" title="Example: Ray tracing refractions" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/refractions_10.png">}} {{< /tab >}} + {{< tab header="Example: Transparency" title="Example: Transparency" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/refractions_transparent.png">}} {{< /tab >}} {{< /tabpane >}} -Refractions are simulated using *Snell's law*. Similarly to reflections, GLSL has a built-in function `refract` that allows us to compute the direction of the refracted ray using the direction of the incident ray, the surface normal, and the ratio between materials' refraction indices. +Refractions are simulated using *Snell's law*. Similarly to reflections, GLSL has a built-in function `refract` that allows you to compute the direction of the refracted ray using the direction of the incident ray, the surface normal, and the ratio between materials' refraction indices. ![Diagram of refractions #center](images/refractions_diagram.svg "Diagram of refractions") -Our refractions algorithm is like our reflection algorithm. We start by retrieving some information from the G-buffer like normal, position or refractive index. Then we use this information to generate our ray, that we trace using either Ray Tracing Pipeline or Ray Query. +The refractions algorithm is like the reflection algorithm. You start by retrieving some information from the G-buffer like normal, position or refractive index. Then you use this information to generate a ray, that you trace using either ray tracing pipeline or ray query. ![Diagram of our refraction algorithm #center](images/refractions_algorithm_diagram.drawio.svg "Diagram of our refraction algorithm") -Just like reflections, we use bindless to get the material information necessary to illuminate the hit object. Refractions require a careful handling of back faces and the blending of multiple layers to obtain a correct result. If we hit a back face triangle we are exiting the refractive material, so we will not illuminate the hit, but we will still need to refract the ray. Finally, we will need to do an inverse blending of all refracted hit results. +Just like reflections, you use bindless to get the material information necessary to illuminate the hit object. Refractions require a careful handling of back faces and the blending of multiple layers to obtain a correct result. If you hit a back face triangle you are exiting the refractive material, so you will not illuminate the hit, but you will still need to refract the ray. Finally, you will need to do an inverse blending of all refracted hit results. ``` glsl @@ -85,8 +85,8 @@ vec4 main() const bool is_front_face = rayQueryGetIntersectionFrontFaceEXT(rayQuery, true); if (is_front_face) { - // For back faces we bend the ray - // For front faces we also compute the hit color + // For back faces you bend the ray + // For front faces you also compute the hit color // See reflection example for an example of how to illuminate a ray query hit vec4 refraction_color = vec4(0); illuminate_hit_data(hit_material_properties, hit_pos, hit_normal, hit_uv, material_id, illuminated_hit_color); @@ -107,12 +107,12 @@ vec4 main() } ``` -To obtain a good refractive result we will need to launch a ray with multiple bounces; to obtain a useful result we will need to use a minimum of 2 bounces. Increasing the number of bounces will produce better visual quality, however refractions are a costly effect, and launching rays with multiple bounces can have a significant performance cost, so one should considered the performance impact. +To obtain a good refractive result, you will need to launch a ray with multiple bounces; to obtain a useful result you will need to use a minimum of two bounces. Increasing the number of bounces produces better visual quality, however refractions are a costly effect, and launching rays with multiple bounces can have a significant performance cost, so one should considered the performance impact. {{< tabpane >}} - {{< tab header="Example: no" title="Example 1: no refractions" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/refractions_no.png">}} {{< /tab >}} - {{< tab header="Example: 1 bounce" title="Example: ray tracing refractions with 1 bounce" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/refractions_1.png">}} {{< /tab >}} - {{< tab header="Example: 2 bounces" title="Example: ray tracing refractions with 2 bounces" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/refractions_2.png">}} {{< /tab >}} - {{< tab header="Example: 4 bounces" title="Example: ray tracing refractions with 4 bounces" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/refractions_4.png">}} {{< /tab >}} - {{< tab header="Example: 10 bounces" title="Example: ray tracing refractions with 10 bounces" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/refractions_10.png">}} {{< /tab >}} + {{< tab header="Example: no" title="Example 1: No refractions" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/refractions_no.png">}} {{< /tab >}} + {{< tab header="Example: 1 Bounce" title="Example: Ray tracing refractions with 1 bounce" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/refractions_1.png">}} {{< /tab >}} + {{< tab header="Example: 2 Bounces" title="Example: Ray tracing refractions with 2 bounces" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/refractions_2.png">}} {{< /tab >}} + {{< tab header="Example: 4 Bounces" title="Example: Ray tracing refractions with 4 bounces" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/refractions_4.png">}} {{< /tab >}} + {{< tab header="Example: 10 Bounces" title="Example: Ray tracing refractions with 10 bounces" img_src="/learning-paths/smartphones-and-mobile/ray_tracing/images/refractions_10.png">}} {{< /tab >}} {{< /tabpane >}} \ No newline at end of file diff --git a/content/learning-paths/smartphones-and-mobile/ray_tracing/rt09_optimizing.md b/content/learning-paths/smartphones-and-mobile/ray_tracing/rt09_optimizing.md index ddacfad6b..ee5728f00 100644 --- a/content/learning-paths/smartphones-and-mobile/ray_tracing/rt09_optimizing.md +++ b/content/learning-paths/smartphones-and-mobile/ray_tracing/rt09_optimizing.md @@ -8,16 +8,16 @@ layout: learningpathall ## Optimizing -We recommend developers to check [our ray tracing best practices](https://developer.arm.com/documentation/101897/latest/Ray-tracing) to learn how to optimize ray tracing content. We also greatly recommend that you profile early and often, since ray tracing can be very costly. +Developers can see [our ray tracing best practices](https://developer.arm.com/documentation/101897/latest/Ray-tracing) to learn more about how to optimize ray tracing content. [Arm Performance Studio](https://developer.arm.com/Tools%20and%20Software/Arm%20Performance%20Studio) provides developers with a set of tools to help you ensure everything works correctly and efficiently: -- [RenderDoc for Arm GPUs](https://developer.arm.com/Tools%20and%20Software/RenderDoc%20for%20Arm%20GPUs), with support for Ray Query, can greatly help you debug your effects. +- [RenderDoc for Arm GPUs](https://developer.arm.com/Tools%20and%20Software/RenderDoc%20for%20Arm%20GPUs), with support for Ray Query, can help you debug your effects. - [Streamline Performance Analyzer](https://developer.arm.com/Tools%20and%20Software/Streamline%20Performance%20Analyzer) contains useful counters to optimize your ray tracing content, and verify you have a coherent and efficient ray traversal. -- [Mali Offline Compiler](https://developer.arm.com/Tools%20and%20Software/Mali%20Offline%20Compiler) is an extremely useful tool. We hugely recommend using it to check the behavior of your ray tracing shaders. For ray tracing it is important to ensure that ray traversal is hardware accelerated and not emulated. +- [Mali Offline Compiler](https://developer.arm.com/Tools%20and%20Software/Mali%20Offline%20Compiler) is an extremely useful tool. It is recommended that you use it to check the behavior of your ray tracing shaders. For ray tracing, it is important to ensure that ray traversal is hardware accelerated and not emulated. -Please check the [Arm Performance Studio learning path](https://learn.arm.com/learning-paths/smartphones-and-mobile/ams) for more information about these tools. +See the [Arm Performance Studio learning path](https://learn.arm.com/learning-paths/smartphones-and-mobile/ams) for more information about these tools. -On some occasions, like conditionally evaluating Ray Query, Arm GPUs are not able to use the ray tracing hardware and will instead use a slower solution using software emulation. Mali Offline compiler can help to detect this problem. Check that your shaders do not produce this line `Has slow ray traversal: true`. Proper hardware ray tracing should show this line `Has slow ray traversal: false` \ No newline at end of file +On some occasions, like conditionally evaluating Ray Query, Arm GPUs are not able to use the ray tracing hardware and will instead use a slower solution using software emulation. Mali Offline compiler can help to detect this problem. Check that your shaders do not produce this line `Has slow ray traversal: true`. Proper hardware ray tracing should show this line `Has slow ray traversal: false`. \ No newline at end of file