Consider wgpu as future renderer backend (abstracts vk & gl) #431
Replies: 4 comments 12 replies
-
I'm pretty sure that wgpu is too abstract for internal compositor work, as it does not give us direct access to extensions that compositors use. For OpenGL, the first ones that come to my mind would be But that aside, I would love to use wgpu in downstream compositors, so I think the easiest way to achieve that would be to swap one of wgpu-hal backends to use Smithay context as it's backend (either opengl or vulkan). TLDR: compositors act a bit differently than typical clients, so it's not a matter of adding a wgpu dep and calling it a day. |
Beta Was this translation helpful? Give feedback.
-
Oh another note, @heavyrain266 I would recommend joining the smithay matrix channel. |
Beta Was this translation helpful? Give feedback.
-
The general impression I have wrt to that is that Smithay will always need to do low-level interaction with OpenGL and Vulkan, for setting things up with appropriate extensions and such indeed. But we can provide the necessary hooks to use a high-level API like wgpu on top of the Smithay infrastructure, so that compositors writers can use Smithay's API to import client buffers as textures (and such), and then manipulate them using wgpu for doing all the rendering they wish. That is mostly the idea that emerged in the discussion of #129 |
Beta Was this translation helpful? Give feedback.
-
I think it is possible to to set your own required vulkan extensions in wgpu. Please take a look at this example. It is about openXr integration with wgpu but the principle is the same: |
Beta Was this translation helpful? Give feedback.
-
It's somehow related to #129 and #134, for example bevyengine/bevy uses gfx-rs/wgpu to abstract graphics APIs on every platform, on Linux it handles OpenGL and Vulkan without bigger effort. Smithay could use this library for renderer to automatically use either OpenGL or Vulkan, depending which one is supported by user's hardware or/and provide simple way to switch between them.
In my honest opinion, this will make future development much easier and allow you focus more on e.g. #363 and #423.
Beta Was this translation helpful? Give feedback.
All reactions