implementation without vertex texture fetch
Hi, lots of embedded/mobile GPUs can't do texture fetches in vertex shader, what would be your suggestion to implement something similar which would work on these GPUs? Thanks!
Hey,
I'm not very familiar with embedded/mobile systems so I can't really give you a qualified answer. I see that compute shaders are supported since OpenGL ES 3.1 but my guess is that they are not well supported on the systems you are talking about as well?
Sadly, I can't really think of another solution off the top of my head. There seem to be quite a few disussions on the internet about this problem in general, maybe you will find a workaround there. You could of course also just skip the vertex displacement entirely and write a pure fragment shader, from the right angle this might still look fine.
The most common version of OpenGL ES is 2.1 :( I think it should be somehow done via fragment shader... or displacement should be provided by a set of vec4 to a vertex shader which is still available...
Oh, okay. I was under the impression that >= OpenGL ES 3.1 might be more common due to this page: https://developer.android.com/about/dashboards But nevertheless, lower versions still make up a substantial margin.
I'm sorry that I can't give you a more specific answer. I'm afraid you will need to find some sort of workaround. Keep me posted if you do!
That page is strange... We here have access to devices in low end to middle end segment production and among last 2 years, none of them had OpenGL ES 3.x or Vulkan API, only OpenGL ES 2.0./2.1. That is statistics I observe by my own eyes in price category I'd buy myself. Maybe in expensive/flagship devices the picture is different, but not in devices I would buy myself.
On Mon, Nov 16, 2020 at 10:01 PM John [email protected] wrote:
Oh, okay. I was under the impression that >= OpenGL ES 3.1 might be more common due to this page: https://developer.android.com/about/dashboards But nevertheless, lower versions still make up a substantial margin.
I'm sorry that I can't give you a more specific answer. I'm afraid you will need to find some sort of workaround. Keep me posted if you do!
— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/CaptainProton42/DynamicWaterDemo/issues/2#issuecomment-728260749, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAABPU2JY6SV5Z2E2O4BA23SQFZIHANCNFSM4TW3BGWQ .
You can comment out the vertex shader from the Water mesh material and it just does the fragment-space normal displacement.
If you absolutely wanted truly-displaced water (which, let's face it, is asking a lot of an embedded device), you could theoretically go through the trouble of modifying the Water mesh's vertex data using the simulation texture data on the CPU, maybe using Godot's SurfaceTool. You check each vertex's color, and if it's red you use the simulation texture image data to raise or lower the vertices.
But of course, that's really inefficient, and is the very reason we have vertex shaders. So maybe try and aim for something that fits within the limitations of your target device, like the fragment-only simulation, or blending/alternating in time between two water normal maps. Real-time computer graphics is all about compromise :)