language-glsl
language-glsl copied to clipboard
Inferring array types
While using WebGL in Elm, I discovered the GLSL parser doesn't seem to infer types of arrays correctly.
For instance: uniform int array[32]; is inferred as having just the type Int.
I'm not sure about Haskell, but in Elm I expect this type to be inferred as Array Int.
Is this simply not supported, or is it a bug?
My guess is it's a bug. The parser has a couple bugs in it. I've found that sometimes when parsing function parameters it will incorrectly parse function (int value) as function(in tvalue) or something like that.