"char" is (potentially) signed, so casting it to uint32_t will
sign-extend it. Because we use |= to assign it to "word", and don't
otherwise mask out the higher bits either, we effectively set subsequent
bytes in the same word to 0xff for input bytes > 0x7f. That potentially
includes the \0 terminator. For example, "é" (U+00e9) is "\xc3\xa9"
when encoded as UTF-8, and would get us 0xffffffc3 instead of
0x0000a9c3.
Commit 1ed8d907b3 inadvertently dropped
emitting the tessellator domain for domain shaders. Although Vulkan
environments allow us to write the tessellator domain from the hull
shader, the domain shader, or both, that's not generally true for OpenGL
environments.
Otherwise ubsan reports errors such as:
libs/vkd3d-shader/spirv.c:7266:5: runtime error: null pointer passed as argument 1, which is declared to never be null
The previous names "not normalised" and "fully normalised" have meanings
which are likely to change with time. OTOH including a description of the
normalisation level in the enumerant seems excessive. Relating
normalisation levels to shader model versions might be a reasonable
compromise.