In WebGL you pass a format/type pair to readPixels
. For a given
texture internal format (attached to a framebuffer), only 2 combinations
of format/type are valid.
From the spec:
For normalized fixed-point rendering surfaces, the combination format
RGBA
and typeUNSIGNED_BYTE
is accepted. For signed integer rendering surfaces, the combination formatRGBA_INTEGER
and typeINT
is accepted. For unsigned integer rendering surfaces, the combination formatRGBA_INTEGER
and typeUNSIGNED_INT
is accepted.
The second combination is implementation defined which probably means you shouldn't use it in WebGL if you want your code to be portable. You can ask what the format/type combination is by querying
// assuming a framebuffer is bound with the texture to read attached
const format = gl.getParameter(gl.IMPLEMENTATION_COLOR_READ_FORMAT);
const type = gl.getParameter(gl.IMPLEMENTATION_COLOR_READ_TYPE);
Also note what texture formats that are renderable, meaning you can attach them to a framebuffer and render to them,
are also somewhat implementation defined.
WebGL2 lists many formats but some are optional (LUMINANCE
for example) and some
are not renderable by default but can maybe be made renderable by extension. (RGBA32F
for example).
The table below is live. You may notice that it gives different results depending on the machine, OS, GPU, or even browser. I know on my machine Chrome and Firefox give different results for some of the implementation defined values.