Hi there,
(before i start , it could be problem at my side but still, can be a good question )
I have 16bit texture in form of uint8_t array.
uint8_t* texture
I`m cast it to 16bit and try to push it into opengl texture with no success.
this one i`m able to get an image but the pixel data isnt 16bit.
glTexImage2D(GL_TEXTURE_2D, 0, GL_LUMINANCE16, texture_width, texture_height, 0, GL_RED, GL_UNSIGNED_SHORT, (uint16_t*)texture);
Black screen ( but should work no ?):
glTexImage2D(GL_TEXTURE_2D, 0, GL_R16UI, texture_width, texture_height, 0, GL_R16UI, GL_UNSIGNED_SHORT, (uint16_t*)texture);
Generally it seem that other than internal format of RGBA ,RGB,GL_LUMINANCE16 GL_LUMINANCE non of the other
formats accepted.
I`m also use GL_LUMINANCE in other case which work fine, not related to 16bit texture :
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, texture_width, texture_height, 0, GL_LUMINANCE, GL_UNSIGNED_BYTE, texture);
I read that GL_LUMINANCE is not used anymore,
what i should use in both cases when retrieving 16bit monochrome texture ?
khronos.org/opengl/wiki/Ima … ge_Formats
thank you !