Retrive 16bit texture c++ |b2018.25850

Hi there,
(before i start , it could be problem at my side but still, can be a good question )
I have 16bit texture in form of uint8_t array.

uint8_t* texture

I`m cast it to 16bit and try to push it into opengl texture with no success.

this one i`m able to get an image but the pixel data isnt 16bit.

glTexImage2D(GL_TEXTURE_2D, 0, GL_LUMINANCE16, texture_width, texture_height, 0, GL_RED, GL_UNSIGNED_SHORT, (uint16_t*)texture);

Black screen ( but should work no ?):

glTexImage2D(GL_TEXTURE_2D, 0, GL_R16UI, texture_width, texture_height, 0, GL_R16UI, GL_UNSIGNED_SHORT, (uint16_t*)texture);

Generally it seem that other than internal format of RGBA ,RGB,GL_LUMINANCE16 GL_LUMINANCE non of the other
formats accepted.

I`m also use GL_LUMINANCE in other case which work fine, not related to 16bit texture :

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, texture_width, texture_height, 0, GL_LUMINANCE, GL_UNSIGNED_BYTE, texture);

I read that GL_LUMINANCE is not used anymore,
what i should use in both cases when retrieving 16bit monochrome texture ?
khronos.org/opengl/wiki/Ima … ge_Formats

thank you !

Update :
As it seem to work.

I`m using :

glTexImage2D(GL_TEXTURE_2D, 0, GL_R16F, texture_width, texture_height, 0, GL_RED, GL_UNSIGNED_SHORT, (uint16_t*)texture);

Cant understand the process.
I`m sending uint16_t*

the values represented as mm
167,1030… etc.
0 to 1991

while

in touch designer i`m getting diffrent range :
0 to 0.03038

any ideas ? explanations ?
Maybe there is flag that accpet values as they are and not try to normalize them ?

Didn’t find any other solution
but
post process it to match the values i want.