[英]Using unsigned byte textures with DirectX 10 / 11
I am attempting to do some processing in the pixel shader on a texture. 我正在尝试在纹理的像素着色器中进行一些处理。 The data for the texture is coming from a memory chunk of 8 bit data. 纹理的数据来自8位数据的存储块。 The problem I am facing is how to read the data in the shader. 我面临的问题是如何在着色器中读取数据。
Code to create the texture and ressource view: 创建纹理和资源视图的代码:
In OnD3D11CreateDevice: 在OnD3D11CreateDevice中:
D3D11_TEXTURE2D_DESC tDesc;
tDesc.Height = 480;
tDesc.Width = 640;
tDesc.Usage = D3D11_USAGE_DYNAMIC;
tDesc.MipLevels = 1;
tDesc.ArraySize = 1;
tDesc.SampleDesc.Count = 1;
tDesc.SampleDesc.Quality = 0;
tDesc.Format = DXGI_FORMAT_R8_UINT;
tDesc.CPUAccessFlags = D3D11_CPU_ACCESS_WRITE;
tDesc.BindFlags = D3D11_BIND_SHADER_RESOURCE;
tDesc.MiscFlags = 0;
V_RETURN(pd3dDevice->CreateTexture2D(&tDesc, NULL, &g_pCurrentImage));
D3D11_SHADER_RESOURCE_VIEW_DESC rvDesc;
g_pCurrentImage->GetDesc(&tDesc);
rvDesc.Format = DXGI_FORMAT_R8_UINT;
rvDesc.Texture2D.MipLevels = tDesc.MipLevels;
rvDesc.Texture2D.MostDetailedMip = tDesc.MipLevels - 1;
rvDesc.ViewDimension = D3D_SRV_DIMENSION_TEXTURE2D;
V_RETURN(pd3dDevice->CreateShaderResourceView(g_pCurrentImage, &rvDesc, &g_pImageRV)); </code>
in OnD3D11FrameRender: 在OnD3D11FrameRender中:
HRESULT okay;
if( !g_updateDone ) {
D3D11_MAPPED_SUBRESOURCE resource;
resource.pData = mImage.GetData();
resource.RowPitch = 640;
resource.DepthPitch = 1;
okay = pd3dImmediateContext->Map(g_pCurrentImage, 0, D3D11_MAP_WRITE_DISCARD, 0, &resource);
g_updateDone = true;
}
pd3dImmediateContext->PSSetShaderResources(0, 1, &g_pImageRV);
This returns no errors so far, everything seems to work. 到目前为止,此操作不会返回任何错误,一切似乎都可以进行。
The HLSL Shader: HLSL着色器:
//-----
// Textures and Samplers
//-----
Texture2D <int> g_txDiffuse : register( t0 );
SamplerState g_samLinear : register( s0 );
//-----
// shader input/output structure
//-----
struct VS_INPUT
{
float4 Position : POSITION; // vertex position
float2 TextureUV : TEXCOORD0;// vertex texture coords
};
struct VS_OUTPUT
{
float4 Position : SV_POSITION; // vertex position
float2 TextureUV : TEXCOORD0; // vertex texture coords
};
//-----
// Vertex shader
//-----
VS_OUTPUT RenderSceneVS( VS_INPUT input )
{
VS_OUTPUT Output;
Output.Position = input.Position;
Output.TextureUV = input.TextureUV;
return Output;
}
//-----
// Pixel Shader
//-----
float4 RenderScenePS( VS_OUTPUT In ) : SV_TARGET
{
int3 loc;
loc.x = 0;
loc.y = 0;
loc.z = 1;
int r = g_txDiffuse.Load(loc);
//float fTest = (float) r;
return float4( In.TextureUV.x, In.TextureUV.y, In.TextureUV.x + In.TextureUV.y, 1);
}
The thing is, I can't even debug it in PIX to see what r results in, because even with Shader optimization disabled, the line int r = ... is never reached 问题是,我什至无法在PIX中调试它来查看r的结果,因为即使禁用了Shader优化,也永远不会到达int r = ...行
I tested 我测试了
float fTest = (float) r;
return float4( In.TextureUV.x, In.TextureUV.y, In.TextureUV.x + In.TextureUV.y, fTest);
but this would result in "cannot map expression to pixel shader instruction set", even though it's a float. 但这将导致“无法将表达式映射到像素着色器指令集”,即使它是浮点型也是如此。
So how do I read and use 8bit integers from a texture, and if possible, with no sampling at all. 因此,如何从纹理中读取和使用8位整数,并且如果可能的话,根本不进行采样。
Thanks for any feedback. 感谢您的任何反馈。
loc.z = 1;
此处应为0,因为在您的情况下,纹理Mip级别为1,并且在HLSL中,“ Load
固有”的mipmap从0开始。
Oh my this is a really old question, I thought it said 2012! 哦,这是一个非常老的问题,我以为是2012年!
But anyway as it's still open: 但是无论如何,因为它仍然是开放的:
Due to the nature of GPU's being optimised for floating point arithmetic, you probably wont get a great deal of performance advantage by using a Texture2D<int>
over a Texture2D<float>
. 由于GPU针对浮点算法进行了优化的性质,与Texture2D<float>
相比,使用Texture2D<int>
可能不会获得很大的性能优势。
You could attempt to use a Texture2D<float>
and then try: 您可以尝试使用Texture2D<float>
,然后尝试:
return float4( In.TextureUV.x, In.TextureUV.y, In.TextureUV.x + In.TextureUV.y, g_txDiffuse.Load(loc));
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.