简体   繁体   中英

Offscreen rendering of my own Shader with Apple's Metal sample doesn't work

Apple sample program

https://developer.apple.com/documentation/metal/customizing_render_pass_setup

I'm thinking of making Swift and using my own Shader for offscreen rendering. Swift conversion itself was done without problems. When I wrote the following monochromatic Shader there, I was able to output it to the screen without any problems.

fragment float4 simpleFragmentShader(VertexOut vertexIn [[stage_in]] {
    return float4(1.0, 0.0, 0.0, 1.0);
}

However, if you use the following Shade, the screen will turn black immediately and you will not be able to output properly. This Shader works fine with normal Metal code, but it just didn't work in the sample environment above. When using such a Shader in the above sample code, is it possible to use it without changing the sample code or somewhere in the Shader?

vertex VertexOut simpleVertexShader(constant float4 *positions [[buffer(0)]],
 uint vid [[ vertex_id ]]) {
    
    VertexOut out;
    
    out.pos = positions[vid];
    
    return out;
}


fragment float4 simpleFragmentShader(VertexOut vertexIn [[stage_in]],
                                     constant float2 &resolution [[buffer(0)]]) {

    float2 resolution0 = float2(resolution[0], resolution[1]);
    float2 p = (vertexIn.pos.xy * 2.0 - resolution0) / min(resolution0.x, resolution0.y);

    return float4(p, 0.0, 1.0);
    
}

在此处输入图片说明

The image is for success.

There is no problem with basic descriptions such as resolution.

renderEncoder.setFragmentBuffer(resolutionBuffer, offset: 0, index: 0)

Addition of Shader of the final rendering.

vertex VertexTextureOut textureVertexShader(constant float4 *positions [[buffer(0)]],
                                            constant float2 *texcoord [[buffer(1)]],
                                            uint vid [[ vertex_id ]]) {
    
    VertexTextureOut out;
    
    out.pos = positions[vid];
    out.texcoord = texcoord[vid];
    
    return out;
}

fragment float4 textureFragmentShader(VertexTextureOut vertexIn [[stage_in]],
                                     texture2d<float> renderTargetTexture [[texture(0)]]) {
    
    constexpr sampler simpleSampler;

    // Sample data from the texture.
    float4 colorSample = renderTargetTexture.sample(simpleSampler, vertexIn.texcoord);

    // Return the color sample as the final color.
    return colorSample;
}

The issue is that 'in.texcoord.xy' is already using normalized coordinates so they are already in the range on 0.0-1.0

This works:

float2 p = (in.texcoord.xy) ;
return float4(p, 0.0, 1.0);

Update: using your sample project and:

float2 p = (vertexIn.pos.xy) / min(resolution0.x, resolution0.y);
return float4(p, 0.0, 1.0);

I get these results which seems correct:

在此处输入图片说明

Update 2: The sample project 2 crashes for me. I had to comment out these two lines (line 63 and 65 of Renderer.swift:

//textureDescriptor.textureType = .type2DMultisample
//textureDescriptor.sampleCount = 4

That got me to a partially black screen. Next I changed line 29 of Shaders.metal to:

float2 p = (vertexIn.pos.xy) / min(resolution0.x, resolution0.y);

Which gave me expected results:

在此处输入图片说明

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM