简体   繁体   中英

Manually set a 1D Texture in Metal

I'm trying to fill a 1D texture with values manually and pass that texture to a compute shader (these are 2 pixels that I want to set via code, they don't represent any image).

Due to the current small amount of Metal examples, all examples I could find deal with 2D textures that load the texture by converting a loaded UIImage to raw bytes data, but creating a dummy UIImage felt like a hack for me.

This is the "naive" way I started with -

...
var manualTextureData: [Float] = [ 1.0, 0.0, 0.0, 1.0,
                                   0.0, 0.0, 1.0, 1.0 ];
let region: MTLRegion = MTLRegionMake1D(0, textureDescriptor.width);
myTexture.replaceRegion(region, mipmapLevel: 0, withBytes: &manualTextureData, bytesPerRow: 0);

but Metal doesn't recognize those values in the shader (it gets an empty texture, except for the first value).

I quickly realized that the Float array probably has to be converted into a bytes array (eg UInt8 ), but couldn't find a way to convert from [Float] to [UInt8] either.

Another possible option I consider is using a CVPixelBuffer object, but that also felt like a workaround to the problem.

So whats the right way to tackle that?

Thanks in advance.

  • Please note I'm not familiar with Objective-C , hence I'm not sure whether using CVPixelBuffer / UIImage is exaggerated for something which should be straight-forward.

If you want a float texture bytesPerRow should be 4 for times the width, because a float has a size of 4 bytes. Metal copies the memory and dont care about the values. That is your task ;-)

Something like:

myTexture.replaceRegion(region, mipmapLevel: 0, withBytes: &manualTextureData, bytesPerRow: manualTextureData.count * sizeof(Float));

I don't see any reason for you to pass data using 1D texture. Instead I would go with just passing a buffer. Like this:

var dataBuffer:MTLBuffer? = device.newBufferWithBytes(&manualTextureData, length: sizeOf(manualTextureData), options: MTLResourceOptions.OptionCPUCacheModeDefault)

Then you hook it to your renderCommandEncoder like this:

renderCommandEncoder.setFragmentBuffer(dataBuffer, offset: 0, atIndex: 1)//Note that if you want this buffer to be passed to you vertex shader you should use setVertexBuffer

Then in your shader, you should add parameter like this const device float* bufferPassed [[ buffer(1) ]]

And then use it like this, inside your shader implementation:

float firstFloat = bufferPassed[0];

This will get the job done.

Not really answering your question, but you could just define an array in your metal shader instead of passing the values as a texture.

Something like:

constant float manualData[8] = { 1.0, 0.0, 0.0, 1.0,
                               0.0, 0.0, 1.0, 1.0 };

vertex float4 world_vertex(unsigned int vid[[vertex_id]], ...) {
    int manualIndex = vid % 8;
    float manualValue = manualData[manualIndex];
    // something deep and meaningful here...
    return float4(manualValue);
}

Please forgive the terse reply, but you may find it useful to take a look at my experiments with Swift and Metal. I've created a particle system in Swift which is passed to a Metal compute shader as a one dimensional array of Particle structs. By using posix_memalign, I'm able to eliminate the bottleneck caused by passing the array between Metal and Swift.

I've blogged extensively about this: http://flexmonkey.blogspot.co.uk/search/label/Metal

I hope this helps.

Simon

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM