[英]Splitting a byte[] into multiple byte[] arrays in C#
I am trying to "chunk" up the bytes of an image. 我试图“填充”图像的字节。 This will allow me to upload a large image in portions.
这将允许我分批上传大图像。 I have the image currently stored as one large byte[].
我将当前存储的图像存储为一个大字节[]。 I would like to split the byte array into
byte[]
's with a maxlength of 512 elements. 我想将字节数组拆分为
byte[]
,最大长度为512个元素。 However, I'm not sure how to do this in the most efficient way. 但是,我不确定如何以最有效的方式做到这一点。
Does anyone know how I can do this in the most efficient manner? 有谁知道我怎么能以最有效的方式做到这一点?
I wrote an extension for this, originally for strings, but decided to make it generic. 我为此编写了一个扩展,最初用于字符串,但决定使其成为通用的。
public static T[] CopySlice<T>(this T[] source, int index, int length, bool padToLength = false)
{
int n = length;
T[] slice = null;
if (source.Length < index + length)
{
n = source.Length - index;
if (padToLength)
{
slice = new T[length];
}
}
if(slice == null) slice = new T[n];
Array.Copy(source, index, slice, 0, n);
return slice;
}
public static IEnumerable<T[]> Slices<T>(this T[] source, int count, bool padToLength = false)
{
for (var i = 0; i < source.Length; i += count)
yield return source.CopySlice(i, count, padToLength);
}
Basically, you can use it like so: 基本上,您可以像这样使用它:
byte[] myBytes; // original byte array
foreach(byte[] copySlice in myBytes.Slices(10))
{
// do something with each slice
}
Edit : I also provided an answer on SO using Buffer.BlockCopy here but BlockCopy will only work on byte[]
arrays, so a generic version for strings wouldn't be possible. 编辑 :我还在这里使用Buffer.BlockCopy提供了一个关于SO的答案,但是BlockCopy只能在
byte[]
数组上工作,所以字符串的通用版本是不可能的。
The most efficient method would be: not to. 最有效的方法是:不要。 If you already have the image as a single byte[] then for local code, just specifying the offset and length (perhaps som ArraySegment-of-byte) is usually sufficient.
如果您已经将图像作为单个字节[],那么对于本地代码,只需指定偏移量和长度(可能是字节ArraySegment-of-byte)通常就足够了。 If your upload API only takes byte[], then you still shouldn't chunk it completely;
如果你的上传API只接受byte [],那么你仍然不应该把它完全分块; just use a single 512 buffer and use Buffer.BlockCopy to load it will successive pieces of the data.
只需使用单个512缓冲区并使用Buffer.BlockCopy加载它将连续的数据片段。 You may need to resize (Array.Resize) the final chunk, but at most 2 arrays should be needed.
您可能需要调整最终块的大小(Array.Resize),但最多需要2个阵列。
Even better; 更好; avoid needing a byte[] in the first place: consider loading the data via a streaming API (this will work well if the data is coming from a file);
首先避免使用byte []:考虑通过流API加载数据(如果数据来自文件,这将很有效); just use Read (in a loop, processing the returned value etc) to populate chunks of max 512. For example (untested, just of illustration):
只需使用Read(在循环中,处理返回的值等)来填充max 512的块。例如(未经测试,仅用于说明):
byte[] buffer = new byte[512];
while(true) {
int space = 512, read, offset = 0;
while(space > 0 && (read = stream.Read(buffer, offset, space)) > 0) {
space -= read;
offset += read;
}
// either a full buffer, or EOF
if(space != 0) { // EOF - final
if(offset != 0) { // something to send
Array.Resize(red buffer, offset);
Upload(buffer);
}
break;
} else { // full buffer
Upload(buffer);
}
}
public static IEnumerable<byte[]> Split(this byte[] value,int bufferLength){
int countOfArray = value.Length / bufferLength;
if(value.Length % bufferLength > 0)
countOfArray ++;
for(int i=0;i<countOfArray;i++)
{
yield return value.Skip(i * bufferLength).Take(bufferLength).ToArray();
}
}
This is my extension what I used 这是我使用的扩展
I know this is old but needed the same solution and following works perfectly for me hope this helps someone
我知道这是旧的,但需要相同的解决方案和以下工作完美的我希望这有助于某人
private byte[][] ByteArrayToChunks(byte[] byteData, long BufferSize)
{
byte[][] chunks = byteData.Select((value, index) => new { PairNum = Math.Floor(index / (double)BufferSize), value }).GroupBy(pair => pair.PairNum).Select(grp => grp.Select(g => g.value).ToArray()).ToArray();
return chunks;
}
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.