简体   繁体   中英

Getting binary data using SqlDataReader

I have a table named Blob (Id (int), Data (Image)). I need to use SqlDataReader to get that image data. Note that I dont want to Response.Binarywrite() the data to the browser. I just need that binary data as byte[] to use the same for some internal operations. Only way I can think of is getting id using SqlDataReader and the again use SqlCommand.ExecuteScalar() to get that as byte[] for a given id. Can I use just the SqlDataReader (SqlCommand.ExecuteReader) to get that image data as byte[]? Am I missing anything?

You should be able to get it via: (byte[])reader["Data"] .

Also note that the image data type is deprecated, and will be removed in a future version of SQL Server; use varbinary(max) instead.

Yes, you can use SqlDataReader.GetBytes . You probably want to pass in null for the buffer in the first call, to find out how much data there is, and then call it again with an appropriately sized buffer.

You may just be able to use the indexer and cast the result to a byte array - I'm not sure. Worth a try:)

In .NET Framework 4.5 you can use GetStream method to access binary data as Stream.

From MSDN . Not sure why I couldn't find that before.

    SqlConnection pubsConn = new SqlConnection("Data Source=localhost;Integrated Security=SSPI;Initial Catalog=pubs;");
    SqlCommand logoCMD = new SqlCommand("SELECT pub_id, logo FROM pub_info", pubsConn);

    FileStream fs;                          // Writes the BLOB to a file (*.bmp).
    BinaryWriter bw;                        // Streams the BLOB to the FileStream object.

    int bufferSize = 100;                   // Size of the BLOB buffer.
    byte[] outbyte = new byte[bufferSize];  // The BLOB byte[] buffer to be filled by GetBytes.
    long retval;                            // The bytes returned from GetBytes.
    long startIndex = 0;                    // The starting position in the BLOB output.

    string pub_id = "";                     // The publisher id to use in the file name.

    // Open the connection and read data into the DataReader.
    pubsConn.Open();
    SqlDataReader myReader = logoCMD.ExecuteReader(CommandBehavior.SequentialAccess);

    while (myReader.Read())
    {
      // Get the publisher id, which must occur before getting the logo.
      pub_id = myReader.GetString(0);  

      // Create a file to hold the output.
      fs = new FileStream("logo" + pub_id + ".bmp", FileMode.OpenOrCreate, FileAccess.Write);
      bw = new BinaryWriter(fs);

      // Reset the starting byte for the new BLOB.
      startIndex = 0;

      // Read the bytes into outbyte[] and retain the number of bytes returned.
      retval = myReader.GetBytes(1, startIndex, outbyte, 0, bufferSize);

      // Continue reading and writing while there are bytes beyond the size of the buffer.
      while (retval == bufferSize)
      {
        bw.Write(outbyte);
        bw.Flush();

        // Reposition the start index to the end of the last buffer and fill the buffer.
        startIndex += bufferSize;
        retval = myReader.GetBytes(1, startIndex, outbyte, 0, bufferSize);
      }

      // Write the remaining buffer.
      if(retval > 0) // if file size can divide to buffer size
          bw.Write(outbyte, 0, (int)retval); //original MSDN source had retval-1, a bug
      bw.Flush();

      // Close the output file.
      bw.Close();
      fs.Close();
    }

    // Close the reader and the connection.
    myReader.Close();
    pubsConn.Close();

Use this function for safe and flexible bytes reading:

    /// <summary>
    /// Reads all available bytes from reader
    /// </summary>
    /// <param name="reader"></param>
    /// <param name="ordinal"></param>
    /// <returns></returns>
    private byte[] GetBytes(SqliteDataReader reader, int ordinal)
    {
        byte[] result = null;

        if (!reader.IsDBNull(ordinal))
        {
            long size = reader.GetBytes(ordinal, 0, null, 0, 0); //get the length of data 
            result = new byte[size];
            int bufferSize = 1024;
            long bytesRead = 0;
            int curPos = 0;
            while (bytesRead < size)
            {
                bytesRead += reader.GetBytes(ordinal, curPos, result, curPos, bufferSize);
                curPos += bufferSize;
            }
        }

        return result;
    }

No need of using the reader. Just use a dataset to fetch values from the database(using stored Proc or any other method) and just type cast it with byte(code below) and store it in a byte array. Your work is done.

byte[] productImage;
productImage = (byte[])ds.Tables[0].Rows[0]["Image"];

This is an old question, and I had been using Anton Bakulev's answer above for a while, until I ran into a case where my data was actually bigger than the 2GB that int curPos could handle. When I tried changing that bufferIndex argument to 0, well anything that was beyond the bufferSize returned corrupted. (Also, that tiny buffer size made loading anything above 2MB a literal PAIN).

No, you probably should not have over 2GB of data in a single column in your database. Try to avoid that. But just in case, here is a more robust, and streamlined, version of the code, as an SqlDataReader extension method:

public static byte[] ParseStrictByteArray(this SqlDataReader reader, string columnName)
{
    int colIdx = reader.GetOrdinal(columnName);
    long size = reader.GetBytes(colIdx, 0, null, 0, 0);
    byte[] imageValue = new byte[size];
    // essentially, we are loading all this data in memory, either way... Might as well do it in one swoop if we can
    int bufferSize = (int)Math.Min(int.MaxValue, size); 
    //int.MaxValue = 2,147,483,647 = roughly 2 GB of data, so if the data > 2GB we have to read in chunks
            
    if(size > bufferSize){

        long bytesRead = 0;
        int position = 0;
        //we need to copy the data over, which means we DON'T want a full copy of all the data in memory. 
        //We need to reduce the buffer size (but not too much, as multiple calls to the reader also affect performance a lot)
        bufferSize = 104857600; //this is roughly 100MB
        byte[] buffer = new byte[bufferSize];
        while (bytesRead < size)
        {
            if (size - bytesRead < bufferSize)
                bufferSize = Convert.ToInt32(size - bytesRead);

            bytesRead += reader.GetBytes(colIdx, position, buffer, 0, bufferSize);
            //shift the buffer into the final array
            Array.Copy(buffer, 0, imageValue, position, bufferSize);
            position += bufferSize;
        }
    }
    else 
    {
        //single read into the image buffer
        reader.GetBytes(colIdx, 0, imageValue, 0, bufferSize);
    }

    return imageValue;
} 

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM