Hello friends! I have some database tables that store Image/Bitmap as part of each record as a Byte[]. I also have method that loads all the tables records into a DataTable and I would like some suggestions on how I can get this Byte[] column in my DataTable to be of type
System.Drawing.Image after or during the load from the DbDataReader .

Thanks in advance!

public DataTable SelectAll(string tableName, DbConnection conn)
        {
            DataTable dt = new DataTable();

            try
            {
                string query = "SELECT * FROM " + tableName;

                conn.Open();

                using (DbCommand cmd = GetDbCommand(query, conn))
                {
                    using (DbDataReader dr = cmd.ExecuteReader())
                    {
                        dt.Load(dr);
                    }
                }
            }
            catch (DbException ex)
            {
                Console.WriteLine("Exception: {0}\r\n   Stack Trace: {1}", ex.Message, ex.StackTrace);
                System.Diagnostics.Debugger.Break();
            }
            finally
            {
                conn.Close();
            }

            return dt;
        }

Just create a new column and move the data over to the desired data type.


I don't know of a one-liner to do this:

private void button1_Click(object sender, EventArgs e)
    {
      DataTable dt = new DataTable();
      dt.Columns.Add(new DataColumn("RecordId", typeof(int)));
      dt.Columns.Add(new DataColumn("Picture", typeof(byte[])));
      {
        DataRow row = dt.NewRow();
        row["RecordId"] = 1;
        row["Picture"] = System.IO.File.ReadAllBytes(@"C:\picture.bmp");
        dt.Rows.Add(row);
        row = dt.NewRow();
        row["RecordId"] = 2;
        row["Picture"] = System.IO.File.ReadAllBytes(@"C:\picture2.bmp");
        dt.Rows.Add(row);
      }

      string colName = Guid.NewGuid().ToString();
      dt.Columns.Add(new DataColumn(colName, typeof(Image)));
      foreach (DataRow row in dt.Rows)
      {
        using (System.IO.MemoryStream ms = new System.IO.MemoryStream((byte[])row["Picture"]))
        {
          Image bmp = Bitmap.FromStream(ms);
          row.BeginEdit();
          row[colName] = bmp;
          row.EndEdit();
        }
      }
      dt.AcceptChanges();
      dt.Columns.Remove("Picture");
      dt.Columns[colName].ColumnName = "Picture";
      dt.AcceptChanges();
      System.Diagnostics.Debugger.Break();
    }

Edited 7 Years Ago by sknake: n/a

Thanks Scott, that works fine for what I'm doing.

Out of curiosity, do you know of a way or example of overriding the default behavior of the datatable's load from the db reader? I was thinking if I had a large table schema with several column types that cannot be directly translated to system types that I would want to consider a more efficient way of performing a load and conversion all-in-one.

You can use the .NET reflector to dig in to the sql classes to see how it works. I'm sure it parses the rows one by one so you could manually read the rows and convert the columns as you import the data instead of using datable.load(); . I personally try to do everything at the data access layer after the .NET framework loads it in to a datatable (like the earlier example). In the old days of ado data access did not work so hot -- and i'm reluctant to try expanding their current data access and cause problems for myself.

As far as efficiency goes it would probably be the same. If you're pulling a PDF or an image down from the SQL Server it is stored as a byte[] array on the server. On the client end you have to instantiate the container class and load the buffer. No matter where you perform that task you're going to take a performance hit.

What you could do that my code didn't do is set the picture byte[] column to null after the image is loaded. The way I wrote the code means you will use double the memory until the all of the conversion is done, then it will drop the byte buffers.

foreach (DataRow row in dt.Rows)
      {
        using (System.IO.MemoryStream ms = new System.IO.MemoryStream((byte[])row["Picture"]))
        {
          Image bmp = Bitmap.FromStream(ms);
          row.BeginEdit();
          row[colName] = bmp;
          row["Picture"] = DBNull.Value; //new line, reclaim memory
          row.EndEdit();
        }
      }

You cannot change DataType of column if it has data already. Why not just add a column and remove a column than in this case?

... I'm sure it parses the rows one by one so you could manually read the rows and convert the columns as you import the data instead of using datable.load(); .

...On the client end you have to instantiate the container class and load the buffer.

Thanks for the follow-up sknake. I'm going to mark this as solved, but was wondering if you could refer any examples related to the excerpted quotes above?

DdoubleD,

Watching the behavior of DTS in MSSQL I noticed that it submitted two queries -- the first fetches the schema only, the second starts fetching the data. It appears what they did is this:

public DataTable QueryDataTableSchemaOnly(string query, List<DataParameter> lst)
    {
      DataTable result = null;
      using (DbConnection conn = GetDbConnection())
      {
        conn.Open();
        using (DbCommand cmd = GetDbCommand(query, lst, conn))
        {
          LogQuery(query, lst);
          using (DbDataReader dr = cmd.ExecuteReader(CommandBehavior.SchemaOnly))
          {
            result = new DataTable();
            result.Load(dr);
          }
        }
        conn.Close();
      }
      return result;
    }

Now that you have the schema information you could re-run your query with executing a reader but instead of loading the reader directly in to a datatable you could read row-by-row and convert all of the values to the desired data type as you imported the rows. I have long suspected this is how the .NET framework works under the hood but I have never dug in to it.

This question has already been answered. Start a new discussion instead.