简体   繁体   中英

Reduce Memory footprint if possible using Ado.NET Adapter

I have the need to synchronize a mobile database with the main database one. The idea is the following. I've got a REST service that accepts in input the name of the table I need to sync (with a DateTime, since the next time the data exchanged will only be the delta from last sync to the date I pass).

In the current implementation (since I want to be as more generic as possible) is the following

      SqlConnection connection = new SqlConnection(connectionString);
                using (SqlCommand command = new SqlCommand("SELECT * FROM XXX", connection))
                {
                    SqlDataAdapter sqlDataAdapter = new SqlDataAdapter(command);

                    DataSet ds = new DataSet();
                    sqlDataAdapter.Fill(ds);

                    using (var memoryStream = new MemoryStream())

                    {
                        ds.WriteXml(memoryStream, XmlWriteMode.WriteSchema);

                        var data = Encoding.UTF8.GetString(memoryStream.ToArray());

//just for test                        File.WriteAllText("c:\\temp\\XXX.txt", data);
                    }
                }

Now for a table that on disk is around 256MB I got a memory footprint of 1GB. Any suggestion on how to reduce the memory footprint?

Thanks

Options to reduce memory:

  1. don't load the entire table; add a WHERE clause, perhaps using paging or similar
  2. don't use DataTable - it is not your friend for efficiency; if you just need the data, consider POCO types, perhaps with any ORM of your choice
  3. if you are trying to write to a file, don't use a MemoryStream : write directly to the FileStream so you don't need an additional copy of everything; there's also no need for that ToArray() - even if you use MemoryStream , prefer GetBuffer() and limit to .Length ; or use StringWriter

I explain you.. since I need to load (and at the first time I need to fully load around 30 tables) I neednto have the whole dataset. I know poco are better but I would have to create 30 POCOs and have to manually pwrform the insert on the client side. this way I load the dataset and do a bulkinsert

If you take a look at the docu , you will see the output of a very simple example datatable. XML is just creating a huge overhead based on opening and closing xml-tags.

From the link

This:

table.Rows.Add(new object[] { 1, "Mary" }); 
table.Rows.Add(new object[] { 2, "Andy" }); 
table.Rows.Add(new object[] { 3, "Peter" }); 
table.Rows.Add(new object[] { 4, "Russ" });

Results in

 <Table1>  
     <ID>1</ID>. 
     <Name>Mary</Name>    
</Table1>    
<Table1>  
     <ID>2</ID>  
     <Name>Andy</Name>   
</Table1>   
<Table1>  
     <ID>3</ID>  
     <Name>Peter</Name>   
</Table1>   
<Table1>      
    <ID>4</ID>     
    <Name>Russ</Name>  
</Table1>

What I want to say is that for this small example a source content of 17 chars and 4 integers results in 156 chars. This should explain why a memory footprint of 1GB for a database of 256MB should be expected.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM