简体   繁体   中英

Using SqlDataReader for filling List<T> in c#

I have a class like below,

class Student
{
    public string Name {get; set;}
    public string Surname {get; set;}
    public int Age {get; set;}
    public string Address {get; set;}
}

And I have a MySql table with 50,000 records. The Structure of the table is like below,

ID        NAME       SURNAME        AGE          ADDRESS
1         Joe        Philip         20           Moscow
2         Misha      Johny          25           London
...

And I have a C# code,

List<Student> students = new List<Student>();
string sql = "SELECT name,surname,age,address FROM Students";
command.CommandText = sql;
MySqlDataReader reader = command.ExecuteReader();
while(reader.Read())
{
    Student st = new Student();
    st.Name = reader["Name"].ToString();
    st.Surname = reader["Surname"].ToString();
    st.Age = Convert.ToInt32(reader["Age"].ToString());
    st.Address = reader["Address"].ToString();
    students.Add(st);
}

But it works very slow. Which method do you advice to make this code run faster?

UPDATE:

When I use this code,

DataTable dt = new DataTable();
adapter.Fill(dt);

It works very well and speed is very normal. But what is the problem I try it with my own classes?

If the code runs slowly, the largest cause is that there are 50,000 records. What exactly do you need with 50,000 Student objects? If you can find a way to solve your problem without reading all of those records and creating all of those objects, you'll have faster code.

Update

Using your own class is fine. Most of the time when things run slow, it's because your code is I/O bound (you spend most of your time waiting for I/O). To avoid all that I/O, you can reduce the amount of data you retrieve (perhaps by eliminating irrelevant columns or rows from your data) or doing your processing on the database through a more complex query or stored procedure.

Update 2

To answer your follow-up question (why creating a list of objects is slower than getting a DataSet ), I would expect that reading the entire query as a DataSet would only be slightly faster than object creation. I'm not familiar with how that MySQL .NET library is implemented. It is surprising to me that the two methods would have a large difference in speed. Maybe MySqlDataReader is doing something dumb like using an internal DataSet . If the performance is drastically different between the two, it's probably something the author of that library should fix.

Update 3

This answer for MySqlDataAdapter or MySqlDataReader for bulk transfer? has a good tip; setting the BatchSize of the reader may be helpful. If the batch size is too small for the reader, that would make it less efficient with a large number of records like yours.

Using index of record instead of coulmname make the performance a bit better use

st.Name = reader[0].ToString();
instead of
st.Name = reader["Name"].ToString();

and 
st.Name = reader[0].ToString();
instead of 
st.Name = reader["surname"].ToString();

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM