简体   繁体   中英

ListView ColumnWidthChanged Event gets fired 'without reason'

I have a listview and an event handler for the ColumnWidthChanged event, which stores the new column sizes in a config file every time the user changes something.

I've created flags for every process that is changing the columns internally for being able to only react on user inputs.

Still this event gets fired at program start ( once for every column in the listview ).

The call stack only shows the event, 'external code' and the main 'program.cs'.

I just can't find out where the event is getting triggered from. Any ideas?

You will need to put the code where you wire up the ColumnWidthChanged event into the Shown event of your form.

What is happening is that the ListView is taking up physical space on the form, so as it is being built, it has to calculate the size of each column. Therefore, as it is populating the data into them, those columns get resized once they have the data. That would be why it is firing once for each column before it is even displayed.

As for why it is not firing on the DataGridView, I am thinking that the size of the object is what is added to the form. It would wait to calculate the size of each of the columns until after the DataGridView is already added to the Form. It's a little unintuitive behavior, but that appears to be what is happening from the results you're seeing.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM