简体   繁体   中英

Why ASP.NET long running page request hangs forever for the client?

We ran into a problem with an ASP.NET web application written in C# which is built with Web Forms (.ASPX pages). There are some long running requests in this system which take about 10-15 minutes. This is normal due to the heavy SQL queries and calculations that are running at the background. This is an old system and this have never been a problem over the years. Some time ago a couple of users started complaining that they cannot download those reports which take more than 10 minutes to run. Technically they run the report by hitting a button on the page which performs a post back to the server where the report is generated and when the file is ready the server returns it to the client where the “save file” dialog appears (depending on the browser). Unfortunately, those users don't get the “save file” dialog and the file never appears in the browser and the page seems to be loading forever.

As I said, we have never seen this issue before (the system has been hosted on different servers over the years). We are unable to reproduce the problem in our local development environment. What is most interesting is the fact that we cannot reproduce the problem when we try running the same report from our personal computers which are outside of the client's corporate AD domain network. However, we can reproduce the problem when running sample requests in Postman.

Here is a short summary of when the problem can be reproduced and when everything works fine:

  1. Server = [production server, IIS 10, Windows Server 2019]; Client = [Any browser (either Chrome or Edge) when this is a workstation in the client's corporate AD domain network]; Problem appears => YES
  2. Server = [production server, IIS 10, Windows Server 2019]; Client = [Any browser (either Chrome or Edge) when this is a computer OUTSIDE the corporate AD domain network]; Problem appears => NO
  3. Server = [production server, IIS 10, Windows Server 2019]; Client = [Postman OR sample C# console application making HTTP requests OR PowerShell script when the client computer is OUTSIDE the corporate AD domain network]; Problem appears => YES
  4. Server = [development machine, IIS 10, Window 10]; Client = [Any browser (either Chrome or Edge) from a computer OUTSIDE the corporate AD domain network]; Problem appears => NO

We did various experiments and it seems that this problem appears in different use cases as follows:

  • Do a post-back to the .ASPX page and reload some content on the screen
  • Do a post-back and return a file attachment (the “save file” dialog in the browser)
  • Sample page which just returns some plain text as a response

We found that in all those cases the problem appears if the HTTP request takes more than 240-250 seconds.

The problem is that no error appears and we cannot see any indication neither in browser's DevTools (F12) nor in Postman nor in Fiddler. It just looks like as if the request hangs and remains “pending” forever.

All “timeout” settings that we know on the server are long enough. For example, the ASP.NET session timeout is 60 minutes, the httpRuntime is configured with executionTimeout=3600, the Application Pool Idle Time-out is set to 0.

It just looks like that if the HTTP request takes more than 240-250 seconds then the connection hangs and it happens only for some clients. Just to clarify, the process on the server completes every time and it performs all tasks without any problems.

Some time ago we did an experiment and it seems the problem doesn't appear if the server periodically sends something to the client by using Response.Write() and Response.Flush() while the long HTTP request is still running but we cannot use this workaround when the server sends a file attachment back to the browser. Also, in addition to finding a workaround we are trying to understand what is the cause of this issue and we are looking for some kind of a server setting that we can use to control this behavior.

Any ideas?

Quite sure the time out settings are browser based - not server based. So, updates and upgrades to browsers would be quite much why this is occurring. And those browser receive many updates - often weekly.

And in fact, your testing shows and suggests that you get different results based on the kind of client software used - not any changes to the server.

So, using post-man etc. gives different results. This is not the server side of things - but the web browsers settings. I would suggest you consider generating the data or whatever takes a long time as a new thread, and when done, it sets some session value.

And in the browser side, you have a client side timer (or even a asp.net one) - say every 5 or 10 seconds calls a web method, and it returns the status of the data generation. When data generation and the long parts are done then client side can then click a button, or whatever to actually navigate or display the results.

So, you are in effect fighting against the browser makers, and their updates to their browsers - something you can't control much. You might tweak this, and tweak that - only in some months to see a whole rash of new issues based on say some update to FireFox, or whatever browser(s) they are using.

So, I would start a new process thread for the long running routine(s), and then the post-back has occurred. As noted, at that point, a simple timer that checks every 10 or 20 seconds could occur. It would/could as noted, call a web method. Or even as noted, you can use a asp.net timer, and it would trigger every 10 seconds, code behind runs to get the session, and if the data is ready then the code behind can navigate to the new page, or fill out the grid or whatever objects and means you are using to render that data.

From the information you posted, it sure looks like you fighting the browser makers, and the various client side settings for time outs - often not under your control.

Anything that waits for say more then 10-15 seconds? Then you ignoring the basic concepts of how a browser works. (post-back, wait a bit, and get a response). Any other design then this type of approach quite much ignores the whole architecture of web based software. Long waits as a result of a post-back is simple NOT a workable design approach. You might have gotten away doing this approach, but you now clearly paying the price for this bad assumption and design.

Anything that going to take more then 10 seconds is TOO long of a time frame for a response.

For example, I have some PDF files. I need to genrate a image (preview) thumb nail.

So, I load the grid, (show a generic image). And then start a asp.net timer.

I then check for if this process is done (I used session). When done, then I simple refresh the grid, and stop the timer. So, this approach required a MIN of changes to the existing code, but also afforded the ability to poll and check the status of that long running process, and do so without any fancy ajax, and even having to write client side code.

So I have this code:

    If bolUpdate Then
        ' process the images as seperate process
        Dim MyP(1) As String        ' threads only allow one param
        MyP(0) = cPinfo.ProjectHeaderID
        MyP(1) = cPinfo.PortalComp
        Dim mypthread As New Thread(New ParameterizedThreadStart(AddressOf ProcessThumbs))
        mypthread.Start(MyP)
    End If

    If bolUpdate Then
        ' start a timer to update page when image processing done
        Timer1.Enabled = True
    End If

So, I started a separate thread, and then start the timer.

the timer code thus waits for the image processing to complete. I trigger it once per second.

Protected Sub Timer1_Tick(sender As Object, e As EventArgs) Handles Timer1.Tick

    Dim strSQL As String = "SELECT * FROM myTable WHERE ProjectHeaderID = " &
                            cPinfo.ProjectHeaderID & " AND WebPreviewThumb is null"
    Dim rst As DataTable
    rst = Myrst(strSQL, GetConstr(cPinfo.PortalComp))
    If rst.Rows.Count = 0 Then
        ' no files to process left
        Timer1.Enabled = False
    End If

    LoadProofGrid()

End Sub

So, in above, it pulls the data source for the grid display, and if any null image previews exist - it keeps running. If all image previews are done, then I stop the timer.

And I did not really have to re-load the gridview for each cycle, but I like the effect (if you have 5 rows, then you see updates occur for each preview image in the grid).

Now, I had some specific data rows to test when we are done, but I could have just as well had the process thread set some session() value like session("ReportReady") = true, and check that in the timer event.

So, I would not base your software design on a hope and wing and a prayer, but in fact, add a wee bit of code to cycle around and around waiting for this to occur. The nice part is with a timer event, then you can have some type of progress bar, or even a nice please wait - processing for the user.

and using a timer event was easy, and nice, since no new markup (except for the timer control), or even introduction of client side JavaScript was required here. However, it is in most cases better to setup such timer code client side, use ajax calls, and get some status as to when the longer running process is done.

However, the timer approach above tends to mean far less changes to the existing page markup wise. And since the asp.net timer DOES cause a post back, then place it in a simple tiny update panel and you not suffer full page post backs and replots during this waiting period. Once the process is done, then your code behind can now continue to render the report or do whatever.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM