简体   繁体   中英

python urllib2 timing

I'd like to collect statistics related to how long each phase of a web request takes. httplib offers:

 def run(self): conn = httplib.HTTPConnection('www.example.com') start = time.time() conn.request('GET', '/') request_time = time.time() resp = conn.getresponse() response_time = time.time() conn.close() transfer_time = time.time() self.custom_timers['request sent'] = request_time - start self.custom_timers['response received'] = response_time - start self.custom_timers['content transferred'] = transfer_time - start assert (resp.status == 200), 'Bad Response: HTTP %s' % resp.status 

Are these statistics available from a more high-level interface like urllib2 ? Is there high level library offering such statistics?

As mentioned in a related question a good way to do this now is to use the requests library. You can use it to measure the request latency, though I'm not sure if you can measure the content transfer timing. You could potentially do that by comparing a HEAD request to a GET request.

time.time is not the most reliable and precise. You can use the timeIt module in python for your profiling purpose. http://docs.python.org/library/timeit.html Here is a code snippet that uses timeit

    statmnt = 'print "Replace print with the snippet you want to profile"'
    setup = 'print "Replace this line with some snippet specific imports"' 
    n = 1 #Number of times you want the timeit module to execute the statmnt
    t = timeit.Timer(statmnt, setup)
    qTime = t.timeit(n)

In your case you will hae to give create three timeit objects, for request, response and content. Do refer the documentation for more info on the module timeit

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM