简体   繁体   中英

Get status code of url efficiently in python, alternative for for-loop

I want to check a list of urls (in a column of a dataframe df) for their status codes (404, 403 and 200 seem to be the interesting ones). I defined a function which does the job. However, it uses a for-loop which is inefficient (I have a long list of urls!).

Does anyone have a hint on how to do it more efficiently? Optimally the returned status code would also be displayed in a new column of the dataframe, eg df['status_code_url'].

def url_access(df, column):
    e_404 =0
    e_403 =0
    e_200 =0
    for i in range(0, len(df)):
        if requests.head(df[column][i]).status_code == 404:
            e_404= e_404+1
        elif requests.head(df[column][i]).status_code == 403:
            e_403 = e_403 +1
        elif requests.head(df[column][i]).status_code == 200:
            e_200 = e_200 +1
        else:
            print(requests.head(df[column][i]).status_code)

    return ("Statistics about " + column , '{:.1%}'.format(e_404/len(df)) 
            + " of links to intagram post return 404", '{:.1%}'.format(e_403/len(df)) 
            + " of links to intagram post return 403", '{:.1%}'.format(e_200/len(df)) 
            + " of links to intagram post return 200")

Thank you a lot!

Use Pandas , apply and groupby -

def url_access(x):
    return requests.head(x).status_code


df['Status'] = df['url'].apply(url_access)

dfcount = df.groupby('Status')['url'].count().reset_index()

Basically, your task seems to be:

  1. get url status code
  2. collect responses for all of the
  3. calculate percentages

For first step you use:

def get_code(url):
    return requests.head(url).status_code

For second step you apply this fucntion to dataframe column, see https://pandas.pydata.org/pandas-docs/stable/generated/pandas.DataFrame.apply.html

resp_df = df[column].apply(get_code, axis=1) 

For third step you can use opertions over column to calculate percentages:

resp_df[resp_df == 404].sum() / len (resp_df) 

(note code not run)

pandas.DataFrame.apply (or rather, the normal requests library) will only be able to make one request at a time. To do multiple requests in parallel, you can use requests_futures (install it with pip install requests-futures ):

import pandas as pd
from requests_futures.sessions import FuturesSession

def get_request(url):
    session = FuturesSession()
    return session.head(url)


def get_status_code(r):
    return r.result().status_code

if __name__ == "__main__":
    urls = ['http://python-requests.org',
            'http://httpbin.org',
            'http://python-guide.org',
            'http://kennethreitz.com']
    df = pd.DataFrame({"url": urls})
    df["status_code"] = df["url"].apply(get_request).apply(get_status_code)

Afterwards you can use for example groupby , as suggested by @Aritesh in their answer :

stats = df.groupby('status_code')['url'].count().reset_index()
print(stats)
#    status_code  url
  0          200    1
  1          301    3

With this you probably also want to add some protection against connection errors and a timeout:

import numpy as np
import requests

def get_request(url):
    session = FuturesSession()
    return session.head(url, timeout=1)

def get_status_code(r):
    try:
        return r.result().status_code
    except (requests.exceptions.ConnectionError, requests.exceptions.ReadTimeout):
        return 408 # Request Timeout

ips = np.random.randint(0, 256, (1000, 4))
df = pd.DataFrame({"url": ["http://" + ".".join(map(str, ip)) for ip in ips]})
df["status_code"] = df["url"].apply(get_request).apply(get_status_code)
df.groupby('status_code')['url'].count().reset_index()
#    status_code  url
# 0          200    3
# 1          302    2
# 2          400    2
# 3          401    1
# 4          403    1
# 5          404    1
# 6          408  990

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM