简体   繁体   中英

How to close aiohttp ClientSession

I am trying to make an app that might live for a day, a week or longer. Dyring the app's lifetime, it will make requests to different API's. Some of these apis might require log in, so it is important that i have access to cookies at all times.

So what i need is a file that the different API's can use without blocking the app.

I am new to asynchronous programming(asyncio/aiohttp) and examples i have seen, shows how to make a lot of requests from a list of url's, but this is not what i need.

The problem with the code i have is, either i get ClientSession is closed error or unclosed ClientSession warnings.

import asyncio  # only here for debugging purposes
import aiohttp

USER_AGENT = 'Mozilla/5.0 (Windows NT 6.1; WOW64; rv:61.0) Gecko/20100101 Firefox/61.1'


def default_headers():
    header = {
        'User-Agent': USER_AGENT
    }
    return header


class WebSession(object):
    session = None

    @classmethod
    def create(cls):
        cls.session = aiohttp.ClientSession()
        return cls.session

    @classmethod
    def close(cls):
        if cls.session is not None:
            cls.session.close()

async def request(method, url, **kwargs):

    if kwargs.get('headers', None) is None:
        kwargs['headers'] = default_headers()

    if WebSession.session is None:
        session = WebSession.create()
    else:
        session = WebSession.session


    async with session.request(method=method, url=url, **kwargs) as response:
        if isinstance(session, aiohttp.ClientSession):
            # if i close the session here, i will get the ClientSession closed error on 2. request.
            # await session.close()
            pass

        return response


async def get(url, **kwargs):
    return await request('GET', url=url, **kwargs)


async def post(url, **kwargs):
    return await request('POST', url=url, **kwargs)


async def get_url():
    res = await get('https://httpbin.org/get')
    print(f'Status code: {res.headers}')


m_loop = asyncio.get_event_loop()
m_loop.run_until_complete(get_url())
# if i run this without closing the ClientSession, i will get unclosed ClientSession warnings.
m_loop.run_until_complete(get_url())
m_loop.close()

I do get a response from the server, however it is followed by this error/warning

Unclosed client session
client_session: <aiohttp.client.ClientSession object at 0x03354630>
Unclosed connector
connections: ['[(<aiohttp.client_proto.ResponseHandler object at   0x033BBBF0>, 71.542)]']
connector: <aiohttp.connector.TCPConnector object at 0x033542D0>

If i uncomment the await session.close() and remove the pass i get a response from the server in the first request, followed by RuntimeError: Session is closed in the second request.

Ahh, i think i got it now.

The warnings i got Unclosed client session and Unclosed connector was aiohttp telling me "hey, you forgot to close the session". And this is exactly what happened with this small example. Both calls to get_url would actually get a response from the server, and then the app would end. So the session would then be left in an unclosed state when the app ended, which is why the abover warnings were shown.

I was not supposed to close the session after each request, since there would be no way of making a new request at that point, atleast not to my knowledge. And that is why i got RuntimeError: Session is closed when trying to make a new request, once it was already closed.

So once i figured this out, i then created a close function, and simply called this before the loop(app) ended. Now i get no warnings/errors. And cookies are now shared between all request made(i think) while the app is running. Whether they be GET or POST, and that was exactly what i wanted.

I hope that someone else new to aiohttp/asyncio will benefit from this, as it took me some time(to long) to understand. As i am still new to aiohttp/asyncio, i don't know if this is the correct way of doing it, but at least it seems to work.

import asyncio  # only here for debugging purposes
import aiohttp

USER_AGENT = 'Mozilla/5.0 (Windows NT 6.1; WOW64; rv:61.0) Gecko/20100101 Firefox/61.1'


def default_headers():
    header = {
        'User-Agent': USER_AGENT
    }
    return header


class WebSession(object):
    session = None

    @classmethod
    def create(cls):
        cls.session = aiohttp.ClientSession()
        return cls.session

    @classmethod
    def close(cls):
        if cls.session is not None:
            # apparently this is supposed to return a future?
            return cls.session.close()


async def request(method, url, **kwargs):

    if kwargs.get('headers', None) is None:
        kwargs['headers'] = default_headers()

    if WebSession.session is None:
        session = WebSession.create()
    else:
        session = WebSession.session

    return await session.request(method=method, url=url, **kwargs)


async def get(url, **kwargs):
    return await request('GET', url=url, **kwargs)


async def post(url, **kwargs):
    return await request('POST', url=url, **kwargs)


async def get_url():
    res = await get('https://httpbin.org/get')
    print(f'Headers: {res.headers}')


async def close():
    # run this before the app ends
    await WebSession.close()

# so imagine that this is our app.
m_loop = asyncio.get_event_loop()
# its running now and doing stuff..

# then it makes a request to a url.
m_loop.run_until_complete(get_url())
# then some time passes, and then it makes another request to a url.
m_loop.run_until_complete(get_url())
# now the app gets stopped, whether by keyboard interrupt or some other means of stopping it
# then close the session
m_loop.run_until_complete(close())
# and then end the app..
m_loop.close()

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM