简体   繁体   中英

asncio coroutine was never awaited

I am attempting to run an experiment with 2 coroutines with asyncio. One loop is to scrape a weather data outside air temperature and the other loop is just a made up value with numpy random. Both coroutines have the same traceback nothing happens..

RuntimeWarning: coroutine 'get_weather' was never awaited

Would anyone have any ideas to try? I am trying to follow this SO Post for a similar question as well as this other SO Post for running python 3.7 with coroutines which I am doing.

import asyncio
import requests
from bs4 import BeautifulSoup
import pandas as pd
import numpy as np


headers = {
    'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/39.0.2171.95 Safari/537.36'}

async def get_weather():
    await asyncio.sleep(300)

    r = requests.get('https://www.google.com/search?q=weather%20duluth', headers=headers)
    soup = BeautifulSoup(r.text, 'html.parser')
    outTemp = soup.find("span", {"class": "wob_t"}).text
    intOutTemp = int(outTemp)
    print(f'got outside temperature data {intOutTemp}')
    return intOutTemp


async def get_temp():
    await asyncio.sleep(10)

    inTemp = (random.random() * 20) - 5  # -5 to 15
    print(f'got inside temperature data {inTemp}')
    return temperature


def main():
    loop1 = get_weather()
    loop2 = get_temp()



if __name__ == '__main__':
    main()

UPDATED CODE

#py weatherAsyncio3.py

import asyncio
import aiohttp
from bs4 import BeautifulSoup
import pandas as pd
import numpy as np
import random

headers = {
    'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/39.0.2171.95 Safari/537.36'}

async def get_weather():
    while True:
        await asyncio.sleep(1800)

        async with aiohttp.ClientSession() as session:
            async with session.get('https://www.google.com/search?q=weather%20duluth', headers=headers) as r:
                soup = BeautifulSoup(await r.text(), 'html.parser')
                outTemp = soup.find("span", {"class": "wob_t"}).text
                intOutTemp = int(outTemp)
                print(f'got outside temperature data {intOutTemp}')
                return intOutTemp

async def get_temp():
    while True:
        await asyncio.sleep(60)

        inTemp = (random.random() * 20) - 5  # -5 to 15
        print(f'got inside temperature data {inTemp}')
        return inTemp

def main():
    loop = asyncio.get_event_loop()
    futureWeather = asyncio.ensure_future(get_weather())
    loop.run_until_complete(futureWeather)

    futureTemp = asyncio.ensure_future(get_temp())
    loop.run_until_complete(futureTemp)



if __name__ == '__main__':
    main()

You've just created a variable called loop1 & loop2. Neither of them wait for the task to complete. To use as asyncio.run() as an entry point, you can pass in your async methods into asyncio.gather() method to wait for both tasks to complete.

To run method in infinite loop, you can use the while True: method and not return from the method. I would recommend either splitting out the logic into 2 methods or renaming GetWeather() to something like GetWeatherContinuously(). It make other users aware of what the code does without looking through the logic.

Also request.get() does not run in parallel. Google returns the results instantly. However if you point the URL to somewhere that has a considerable wait, the issue is noticeable. Therefore I'd strongly recommend using aiohttp.

Also there is issue on Windows whereby aiohttp.get() throws an error - "Event Loop is closed". Therefore as a workaround found on GitHub . Therefore I've add commented-out code a call asyncio.set_event_loop_policy(asyncio.WindowsSelectorEventLoopPolicy()) before asyncio.run(). Only use it, if you need it.

Below is a working example:

import asyncio
import aiohttp
from bs4 import BeautifulSoup
import json
import pandas as pd
import numpy as np
import random

headers = {
    'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/39.0.2171.95 Safari/537.36'}

def Logger(json_message):
    print(json.dumps(json_message))

async def get_weather():
    Logger({"start": "get_weather()"})
    while True:
        await asyncio.sleep(3)    
        async with aiohttp.ClientSession() as session:
            r = await session.get('https://www.google.com/search?q=weather%20duluth', headers=headers)
            soup = BeautifulSoup(await r.text(), 'html.parser')
            outTemp = soup.find("span", {"class": "wob_t"}).text
            intOutTemp = int(outTemp)

            Logger({"results": f'got outside temperature data {intOutTemp}'})
            Logger({"finish": "get_weather()"})
            #return intOutTemp # without a return, the while loop will run continuously.

async def get_temperature():
    Logger({"start": "get_temperature()"})
    while True:
        await asyncio.sleep(1)

        inTemp = (random.random() * 20) - 5  # -5 to 15
        Logger({"results": f'got inside temperature data {inTemp}'})
        Logger({"finish": "get_temperature()"})
        #return inTemp # without a return, the while loop will run continuously.

async def main():
    statements = [get_weather(), get_temperature()]
    Logger({"start": "gather()"})

    await asyncio.gather(*statements) # Gather is used to allow both funtions to run at the same time.
    Logger({"finish": "gather()"})

if __name__ == '__main__':
    #asyncio.set_event_loop_policy(asyncio.WindowsSelectorEventLoopPolicy()) # Use this to stop "Event loop is closed" error on Windows - https://github.com/encode/httpx/issues/914
    asyncio.run(main())

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM