I have 3 python scripts
The 3 scripts work as expected individually. The next step is to create an API endpoint which will take an account name as a request parameter, and then trigger the above 3 scripts for the received account. The final analysis results will be stored in a database.
The endpoint also needs to have a queueing mechanism to store account names received. The queue will be polled, and if account names are available, they will be processed sequentially.
My API development experience is limited so I am not sure of the best approach to tackle this problem. My questions are:
To get info from an API and save it I would recommend using asyncio to do something like
import asyncio
import aiohttp
import time
import aiofiles as aiof
FILENAME = "foo.txt"
loop = asyncio.get_event_loop()
async def fetch(session, url):
async with session.get(url) as response:
async with aiof.open(FILENAME, "a") as out:
out.write((await response.json()))
out.flush()
async def main():
instagram-ids = [] #profile ids
current = time.time()
url = "INSTAGRAM_API_URL"
tasks = []
async with aiohttp.ClientSession() as session:
for id in instagram-ids:
tasks.append(loop.create_task(fetch(session, url.format(id))))
responses = await asyncio.gather(*tasks)
print(time.time() - current)
loop.run_until_complete(main())
since most of the time when dealing with API is spent on waiting for the results
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.