Python orchestration & asyncio & asynchronous programming

Keywords: Python Redis network Session

Python orchestration & asyncio & asynchronous programming

1. Cooperation

The coroutine is a kind of micro thread, and it is a kind of context switching technology of user state. Through a thread, code blocks can be switched to each other for execution
There are several ways to implement a collaboration:

  • greenlet, early modules
  • yield keyword
  • Introduced by asyncio Python 3.4
  • async, await keywords Python 3.5 mainstream [recommended]

1.1 greenlet implementation process

pip install greenlet
# -*- coding: utf-8 -*-
from greenlet import greenlet


def func1():
    print(1)       # Step 1: output 1
    gr2.switch()   # Step 2: skip to func2 function
    print(2)       # Step 5: output 2
    gr2.switch()   # Step 6: skip to func2 function


def func2():
    print(3)      # Step 3: output 3
    gr1.switch()  # Step 4: skip to func1 function
    print(4)      # Step 7: output 4


gr1 = greenlet(func1)
gr2 = greenlet(func2)

gr1.switch()   # Step 1: execute func1 function

1.2 yield keyword

# -*- coding: utf-8 -*-


def func1():
    yield 1
    yield from func2()
    yield 2


def func2():
    yield 3
    yield 4


f1 = func1()
for item in f1:
    print(item)

1.3 asyncio

Python 3.4 and later

# -*- coding: utf-8 -*-
import asyncio


@asyncio.coroutine
def func1():
    print(1)
    yield from asyncio.sleep(2)  # Automatically switch to other tasks in tasks in case of IO time-consuming operation
    print(2)


@asyncio.coroutine
def func2():
    print(3)
    yield from asyncio.sleep(2)  # Automatically switch to other tasks in tasks in case of IO time-consuming operation
    print(4)


tasks = [
    asyncio.ensure_future(func1()),
    asyncio.ensure_future(func2()),
]

loop = asyncio.get_event_loop()
loop.run_until_complete(asyncio.wait(tasks))

Note: automatic switching in case of IO blocking

1.4 async and await keywords

Python 3.5 and later

# -*- coding: utf-8 -*-
import asyncio


async def func1():
    print(1)
    await asyncio.sleep(2)  # Automatically switch to other tasks in tasks in case of IO time-consuming operation
    print(2)


async def func2():
    print(3)
    await asyncio.sleep(2)  # Automatically switch to other tasks in tasks in case of IO time-consuming operation
    print(4)


tasks = [
    asyncio.ensure_future(func1()),
    asyncio.ensure_future(func2()),
]

loop = asyncio.get_event_loop()
loop.run_until_complete(asyncio.wait(tasks))

1.5 generally use the keyword of Green let or async & await

2. Significance of cooperation

When an IO waiting time is encountered in a thread, the thread is not allowed to wait for nothing all the time. Instead, the thread is allowed to use its idle time to do other things

# -*- coding: utf-8 -*-
import asyncio
import aiohttp


async def fetch(session, url):
    print("Send request")
    async with session.get(url, verify_ssl=False) as response:
        content = await response.content.read()
        file_name = url.rsplit("/")[-1]
        with open(file_name, mode="wb") as file_object:
            file_object.write(content)
        print("Download complete", url)


async def main():
    async with aiohttp.ClientSession() as session:
        url_list = [
            "http://ww1.sinaimg.cn/mw600/00745YaMgy1gedxxa59lyj30kk10p77m.jpg",
            "http://ww1.sinaimg.cn/mw600/00745YaMgy1gedxrlhlhaj30kk0dpmxj.jpg",
            "http://ww1.sinaimg.cn/mw600/00745YaMgy1gedxrlrw4tj30kk0pp78u.jpg"
        ]

        tasks = [asyncio.create_task(fetch(session, url)) for url in url_list]
        await asyncio.wait(tasks)

if __name__ == "__main__":
    asyncio.run(main())

3. Asynchronous programming

3.1 event cycle

Understanding becomes a dead cycle

"""
#Pseudocode

Task list = [task 1, task 2, task 3,...]

while True:
    Executable task list, completed task list = check all tasks in the task list, and return 'executable' and 'completed' tasks
    for ready task in list of executable tasks:
        Perform ready tasks
    for completed tasks in completed tasks list:
        Remove completed tasks from the task list
    Terminate the cycle when all tasks in the task list have been completed
"""
import asyncio

# To generate or retrieve an event loop
loop = asyncio.get_event_loop()
# Put tasks on the task list
loop.run_until_complete(asyncio.wait(tasks))

3.2 get started quickly

async def function name when defining a function
Co process object executing co process function () to get co process object

async def func():
    pass
    
result = func()

Note: run the function to get the object, and the internal code of the function will not execute
To run the inner code of a coroutine function, the coroutine object must be handed over to the event loop for processing

import asyncio

async def func():
	print("Hello World")
    
result = func()

# To generate or retrieve an event loop
loop = asyncio.get_event_loop()
# Put tasks on the task list
loop.run_until_complete(result)
# After asyncio. Run (result) Python 3.7, this sentence can replace the above two sentences

3.3 await

await + wait object (process object, Future, Task object, -- > IO wait)
Example 1:

import asyncio


async def func():
    print("start")
    response = await asyncio.sleep(2)
    print("end", response)
    
asyncio.run(func)

Example 2:

# -*- coding: utf-8 -*-
import asyncio


async def others():
    print("start")
    await asyncio.sleep(2)
    print("end")
    return "Return value"


async def func():
    print("Start execution func")
    response = await others()
    print("IO Request ended with: ", response)


asyncio.run(func())

Example 3:

# -*- coding: utf-8 -*-
import asyncio


async def others():
    print("start")
    await asyncio.sleep(2)
    print("end")
    return "Return value"


async def func():
    print("Start execution func")
    response1 = await others()
    print("IO Request ended with: ", response1)

    response2 = await others()
    print("IO Request ended with: ", response2)

asyncio.run(func())

await is to wait for the value of the object to reach the corresponding result before continuing

3.4 Task object

Add multiple tasks to the event loop
Tasks is used to schedule the processes concurrently. By using asyncio. Create \. In addition to using asyncio. Create? Task (the process object), you can also use the lower level loop. Create? Task() or ensure? Future() functions. Manual instantiation of task objects is not recommended.
Note: asyncio.create'task (the collaboration object) was added at Python 3.7. Prior to Python 3.7, you can use the lower level asyncio. Ensure? Future() function.
Example 1:

import asyncio


async def func():
    print("1")
    await asyncio.sleep(2)
    print("2")
    return "Return value"


async def main():
    print("Start execution main")
    # Create a Task object to add the currently executing func function Task to the event loop
    task1 = asyncio.create_task(func())
    task2 = asyncio.create_task(func())
    print("main End")

    # When the execution process encounters IO operation, it will automatically switch to other tasks (task2)
    # The await here is to wait for all the corresponding processes to complete and get the results
    ret1 = await task1
    ret2 = await task2

    print(ret1, ret2)

asyncio.run(main())

Example 2:

# -*- coding: utf-8 -*-
import asyncio


async def func():
    print("1")
    await asyncio.sleep(2)
    print("2")
    return "Return value"


async def main():
    print("Start execution main")
    # Create a Task object to add the currently executing func function Task to the event loop
    task_list = [
        asyncio.create_task(func(), name="n1"),
        asyncio.create_task(func(), name="n2")
    ]

    print("main End")

    # When the execution process encounters IO operation, it will automatically switch to other tasks (task2)
    # The await here is to wait for all the corresponding processes to complete and get the results
    # Do is a collection, which is the return value of the above two tasks
    done, pending = await asyncio.wait(task_list, timeout=None)

    print(done)

asyncio.run(main())

Example 3:

import asyncio


async def func():
    print("1")
    await asyncio.sleep(2)
    print("2")
    return "Return value"

task_list = [
    func(),
    func()
]

done, pending = asyncio.run(asyncio.wait(task_list))
print(done)

3.5 asyncio.Future object

Task inherits Future, and the await result processing within task object is based on Future object
Example 1:

import asyncio


async def main():
    # Get current event loop
    loop = asyncio.get_running_loop()
    # Create a future object that does nothing
    fut = asyncio.create_future()
    # Wait for the final result of the task (Future object). If there is no result, it will wait forever
    await fut


asyncio.run(main())

Example 2:

import asyncio


async def set_after(fut):
    await asyncio.sleep(2)
    fut.set_result("success")

async def main():
    # Get current event loop
    loop = asyncio.get_running_loop()
    # Create a future object. If the object does nothing and has no binding behavior, then the task does not know when to end
    fut = asyncio.create_future()

    # Create a Task (Task object) and bind the set ﹐ after function. After 2s, the function will assign a value to fut
    # That is to say, manually set the result of fut, then it can be finished
    await loop.create_task(set_after(fut))

    # Wait for the final result of the task (Future object). If there is no result, it will wait forever
    data = await fut
    print(data)


asyncio.run(main())

3.6 concurrent.future.Future object

Objects used in asynchronous operations using thread pools or process pools
Cross use, concurrent asynchronous programming + MySQL (not supported) [asynchronous programming for threads and processes]

import time
import asyncio
import concurrent.futures


def func():
    # A time-consuming operation
    time.sleep(2)
    return "success"


async def main():
    loop = asyncio.get_running_loop()

    # 1. Run in the default loop's executor
    # Step 1: internally, the ThreadPoolExecutor's submit method will be called to apply for a thread in the thread pool to execute the func function, and a concurrent.futures.Future object will be returned
    # Step 2: call asyncio.wrap? Future to wrap the concurrent.futures.Future object as the asyncio.Future object
    # Because the concurrent.futures.Future object does not support the await syntax, it needs to be wrapped as an asyncio.Future object to be used
    fut = loop.run_in_excutor(None, func)
    result = await fut
    print("default thread pool", result)

    # 2.Run in a custom thread pool
    with concurrent.futures.ThreadPoolExecutor() as pool:
        result = await loop.run_in_excutor(pool, func)
        print("custom thread pool", result)

    # 3.Run in a custom process pool
    with concurrent.futures.ProcessPoolExecutor() as pool:
        result = await loop.run_in_excutor(pool, func)
        print("custom process pool", result)


asyncio.run(main())

Case: asyncio + unsupported module

import asyncio
import requests


async def download_image(url):
    # f send the network request to download the picture (automatically switch to other tasks in case of the network IO request to download the picture)
    print("Start downloading", url)
    loop = asyncio.get_running_loop()

    # The requests module does not support asynchronous operation, so it uses thread pool to cooperate with the implementation
    future = loop.run_in_executor(None, requests.get, url)

    response = await future
    print("Download complete")

    # Save picture to local file
    file_name = url.rsplit("/")[-1]
    with open(file_name, mode="wb") as file_object:
        file_object.write(response.content)


def main():

    url_list = [
        "http://ww1.sinaimg.cn/mw600/00745YaMgy1gedxxa59lyj30kk10p77m.jpg",
        "http://ww1.sinaimg.cn/mw600/00745YaMgy1gedxrlhlhaj30kk0dpmxj.jpg",
        "http://ww1.sinaimg.cn/mw600/00745YaMgy1gedxrlrw4tj30kk0pp78u.jpg"
    ]

    tasks = [download_image(url) for url in url_list]
    loop = asyncio.get_event_loop()
    loop.run_until_complete(asyncio.wait(tasks))


if __name__ == "__main__":
    main()

3.7 asynchronous iterator

What is asynchronous iterator
An object that implements the methods \. An awaitable object must be returned by an anext, and async for will handle the waiting object returned by the asynchronous iterator's \\\\\\\\\\.
What is asynchronous iterative object
Objects that can be used in async for. An asynchronous iterator must be returned through its \

import asyncio


class Reader(object):
    """Custom asynchronous iterator(It is also an asynchronous and iterative object)"""
    def __init__(self):
        self.count = 0

    async def readline(self):
        # await asyncio.sleep(2)
        self.count += 1
        if self.count == 100:
            return None
        return self.count

    def __aiter__(self):
        return self

    async def __anext__(self):
        val = await self.readline()
        if val == None:
            raise StopAsyncIteration
        return val


async def func():
    obj = Reader()
    # async for can only be written in a co function
    async for item in obj:
        print(item)

asyncio.run(func())

3.8 asynchronous context manager

This object controls the environment in the async with statement by defining the aenter and aexit methods

import asyncio


class AsyncContexManager(object):
    def __init__(self, ):
        self.conn = conn

    async def do_something(self):
        # Asynchronous operation database
        return 666

    async def __aenter__(self):
        # Asynchronous connection to database
        self.conn = await asyncio.sleep(2)
        return self

    async def __anext__(self):
        # Asynchronously shut down the database
        await asyncio.sleep(2)


async def func():
    # async with can only be used in a coroutine function
    async with AsyncContexManager() as f:
        result = await f.do_something()
        print(result)

asyncio.run(func())

4.uvloop

Is an alternative to asyncio's event loop. Event loop > asyncio's event loop, performance comparison go

pip install uvloop
import asyncio
import uvloop
asyncio.set_event_loop_policy(uvloop.EventLoopPolicy)


#  Write asyncio code, which is consistent with the previous code

# The internal event loop will automatically change to uvloop

asyncio.run(...)

Note: an asgi -- > uvicon uses uvloop internally

5. Practical cases

5.1 asynchronous operation redis

When using Python code to operate redis, connection, operation and disconnection are all network IO

pip install aioredis

Example 1:

import asyncio
import aioredis


async def execute(address, password):
    print("Start execution", address)
    # Network IO creates redis connection
    redis = await aioredis.create_redis(address, password=password)
    # Network IO sets the hash value car in redis, and internally sets three key value pairs redis = {"car": {"key1": 1, "key2": 2, "key3": 3}}
    await redis.hmset_dict("car", key1=1, key2=2, key3=3)
    # Get value from network IO to redis
    result = redis.hgetall("car", encoding="utf-8")
    print(result)

    redis.close()
    # Network IO closes redis connection
    await redis.wait_close()

    print("End", address)


asyncio.run(execute("redis://127.0.0.1:6379", "123456"))

Example 2:

import asyncio
import aioredis


async def execute(address, password):
    print("Start execution", address)
    # Connect 47.93.4.197:6379 for network IO first, and automatically switch tasks when encountering io. Connect 47.93.4.198:6379
    redis = await aioredis.create_redis(address, password=password)
    # Network IO will automatically switch tasks when it encounters IO
    await redis.hmset_dict("car", key1=1, key2=2, key3=3)
    # Network IO will automatically switch tasks when it encounters IO
    result = redis.hgetall("car", encoding="utf-8")
    print(result)

    redis.close()
    # Network IO will automatically switch tasks when it encounters IO
    await redis.wait_close()

    print("End", address)


task_list = [
    execute("47.93.4.197:6379", "123456"),
    execute("47.93.4.198:6379", "123456"),
]

asyncio.run(asyncio.wait(task_list))

5.1 asynchronous operation of MySQL

pip install aiomysql

Example 1:

import asyncio
import aiomysql


async def execute():
    print("Start execution")
    # Network IO connection to MySQL
    conn = await aiomysql.connect(host="127.0.0.1", port=3306, user="root", password="123456", db="mysql")

    # Create cursor for network IO
    cur = await conn.cursor()
    # Network IO execution sql
    await cur.execute("seletc name from user")
    # Get sql results from network IO
    result = await cur.fetchall()
    print(result)
    # Network IO close connection
    await cur.close()
    conn.close()


asyncio.run(execute())

Example 2:

import asyncio
import aiomysql


async def execute(host, password):
    print("Start execution", host)
    # For network IO connection to MySQL, go to connect 47.93.41.197 first, and switch to connect 47.93.41.198 in case of Io
    conn = await aiomysql.connect(host=host, port=3306, user="root", password=password, db="mysql")

    # Network IO will automatically switch tasks when it encounters IO
    cur = await conn.cursor()
    # Network IO will automatically switch tasks when it encounters IO
    await cur.execute("seletc name from user")
    # Network IO will automatically switch tasks when it encounters IO
    result = await cur.fetchall()
    print(result)
    # Network IO will automatically switch tasks when it encounters IO
    await cur.close()
    conn.close()
    print("End", host)


task_list = [
    execute("47.93.41.197", "123456"),
    execute("47.93.41.198", "123456")
]
asyncio.run(asyncio.wait(task_list))

5.3 FastAPI framework

install

pip install fastapi
pip install uvicorn # (asgi can be considered to support asynchronous wsgi, which is internally based on uvloop)

Example: mu.py

import asyncio
import aioredis
import uvicorn
from fastapi import FastAPI
from aioredis import Redis


app = FastAPI()
REDIS_POOL = aioredis.ConnectionsPool("redis://47.193.14.198:6379", password="123", minsize=1, maxsize=10)


@app.get("/")
def index():
    """General operation interface"""
    return {"msg": "hello world"}

@app.get("/red")
async def red():
    # Asynchronous operation interface
    print("Here comes the request")
    await asyncio.sleep(3)
    # Connection pool get a connection
    conn = await REDIS_POOL.acquire()
    redis = Redis(conn)
    # Set value
    await redis.hmset_dict("car", key1=1, key2=2, key3=3)
    # Read value
    result = await redis.hgetall("car", encoding="utf-8")
    print(result)
    # Connection return connection pool
    REDIS_POOL.release(conn)
    return result


if __name__ == "__main__":
    uvicorn.run("mu:app", host="127.0.0.1", port=5000, log_level="info")

5.4 reptiles

pip install aiohttp
import asyncio
import aiohttp


async def fetch(session, url):
    print("Send request", url)
    async with session.get(url, verify_ssl=False) as response:
        text = await response.text()
        print("result:", url, len(text))
        return text


async def main():
    async with aiohttp.ClientSession() as session:
        url_list = [
            "https://python.org",
            "https://www.baidu.com",
            "https://tianbaoo.github.io"
        ]

        tasks = [asyncio.create_task(fetch(session, url)) for url in url_list]
        done, pending = await asyncio.wait(tasks)
        print(done)

if __name__ == "__main__":
    asyncio.run(main())

6. Summary

The biggest point: take advantage of a thread's IO wait time to do other things.

Posted by Lateuk on Mon, 04 May 2020 04:42:37 -0700