synchronization
concept
Synchronization is to execute our code step by step
Advanced
But in some cases, we have some time-consuming operations, such as getting some resources from other places, requesting data from other websites, accessing databases, uploading files, etc., so there are advantages and disadvantages in this, and there are small compilations coming together
For example.
''' Function of this module:<Synchronous asynchronous demo>''' # This is equivalent to a client request import time # Add a time-consuming operation def longIO(): print("Start time consuming operation") time.sleep(5) print("End time consuming operation") def reqA(): print("Start processing reqA") longIO() print("End processing reqA") # This is equivalent to another client's request def reqB(): print("Start processing reqB") print("End processing reqB")
def main(): # This is what synchronization is doing reqA() reqB() while True: ''' # If you want to write a dead cycle, you don't write a dead cycle. You have to sleep # Why sleep? Because if you don't, you'll find that your CPU utilization is 100% ''' time.sleep(0.1)
if __name__ == '__main__': main()
Result
Begin processing reqA Time consuming operation started ා 5 seconds waiting here End time consuming operation End processing reqA Begin processing reqB End processing reqB
A time-consuming operation is added to the request, which leads to the inefficiency of our synchronization, which does not reflect the efficiency of our tornado
''' ┌─┐ ┌─┐ + + ┌──┘ ┴───────┘ ┴──┐++ │ │ │ ─── │++ + + + ███████───███████ │+ │ │+ │ ─┴─ │ │ │ └───┐ ┌───┘ │ │ │ │ + + │ │ │ └──────────────┐ │ │ │ ├─┐ │ ┌─┘ │ │ └─┐ ┐ ┌───────┬──┐ ┌──┘ + + + + │ ─┤ ─┤ │ ─┤ ─┤ └──┴──┘ └──┴──┘ + + + + Gods and beasts bless you No BUG in code! '''
asynchronous
Your colleagues who do one thing do another
Summary
For the time-consuming operations, they will be handed over to others (another thread) for processing. We continue to execute them downward. When others finish the time-consuming operations, we will return the processing results to us
Asynchronous implementation of callback function
In fact, we have used asynchronous. There is a very obvious asynchronous in js, that is, when we send ajax, we go to other jobs after we send ajax, and then we take care of ajax when it responds
Here, the code is shown as follows:
''' #!/usr/bin/env python # -*- coding:utf-8 -*- # Created by victor ''' # Function of this module: < asynchronous demo > # This is equivalent to a client request import time import threading
''' Add a time-consuming operation''' ''' //There will be a problem. We can't accept the return value of the run() function //To solve this problem, we need to write a function, which is called a callback function ''' def longIO(callback): def run(cb): print("Start time consuming operation") time.sleep(3) print("End time consuming operation") cb("victor is a wonderful man") threading.Thread( target=run, args=(callback,) ).start()
This longio part, like ajax, doesn't need to be written by us
def finish(data): print("Start processing callback function") print("receive longIO The data are:",data) print("End processing callback function") def reqA(): print("Start processing reqA") longIO(finish) print("End processing reqA") # This is equivalent to another client's request def reqB(): print("Start processing reqB") time.sleep(1) print("End processing reqB")
def main(): # This is what synchronization is doing reqA() reqB() while True: ''' # If you want to write a dead cycle, you don't write a dead cycle. You have to sleep # Why sleep? Because if you don't, you'll find that your CPU utilization is 100% ''' time.sleep(0.1) if __name__ == '__main__': main()
Asynchrony just means that tornado can handle multiple requests. Your browser should wait
Implementation of asynchrony by cooperation
If you don't understand the process and want to implement asynchrony, you can just implement it
Don't worry about the support. You're in trouble with process threads, let alone processes
Version 1
A basic version of the low est
''' #!/usr/bin/env python # -*- coding:utf-8 -*- # Created by victor ''' # Function of this module: < > # This is equivalent to a client request import time import threading gen = None # Add a time-consuming operation def longIO(): def run(): print("Start time consuming operation") time.sleep(3) try: global gen gen.send("victor is wonderful!!!") except StopIteration as e: pass print("End time consuming operation") threading.Thread( target=run, ).start() # This longio part, like ajax, doesn't need to be written by us ''' //There will be a problem. We can't accept the return value of the run() function //To solve this problem, we need to write a function, which is called a callback function ''' def reqA(): print("Start processing reqA") res = yield longIO() print("receive longIO The data are:",res) # It's like hanging up print("End processing reqA") # This is equivalent to another client's request def reqB(): print("Start processing reqB") time.sleep(1) print("End processing reqB") def main(): # This is what synchronization is doing global gen gen = reqA() # Build a generator next(gen) # Implementation of reqA reqB() while True: ''' # If you want to write a dead cycle, you don't write a dead cycle. You have to sleep # Why sleep? Because if you don't, you'll find that your CPU utilization is 100% ''' time.sleep(0.1) if __name__ == '__main__': main()
Version 2
We have a question
In version 1, when calling reqA, it is not the same as reqB
In other words, numbers can not be regarded as simple functions, but need to be used as generators. When we think about it, we treat it as a common function
reality
global gen gen = reqA() # Build a generator next(gen) # Implementation of reqA
ideal
reqA() # Just a simple call
It's time to send our decorators
''' Can the decorator still write''' def genCoroutine(func): # This is a decorator with parameters def wapper(*args,**kwargs): # In fact, it's the three sentences global gen gen = func(*args,**kwargs) # Build a generator next(gen) # Implementation of reqA return wapper
And when it's defined
@genCoroutine def reqA(): print("Start processing reqA") res = yield longIO() print("receive longIO The data are:",res) # It's like hanging up print("End processing reqA")
And then when it's done
'''This is equivalent to another client's request ''' def reqB(): print("Start processing reqB") time.sleep(1) print("End processing reqB") def main(): # This is what synchronization is doing # global gen # gen = reqA() # Build a generator # next(gen) # Implementation of reqA reqA() reqB() while True: ''' # If you want to write a dead cycle, you don't write a dead cycle. You have to sleep # Why sleep? Because if you don't, you'll find that your CPU utilization is 100% ''' time.sleep(0.1) if __name__ == '__main__': main()
Version 3
In fact, there is another problem in version 2. He has a global gen variable. To put it bluntly, he pretends that he doesn't want to be there anymore
This is the most complicated version. Let's see
- Document description, import related modules
''' #!/usr/bin/env python # -*- coding:utf-8 -*- # Created by victor ''' # Function of this module: < > import time import threading
- Define decorators (the most important part is to realize asynchronous, efficient and concurrent principle)
def genCoroutine(func): ''' //A lot of people are confused ''' def wapper(*args, **kwargs): ''' //In this way, the decorator will be in trouble, because I have to ask for the general gen //I need to get multiple generators ''' gen1 = func() # reqA's generator gen2 = next(gen1) # Generator for longIO # Create my thread here # Hang him up def run(g): # This is to execute longIO res = next(g) try: gen1.send(res) # Return to reqA data except StopIteration as e: # Do nothing pass threading.Thread( target=run, args=(gen2,) ).start() return wapper
- The most difficult part
''' #Add a time-consuming operation #handler obtains data (database, other servers, cycle time) ''' def longIO(): ''' Now you just need to know what your time-consuming operations are, You don't have to worry about threads tornado's done it for you ''' print("start time consuming operation") time.sleep(3) print("end time consuming operation") #Return data after finishing time consuming operation yield "victor is a cool man"
- Decorated function definition
@genCoroutine def reqA(): print("Start processing reqA") res = yield longIO() print("receive longIO The data are:", res) # It's like hanging up print("End processing reqA")
- Another time-consuming function
''' This is equivalent to another client's request''' def reqB(): print("Start processing reqB") time.sleep(1) print("End processing reqB")
- Program entry function
def main(): reqA() reqB() while True: time.sleep(0.1) if __name__ == '__main__': main()
In the future, we don't need to thank such a complex decorator. tornado has already written it for us. As long as you have asynchrony, you can use the decorator to decorate it. No need to write anything else
It's pointed out in tornado that the code is not written in this way. Otherwise, if you write in this way, you will spend all your money. What tornado left is easy to use
It's not that you don't understand this thing. At the beginning, you specified that you don't understand it. You can understand its principle only when you use multiple methods
In fact, the asynchrony in this cooperation process is not a cooperation process in essence, because it uses multiple threads, because the definition of the cooperation process is played in one thread, just to understand the implementation principle of tornado