In this lesson, we will cover:
- What is Scrapli AsyncIO?
- Synchronous vs Asynchronous.
- Asynchronous over multithreading and multiprocessing.
- How to build a Scrapli AsyncIO script.
What is Scrapli Asynchronous?
Scrapli supports Python AsyncIO, or in other words asynchronous-based programming, by providing alternative async drivers and functions that can be used to build Scrapli AsyncIO-based scripts.
Synchronous vs Asynchronous
So what do I mean by Python AsyncIO (asynchronous programming)? As you will recall from the fundamentals section:
Synchronous support is the Python we all know and love – our code waits for something to complete before moving on.
Imagine we have 100 devices, each of which takes 5 seconds to return its config. It would take 500 seconds to collect all 100 configs, as we would call each function sequentially and only move on to the next one when the previous one was complete. But what is actually happening is: at the point when we send the request to the device we are blocked due to IO, i.e the input/output activities outside of Python that have to occur (the device responding to our request and sending the data back). Wouldn’t it be better if we could get Python to do something else whilst we were waiting for the IO to complete?
Python Asynchronous (via the
asyncio module) provides a solution. Python Asynchronous allows us to suspend and resume functions, meaning Python is no longer blocked as previously described.
Another key thing to note is that this is all performed on a single core and a single thread. Therefore we are not actually performing any of our operations in parallel. Instead, we can think of asynchronous programming in Python as a way to:
efficiently schedule our tasks (functions) and how they are run within Python, in order to improve the time for our script to complete.
Why Not Use Multi-processing or Multi-threading?
So you may be asking, ‘Can we not just use multi-processing and multi-threading?’
As is often the case in IT, things are never black and white!
As you may be aware, the CPython GIL (Global Interpreter Lock) has a limitation that no more than one thread can be in a state of execution at any one given moment. This lock is necessary mainly because CPython’s memory management is not thread-safe. Therefore you can run multiple threads, but the execution in its true sense will not be parallel. Multi-threading has previously been a good option when dealing with network devices as much of what occurs is IO-bound. So at the point a thread is bound to the IO (for example, waiting for the device to respond), another thread can be executed.
The main problem with multi-threading is the memory and resource overhead on the management and creation of multiple threads. This is where using
asyncio, which is single-threaded, can provide advantages.