Thursday, March 23, 2023
HomePythonDashing up Python code utilizing multithreading

Dashing up Python code utilizing multithreading


Hello pretty folks! πŸ‘‹ A number of occasions we find yourself writing code in Python which does distant requests or reads a number of information or does processing on some knowledge. And in a whole lot of these instances I’ve seen programmers utilizing a easy for loop which takes endlessly to complete executing. For instance:

import requests
from time import time

url_list = [
    "https://via.placeholder.com/400",
    "https://via.placeholder.com/410",
    "https://via.placeholder.com/420",
    "https://via.placeholder.com/430",
    "https://via.placeholder.com/440",
    "https://via.placeholder.com/450",
    "https://via.placeholder.com/460",
    "https://via.placeholder.com/470",
    "https://via.placeholder.com/480",
    "https://via.placeholder.com/490",
    "https://via.placeholder.com/500",
    "https://via.placeholder.com/510",
    "https://via.placeholder.com/520",
    "https://via.placeholder.com/530",
]

def download_file(url):
    html = requests.get(url, stream=True)
    return html.status_code

begin = time()

for url in url_list:
    print(download_file(url))

print(f'Time taken: {time() - begin}')

Output:

<--truncated-->
Time taken: 4.128157138824463

It is a sane instance and the code will open every URL, anticipate it to load, print its standing code and solely then transfer on to the subsequent URL. This sort of code is an excellent candidate for multi-threading.

Trendy techniques can run a whole lot of threads and meaning you are able to do a number of duties without delay with a really low over-head. Why don’t we try to make use of that to make the above code course of these URLs quicker?

We’ll make use of the ThreadPoolExecutor from the concurrent.futures library. It’s tremendous straightforward to make use of. Let me present you some code after which clarify the way it works.

import requests
from concurrent.futures import ThreadPoolExecutor, as_completed
from time import time

url_list = [
    "https://via.placeholder.com/400",
    "https://via.placeholder.com/410",
    "https://via.placeholder.com/420",
    "https://via.placeholder.com/430",
    "https://via.placeholder.com/440",
    "https://via.placeholder.com/450",
    "https://via.placeholder.com/460",
    "https://via.placeholder.com/470",
    "https://via.placeholder.com/480",
    "https://via.placeholder.com/490",
    "https://via.placeholder.com/500",
    "https://via.placeholder.com/510",
    "https://via.placeholder.com/520",
    "https://via.placeholder.com/530",
]

def download_file(url):
    html = requests.get(url, stream=True)
    return html.status_code

begin = time()

processes = []
with ThreadPoolExecutor(max_workers=10) as executor:
    for url in url_list:
        processes.append(executor.submit(download_file, url))

for activity in as_completed(processes):
    print(activity.consequence())


print(f'Time taken: {time() - begin}')

Output:

<--truncated-->
Time taken: 0.4583399295806885

We simply sped up our code by an element of just about 9! And we didn’t even do something tremendous concerned. The efficiency advantages would have been much more if there have been extra urls.

So what is going on? Once we name executor.submitΒ we’re including a brand new activity to the thread pool. We retailer that activity within the processes listing. Later we iterate over the processes and print out the consequence.

The as_completed methodology yields the gadgets (duties) from processes listing as quickly as they full. There are two causes a activity can go to the finished state. It has both completed executing or it bought cancelled. We may have additionally handed in a timeout parameter to as_completed and if a activity took longer than that point interval, even then as_completed will yield that activity.

You must discover multi-threading a bit extra. For trivial initiatives it’s the quickest option to pace up your code. If you wish to be taught, extra learn the official docs. They’re tremendous useful.

Have a very good day! So long!

RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Most Popular

Recent Comments