Python Stories Ep. 1

Hemanth Sharma
2 min readJul 12, 2021

“Do we need a Queue”?

We finished the demo one day to a team of 40. It went good.
But there were things we needed to fix.
The product we were working on was slow.

We had 8–9 Microservices to be run sequentially. To make it faster we used the Kafka Queues, So while the request is hitting 5th microservice, next request’s 4th microservice starts. So we had achieved some level of concurrency across microservices.

But The Kafka Consumers were not threads. The thing was not working with threads because somewhere in the redis db updation code, we were fetching the data from redis, appending more data to it and then pushing it back.

When threads were executing this code, By the time one thread takes the data from redis, append more data and push, another thread would have took the old data from redis. So the second thread would use old redis data, append to it and push it. So the new data from thread1 is lost.

It was an interesting problem for us. Now we had 2 choices in mind.
Either we had go without threads which we did :(
This caused the slowness of course.

The other was to make a queue for redis code, So that the redis updation code executes one after another.

“Do we need a queue”?
If we pose this question here, If we put up a queue, sure it will be executed one after another. But we need to come up with a mechanism to send a response back to the caller function (which is again a different kafka consumer). So it would be a messy thing.

The Solution:
The solution was hiding somewhere in the python’s long lived package
“threading”

It is the the lock

By applying threading.lock() on a piece of code, one thread has to wait till the code gets released by another thread

The code was changed from

class RedisUtils:

def updateRedisData(self, ids):

fetch data from redis
append ids
push new data to redis

to

class RedisUtils:
def __init__(self,):
self.lock = threading.lock()
def updateRedisData(self, ids):
with self.lock:
fetch data from redis
append ids
push new data to redis

Voila, We had a perfectly working redis updation code, and now the response time for the product is super fast

--

--