Hi,
I am a certified big data developer, used pyspark in my many of applications, I feel you should use pyspark for multithreaded applications as spark distribute the load into different node and executors. If you have spark environment ready then you should start using it, otherwise it can be done using thread mechanism in pure python code too.
Please let’s connect and discuss more on your requirements.
Thanks,
Naresh.