LaVOZs

The World’s Largest Online Community for Developers

'; python - Sending over the same socket with multiprocessing.pool.map - LavOzs.Com

I'm trying to pack and send columnar data over a socket connection.

To speed it up, I thought about splitting the packing (struct.pack) to multiple processes.

To avoid pickling both ways, I thought it might be better to have the packing processes send the data themselves, as it was said socket objects can be pickled starting Python 3.4.

This is a simplified version of what I have at work:

import socket
from multiprocessing import Pool
from struct import pack

# Start and connect a socket
s = socket.socket()
s.connect((ip, port))

# Data to be packed and sent in this order
data1 = 1, 2, 3, 4
data2 = 5, 6, 7, 8
data3 = 9, 10, 11, 12

# Top level column packer/sender for mp.pool
def send_column(column):
    return s.send(pack(f'{len(column)}i', *column))


pool = Pool()

# Will this necessarily send the data in order?
pool.map(send_column, (data1, data2, data3))

My question is - Is it guaranteed the data will be sent in order?

If not, what's a prudent way to make sure it does?

I thought about a global counter for processes to check if it's their turn yet, but I'd be happy to hear better ideas.

  1. The socket will be shared by the processes and processes are controlled by operating system scheduler which has no control over the execution order for this processes. So processes appears to run randomly to us (this is not full truth - check about os scheduling algorithms) and you cannot guarantee the order of execution and the order of package delivery.
  2. From network perspective when you send data over shared socket typically you don't wait for response (if you use tcp protocol) and this will appear to us as simultaneous packet send/deliver and same for response.

To make sure that you have in-order delivery of packets you need to ensure that each packet you send the other end receive, so you are limited to use synchronized connections (send packet only after previous one was send and you made sure that it was received). In your use case I would suggest you have pool of processes that generate pickled objects and send them to queue (they will be producers). The other object will be consumer of these objects and send them over network.

Looking at your use-case, you have 2 time-intensive tasks:

  • packing/serializing the data
  • sending the data

Packing on your machine is a CPU intensive task: It would probably not profit much(if at all) from multithreading as threads in python always run on the same core. Packing in multiple processes would probably speed up the packing part since there multiple cores can be leveraged, but on the other hand you'll have to copy the data to a new space in main memory, since processes don't share memory. You should test if multiprocessing makes sense there, if not, try with shared memory which would eliminate the speed loss from copying the data and will let you pack your data on multiple cores(but adds a lot of complexity to your code). For packing in general I'd also recommend looking at protobuf or flatbuffers.

Sending data on the other hand, profits from concurrency not because the CPU needs so much time, but because of delays through the network and waiting for acknowledgement packets, which means that a significant speedup can be achieved by using threads or asyncio because waiting on a reply isn't sped up by using multiple cores.

I'd suggest you test whether packing on multiple cores using the multiprocessing library has the desired effect. If so, you'll have to index or timestamp your packets to be able to realign them on the other side. There are no mechanisms to "make sure they are sent in order" simply because that would remove most of the time you saved using concurrency. So don't try to syncronize where you don't have to, since then you could skip working asyncronously altogether.

However if packing(and this is what I suspect) on multiple processes only yields a negligible speedup, I'd recommend packing/serializing the data on one thread(in the main thread) and then send the data on a thread each or using asyncio. For a how to on that please refer to this answer. You will have to expect data out of order, so either index your packets or timestamp them.

HTH

If for some reason you absolutely have to pack on multiple processes and send the data in order, you'll have to look at shared-memory and set it up so the main process creates a process for each set of data and shares the memory of each dataset with the correct process. Then each child process has to create a shared memory object to write the packed data to. The packed data has to be shared with the parent process. The parent process should then loop over the shared memory objects the children will write to and only send a piece of data if it is the first, or if the previous piece is marked as sent. Sending the data in this case should NOT happen using threads or anything asyncronous, as then the correct order would again not be guaranteed... That said better don't use this solution(extremely complex-minimal gain), go with either of the above 2.

Related
How do I download a file over HTTP using Python?
Iterating over dictionaries using 'for' loops
How to iterate over rows in a DataFrame in Pandas?
Use cx_Oracle and multiprocessing to query data concurrently
Sending JSON payload size and data over socket
How to add a character to the beginning of a line and end of a line in python