Multiprocessing and streams

+2 votes
I've a stream of n items. Instead of retrieving all of them with a single api call, I wanted to make use of multiprocessing. So, I wrote a little code that fetches first n/2 items on a core and fetches the remaining n/2 items on another core. However, I did not obtain any speed-up. So, is interacting with streams not compatible with multiprocessing?

You can see my attempt here: https://gist.github.com/TinfoilHat0/c509deb1219c268b9b2e8aa8e5e6cee5#file-streams_in_parallel-py
asked Sep 6, 2018 by TinfoilHat

1 Answer

+1 vote
 
Best answer
Stream item fetching takes place under a concurrency lock inside MultiChain, so I'm afraid you won't be able to increase performance in this way. Instead, you may want to consider running two nodes, both of which are subscribed to the stream, even on the same system. If you query those two nodes simultaneously, you will expect to see a speedup.
answered Sep 7, 2018 by MultiChain
selected Sep 10, 2018 by TinfoilHat
...