Stream Performance Under High Data Load

+1 vote
What are the performance guarantees for read and write to streams based on the amount of data in a stream?  What performance should we expect around a stream with small entries that is 2GB? What about 2TB?
asked Apr 11, 2018 by anonymous

1 Answer

0 votes
There aren't performance guarantees per se, because performance depends on the specification of server you are using.

We have tested streams in blockchains over 1 TB in size, with no observed problems. A stream write is just a regular blockchain transaction, so it doesn't care how many items have been written before.

In terms of reading, all stream retrieval APIs use indexes, so they will remain efficient even for streams with a huge number of items. Like any index you can expect the retrieval time to increase with the logarithm of the number of entries. So, very roughly, in going from 10,000 to 100,000,000 items, the retrieval time will double.
answered Apr 11, 2018 by MultiChain
...