How to push data to multichain using Node JS API

+1 vote
Hi All,

I have installed Multichain with two nodes and tried the basic functionalities like creating assets,streams,granting permissions,sending asset with data etc..

Now i want to push some data like the user information initially one record using a Node JS application API.

And then i want to push data in bulk to the multichain and in real time.

Could someone pls guide me how to start with.

PS: I am a newbie to NodeJS and Multichain :)
asked Nov 23, 2018 by blkchain_enthu
I myself found the answer or my question.
Used third party library multichain-node and able to connect to multichain using RPC calls.

var multichain = require("multichain-node")({
  port: 6750,
  host: "127.0.0.1",
  user: "multichainrpc",
  pass: "8xT6wyTZwqwfdasfawf"
 });

multichain.getInfo((err, info) => {
    if(err){
        throw err;
    }
    console.log(info);
})

//multichain.issue({address: "18NE1HbETjkLasdasdasd", asset: "testcoin", qty: 50000, units: 0.01, details: {hello: "world"}}, (err, res) => {
//    console.log(res)
//})

I would like to know how we can generate and push data continuously to a stream since i want to test the performance.

Thanks.

1 Answer

0 votes

If your main purpose is pushing data to the chain, you should use a stream and the publish command. You can test the performance by simply performing many publish commands in a loop, one after the other.

If you're on MultiChain 2.0 (alphas) check out the publishmulti API just added, which allows you to easily publish many stream items in a single transaction, to increase the data throughput further. (This was always possible, but this just makes it easier.)

answered Nov 26, 2018 by MultiChain
Thanks for your prompt response.!!

I am using version 1.6 so I am trying to push lots of data to stream in loops to check the performance.

So far i have tried to push nearly 15k key value pairs to a single stream and when i try to retrieve it using the key i didnt find much latency and i got the response in less than a second.

I would like to know if there is any cut off point or limit where the latency increases.

And how can we do optimization in such cases.

Thanks in advance.
There is no hard cut-off point, but if a node is subscribed to a stream, that means it is indexing that streams entire content. So you should expect retrieval performance of O(log n) where n is the total number of stream items. In practice this means that when you move from 100 to 10,000 items, you should expect retrieval time to (at worst) double. And same when you move from 10,000 to 100,000,000 items.
...