Multichain-explorer throws Exception during block Catch UP

+2 votes

i am playing around with the mutlichain Explorer and got the following error:

block_tx 2847 2878
block_tx 2848 2879

Exception at 792928
Failed to catch up {'blkfile_offset': 721627, 'blkfile_number': 100000, 'chain_id': 1, 'loader': u'default', 'conf': None, 'dirname': u'/home/xxxxxxxl/.multichain/documentChain', 'id': 1}
Traceback (most recent call last):
  File "Mce/", line 2855, in catch_up
  File "Mce/", line 3166, in catch_up_dir
    store.import_blkdat(dircfg, ds, blkfile['name'])
  File "Mce/", line 3298, in import_blkdat
    store.import_block(b, chain = chain)
  File "Mce/", line 1195, in import_block
    tx['tx_id'] = store.import_tx(tx, pos == 0, chain)
  File "Mce/", line 2115, in import_tx
    address = util.hash_to_address_multichain(vers, pubkey_hash, checksum)
  File "Mce/", line 141, in hash_to_address_multichain
    vh += hash[pos:pos+5]
TypeError: cannot concatenate 'str' and 'list' objects

Explorer Version is from 29.sept.16

multichain Version is alpha-24, used a lot of the new stream definitions in the blockchain

Any Ideas?

If you Need logs, please tell me, which one?


asked Sep 29, 2016 by hm
We've forwarded this to the right person and will be in touch shortly.
Hi Guys!

I also have this issue, but my goal here is I wish to know the 'cause' of this.

Usually my best guess is the data itself published into the streams or in this case I need to update my Protocol.

My Setup

Multichain Protocol : 10009
Multichain Explorer : Latest Commit 54fe63cb8b5ec0d05cd453dc2a6a21165992641c
You should update your Explorer to the latest.

1 Answer

0 votes
Hi HM, can you please zip up your .multichain/documentChain folder and email it to me, along with your explorer folder (with your .conf and .sqlite file)?  Email to simon at coinsciences dot com.  Thank you.
answered Oct 2, 2016 by simon
Is this issue resolved? I'm facing the same issue. I have no idea how to fix this. Please help me asap.
Have you tried the later Explorer? We've been adding streams support.
Ya. Thank you. It works great :)