[bisq-network/bisq] Clean up trade statistics from duplicate entries (#3470)

chimp1984 notifications at github.com
Fri Oct 25 17:56:26 UTC 2019


Currently we have some duplicate entries in the trade statistic map because the hash has changed with changes in the extraMap field and we added new enties in recent versions. As there was a republish routine for past trade statistics we added duplicates with such updates.
To fix that we should add some cleanup code which deletes those duplicates. We can check for trade ID and the most significant fields like price and amount. 
The republish routine was removed for v.1.2. so there should not be added new duplicates.
For future updates when we add entries to that map it would only be an issue if maker and taker have a different version. 
If the cleanup code does not cause performance issues we could leave it so it would handle such potential future scenarios as well. Alternatively we could add a Json Ignore annotation to the map as well.

-- 
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub:
https://github.com/bisq-network/bisq/issues/3470
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.bisq.network/pipermail/bisq-github/attachments/20191025/1eb510a2/attachment.html>


More information about the bisq-github mailing list