[bisq-network/bisq] Clean up trade statistics from duplicate entries (#3476)
Julian Knutsen
notifications at github.com
Sun Nov 10 05:37:38 UTC 2019
julianknutsen commented on this pull request.
> @@ -240,7 +242,12 @@ public void onMessage(NetworkEnvelope networkEnvelope, Connection connection) {
// Processing 82645 items took now 61 ms compared to earlier version where it took ages (> 2min).
// Usually we only get about a few hundred or max. a few 1000 items. 82645 is all
// trade stats stats and all account age witness data.
- dataStorage.addPersistableNetworkPayloadFromInitialRequest(e);
+
+ // We only apply it once from first response
+ if (!initialRequestApplied) {
Ok, I think I understand. So we can have up to two loops through the ProtectedStoragePayload path and only one through the PersistableNetworkPayload path. The code handles duplicates just fine, but processing 2x the PersistableNetworkPayload objects takes a lot of time and we don't expect processing them from the second seed to provide meaningful data. Thanks for the rundown.
--
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub:
https://github.com/bisq-network/bisq/pull/3476#discussion_r344473375
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.bisq.network/pipermail/bisq-github/attachments/20191109/1b121e23/attachment-0001.html>
More information about the bisq-github
mailing list