[bisq-network/projects] Make compensation requests programattically parsable (#32)

m52go notifications at github.com
Mon May 4 20:41:34 UTC 2020


> _This is a Bisq Network project. Please familiarize yourself with the [project management process](https://bisq.wiki/Project_management)._

## Description
<!-- Briefly summarize the proposed project. Strive for one or two sentences of plain language that any user, contributor or stakeholder will understand. -->

Currently, in order to determine how funds were allocated in a particular cycle, compensation requests must be manually analyzed and aggregated. This is time-consuming and error-prone. 

This mini-project seeks to establish and implement a new structure for compensation requests that can be parsed by a script.

## Rationale
<!-- Make the case for the the project. Why is it important? Why should it be done now? What will happen if we don't do it or delay doing it? -->

The [project reorganization implemented in Cycle 10](https://bisq.network/blog/q1-2020-update/) established a budgeting structure with team leads. This has helped the project _look forward_ and plan how resources should be allocated. But planning is useless if one cannot _look backward_ and evaluate results.

## Criteria for delivery
<!-- Make a checklist defining the end state of the project. How will we know that the project is complete, i.e. delivered? What will exist at the completion of this project that does not exist now? What will have changed? What communications, promotions and/or documentation will exist to ensure people know about these changes? -->

This project should result in:
- a new template for compensation requests
- a bot that "lints" compensation requests as they are made (and edited) to ensure they fit the new template and can indeed be parsed
- a bot that parses compensation requests after a cycle's voting period ends, and adds a breakdown of issuance by functional team as a comment on each compensation request issue

The existing compensation request template and wiki documentation will need to be updated to reflect the new requirements, along with announcements in all major Keybase channels to ensure contributors are aware (#compensation, #dev, #chinese, #transifex, etc).

The project will be complete when the items above are complete: linter, parser, and related communications.

## Measures of success
<!-- After this project has been delivered, how will we know whether it was a success? What can be measured to indicate that this project was worth the time, effort and compensation it cost? -->

Contributors must make correctly-formed compensation requests on their own (this will demonstrate _awareness_ of the initiative). The linter must alert compensation request makers of mistakes. The parser must make comments on approved compensation request issue with issuence numbers broken down by team.

The project can be considered a success if team leads actually use the issuance numbers provided by the parser bot for budgeting and tracking issuance over time.

## Risks
<!-- What risks do we run by undertaking the project? What sensitive areas of code are to be modified? What negative implications might this project have on privacy, security, performance, compatibility, maintainability, user experience, etc? -->

Not applicable as no Bisq code is touched. 

The most significant risk is probably a bot that reports incorrect numbers for some reason, but such a mistake should be discovered quickly and only impacts reporting (not issuance or software or anything else).

## Tasks
<!-- Make a checklist defining in as much detail as is foreseeable who will need to do what in order to deliver the project. The checklist may be modified throughout the course of the project as new tasks emerge. Alternatively, once the project proposal is approved, you may choose to migrate the task checklist to a dedicated GitHub project board. -->

I don't think this project is complex enough to warrant a whole GitHub board, so here's a checklist.

- [ ] Finalize compensation request format (Markdown table, YAML, etc)
- [ ] Clarify planned results (e.g., issuance breakdown by functional team, anything else?)
- [ ] Create and test linting bot
- [ ] Create and test parsing bot
- [ ] Determine ops for bots (who will host them, where, costs, new roles if needed, etc)
- [ ] Edit wiki documentation and compensation request template to reflect changes
- [ ] Announce changes to contributors
- [ ] Follow implementation for 1 cycle: 1 proposal phase and 1 results phase
- [ ] Ensure results are used for budgeting

## Estimates
<!-- Estimate the cost in USD of delivering this project. Indicate which teams are involved and provide subtotal estimates per team. This section need not be complete for the project proposal to be approved but must be complete for budget to be allocated. -->

Since this is largely a reporting initiative, it probably makes most sense to come out of the growth budget. Ongoing server costs should come from ops.

Maybe 1500 USD is sufficient for the whole project, as described above (initial implementation and documentation)? This is based on it taking a day to create the bots. Ongoing costs for the bots should be very low/negligible. Open to feedback if it any of this is off.

## Notes
<!-- Include anything else worth mentioning about this project. This section is optional and should be omitted if empty. -->

Tracking issuance in a more automated way is the first step of a bigger drive to report issuance, burn, and trading volume better. 

Compensation request details are an important first step to enabling other reporting, so a new project can be created to pursue further reporting once this project has been successfully completed.


-- 
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub:
https://github.com/bisq-network/projects/issues/32
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.bisq.network/pipermail/bisq-github/attachments/20200504/9255fbfc/attachment.html>


More information about the bisq-github mailing list