-
Notifications
You must be signed in to change notification settings - Fork 2
Open
Description
@todo milestone2: 13.07.2020 - ????
-
datdot#22spec: make presentable spec after public w3f wave4 announcement -
datdot-ui#1datdot-ui - close
datdot-node-rust#12What is this? after posting link to latest specification in datdot-research - merge milestones from below
- create milestones and issues for 3 months from the [datdot concept]
- transition as many of our hackmd's into github repo readmes and/or issues
- update our sequence diagram and combine it with the substrate API you have written down and with the brotli encoding will all put it into a first draft of some kind of official "specification" which i will add to the
datdot-researchrepo -
datdot#16specify datdot api -
datdot#38iframe api - update project description in readme by linking to the latest specification in datdot-research
-
datdot#14package datdot in docker app -
datdot#24encoding compatible hypercores -
datdot#32post project updates -
datdot#29web3 grant status updates -
datdot-research#1datdot-research -
datdot-chain#1datdot-chain -
datdot-service#1datdot-service -
datdot-research#5ensure quality of datdot service -
datdot#23user feedback and community channels - use changelog of like
playproject-io/roadmapping- finalized/implemented spec versions
- in progress/next spec version
- future/backlog spec version
- update gitter/discord bot to report github activity correctly
previous roadmap:
future roadmapping:
milestones from below:
milestones
Month 1
- no economics
- no UI
- substrate logic (node)
- js service that reacts to the node
Month 2
- add UI
- implement basic economic model
- community test
Month 3
- improving economic model
- documentation
Development Roadmap
Milestone 1 (Month 1)
Implement basic JS & Substrate logic
- We will be using SRML for
balances,sudoand write our own moduledat-verify(=substrate logic (node))- it will verify
hypercoresin substrate runtime - we will use datrs (or something inspired by it) to make that work
- details:
- randomly selects dat archives and emits events
- verifies data coming in from the service
- make the node run on docker
- implement structs for Proof and Node
- implement and harden randomness
- implement timing logic
- add on-initialize logic
- add register_backup to add and count users
- implement logic to submit dat addresses for pinning
- implement unregistering and initial challenge-response
- it will verify
- We will implement
adapter.js(=js service that reacts to the node)- it will use
polkadot.js.orgapi and hypercore js libraries to encode and decode hypercores - we will use
dat sdkand/ordat-store's service.jsto communicate withadapter.js - details:
- listening for events on the node
- submitting data to the node (proofs and archives) or responding with the merkle proof from the dat archives
- it will use
Deliverables:
- We will deliver a working SRML module
- We will create a docker container that runs a substrate node using the module
- We will deliver a basic javascript module as a helper to interact with the node
- We will record a screencast that explains how a user can spin up one of those Substrate nodes.
- Once the node is up, it will be possible to send test transactions that will show how the new functionality works and will create a screencast which shows step by step how it works
Month 2 (Month 2)
Implement basic economics & UI logic
- We will use the
balancesmodule to:- create a simple credit based system
- by pinning you mint credits, by having your archive pinned, you burn credits
- mint amount should be > than the burn amount to solve bootstrapping.(the burn amount should be defined by a market)
- when you submit the dat you also set the price you're willing to pay for the service
- priority service: users who pin more have priority to get their data pinned first
- details:
- Write a basic module that calls balances to mint and burn balances based on the outcomes of dat-verify
- implement minting tokens if you are seeding (earning) and successfully solve challenges
- implement burning creators' tokens when their data is pinned (payment)
- implement a rough and basic UI for expert users to try out the system as a whole
- run a little closed alpha (community) test and monitor and analyse usage to improve the economic model
- We will write detailed documentation and create a screencast to show how to use it
Deliverables:
- We will deliver a refined working SRML module
- We will deliver a refined javascript module that helps interacting with the node
- We will deliver a basic web UI which works with a locally running substrate node
- We will create a docker container that runs all of this
- We will record a screencast that explains how a user can spin up the docker and use it
Month 3 (Month 3)
Implement refine economics, UI and write documentation
- We will run a public beta and monitor and analyse usage to improve the economic model
- We will implement a convenient UI/UX
- it will use and wrap the work from previous milestones to make it easy for each of the user roles:
- pinners (seeders)
- register to become pinners
- get random dats to pin
- they get paid for their work
- dat creators (requestors)
- they submit dats to be pinned to keep their data available while their devices are offline
- node operators (should be seeders)
- run substrate node
- have to have enough disk space
- reliable connection
- get paid only when they seed (proof = succesful challenge )
- paid in tokens (minted when each payment needs happen)
- data consumers (public)
- reads the data
- pinners (seeders)
- we will use
electronto build a desktop task bar application - details:
- registering availability and requesting pinning
- it will use and wrap the work from previous milestones to make it easy for each of the user roles:
- We will write detailed documentation and create video workshops for users to understand how to use it
Deliverables:
- We will deliver a working electron task bar application to run the substrate node and UI
- We will write a small report with the results from the analysis of our public beta
- We will refine and describe the economic model we are using
- We will record a screencast to show how to install and use the electron app to pin your data or let it be pinned
- We will write detailed documentation which explains all features and how to use them
Future Milestones
We plan to further improve the electron app and the substrate node and economics around it to make datdot work reliable in production.
This might require further grant applications and eventually we might be able to be self sustainable, but that depends on the economic model
we will end up using. One big motivation for us is to use this as a reliable building block for future and past projects, where people need to manage their personal data
Metadata
Metadata
Assignees
Labels
No labels