diff --git a/docs/.gitbook/assets/0 b/docs/.gitbook/assets/0 new file mode 100644 index 0000000..2288a17 Binary files /dev/null and b/docs/.gitbook/assets/0 differ diff --git a/docs/.gitbook/assets/01.jpg b/docs/.gitbook/assets/01.jpg new file mode 100644 index 0000000..d0e1e43 Binary files /dev/null and b/docs/.gitbook/assets/01.jpg differ diff --git a/docs/.gitbook/assets/1 b/docs/.gitbook/assets/1 new file mode 100644 index 0000000..c1b3e76 Binary files /dev/null and b/docs/.gitbook/assets/1 differ diff --git a/docs/.gitbook/assets/2 b/docs/.gitbook/assets/2 new file mode 100644 index 0000000..1d9b802 Binary files /dev/null and b/docs/.gitbook/assets/2 differ diff --git a/docs/.gitbook/assets/2.jpg b/docs/.gitbook/assets/2.jpg new file mode 100644 index 0000000..d1f3e2b Binary files /dev/null and b/docs/.gitbook/assets/2.jpg differ diff --git a/docs/.gitbook/assets/3 b/docs/.gitbook/assets/3 new file mode 100644 index 0000000..01eea23 Binary files /dev/null and b/docs/.gitbook/assets/3 differ diff --git a/docs/.gitbook/assets/3.jpg b/docs/.gitbook/assets/3.jpg new file mode 100644 index 0000000..33c4cd2 Binary files /dev/null and b/docs/.gitbook/assets/3.jpg differ diff --git a/docs/.gitbook/assets/4 b/docs/.gitbook/assets/4 new file mode 100644 index 0000000..92defc1 Binary files /dev/null and b/docs/.gitbook/assets/4 differ diff --git a/docs/.gitbook/assets/4.jpg b/docs/.gitbook/assets/4.jpg new file mode 100644 index 0000000..e3a4554 Binary files /dev/null and b/docs/.gitbook/assets/4.jpg differ diff --git a/docs/.gitbook/assets/5.jpg b/docs/.gitbook/assets/5.jpg new file mode 100644 index 0000000..28b1e01 Binary files /dev/null and b/docs/.gitbook/assets/5.jpg differ diff --git a/docs/.gitbook/assets/6.jpg b/docs/.gitbook/assets/6.jpg new file mode 100644 index 0000000..d87a377 Binary files /dev/null and b/docs/.gitbook/assets/6.jpg differ diff --git a/docs/.gitbook/assets/Addresses For Funding.png b/docs/.gitbook/assets/Addresses For Funding.png new file mode 100644 index 0000000..28819cf Binary files /dev/null and b/docs/.gitbook/assets/Addresses For Funding.png differ diff --git a/docs/.gitbook/assets/Autogenerated Wallets.png b/docs/.gitbook/assets/Autogenerated Wallets.png new file mode 100644 index 0000000..98bd2ab Binary files /dev/null and b/docs/.gitbook/assets/Autogenerated Wallets.png differ diff --git a/docs/.gitbook/assets/Automated deployment via GCM2.png b/docs/.gitbook/assets/Automated deployment via GCM2.png new file mode 100644 index 0000000..d10c90e Binary files /dev/null and b/docs/.gitbook/assets/Automated deployment via GCM2.png differ diff --git a/docs/.gitbook/assets/Automated environment setup.png b/docs/.gitbook/assets/Automated environment setup.png new file mode 100644 index 0000000..a053bbc Binary files /dev/null and b/docs/.gitbook/assets/Automated environment setup.png differ diff --git a/docs/.gitbook/assets/Base Explorer.png b/docs/.gitbook/assets/Base Explorer.png new file mode 100644 index 0000000..9178f82 Binary files /dev/null and b/docs/.gitbook/assets/Base Explorer.png differ diff --git a/docs/.gitbook/assets/Begin Step 6.png b/docs/.gitbook/assets/Begin Step 6.png new file mode 100644 index 0000000..0bcff3c Binary files /dev/null and b/docs/.gitbook/assets/Begin Step 6.png differ diff --git a/docs/.gitbook/assets/Build with the DKG Node.png b/docs/.gitbook/assets/Build with the DKG Node.png new file mode 100644 index 0000000..cfa7c9e Binary files /dev/null and b/docs/.gitbook/assets/Build with the DKG Node.png differ diff --git a/docs/.gitbook/assets/ChatDKG builder toolkit 2.png b/docs/.gitbook/assets/ChatDKG builder toolkit 2.png new file mode 100644 index 0000000..af6cbc9 Binary files /dev/null and b/docs/.gitbook/assets/ChatDKG builder toolkit 2.png differ diff --git a/docs/.gitbook/assets/Cloning Repo.png b/docs/.gitbook/assets/Cloning Repo.png new file mode 100644 index 0000000..87f4ff4 Binary files /dev/null and b/docs/.gitbook/assets/Cloning Repo.png differ diff --git a/docs/.gitbook/assets/Core Node.png b/docs/.gitbook/assets/Core Node.png new file mode 100644 index 0000000..50dabc6 Binary files /dev/null and b/docs/.gitbook/assets/Core Node.png differ diff --git a/docs/.gitbook/assets/Core and Edge Nodes as Neuro-symbolic AI systems.png b/docs/.gitbook/assets/Core and Edge Nodes as Neuro-symbolic AI systems.png new file mode 100644 index 0000000..2fdec2b Binary files /dev/null and b/docs/.gitbook/assets/Core and Edge Nodes as Neuro-symbolic AI systems.png differ diff --git a/docs/.gitbook/assets/Customize the Edge Node to build your project (1).png b/docs/.gitbook/assets/Customize the Edge Node to build your project (1).png new file mode 100644 index 0000000..61ee41d Binary files /dev/null and b/docs/.gitbook/assets/Customize the Edge Node to build your project (1).png differ diff --git a/docs/.gitbook/assets/Customize the Edge Node to build your project.png b/docs/.gitbook/assets/Customize the Edge Node to build your project.png new file mode 100644 index 0000000..6dbb05a Binary files /dev/null and b/docs/.gitbook/assets/Customize the Edge Node to build your project.png differ diff --git a/docs/.gitbook/assets/DKG Edge Node - beta version - docs visual2 1 (1).png b/docs/.gitbook/assets/DKG Edge Node - beta version - docs visual2 1 (1).png new file mode 100644 index 0000000..9013514 Binary files /dev/null and b/docs/.gitbook/assets/DKG Edge Node - beta version - docs visual2 1 (1).png differ diff --git a/docs/.gitbook/assets/DKG Edge Node - beta version - docs visual2 1.png b/docs/.gitbook/assets/DKG Edge Node - beta version - docs visual2 1.png new file mode 100644 index 0000000..fd53b3c Binary files /dev/null and b/docs/.gitbook/assets/DKG Edge Node - beta version - docs visual2 1.png differ diff --git a/docs/.gitbook/assets/DKG Edge Node - beta version - docs visual2.png b/docs/.gitbook/assets/DKG Edge Node - beta version - docs visual2.png new file mode 100644 index 0000000..52d416e Binary files /dev/null and b/docs/.gitbook/assets/DKG Edge Node - beta version - docs visual2.png differ diff --git a/docs/.gitbook/assets/DKG Edge Node - docs visual.png b/docs/.gitbook/assets/DKG Edge Node - docs visual.png new file mode 100644 index 0000000..80438bd Binary files /dev/null and b/docs/.gitbook/assets/DKG Edge Node - docs visual.png differ diff --git a/docs/.gitbook/assets/DKG Edge Node architecture 1.png b/docs/.gitbook/assets/DKG Edge Node architecture 1.png new file mode 100644 index 0000000..42d7cf8 Binary files /dev/null and b/docs/.gitbook/assets/DKG Edge Node architecture 1.png differ diff --git a/docs/.gitbook/assets/DKG Edge Node architecture.png b/docs/.gitbook/assets/DKG Edge Node architecture.png new file mode 100644 index 0000000..ab359a2 Binary files /dev/null and b/docs/.gitbook/assets/DKG Edge Node architecture.png differ diff --git a/docs/.gitbook/assets/DKG Logs.png b/docs/.gitbook/assets/DKG Logs.png new file mode 100644 index 0000000..10bb7e8 Binary files /dev/null and b/docs/.gitbook/assets/DKG Logs.png differ diff --git a/docs/.gitbook/assets/DKG Node (1).png b/docs/.gitbook/assets/DKG Node (1).png new file mode 100644 index 0000000..13a9be8 Binary files /dev/null and b/docs/.gitbook/assets/DKG Node (1).png differ diff --git a/docs/.gitbook/assets/DKG Node ENV Setup.png b/docs/.gitbook/assets/DKG Node ENV Setup.png new file mode 100644 index 0000000..3e3087f Binary files /dev/null and b/docs/.gitbook/assets/DKG Node ENV Setup.png differ diff --git a/docs/.gitbook/assets/DKG Node Installer.png b/docs/.gitbook/assets/DKG Node Installer.png new file mode 100644 index 0000000..57eae1e Binary files /dev/null and b/docs/.gitbook/assets/DKG Node Installer.png differ diff --git a/docs/.gitbook/assets/DKG Node diagram2 (1).png b/docs/.gitbook/assets/DKG Node diagram2 (1).png new file mode 100644 index 0000000..308e725 Binary files /dev/null and b/docs/.gitbook/assets/DKG Node diagram2 (1).png differ diff --git a/docs/.gitbook/assets/DKG Node diagram2.png b/docs/.gitbook/assets/DKG Node diagram2.png new file mode 100644 index 0000000..320477a Binary files /dev/null and b/docs/.gitbook/assets/DKG Node diagram2.png differ diff --git a/docs/.gitbook/assets/DKG Node.png b/docs/.gitbook/assets/DKG Node.png new file mode 100644 index 0000000..784f526 Binary files /dev/null and b/docs/.gitbook/assets/DKG Node.png differ diff --git a/docs/.gitbook/assets/DKG Nodes work together to build the Verifiable Internet. (1).png b/docs/.gitbook/assets/DKG Nodes work together to build the Verifiable Internet. (1).png new file mode 100644 index 0000000..98dc3e4 Binary files /dev/null and b/docs/.gitbook/assets/DKG Nodes work together to build the Verifiable Internet. (1).png differ diff --git a/docs/.gitbook/assets/DKG Nodes work together to build the Verifiable Internet. (2).png b/docs/.gitbook/assets/DKG Nodes work together to build the Verifiable Internet. (2).png new file mode 100644 index 0000000..fd7b2e2 Binary files /dev/null and b/docs/.gitbook/assets/DKG Nodes work together to build the Verifiable Internet. (2).png differ diff --git a/docs/.gitbook/assets/DKG Nodes work together to build the Verifiable Internet..png b/docs/.gitbook/assets/DKG Nodes work together to build the Verifiable Internet..png new file mode 100644 index 0000000..66c57f0 Binary files /dev/null and b/docs/.gitbook/assets/DKG Nodes work together to build the Verifiable Internet..png differ diff --git a/docs/.gitbook/assets/DKG Paranets.png b/docs/.gitbook/assets/DKG Paranets.png new file mode 100644 index 0000000..d8b0c52 Binary files /dev/null and b/docs/.gitbook/assets/DKG Paranets.png differ diff --git a/docs/.gitbook/assets/DKG Staking - docs visual (1).png b/docs/.gitbook/assets/DKG Staking - docs visual (1).png new file mode 100644 index 0000000..bbb4499 Binary files /dev/null and b/docs/.gitbook/assets/DKG Staking - docs visual (1).png differ diff --git a/docs/.gitbook/assets/DKG Staking - docs visual.png b/docs/.gitbook/assets/DKG Staking - docs visual.png new file mode 100644 index 0000000..74081fc Binary files /dev/null and b/docs/.gitbook/assets/DKG Staking - docs visual.png differ diff --git a/docs/.gitbook/assets/DKG Staking.png b/docs/.gitbook/assets/DKG Staking.png new file mode 100644 index 0000000..fca7e42 Binary files /dev/null and b/docs/.gitbook/assets/DKG Staking.png differ diff --git a/docs/.gitbook/assets/DKG V6.jpg b/docs/.gitbook/assets/DKG V6.jpg new file mode 100644 index 0000000..9e1ca54 Binary files /dev/null and b/docs/.gitbook/assets/DKG V6.jpg differ diff --git a/docs/.gitbook/assets/DKG V8 docs.jpg b/docs/.gitbook/assets/DKG V8 docs.jpg new file mode 100644 index 0000000..8d716bc Binary files /dev/null and b/docs/.gitbook/assets/DKG V8 docs.jpg differ diff --git a/docs/.gitbook/assets/DKG V8 update guide book - doc cover.png b/docs/.gitbook/assets/DKG V8 update guide book - doc cover.png new file mode 100644 index 0000000..1e83f43 Binary files /dev/null and b/docs/.gitbook/assets/DKG V8 update guide book - doc cover.png differ diff --git a/docs/.gitbook/assets/DKG V8 update guide book - gitbook cover.png b/docs/.gitbook/assets/DKG V8 update guide book - gitbook cover.png new file mode 100644 index 0000000..c3fa5a3 Binary files /dev/null and b/docs/.gitbook/assets/DKG V8 update guide book - gitbook cover.png differ diff --git a/docs/.gitbook/assets/DKG V8 update guide book - gitbook cover1 (1).png b/docs/.gitbook/assets/DKG V8 update guide book - gitbook cover1 (1).png new file mode 100644 index 0000000..4fab0ab Binary files /dev/null and b/docs/.gitbook/assets/DKG V8 update guide book - gitbook cover1 (1).png differ diff --git a/docs/.gitbook/assets/DKG V8 update guide book - gitbook cover1.png b/docs/.gitbook/assets/DKG V8 update guide book - gitbook cover1.png new file mode 100644 index 0000000..4fab0ab Binary files /dev/null and b/docs/.gitbook/assets/DKG V8 update guide book - gitbook cover1.png differ diff --git a/docs/.gitbook/assets/DKG V8.1 (1).png b/docs/.gitbook/assets/DKG V8.1 (1).png new file mode 100644 index 0000000..c41e692 Binary files /dev/null and b/docs/.gitbook/assets/DKG V8.1 (1).png differ diff --git a/docs/.gitbook/assets/DKG V8.1.png b/docs/.gitbook/assets/DKG V8.1.png new file mode 100644 index 0000000..f4d8905 Binary files /dev/null and b/docs/.gitbook/assets/DKG V8.1.png differ diff --git a/docs/.gitbook/assets/DKG V8.jpg b/docs/.gitbook/assets/DKG V8.jpg new file mode 100644 index 0000000..7d6b8ba Binary files /dev/null and b/docs/.gitbook/assets/DKG V8.jpg differ diff --git a/docs/.gitbook/assets/DKG key concepts.png b/docs/.gitbook/assets/DKG key concepts.png new file mode 100644 index 0000000..bc9061b Binary files /dev/null and b/docs/.gitbook/assets/DKG key concepts.png differ diff --git a/docs/.gitbook/assets/DKG under the hood.png b/docs/.gitbook/assets/DKG under the hood.png new file mode 100644 index 0000000..9634e20 Binary files /dev/null and b/docs/.gitbook/assets/DKG under the hood.png differ diff --git a/docs/.gitbook/assets/Delegated Staking.png b/docs/.gitbook/assets/Delegated Staking.png new file mode 100644 index 0000000..5922694 Binary files /dev/null and b/docs/.gitbook/assets/Delegated Staking.png differ diff --git a/docs/.gitbook/assets/ENVFILE.png b/docs/.gitbook/assets/ENVFILE.png new file mode 100644 index 0000000..d1f2960 Binary files /dev/null and b/docs/.gitbook/assets/ENVFILE.png differ diff --git a/docs/.gitbook/assets/Edge Node GCP - docs visual11.png b/docs/.gitbook/assets/Edge Node GCP - docs visual11.png new file mode 100644 index 0000000..c341be7 Binary files /dev/null and b/docs/.gitbook/assets/Edge Node GCP - docs visual11.png differ diff --git a/docs/.gitbook/assets/Edge Node GCP - docs visual2 1.png b/docs/.gitbook/assets/Edge Node GCP - docs visual2 1.png new file mode 100644 index 0000000..502c0cc Binary files /dev/null and b/docs/.gitbook/assets/Edge Node GCP - docs visual2 1.png differ diff --git a/docs/.gitbook/assets/Edge Node2.png b/docs/.gitbook/assets/Edge Node2.png new file mode 100644 index 0000000..dd1e0b1 Binary files /dev/null and b/docs/.gitbook/assets/Edge Node2.png differ diff --git a/docs/.gitbook/assets/Edge_Node_cover (1).png b/docs/.gitbook/assets/Edge_Node_cover (1).png new file mode 100644 index 0000000..b2457b1 Binary files /dev/null and b/docs/.gitbook/assets/Edge_Node_cover (1).png differ diff --git a/docs/.gitbook/assets/Edge_Node_cover.png b/docs/.gitbook/assets/Edge_Node_cover.png new file mode 100644 index 0000000..b2457b1 Binary files /dev/null and b/docs/.gitbook/assets/Edge_Node_cover.png differ diff --git a/docs/.gitbook/assets/Empowering ai minds_ Origintrail hackathon (gitbook cover).png b/docs/.gitbook/assets/Empowering ai minds_ Origintrail hackathon (gitbook cover).png new file mode 100644 index 0000000..740757c Binary files /dev/null and b/docs/.gitbook/assets/Empowering ai minds_ Origintrail hackathon (gitbook cover).png differ diff --git a/docs/.gitbook/assets/Empowering ai minds_ Origintrail hackathon (gitbook doc cover) 1.png b/docs/.gitbook/assets/Empowering ai minds_ Origintrail hackathon (gitbook doc cover) 1.png new file mode 100644 index 0000000..0a8da6d Binary files /dev/null and b/docs/.gitbook/assets/Empowering ai minds_ Origintrail hackathon (gitbook doc cover) 1.png differ diff --git a/docs/.gitbook/assets/Empowering ai minds_ Origintrail hackathon (gitbook doc cover).png b/docs/.gitbook/assets/Empowering ai minds_ Origintrail hackathon (gitbook doc cover).png new file mode 100644 index 0000000..1729d52 Binary files /dev/null and b/docs/.gitbook/assets/Empowering ai minds_ Origintrail hackathon (gitbook doc cover).png differ diff --git a/docs/.gitbook/assets/Empowering ai minds_ Origintrail hackathon (github cover).png b/docs/.gitbook/assets/Empowering ai minds_ Origintrail hackathon (github cover).png new file mode 100644 index 0000000..945cb30 Binary files /dev/null and b/docs/.gitbook/assets/Empowering ai minds_ Origintrail hackathon (github cover).png differ diff --git a/docs/.gitbook/assets/Fund Base.png b/docs/.gitbook/assets/Fund Base.png new file mode 100644 index 0000000..77110b5 Binary files /dev/null and b/docs/.gitbook/assets/Fund Base.png differ diff --git a/docs/.gitbook/assets/Fund Gnosis.png b/docs/.gitbook/assets/Fund Gnosis.png new file mode 100644 index 0000000..0b7101b Binary files /dev/null and b/docs/.gitbook/assets/Fund Gnosis.png differ diff --git a/docs/.gitbook/assets/Generated Wallets.png b/docs/.gitbook/assets/Generated Wallets.png new file mode 100644 index 0000000..b667ead Binary files /dev/null and b/docs/.gitbook/assets/Generated Wallets.png differ diff --git a/docs/.gitbook/assets/Get started with the Edge Node boilerplate.png b/docs/.gitbook/assets/Get started with the Edge Node boilerplate.png new file mode 100644 index 0000000..af0536c Binary files /dev/null and b/docs/.gitbook/assets/Get started with the Edge Node boilerplate.png differ diff --git a/docs/.gitbook/assets/Installed Lauched.png b/docs/.gitbook/assets/Installed Lauched.png new file mode 100644 index 0000000..826e585 Binary files /dev/null and b/docs/.gitbook/assets/Installed Lauched.png differ diff --git a/docs/.gitbook/assets/Installer .ENV.png b/docs/.gitbook/assets/Installer .ENV.png new file mode 100644 index 0000000..3977f90 Binary files /dev/null and b/docs/.gitbook/assets/Installer .ENV.png differ diff --git a/docs/.gitbook/assets/Installer Home.png b/docs/.gitbook/assets/Installer Home.png new file mode 100644 index 0000000..977d9af Binary files /dev/null and b/docs/.gitbook/assets/Installer Home.png differ diff --git a/docs/.gitbook/assets/Installer Ran.png b/docs/.gitbook/assets/Installer Ran.png new file mode 100644 index 0000000..14a5cda Binary files /dev/null and b/docs/.gitbook/assets/Installer Ran.png differ diff --git a/docs/.gitbook/assets/Intro Video Final COMPRESSED.mp4 b/docs/.gitbook/assets/Intro Video Final COMPRESSED.mp4 new file mode 100644 index 0000000..11fffa8 Binary files /dev/null and b/docs/.gitbook/assets/Intro Video Final COMPRESSED.mp4 differ diff --git a/docs/.gitbook/assets/Layers-Infographic.jpg b/docs/.gitbook/assets/Layers-Infographic.jpg new file mode 100644 index 0000000..ecf7fb2 Binary files /dev/null and b/docs/.gitbook/assets/Layers-Infographic.jpg differ diff --git a/docs/.gitbook/assets/Layers-Infographic.png b/docs/.gitbook/assets/Layers-Infographic.png new file mode 100644 index 0000000..ef166d1 Binary files /dev/null and b/docs/.gitbook/assets/Layers-Infographic.png differ diff --git a/docs/.gitbook/assets/Learn about the Edge Node internals (architecture).png b/docs/.gitbook/assets/Learn about the Edge Node internals (architecture).png new file mode 100644 index 0000000..f5d6ef8 Binary files /dev/null and b/docs/.gitbook/assets/Learn about the Edge Node internals (architecture).png differ diff --git a/docs/.gitbook/assets/MCP1.png b/docs/.gitbook/assets/MCP1.png new file mode 100644 index 0000000..b7213d4 Binary files /dev/null and b/docs/.gitbook/assets/MCP1.png differ diff --git a/docs/.gitbook/assets/MCP2.png b/docs/.gitbook/assets/MCP2.png new file mode 100644 index 0000000..600b1f8 Binary files /dev/null and b/docs/.gitbook/assets/MCP2.png differ diff --git a/docs/.gitbook/assets/Manual dvelopment environment setup.png b/docs/.gitbook/assets/Manual dvelopment environment setup.png new file mode 100644 index 0000000..8212ee9 Binary files /dev/null and b/docs/.gitbook/assets/Manual dvelopment environment setup.png differ diff --git a/docs/.gitbook/assets/Microsoft Copilot demo infographic.png b/docs/.gitbook/assets/Microsoft Copilot demo infographic.png new file mode 100644 index 0000000..11a141e Binary files /dev/null and b/docs/.gitbook/assets/Microsoft Copilot demo infographic.png differ diff --git a/docs/.gitbook/assets/MicrosoftTeams-image (1).png b/docs/.gitbook/assets/MicrosoftTeams-image (1).png new file mode 100644 index 0000000..af8eb80 Binary files /dev/null and b/docs/.gitbook/assets/MicrosoftTeams-image (1).png differ diff --git a/docs/.gitbook/assets/New version of Knowledge Assets.png b/docs/.gitbook/assets/New version of Knowledge Assets.png new file mode 100644 index 0000000..0a749f7 Binary files /dev/null and b/docs/.gitbook/assets/New version of Knowledge Assets.png differ diff --git a/docs/.gitbook/assets/Node health.png b/docs/.gitbook/assets/Node health.png new file mode 100644 index 0000000..a3ce7f0 Binary files /dev/null and b/docs/.gitbook/assets/Node health.png differ diff --git a/docs/.gitbook/assets/Node power.png b/docs/.gitbook/assets/Node power.png new file mode 100644 index 0000000..60d68df Binary files /dev/null and b/docs/.gitbook/assets/Node power.png differ diff --git a/docs/.gitbook/assets/ODN v6 diagrams - Page 41 (1).png b/docs/.gitbook/assets/ODN v6 diagrams - Page 41 (1).png new file mode 100644 index 0000000..14ed7f0 Binary files /dev/null and b/docs/.gitbook/assets/ODN v6 diagrams - Page 41 (1).png differ diff --git a/docs/.gitbook/assets/OT x BASE doc visual.jpg b/docs/.gitbook/assets/OT x BASE doc visual.jpg new file mode 100644 index 0000000..0359364 Binary files /dev/null and b/docs/.gitbook/assets/OT x BASE doc visual.jpg differ diff --git a/docs/.gitbook/assets/Operator fee ex2 (1).png b/docs/.gitbook/assets/Operator fee ex2 (1).png new file mode 100644 index 0000000..65ddafa Binary files /dev/null and b/docs/.gitbook/assets/Operator fee ex2 (1).png differ diff --git a/docs/.gitbook/assets/Operator fee ex2.png b/docs/.gitbook/assets/Operator fee ex2.png new file mode 100644 index 0000000..5d848a9 Binary files /dev/null and b/docs/.gitbook/assets/Operator fee ex2.png differ diff --git a/docs/.gitbook/assets/OriginTrail - Decentralized Network.png b/docs/.gitbook/assets/OriginTrail - Decentralized Network.png new file mode 100644 index 0000000..a170e44 Binary files /dev/null and b/docs/.gitbook/assets/OriginTrail - Decentralized Network.png differ diff --git a/docs/.gitbook/assets/OriginTrail - Technical Stack.png b/docs/.gitbook/assets/OriginTrail - Technical Stack.png new file mode 100644 index 0000000..2cbc330 Binary files /dev/null and b/docs/.gitbook/assets/OriginTrail - Technical Stack.png differ diff --git a/docs/.gitbook/assets/Posnetek zaslona 2025-01-30 145420.png b/docs/.gitbook/assets/Posnetek zaslona 2025-01-30 145420.png new file mode 100644 index 0000000..4bc9033 Binary files /dev/null and b/docs/.gitbook/assets/Posnetek zaslona 2025-01-30 145420.png differ diff --git a/docs/.gitbook/assets/Scalability updates.png b/docs/.gitbook/assets/Scalability updates.png new file mode 100644 index 0000000..b45d8dd Binary files /dev/null and b/docs/.gitbook/assets/Scalability updates.png differ diff --git a/docs/.gitbook/assets/Scheme.png b/docs/.gitbook/assets/Scheme.png new file mode 100644 index 0000000..a728370 Binary files /dev/null and b/docs/.gitbook/assets/Scheme.png differ diff --git a/docs/.gitbook/assets/Screen Shot 2022-01-19 at 12.31.48.png b/docs/.gitbook/assets/Screen Shot 2022-01-19 at 12.31.48.png new file mode 100644 index 0000000..5ad99b1 Binary files /dev/null and b/docs/.gitbook/assets/Screen Shot 2022-01-19 at 12.31.48.png differ diff --git a/docs/.gitbook/assets/Screen Shot 2022-01-19 at 12.32.39.png b/docs/.gitbook/assets/Screen Shot 2022-01-19 at 12.32.39.png new file mode 100644 index 0000000..bbac40a Binary files /dev/null and b/docs/.gitbook/assets/Screen Shot 2022-01-19 at 12.32.39.png differ diff --git a/docs/.gitbook/assets/Screen Shot 2022-03-04 at 14.27.16.png b/docs/.gitbook/assets/Screen Shot 2022-03-04 at 14.27.16.png new file mode 100644 index 0000000..d0ac44f Binary files /dev/null and b/docs/.gitbook/assets/Screen Shot 2022-03-04 at 14.27.16.png differ diff --git a/docs/.gitbook/assets/Screen Shot 2022-03-11 at 10.18.34.png b/docs/.gitbook/assets/Screen Shot 2022-03-11 at 10.18.34.png new file mode 100644 index 0000000..1df9d38 Binary files /dev/null and b/docs/.gitbook/assets/Screen Shot 2022-03-11 at 10.18.34.png differ diff --git a/docs/.gitbook/assets/Screen Shot 2022-12-15 at 14.22.00.png b/docs/.gitbook/assets/Screen Shot 2022-12-15 at 14.22.00.png new file mode 100644 index 0000000..be7faa0 Binary files /dev/null and b/docs/.gitbook/assets/Screen Shot 2022-12-15 at 14.22.00.png differ diff --git a/docs/.gitbook/assets/Screen Shot 2023-02-22 at 14.51.44 (1).png b/docs/.gitbook/assets/Screen Shot 2023-02-22 at 14.51.44 (1).png new file mode 100644 index 0000000..a94dc34 Binary files /dev/null and b/docs/.gitbook/assets/Screen Shot 2023-02-22 at 14.51.44 (1).png differ diff --git a/docs/.gitbook/assets/Screen Shot 2023-02-22 at 14.51.44 (2).png b/docs/.gitbook/assets/Screen Shot 2023-02-22 at 14.51.44 (2).png new file mode 100644 index 0000000..a94dc34 Binary files /dev/null and b/docs/.gitbook/assets/Screen Shot 2023-02-22 at 14.51.44 (2).png differ diff --git a/docs/.gitbook/assets/Screen Shot 2023-02-22 at 14.51.44.png b/docs/.gitbook/assets/Screen Shot 2023-02-22 at 14.51.44.png new file mode 100644 index 0000000..a94dc34 Binary files /dev/null and b/docs/.gitbook/assets/Screen Shot 2023-02-22 at 14.51.44.png differ diff --git a/docs/.gitbook/assets/Screen Shot 2023-12-15 at 12.05.56.png b/docs/.gitbook/assets/Screen Shot 2023-12-15 at 12.05.56.png new file mode 100644 index 0000000..5e18dc6 Binary files /dev/null and b/docs/.gitbook/assets/Screen Shot 2023-12-15 at 12.05.56.png differ diff --git a/docs/.gitbook/assets/Screen Shot 2023-12-15 at 12.06.16.png b/docs/.gitbook/assets/Screen Shot 2023-12-15 at 12.06.16.png new file mode 100644 index 0000000..1712f50 Binary files /dev/null and b/docs/.gitbook/assets/Screen Shot 2023-12-15 at 12.06.16.png differ diff --git a/docs/.gitbook/assets/Screen Shot 2023-12-15 at 12.06.32.png b/docs/.gitbook/assets/Screen Shot 2023-12-15 at 12.06.32.png new file mode 100644 index 0000000..21476b0 Binary files /dev/null and b/docs/.gitbook/assets/Screen Shot 2023-12-15 at 12.06.32.png differ diff --git a/docs/.gitbook/assets/Screen Shot 2023-12-15 at 12.06.53.png b/docs/.gitbook/assets/Screen Shot 2023-12-15 at 12.06.53.png new file mode 100644 index 0000000..429a2f1 Binary files /dev/null and b/docs/.gitbook/assets/Screen Shot 2023-12-15 at 12.06.53.png differ diff --git a/docs/.gitbook/assets/Screen Shot 2023-12-15 at 12.26.12.png b/docs/.gitbook/assets/Screen Shot 2023-12-15 at 12.26.12.png new file mode 100644 index 0000000..befce3e Binary files /dev/null and b/docs/.gitbook/assets/Screen Shot 2023-12-15 at 12.26.12.png differ diff --git a/docs/.gitbook/assets/Screen Shot 2023-12-15 at 12.32.41 (1).png b/docs/.gitbook/assets/Screen Shot 2023-12-15 at 12.32.41 (1).png new file mode 100644 index 0000000..c681da5 Binary files /dev/null and b/docs/.gitbook/assets/Screen Shot 2023-12-15 at 12.32.41 (1).png differ diff --git a/docs/.gitbook/assets/Screen Shot 2023-12-15 at 12.32.41.png b/docs/.gitbook/assets/Screen Shot 2023-12-15 at 12.32.41.png new file mode 100644 index 0000000..c681da5 Binary files /dev/null and b/docs/.gitbook/assets/Screen Shot 2023-12-15 at 12.32.41.png differ diff --git a/docs/.gitbook/assets/Screen Shot 2023-12-18 at 14.06.21.png b/docs/.gitbook/assets/Screen Shot 2023-12-18 at 14.06.21.png new file mode 100644 index 0000000..0533ab3 Binary files /dev/null and b/docs/.gitbook/assets/Screen Shot 2023-12-18 at 14.06.21.png differ diff --git a/docs/.gitbook/assets/Screen Shot 2023-12-18 at 14.07.01.png b/docs/.gitbook/assets/Screen Shot 2023-12-18 at 14.07.01.png new file mode 100644 index 0000000..c40926c Binary files /dev/null and b/docs/.gitbook/assets/Screen Shot 2023-12-18 at 14.07.01.png differ diff --git a/docs/.gitbook/assets/Screen Shot 2023-12-18 at 14.10.25 (1).png b/docs/.gitbook/assets/Screen Shot 2023-12-18 at 14.10.25 (1).png new file mode 100644 index 0000000..f1de1b2 Binary files /dev/null and b/docs/.gitbook/assets/Screen Shot 2023-12-18 at 14.10.25 (1).png differ diff --git a/docs/.gitbook/assets/Screen Shot 2023-12-18 at 14.10.25.png b/docs/.gitbook/assets/Screen Shot 2023-12-18 at 14.10.25.png new file mode 100644 index 0000000..f1de1b2 Binary files /dev/null and b/docs/.gitbook/assets/Screen Shot 2023-12-18 at 14.10.25.png differ diff --git a/docs/.gitbook/assets/Screenshot 2021-12-27 at 15.49.28.png b/docs/.gitbook/assets/Screenshot 2021-12-27 at 15.49.28.png new file mode 100644 index 0000000..8fadd0d Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2021-12-27 at 15.49.28.png differ diff --git a/docs/.gitbook/assets/Screenshot 2022-03-30 at 16.46.10.png b/docs/.gitbook/assets/Screenshot 2022-03-30 at 16.46.10.png new file mode 100644 index 0000000..a94a062 Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2022-03-30 at 16.46.10.png differ diff --git a/docs/.gitbook/assets/Screenshot 2022-04-04 at 13.37.19.png b/docs/.gitbook/assets/Screenshot 2022-04-04 at 13.37.19.png new file mode 100644 index 0000000..ee01de7 Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2022-04-04 at 13.37.19.png differ diff --git a/docs/.gitbook/assets/Screenshot 2022-04-04 at 13.37.46.png b/docs/.gitbook/assets/Screenshot 2022-04-04 at 13.37.46.png new file mode 100644 index 0000000..2772bab Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2022-04-04 at 13.37.46.png differ diff --git a/docs/.gitbook/assets/Screenshot 2022-04-23 at 12.30.53.png b/docs/.gitbook/assets/Screenshot 2022-04-23 at 12.30.53.png new file mode 100644 index 0000000..505a917 Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2022-04-23 at 12.30.53.png differ diff --git a/docs/.gitbook/assets/Screenshot 2022-04-23 at 12.32.48.png b/docs/.gitbook/assets/Screenshot 2022-04-23 at 12.32.48.png new file mode 100644 index 0000000..2ce7d39 Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2022-04-23 at 12.32.48.png differ diff --git a/docs/.gitbook/assets/Screenshot 2022-08-19 at 14.25.28.png b/docs/.gitbook/assets/Screenshot 2022-08-19 at 14.25.28.png new file mode 100644 index 0000000..8042dc2 Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2022-08-19 at 14.25.28.png differ diff --git a/docs/.gitbook/assets/Screenshot 2022-08-19 at 14.32.08.png b/docs/.gitbook/assets/Screenshot 2022-08-19 at 14.32.08.png new file mode 100644 index 0000000..12cc45c Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2022-08-19 at 14.32.08.png differ diff --git a/docs/.gitbook/assets/Screenshot 2022-08-19 at 19.58.47.png b/docs/.gitbook/assets/Screenshot 2022-08-19 at 19.58.47.png new file mode 100644 index 0000000..142180d Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2022-08-19 at 19.58.47.png differ diff --git a/docs/.gitbook/assets/Screenshot 2022-11-16 at 18.29.58.png b/docs/.gitbook/assets/Screenshot 2022-11-16 at 18.29.58.png new file mode 100644 index 0000000..3fda560 Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2022-11-16 at 18.29.58.png differ diff --git a/docs/.gitbook/assets/Screenshot 2022-11-16 at 18.40.35.png b/docs/.gitbook/assets/Screenshot 2022-11-16 at 18.40.35.png new file mode 100644 index 0000000..0800543 Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2022-11-16 at 18.40.35.png differ diff --git a/docs/.gitbook/assets/Screenshot 2022-11-16 at 18.40.56.png b/docs/.gitbook/assets/Screenshot 2022-11-16 at 18.40.56.png new file mode 100644 index 0000000..934bac2 Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2022-11-16 at 18.40.56.png differ diff --git a/docs/.gitbook/assets/Screenshot 2022-12-29 at 16.55.06.png b/docs/.gitbook/assets/Screenshot 2022-12-29 at 16.55.06.png new file mode 100644 index 0000000..8db9b33 Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2022-12-29 at 16.55.06.png differ diff --git a/docs/.gitbook/assets/Screenshot 2023-05-29 at 18.40.45.png b/docs/.gitbook/assets/Screenshot 2023-05-29 at 18.40.45.png new file mode 100644 index 0000000..fce8d63 Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2023-05-29 at 18.40.45.png differ diff --git a/docs/.gitbook/assets/Screenshot 2023-05-29 at 18.44.29.png b/docs/.gitbook/assets/Screenshot 2023-05-29 at 18.44.29.png new file mode 100644 index 0000000..3b28b57 Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2023-05-29 at 18.44.29.png differ diff --git a/docs/.gitbook/assets/Screenshot 2023-05-29 at 18.44.49.png b/docs/.gitbook/assets/Screenshot 2023-05-29 at 18.44.49.png new file mode 100644 index 0000000..41b98de Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2023-05-29 at 18.44.49.png differ diff --git a/docs/.gitbook/assets/Screenshot 2023-10-26 at 14.25.02.png b/docs/.gitbook/assets/Screenshot 2023-10-26 at 14.25.02.png new file mode 100644 index 0000000..c4bf632 Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2023-10-26 at 14.25.02.png differ diff --git a/docs/.gitbook/assets/Screenshot 2024-02-02 at 22.15.15.png b/docs/.gitbook/assets/Screenshot 2024-02-02 at 22.15.15.png new file mode 100644 index 0000000..790dd25 Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2024-02-02 at 22.15.15.png differ diff --git a/docs/.gitbook/assets/Screenshot 2024-02-12 at 16.59.40.png b/docs/.gitbook/assets/Screenshot 2024-02-12 at 16.59.40.png new file mode 100644 index 0000000..069772a Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2024-02-12 at 16.59.40.png differ diff --git a/docs/.gitbook/assets/Screenshot 2024-02-12 at 17.00.11.png b/docs/.gitbook/assets/Screenshot 2024-02-12 at 17.00.11.png new file mode 100644 index 0000000..80a8037 Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2024-02-12 at 17.00.11.png differ diff --git a/docs/.gitbook/assets/Screenshot 2024-03-05 at 17.12.46.png b/docs/.gitbook/assets/Screenshot 2024-03-05 at 17.12.46.png new file mode 100644 index 0000000..a1cd8e7 Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2024-03-05 at 17.12.46.png differ diff --git a/docs/.gitbook/assets/Screenshot 2024-04-30 at 19.07.13.png b/docs/.gitbook/assets/Screenshot 2024-04-30 at 19.07.13.png new file mode 100644 index 0000000..b4e58f3 Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2024-04-30 at 19.07.13.png differ diff --git a/docs/.gitbook/assets/Screenshot 2024-05-16 at 17.39.32.png b/docs/.gitbook/assets/Screenshot 2024-05-16 at 17.39.32.png new file mode 100644 index 0000000..e297aa8 Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2024-05-16 at 17.39.32.png differ diff --git a/docs/.gitbook/assets/Screenshot 2024-06-13 at 22.59.48.png b/docs/.gitbook/assets/Screenshot 2024-06-13 at 22.59.48.png new file mode 100644 index 0000000..85026be Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2024-06-13 at 22.59.48.png differ diff --git a/docs/.gitbook/assets/Screenshot 2024-06-13 at 23.54.29.png b/docs/.gitbook/assets/Screenshot 2024-06-13 at 23.54.29.png new file mode 100644 index 0000000..ce7bbe7 Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2024-06-13 at 23.54.29.png differ diff --git a/docs/.gitbook/assets/Screenshot 2024-08-21 at 12.39.27.png b/docs/.gitbook/assets/Screenshot 2024-08-21 at 12.39.27.png new file mode 100644 index 0000000..08160c5 Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2024-08-21 at 12.39.27.png differ diff --git a/docs/.gitbook/assets/Screenshot 2024-08-21 at 12.44.41.png b/docs/.gitbook/assets/Screenshot 2024-08-21 at 12.44.41.png new file mode 100644 index 0000000..3e8011c Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2024-08-21 at 12.44.41.png differ diff --git a/docs/.gitbook/assets/Screenshot 2024-08-21 at 12.53.37.png b/docs/.gitbook/assets/Screenshot 2024-08-21 at 12.53.37.png new file mode 100644 index 0000000..f0bf346 Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2024-08-21 at 12.53.37.png differ diff --git a/docs/.gitbook/assets/Screenshot 2024-08-21 at 12.56.45.png b/docs/.gitbook/assets/Screenshot 2024-08-21 at 12.56.45.png new file mode 100644 index 0000000..8c4b99d Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2024-08-21 at 12.56.45.png differ diff --git a/docs/.gitbook/assets/Screenshot 2024-08-21 at 12.59.35.png b/docs/.gitbook/assets/Screenshot 2024-08-21 at 12.59.35.png new file mode 100644 index 0000000..9dea8ae Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2024-08-21 at 12.59.35.png differ diff --git a/docs/.gitbook/assets/Screenshot 2024-08-21 at 12.59.52.png b/docs/.gitbook/assets/Screenshot 2024-08-21 at 12.59.52.png new file mode 100644 index 0000000..89e8e1c Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2024-08-21 at 12.59.52.png differ diff --git a/docs/.gitbook/assets/Screenshot 2024-08-21 at 13.05.55.png b/docs/.gitbook/assets/Screenshot 2024-08-21 at 13.05.55.png new file mode 100644 index 0000000..b49c33e Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2024-08-21 at 13.05.55.png differ diff --git a/docs/.gitbook/assets/Screenshot 2024-08-21 at 13.40.17 (1).png b/docs/.gitbook/assets/Screenshot 2024-08-21 at 13.40.17 (1).png new file mode 100644 index 0000000..0e2a6a7 Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2024-08-21 at 13.40.17 (1).png differ diff --git a/docs/.gitbook/assets/Screenshot 2024-08-21 at 13.40.17.png b/docs/.gitbook/assets/Screenshot 2024-08-21 at 13.40.17.png new file mode 100644 index 0000000..0e2a6a7 Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2024-08-21 at 13.40.17.png differ diff --git a/docs/.gitbook/assets/Screenshot 2024-08-21 at 13.54.37.png b/docs/.gitbook/assets/Screenshot 2024-08-21 at 13.54.37.png new file mode 100644 index 0000000..8e6b26d Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2024-08-21 at 13.54.37.png differ diff --git a/docs/.gitbook/assets/Screenshot 2024-08-21 at 14.17.51.png b/docs/.gitbook/assets/Screenshot 2024-08-21 at 14.17.51.png new file mode 100644 index 0000000..925ba01 Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2024-08-21 at 14.17.51.png differ diff --git a/docs/.gitbook/assets/Screenshot 2024-08-21 at 14.33.50.png b/docs/.gitbook/assets/Screenshot 2024-08-21 at 14.33.50.png new file mode 100644 index 0000000..238c946 Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2024-08-21 at 14.33.50.png differ diff --git a/docs/.gitbook/assets/Screenshot 2024-08-21 at 14.36.47.png b/docs/.gitbook/assets/Screenshot 2024-08-21 at 14.36.47.png new file mode 100644 index 0000000..c8255a0 Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2024-08-21 at 14.36.47.png differ diff --git a/docs/.gitbook/assets/Screenshot 2024-08-21 at 14.38.56.png b/docs/.gitbook/assets/Screenshot 2024-08-21 at 14.38.56.png new file mode 100644 index 0000000..662e77a Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2024-08-21 at 14.38.56.png differ diff --git a/docs/.gitbook/assets/Screenshot 2024-08-30 at 10.34.53.png b/docs/.gitbook/assets/Screenshot 2024-08-30 at 10.34.53.png new file mode 100644 index 0000000..a99fb1f Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2024-08-30 at 10.34.53.png differ diff --git a/docs/.gitbook/assets/Screenshot 2024-08-30 at 10.35.28.png b/docs/.gitbook/assets/Screenshot 2024-08-30 at 10.35.28.png new file mode 100644 index 0000000..543cb3e Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2024-08-30 at 10.35.28.png differ diff --git a/docs/.gitbook/assets/Screenshot 2024-08-30 at 10.37.04.png b/docs/.gitbook/assets/Screenshot 2024-08-30 at 10.37.04.png new file mode 100644 index 0000000..d20290a Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2024-08-30 at 10.37.04.png differ diff --git a/docs/.gitbook/assets/Screenshot 2024-08-30 at 11.13.29.png b/docs/.gitbook/assets/Screenshot 2024-08-30 at 11.13.29.png new file mode 100644 index 0000000..c07a0c0 Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2024-08-30 at 11.13.29.png differ diff --git a/docs/.gitbook/assets/Screenshot 2024-10-03 at 14.39.12.png b/docs/.gitbook/assets/Screenshot 2024-10-03 at 14.39.12.png new file mode 100644 index 0000000..ed4de04 Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2024-10-03 at 14.39.12.png differ diff --git a/docs/.gitbook/assets/Screenshot 2024-10-04 at 11.07.59.png b/docs/.gitbook/assets/Screenshot 2024-10-04 at 11.07.59.png new file mode 100644 index 0000000..7f7285f Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2024-10-04 at 11.07.59.png differ diff --git a/docs/.gitbook/assets/Screenshot 2024-10-23 at 13.53.20.png b/docs/.gitbook/assets/Screenshot 2024-10-23 at 13.53.20.png new file mode 100644 index 0000000..331b873 Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2024-10-23 at 13.53.20.png differ diff --git a/docs/.gitbook/assets/Screenshot 2024-10-23 at 13.58.52.png b/docs/.gitbook/assets/Screenshot 2024-10-23 at 13.58.52.png new file mode 100644 index 0000000..e505699 Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2024-10-23 at 13.58.52.png differ diff --git a/docs/.gitbook/assets/Screenshot 2024-10-29 at 13.40.30.png b/docs/.gitbook/assets/Screenshot 2024-10-29 at 13.40.30.png new file mode 100644 index 0000000..04fa664 Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2024-10-29 at 13.40.30.png differ diff --git a/docs/.gitbook/assets/Screenshot 2024-10-29 at 13.45.57.png b/docs/.gitbook/assets/Screenshot 2024-10-29 at 13.45.57.png new file mode 100644 index 0000000..85fa19e Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2024-10-29 at 13.45.57.png differ diff --git a/docs/.gitbook/assets/Screenshot 2024-10-29 at 13.46.24.png b/docs/.gitbook/assets/Screenshot 2024-10-29 at 13.46.24.png new file mode 100644 index 0000000..2d1c1ea Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2024-10-29 at 13.46.24.png differ diff --git a/docs/.gitbook/assets/Screenshot 2024-10-29 at 13.47.57.png b/docs/.gitbook/assets/Screenshot 2024-10-29 at 13.47.57.png new file mode 100644 index 0000000..a01711d Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2024-10-29 at 13.47.57.png differ diff --git a/docs/.gitbook/assets/Screenshot 2024-10-29 at 15.32.45.png b/docs/.gitbook/assets/Screenshot 2024-10-29 at 15.32.45.png new file mode 100644 index 0000000..e43de50 Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2024-10-29 at 15.32.45.png differ diff --git a/docs/.gitbook/assets/Screenshot 2024-11-13 at 13.27.41.png b/docs/.gitbook/assets/Screenshot 2024-11-13 at 13.27.41.png new file mode 100644 index 0000000..5252996 Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2024-11-13 at 13.27.41.png differ diff --git a/docs/.gitbook/assets/Screenshot 2024-12-27 at 15.24.23.png b/docs/.gitbook/assets/Screenshot 2024-12-27 at 15.24.23.png new file mode 100644 index 0000000..f431a95 Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2024-12-27 at 15.24.23.png differ diff --git a/docs/.gitbook/assets/Screenshot 2024-12-27 at 15.24.48.png b/docs/.gitbook/assets/Screenshot 2024-12-27 at 15.24.48.png new file mode 100644 index 0000000..5b25e5d Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2024-12-27 at 15.24.48.png differ diff --git a/docs/.gitbook/assets/Screenshot 2024-12-27 at 15.25.21.png b/docs/.gitbook/assets/Screenshot 2024-12-27 at 15.25.21.png new file mode 100644 index 0000000..188cba8 Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2024-12-27 at 15.25.21.png differ diff --git a/docs/.gitbook/assets/Screenshot 2024-12-27 at 15.26.15.png b/docs/.gitbook/assets/Screenshot 2024-12-27 at 15.26.15.png new file mode 100644 index 0000000..00dcff0 Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2024-12-27 at 15.26.15.png differ diff --git a/docs/.gitbook/assets/Screenshot 2024-12-27 at 15.26.46.png b/docs/.gitbook/assets/Screenshot 2024-12-27 at 15.26.46.png new file mode 100644 index 0000000..a960941 Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2024-12-27 at 15.26.46.png differ diff --git a/docs/.gitbook/assets/Screenshot 2024-12-27 at 15.28.41.png b/docs/.gitbook/assets/Screenshot 2024-12-27 at 15.28.41.png new file mode 100644 index 0000000..97d49f1 Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2024-12-27 at 15.28.41.png differ diff --git a/docs/.gitbook/assets/Screenshot 2024-12-27 at 15.29.18.png b/docs/.gitbook/assets/Screenshot 2024-12-27 at 15.29.18.png new file mode 100644 index 0000000..fad67a2 Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2024-12-27 at 15.29.18.png differ diff --git a/docs/.gitbook/assets/Screenshot 2024-12-27 at 15.29.46.png b/docs/.gitbook/assets/Screenshot 2024-12-27 at 15.29.46.png new file mode 100644 index 0000000..7d23d5e Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2024-12-27 at 15.29.46.png differ diff --git a/docs/.gitbook/assets/Screenshot 2024-12-27 at 15.30.30.png b/docs/.gitbook/assets/Screenshot 2024-12-27 at 15.30.30.png new file mode 100644 index 0000000..6dd484c Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2024-12-27 at 15.30.30.png differ diff --git a/docs/.gitbook/assets/Screenshot 2024-12-27 at 15.31.13.png b/docs/.gitbook/assets/Screenshot 2024-12-27 at 15.31.13.png new file mode 100644 index 0000000..9467c42 Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2024-12-27 at 15.31.13.png differ diff --git a/docs/.gitbook/assets/Screenshot 2024-12-27 at 15.32.19.png b/docs/.gitbook/assets/Screenshot 2024-12-27 at 15.32.19.png new file mode 100644 index 0000000..92cdafe Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2024-12-27 at 15.32.19.png differ diff --git a/docs/.gitbook/assets/Screenshot 2024-12-27 at 15.34.43.png b/docs/.gitbook/assets/Screenshot 2024-12-27 at 15.34.43.png new file mode 100644 index 0000000..a75ed36 Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2024-12-27 at 15.34.43.png differ diff --git a/docs/.gitbook/assets/Screenshot 2024-12-27 at 15.36.01.png b/docs/.gitbook/assets/Screenshot 2024-12-27 at 15.36.01.png new file mode 100644 index 0000000..1365eb1 Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2024-12-27 at 15.36.01.png differ diff --git a/docs/.gitbook/assets/Screenshot 2025-01-29 at 13.34.32.png b/docs/.gitbook/assets/Screenshot 2025-01-29 at 13.34.32.png new file mode 100644 index 0000000..4fd20c0 Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2025-01-29 at 13.34.32.png differ diff --git a/docs/.gitbook/assets/Screenshot 2025-02-26 at 17.22.36.png b/docs/.gitbook/assets/Screenshot 2025-02-26 at 17.22.36.png new file mode 100644 index 0000000..eb52c1d Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2025-02-26 at 17.22.36.png differ diff --git a/docs/.gitbook/assets/Screenshot 2025-05-09 at 14.38.38.png b/docs/.gitbook/assets/Screenshot 2025-05-09 at 14.38.38.png new file mode 100644 index 0000000..458432a Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2025-05-09 at 14.38.38.png differ diff --git a/docs/.gitbook/assets/Screenshot 2025-05-09 at 14.39.57.png b/docs/.gitbook/assets/Screenshot 2025-05-09 at 14.39.57.png new file mode 100644 index 0000000..3b850db Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2025-05-09 at 14.39.57.png differ diff --git a/docs/.gitbook/assets/Screenshot 2025-08-13 at 10.17.48.png b/docs/.gitbook/assets/Screenshot 2025-08-13 at 10.17.48.png new file mode 100644 index 0000000..431352f Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2025-08-13 at 10.17.48.png differ diff --git "a/docs/.gitbook/assets/Screenshot 2025-10-03 at 11.59.03\342\200\257PM.png" "b/docs/.gitbook/assets/Screenshot 2025-10-03 at 11.59.03\342\200\257PM.png" new file mode 100644 index 0000000..81f7129 Binary files /dev/null and "b/docs/.gitbook/assets/Screenshot 2025-10-03 at 11.59.03\342\200\257PM.png" differ diff --git "a/docs/.gitbook/assets/Screenshot 2025-10-03 at 11.59.42\342\200\257PM.png" "b/docs/.gitbook/assets/Screenshot 2025-10-03 at 11.59.42\342\200\257PM.png" new file mode 100644 index 0000000..7fd8d85 Binary files /dev/null and "b/docs/.gitbook/assets/Screenshot 2025-10-03 at 11.59.42\342\200\257PM.png" differ diff --git "a/docs/.gitbook/assets/Screenshot 2025-10-04 at 12.00.32\342\200\257AM.png" "b/docs/.gitbook/assets/Screenshot 2025-10-04 at 12.00.32\342\200\257AM.png" new file mode 100644 index 0000000..f197d8a Binary files /dev/null and "b/docs/.gitbook/assets/Screenshot 2025-10-04 at 12.00.32\342\200\257AM.png" differ diff --git "a/docs/.gitbook/assets/Screenshot 2025-10-04 at 12.00.38\342\200\257AM.png" "b/docs/.gitbook/assets/Screenshot 2025-10-04 at 12.00.38\342\200\257AM.png" new file mode 100644 index 0000000..56c3e06 Binary files /dev/null and "b/docs/.gitbook/assets/Screenshot 2025-10-04 at 12.00.38\342\200\257AM.png" differ diff --git "a/docs/.gitbook/assets/Screenshot 2025-10-13 at 7.38.41\342\200\257AM.png" "b/docs/.gitbook/assets/Screenshot 2025-10-13 at 7.38.41\342\200\257AM.png" new file mode 100644 index 0000000..878085a Binary files /dev/null and "b/docs/.gitbook/assets/Screenshot 2025-10-13 at 7.38.41\342\200\257AM.png" differ diff --git a/docs/.gitbook/assets/Screenshot 2025-10-31 at 17.02.59.png b/docs/.gitbook/assets/Screenshot 2025-10-31 at 17.02.59.png new file mode 100644 index 0000000..6ea5562 Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2025-10-31 at 17.02.59.png differ diff --git a/docs/.gitbook/assets/Screenshot 2025-11-05 at 14.35.12.png b/docs/.gitbook/assets/Screenshot 2025-11-05 at 14.35.12.png new file mode 100644 index 0000000..607f952 Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2025-11-05 at 14.35.12.png differ diff --git a/docs/.gitbook/assets/Screenshot 2025-11-05 at 14.40.17.png b/docs/.gitbook/assets/Screenshot 2025-11-05 at 14.40.17.png new file mode 100644 index 0000000..e138ab6 Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2025-11-05 at 14.40.17.png differ diff --git a/docs/.gitbook/assets/Screenshot 2025-11-05 at 15.21.06.png b/docs/.gitbook/assets/Screenshot 2025-11-05 at 15.21.06.png new file mode 100644 index 0000000..eff5992 Binary files /dev/null and b/docs/.gitbook/assets/Screenshot 2025-11-05 at 15.21.06.png differ diff --git a/docs/.gitbook/assets/Staking updates.png b/docs/.gitbook/assets/Staking updates.png new file mode 100644 index 0000000..96cc93f Binary files /dev/null and b/docs/.gitbook/assets/Staking updates.png differ diff --git a/docs/.gitbook/assets/TRAC token (1).png b/docs/.gitbook/assets/TRAC token (1).png new file mode 100644 index 0000000..8c32f74 Binary files /dev/null and b/docs/.gitbook/assets/TRAC token (1).png differ diff --git a/docs/.gitbook/assets/TRAC token.png b/docs/.gitbook/assets/TRAC token.png new file mode 100644 index 0000000..7c2e6db Binary files /dev/null and b/docs/.gitbook/assets/TRAC token.png differ diff --git a/docs/.gitbook/assets/V8 - Random sampling sequence diagram.png b/docs/.gitbook/assets/V8 - Random sampling sequence diagram.png new file mode 100644 index 0000000..8a20c98 Binary files /dev/null and b/docs/.gitbook/assets/V8 - Random sampling sequence diagram.png differ diff --git a/docs/.gitbook/assets/V8 DKG SDK.png b/docs/.gitbook/assets/V8 DKG SDK.png new file mode 100644 index 0000000..f5dc3fa Binary files /dev/null and b/docs/.gitbook/assets/V8 DKG SDK.png differ diff --git a/docs/.gitbook/assets/V8 Timeline (1).png b/docs/.gitbook/assets/V8 Timeline (1).png new file mode 100644 index 0000000..7b4af39 Binary files /dev/null and b/docs/.gitbook/assets/V8 Timeline (1).png differ diff --git a/docs/.gitbook/assets/V8 Timeline.png b/docs/.gitbook/assets/V8 Timeline.png new file mode 100644 index 0000000..7b4af39 Binary files /dev/null and b/docs/.gitbook/assets/V8 Timeline.png differ diff --git a/docs/.gitbook/assets/V8 diagrams - Radnom sampling.png b/docs/.gitbook/assets/V8 diagrams - Radnom sampling.png new file mode 100644 index 0000000..e60695d Binary files /dev/null and b/docs/.gitbook/assets/V8 diagrams - Radnom sampling.png differ diff --git a/docs/.gitbook/assets/V8.1 - random sampling.png b/docs/.gitbook/assets/V8.1 - random sampling.png new file mode 100644 index 0000000..afe304d Binary files /dev/null and b/docs/.gitbook/assets/V8.1 - random sampling.png differ diff --git a/docs/.gitbook/assets/What is the DKG Node.png b/docs/.gitbook/assets/What is the DKG Node.png new file mode 100644 index 0000000..366f776 Binary files /dev/null and b/docs/.gitbook/assets/What is the DKG Node.png differ diff --git a/docs/.gitbook/assets/aa (1).png b/docs/.gitbook/assets/aa (1).png new file mode 100644 index 0000000..c7922f5 Binary files /dev/null and b/docs/.gitbook/assets/aa (1).png differ diff --git a/docs/.gitbook/assets/aa.png b/docs/.gitbook/assets/aa.png new file mode 100644 index 0000000..c7922f5 Binary files /dev/null and b/docs/.gitbook/assets/aa.png differ diff --git a/docs/.gitbook/assets/add_withdraw_stake.png b/docs/.gitbook/assets/add_withdraw_stake.png new file mode 100644 index 0000000..71ec911 Binary files /dev/null and b/docs/.gitbook/assets/add_withdraw_stake.png differ diff --git a/docs/.gitbook/assets/bg-ot.jpeg b/docs/.gitbook/assets/bg-ot.jpeg new file mode 100644 index 0000000..02c5ff0 Binary files /dev/null and b/docs/.gitbook/assets/bg-ot.jpeg differ diff --git a/docs/.gitbook/assets/cd047b82028517bdab2e9e45d3797721ee137160c99af33aef73b455717c7f7f.jpeg b/docs/.gitbook/assets/cd047b82028517bdab2e9e45d3797721ee137160c99af33aef73b455717c7f7f.jpeg new file mode 100644 index 0000000..2a6313f Binary files /dev/null and b/docs/.gitbook/assets/cd047b82028517bdab2e9e45d3797721ee137160c99af33aef73b455717c7f7f.jpeg differ diff --git a/docs/.gitbook/assets/command-executor-api.png b/docs/.gitbook/assets/command-executor-api.png new file mode 100644 index 0000000..422fffd Binary files /dev/null and b/docs/.gitbook/assets/command-executor-api.png differ diff --git a/docs/.gitbook/assets/commandExecutoradd.png b/docs/.gitbook/assets/commandExecutoradd.png new file mode 100644 index 0000000..9e9e663 Binary files /dev/null and b/docs/.gitbook/assets/commandExecutoradd.png differ diff --git a/docs/.gitbook/assets/connect_wallet.png b/docs/.gitbook/assets/connect_wallet.png new file mode 100644 index 0000000..544bb7c Binary files /dev/null and b/docs/.gitbook/assets/connect_wallet.png differ diff --git a/docs/.gitbook/assets/delay&period.png b/docs/.gitbook/assets/delay&period.png new file mode 100644 index 0000000..7bedbff Binary files /dev/null and b/docs/.gitbook/assets/delay&period.png differ diff --git a/docs/.gitbook/assets/delegate.png b/docs/.gitbook/assets/delegate.png new file mode 100644 index 0000000..42ccee7 Binary files /dev/null and b/docs/.gitbook/assets/delegate.png differ diff --git a/docs/.gitbook/assets/dkg-js-banner.jpg b/docs/.gitbook/assets/dkg-js-banner.jpg new file mode 100644 index 0000000..4f6bca2 Binary files /dev/null and b/docs/.gitbook/assets/dkg-js-banner.jpg differ diff --git a/docs/.gitbook/assets/handleHttpApiPublishRequest.png b/docs/.gitbook/assets/handleHttpApiPublishRequest.png new file mode 100644 index 0000000..fc827cc Binary files /dev/null and b/docs/.gitbook/assets/handleHttpApiPublishRequest.png differ diff --git a/docs/.gitbook/assets/https___files.gitbook.com_v0_b_gitbook-x-prod.appspot.com_o_spaces%2F-McnEkhdd7JlySeckfHM%2Fuploads%2F1LB2XdwetIONYxxI3xj3%2FAutomated%20environment%20setup (1).png b/docs/.gitbook/assets/https___files.gitbook.com_v0_b_gitbook-x-prod.appspot.com_o_spaces%2F-McnEkhdd7JlySeckfHM%2Fuploads%2F1LB2XdwetIONYxxI3xj3%2FAutomated%20environment%20setup (1).png new file mode 100644 index 0000000..e06af9f Binary files /dev/null and b/docs/.gitbook/assets/https___files.gitbook.com_v0_b_gitbook-x-prod.appspot.com_o_spaces%2F-McnEkhdd7JlySeckfHM%2Fuploads%2F1LB2XdwetIONYxxI3xj3%2FAutomated%20environment%20setup (1).png differ diff --git a/docs/.gitbook/assets/image (1).png b/docs/.gitbook/assets/image (1).png new file mode 100644 index 0000000..affde22 Binary files /dev/null and b/docs/.gitbook/assets/image (1).png differ diff --git a/docs/.gitbook/assets/image (10).png b/docs/.gitbook/assets/image (10).png new file mode 100644 index 0000000..6ae4c41 Binary files /dev/null and b/docs/.gitbook/assets/image (10).png differ diff --git a/docs/.gitbook/assets/image (11).png b/docs/.gitbook/assets/image (11).png new file mode 100644 index 0000000..c55342e Binary files /dev/null and b/docs/.gitbook/assets/image (11).png differ diff --git a/docs/.gitbook/assets/image (12).png b/docs/.gitbook/assets/image (12).png new file mode 100644 index 0000000..d273de7 Binary files /dev/null and b/docs/.gitbook/assets/image (12).png differ diff --git a/docs/.gitbook/assets/image (13).png b/docs/.gitbook/assets/image (13).png new file mode 100644 index 0000000..2f2580f Binary files /dev/null and b/docs/.gitbook/assets/image (13).png differ diff --git a/docs/.gitbook/assets/image (14).png b/docs/.gitbook/assets/image (14).png new file mode 100644 index 0000000..c4e43b8 Binary files /dev/null and b/docs/.gitbook/assets/image (14).png differ diff --git a/docs/.gitbook/assets/image (15).png b/docs/.gitbook/assets/image (15).png new file mode 100644 index 0000000..f82ebc2 Binary files /dev/null and b/docs/.gitbook/assets/image (15).png differ diff --git a/docs/.gitbook/assets/image (16).png b/docs/.gitbook/assets/image (16).png new file mode 100644 index 0000000..6ae4c41 Binary files /dev/null and b/docs/.gitbook/assets/image (16).png differ diff --git a/docs/.gitbook/assets/image (17).png b/docs/.gitbook/assets/image (17).png new file mode 100644 index 0000000..e0dfdcf Binary files /dev/null and b/docs/.gitbook/assets/image (17).png differ diff --git a/docs/.gitbook/assets/image (18).png b/docs/.gitbook/assets/image (18).png new file mode 100644 index 0000000..66d18f7 Binary files /dev/null and b/docs/.gitbook/assets/image (18).png differ diff --git a/docs/.gitbook/assets/image (19).png b/docs/.gitbook/assets/image (19).png new file mode 100644 index 0000000..92e58ca Binary files /dev/null and b/docs/.gitbook/assets/image (19).png differ diff --git a/docs/.gitbook/assets/image (2).png b/docs/.gitbook/assets/image (2).png new file mode 100644 index 0000000..e8ada43 Binary files /dev/null and b/docs/.gitbook/assets/image (2).png differ diff --git a/docs/.gitbook/assets/image (20).png b/docs/.gitbook/assets/image (20).png new file mode 100644 index 0000000..cdfd5c9 Binary files /dev/null and b/docs/.gitbook/assets/image (20).png differ diff --git a/docs/.gitbook/assets/image (21).png b/docs/.gitbook/assets/image (21).png new file mode 100644 index 0000000..cdfd5c9 Binary files /dev/null and b/docs/.gitbook/assets/image (21).png differ diff --git a/docs/.gitbook/assets/image (22).png b/docs/.gitbook/assets/image (22).png new file mode 100644 index 0000000..917eb29 Binary files /dev/null and b/docs/.gitbook/assets/image (22).png differ diff --git a/docs/.gitbook/assets/image (23).png b/docs/.gitbook/assets/image (23).png new file mode 100644 index 0000000..fe204cc Binary files /dev/null and b/docs/.gitbook/assets/image (23).png differ diff --git a/docs/.gitbook/assets/image (24).png b/docs/.gitbook/assets/image (24).png new file mode 100644 index 0000000..b9806f5 Binary files /dev/null and b/docs/.gitbook/assets/image (24).png differ diff --git a/docs/.gitbook/assets/image (25).png b/docs/.gitbook/assets/image (25).png new file mode 100644 index 0000000..21cb2c5 Binary files /dev/null and b/docs/.gitbook/assets/image (25).png differ diff --git a/docs/.gitbook/assets/image (3).png b/docs/.gitbook/assets/image (3).png new file mode 100644 index 0000000..f8599c4 Binary files /dev/null and b/docs/.gitbook/assets/image (3).png differ diff --git a/docs/.gitbook/assets/image (4).png b/docs/.gitbook/assets/image (4).png new file mode 100644 index 0000000..5f87075 Binary files /dev/null and b/docs/.gitbook/assets/image (4).png differ diff --git a/docs/.gitbook/assets/image (5).png b/docs/.gitbook/assets/image (5).png new file mode 100644 index 0000000..50c53a1 Binary files /dev/null and b/docs/.gitbook/assets/image (5).png differ diff --git a/docs/.gitbook/assets/image (6).png b/docs/.gitbook/assets/image (6).png new file mode 100644 index 0000000..a65363c Binary files /dev/null and b/docs/.gitbook/assets/image (6).png differ diff --git a/docs/.gitbook/assets/image (7).png b/docs/.gitbook/assets/image (7).png new file mode 100644 index 0000000..6bb2281 Binary files /dev/null and b/docs/.gitbook/assets/image (7).png differ diff --git a/docs/.gitbook/assets/image (8).png b/docs/.gitbook/assets/image (8).png new file mode 100644 index 0000000..6bb2281 Binary files /dev/null and b/docs/.gitbook/assets/image (8).png differ diff --git a/docs/.gitbook/assets/image (9).png b/docs/.gitbook/assets/image (9).png new file mode 100644 index 0000000..b067a47 Binary files /dev/null and b/docs/.gitbook/assets/image (9).png differ diff --git a/docs/.gitbook/assets/image.png b/docs/.gitbook/assets/image.png new file mode 100644 index 0000000..630b50a Binary files /dev/null and b/docs/.gitbook/assets/image.png differ diff --git a/docs/.gitbook/assets/initiate-teleport.png b/docs/.gitbook/assets/initiate-teleport.png new file mode 100644 index 0000000..ce3db07 Binary files /dev/null and b/docs/.gitbook/assets/initiate-teleport.png differ diff --git a/docs/.gitbook/assets/kk.png b/docs/.gitbook/assets/kk.png new file mode 100644 index 0000000..1a1459e Binary files /dev/null and b/docs/.gitbook/assets/kk.png differ diff --git a/docs/.gitbook/assets/ld-example-p1.png b/docs/.gitbook/assets/ld-example-p1.png new file mode 100644 index 0000000..103722e Binary files /dev/null and b/docs/.gitbook/assets/ld-example-p1.png differ diff --git a/docs/.gitbook/assets/ld-example-p2.png b/docs/.gitbook/assets/ld-example-p2.png new file mode 100644 index 0000000..3093dc4 Binary files /dev/null and b/docs/.gitbook/assets/ld-example-p2.png differ diff --git a/docs/.gitbook/assets/lde1.png b/docs/.gitbook/assets/lde1.png new file mode 100644 index 0000000..a5d020e Binary files /dev/null and b/docs/.gitbook/assets/lde1.png differ diff --git a/docs/.gitbook/assets/lde2.png b/docs/.gitbook/assets/lde2.png new file mode 100644 index 0000000..e48a3c2 Binary files /dev/null and b/docs/.gitbook/assets/lde2.png differ diff --git a/docs/.gitbook/assets/mind map - Copy of Page 13.png b/docs/.gitbook/assets/mind map - Copy of Page 13.png new file mode 100644 index 0000000..046141a Binary files /dev/null and b/docs/.gitbook/assets/mind map - Copy of Page 13.png differ diff --git a/docs/.gitbook/assets/newcommand.png b/docs/.gitbook/assets/newcommand.png new file mode 100644 index 0000000..d4aa0e9 Binary files /dev/null and b/docs/.gitbook/assets/newcommand.png differ diff --git a/docs/.gitbook/assets/node_wallets.png b/docs/.gitbook/assets/node_wallets.png new file mode 100644 index 0000000..720b309 Binary files /dev/null and b/docs/.gitbook/assets/node_wallets.png differ diff --git a/docs/.gitbook/assets/overview_section.png b/docs/.gitbook/assets/overview_section.png new file mode 100644 index 0000000..936e987 Binary files /dev/null and b/docs/.gitbook/assets/overview_section.png differ diff --git a/docs/.gitbook/assets/publishStartedCommand (1).png b/docs/.gitbook/assets/publishStartedCommand (1).png new file mode 100644 index 0000000..4ad2695 Binary files /dev/null and b/docs/.gitbook/assets/publishStartedCommand (1).png differ diff --git a/docs/.gitbook/assets/publishStartedCommand (2).png b/docs/.gitbook/assets/publishStartedCommand (2).png new file mode 100644 index 0000000..34afa2b Binary files /dev/null and b/docs/.gitbook/assets/publishStartedCommand (2).png differ diff --git a/docs/.gitbook/assets/publishStartedCommand (3).png b/docs/.gitbook/assets/publishStartedCommand (3).png new file mode 100644 index 0000000..5bb201e Binary files /dev/null and b/docs/.gitbook/assets/publishStartedCommand (3).png differ diff --git a/docs/.gitbook/assets/publishStartedCommand.png b/docs/.gitbook/assets/publishStartedCommand.png new file mode 100644 index 0000000..8d5aac5 Binary files /dev/null and b/docs/.gitbook/assets/publishStartedCommand.png differ diff --git a/docs/.gitbook/assets/service_tokenomics.png b/docs/.gitbook/assets/service_tokenomics.png new file mode 100644 index 0000000..26d4fa1 Binary files /dev/null and b/docs/.gitbook/assets/service_tokenomics.png differ diff --git a/docs/.gitbook/assets/staking-ui.png b/docs/.gitbook/assets/staking-ui.png new file mode 100644 index 0000000..0100704 Binary files /dev/null and b/docs/.gitbook/assets/staking-ui.png differ diff --git a/docs/.gitbook/assets/thub.png b/docs/.gitbook/assets/thub.png new file mode 100644 index 0000000..cd58bad Binary files /dev/null and b/docs/.gitbook/assets/thub.png differ diff --git a/docs/.gitbook/assets/trac_metamask.png b/docs/.gitbook/assets/trac_metamask.png new file mode 100644 index 0000000..ffa5994 Binary files /dev/null and b/docs/.gitbook/assets/trac_metamask.png differ diff --git a/docs/.gitbook/assets/withdrwa.png b/docs/.gitbook/assets/withdrwa.png new file mode 100644 index 0000000..58988a1 Binary files /dev/null and b/docs/.gitbook/assets/withdrwa.png differ diff --git a/docs/README.md b/docs/README.md new file mode 100644 index 0000000..f4858a1 --- /dev/null +++ b/docs/README.md @@ -0,0 +1,33 @@ +# Introduction + +OriginTrail is an ecosystem building **collective, trusted memory** for AI. The core ecosystem technology is the **Decentralized Knowledge Graph (DKG)**, a decentralized, permissionless network of nodes, through which both humans and machines can share knowledge, reason together, and preserve context across time. + +Modern AI is powerful but ungrounded. It predicts without knowing, hallucinates, forgets what it said, and relies on data controlled by a few centralized platforms. LLMs in particular have an "explainability" problem - why did an LLM respond a certain way, based on what knowledge, coming from which source? + +The **Decentralized Knowledge Graph (DKG)** hosts **Knowledge Assets** that encode facts, data provenance, and meaning in a tamper-proof way. The network is hosted by a set of independent **DKG Nodes**. Anyone can run a **DKG network Node** - organizations and individuals - contributing to the DKG and at the same time building upon their knowledge in a **privacy-preserving** way. Thus the DKG ensures that no single entity can rewrite, censor, or monopolize the collective memory. Decentralization keeps AI accountable, bias-resistant, and aligned with human diversity. + +OriginTrail’s **neuro-symbolic approach** combines the structure and reasoning of **symbolic AI** with the creativity and pattern recognition of **neural AI**. This enables AI systems to think with context, grounding their outputs in verifiable knowledge rather than probabilistic guesses. + +The DKG grows through human participation. Researchers, developers, and citizens can all publish, link, and improve knowledge — ensuring that the world’s intelligence is shaped by the many, not the few. + +**Core operations** + +* **Publishing knowledge:** Turning data into structured, verifiable Knowledge Assets +* **Knowledge Discovery:** Querying, traversing, and monetizing knowledge in the decentralized graph and its _paranets_ +* **Trusted sharing:** Cryptographically verify authenticity and provenance of knowledge +* **Neuro-symbolic Reasoning**: Infer new facts based on rules, leveraging graph-based reasoning in combination with LLMs and GenAI models + +We encourage developers to [try out the DKG Node](getting-started/decentralized-knowle-dge-graph-dkg.md) and build their first DKG based agent with it, to get a feel of what the technology can do. + +### Three ways to get started + +
Cover image
Build your DKG AgentBegin your journey on the Decentralized Knowledge Graph by staking TRAC and setting up your first DKG Node. This is your entry point into the verifiable knowledge economyDelegated Staking.png
Contrubute to the DKGActivate your DKG Node and join the decentralized knowledge economy by staking TRAC.DKG key concepts.pngdkg-key-concepts.md
Learn more about the DKGExplore the core concepts behind the DKG and how it powers verifiable, intelligent AI.DKG under the hood.png
+ +{% hint style="success" %} +This site is a constantly updated, work-in-progress official OriginTrail documentation built by the OriginTrail community and core developers. + +Find the "Edit on GitHub" button in the upper-right corner and submit your updates as pull requests (or do it directly in the [docs GitHub repo).](https://github.com/OriginTrail/dkg-docs) + +We appreciate any feedback, improvement ideas, and comments. +{% endhint %} + diff --git a/docs/SUMMARY.md b/docs/SUMMARY.md new file mode 100644 index 0000000..7bfeb31 --- /dev/null +++ b/docs/SUMMARY.md @@ -0,0 +1,169 @@ +# Table of contents + +* [Introduction](README.md) +* [DKG - Key concepts](dkg-key-concepts.md) + +## Getting Started + +* [Installation](getting-started/decentralized-knowle-dge-graph-dkg.md) +* [Interacting with your DKG Agent](getting-started/interacting-with-your-dkg-agent.md) +* [DKG Node Services](getting-started/dkg-node-services.md) +* [Basic Knowledge Asset operations](getting-started/basic-knowledge-asset-operations.md) +* [Security](getting-started/security.md) +* [Troubleshoot](getting-started/troubleshoot.md) + +## Build a DKG Node AI agent + +* [Architecture](build-a-dkg-node-ai-agent/architecture.md) +* [Essentials Plugin](build-a-dkg-node-ai-agent/essentials-plugin.md) +* [Customizing your DKG agent](build-a-dkg-node-ai-agent/customizing-your-dkg-agent.md) +* [Evaluating agent responses](build-a-dkg-node-ai-agent/evaluating-agent-responses.md) +* [Set up your custom DKG Node fork & update flow](build-a-dkg-node-ai-agent/set-up-your-custom-dkg-node-fork-and-update-flow.md) +* [Advanced features & toolkits](build-a-dkg-node-ai-agent/advanced-features-and-toolkits/README.md) + * [Query the DKG](build-a-dkg-node-ai-agent/advanced-features-and-toolkits/querying-the-dkg.md) + * [SDKs](build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-sdk/README.md) + * [Development environment setup](build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-sdk/setting-up-your-development-environment.md) + * [DKG Javascript SDK (dkg.js)](build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-sdk/dkg-v8-js-client/README.md) + * [Interact with DKG paranets](build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-sdk/dkg-v8-js-client/interact-with-dkg-paranets.md) + * [Permissioned paranets](build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-sdk/dkg-v8-js-client/permissioned-paranets.md) + * [Knowledge submission & curation](build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-sdk/dkg-v8-js-client/knowledge-submission-and-curation.md) + * [Paranet's incentives pool implementation](build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-sdk/dkg-v8-js-client/paranets-incentives-pool-implementation.md) + * [DKG Python SDK (dkg.py)](build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-sdk/dkg-v8-py-client/README.md) + * [Interact with DKG paranets](build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-sdk/dkg-v8-py-client/interact-with-dkg-paranets.md) + * [Paranets](build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-paranets/README.md) + * [Deploy a Paranet](build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-paranets/deploying-a-dkg-paranet.md) + * [Build with Paranets](build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-paranets/building-with-dkg-paranets.md) + * [Sync a Paranet](build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-paranets/syncing-a-dkg-paranet.md) + * [Initial Paranet Offerings (IPOs)](build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-paranets/initial-paranet-offerings-ipos/README.md) + * [IPO specification](build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-paranets/initial-paranet-offerings-ipos/ipo-specification.md) + * [Launch an IPO](build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-paranets/initial-paranet-offerings-ipos/launching-your-ipo.md) + * [Incentives pool](build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-paranets/initial-paranet-offerings-ipos/paranets-incentives-pool.md) + * [IPO voting](build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-paranets/initial-paranet-offerings-ipos/ipo-voting.md) +* [Contributing a plugin](build-a-dkg-node-ai-agent/contributing-a-plugin.md) + +## Contribute to the DKG + +* [Delegated staking](contribute-to-the-dkg/delegated-staking/README.md) + * [Step-by-step staking](contribute-to-the-dkg/delegated-staking/step-by-step-staking.md) + * [Redelegating stake](contribute-to-the-dkg/delegated-staking/redelegating-stake.md) +* [Whitepapers & RFCs](contribute-to-the-dkg/whitepapers-and-rfcs/README.md) + * [Whitepaper](contribute-to-the-dkg/whitepapers-and-rfcs/origintrail-whitepaper.md) + * [RFCs](contribute-to-the-dkg/whitepapers-and-rfcs/origintrail-rfcs.md) +* [Contribution guidelines](contribute-to-the-dkg/contribute/README.md) + * [Guidelines](contribute-to-the-dkg/contribute/guidelines-for-automated-test-contributions.md) +* [Bounties & rewards](contribute-to-the-dkg/bounties-and-rewards/README.md) + * [General bug bounty](contribute-to-the-dkg/bounties-and-rewards/general-bug-bounty/README.md) + * [Staking security bounty](contribute-to-the-dkg/bounties-and-rewards/general-bug-bounty/staking-security-bounty.md) + * [Code contributions & V8 bug bounty](contribute-to-the-dkg/bounties-and-rewards/code-contributions-and-v8-bug-bounty.md) +* [Ecosystem call for papers](contribute-to-the-dkg/ecosystem-call-for-papers/README.md) + * [OriginTrail Ecosystem — Call for Papers (Coming Soon)](contribute-to-the-dkg/ecosystem-call-for-papers/origintrail-ecosystem-call-for-papers-coming-soon.md) + +## Deploy your DKG Node To Production + +* [From Edge Node To Core Node](deploy-your-dkg-node-to-production/from-edge-node-to-core-node.md) + +## DKG Knowledge Hub + +* [Learn more](dkg-knowledge-hub/learn-more/README.md) + * [Understanding OriginTrail](dkg-knowledge-hub/learn-more/readme/README.md) + * [The OriginTrail Decentralized Knowledge Graph (DKG)](dkg-knowledge-hub/learn-more/readme/decentralized-knowle-dge-graph-dkg.md) + * [Development principles](dkg-knowledge-hub/learn-more/readme/development-principles.md) + * [Linked data & knowledge graphs](dkg-knowledge-hub/learn-more/readme/kg.md) + * [Core DKG concepts](dkg-knowledge-hub/learn-more/readme/dkg-key-concepts.md) + * [$TRAC token](dkg-knowledge-hub/learn-more/readme/usdtrac-token.md) + * [The DKG Node + MCP](dkg-knowledge-hub/learn-more/dkg-key-concepts/README.md) + * [What is MCP? (Model Context Protocol)](dkg-knowledge-hub/learn-more/dkg-key-concepts/what-is-mcp-model-context-protocol.md) + * [Why DKG Node & MCP combo?](dkg-knowledge-hub/learn-more/dkg-key-concepts/why-dkg-node-and-mcp-combo.md) + * [Using MCP on your DKG Node](dkg-knowledge-hub/learn-more/dkg-key-concepts/using-mcp-on-your-dkg-node.md) + * [Network mechanics & systems](dkg-knowledge-hub/learn-more/introduction/README.md) + * [DKG codebase & structure](dkg-knowledge-hub/learn-more/introduction/dkg-codebase-and-structure.md) + * [How DKG synchronization works](dkg-knowledge-hub/learn-more/introduction/dkg-sync.md) + * [Random Sampling & proofs explained](dkg-knowledge-hub/learn-more/introduction/random-sampling-dkg-proof-system/README.md) + * [Random Sampling rollout](dkg-knowledge-hub/learn-more/introduction/random-sampling-dkg-proof-system/random-sampling-rollout.md) + * [Random Sampling FAQ](dkg-knowledge-hub/learn-more/introduction/random-sampling-dkg-proof-system/random-sampling-faq.md) + * [Rules & token thresholds](dkg-knowledge-hub/learn-more/introduction/rules-and-token-thresholds.md) + * [Connected Blockchains](dkg-knowledge-hub/learn-more/connected-blockchains/README.md) + * [NeuroWeb Parachain](dkg-knowledge-hub/learn-more/connected-blockchains/neuroweb.md) + * [Base Network (L2)](dkg-knowledge-hub/learn-more/connected-blockchains/base-blockchain/README.md) + * [Connect to Base](dkg-knowledge-hub/learn-more/connected-blockchains/base-blockchain/connect-to-base.md) + * [Gnosis Chain](dkg-knowledge-hub/learn-more/connected-blockchains/gnosis-chain/README.md) + * [Connect to Gnosis](dkg-knowledge-hub/learn-more/connected-blockchains/gnosis-chain/connect-to-gnosis.md) + * [On-chain deployments & contracts](dkg-knowledge-hub/learn-more/deployed-smart-contracts.md) + * [Node keys (wallets)](dkg-knowledge-hub/learn-more/node-keys-wallets.md) + * [Previous version release](dkg-knowledge-hub/learn-more/previous-updates/README.md) + * [What's new with OriginTrail V8](dkg-knowledge-hub/learn-more/previous-updates/whats-new-with-origintrail-v8.md) + * [DKG V8.0 update guidebook](dkg-knowledge-hub/learn-more/previous-updates/dkg-v8.0-update-guidebook/README.md) + * [Protocol updates](dkg-knowledge-hub/learn-more/previous-updates/dkg-v8.0-update-guidebook/protocol-updates.md) + * [Feature roadmap](dkg-knowledge-hub/learn-more/previous-updates/dkg-v8.0-update-guidebook/feature-roadmap.md) + * [How to upgrade to V8?](dkg-knowledge-hub/learn-more/previous-updates/dkg-v8.0-update-guidebook/how-to-upgrade-to-v8.md) + * [What is a DKG Node?](dkg-knowledge-hub/learn-more/decentralized-knowle-dge-graph-dkg.md) +* [How-tos & tutorials](dkg-knowledge-hub/how-tos-and-tutorials/README.md) + * [Fund your Web3 wallets](dkg-knowledge-hub/how-tos-and-tutorials/fund-your-web3-wallets.md) + * [DKG V8.1.X update guidebook](dkg-knowledge-hub/how-tos-and-tutorials/dkg-v8.1.x-update-guidebook.md) + * [Bridging to Moonbeam](dkg-knowledge-hub/how-tos-and-tutorials/bridging-to-moonbeam.md) + * [Builder tutorials](dkg-knowledge-hub/how-tos-and-tutorials/tutorials.md) +* [Useful resources](dkg-knowledge-hub/useful-resources/README.md) + * [Public nodes](dkg-knowledge-hub/useful-resources/public-nodes.md) + * [Test token faucet](dkg-knowledge-hub/useful-resources/test-token-faucet.md) + * [Community created resources](dkg-knowledge-hub/useful-resources/community-resources.md) + * [Available networks, network details and RPCs](dkg-knowledge-hub/useful-resources/networks.md) + * [DKG Engine implementation details](dkg-knowledge-hub/useful-resources/ot-node-engine-implementation-details/README.md) + * [Modules](dkg-knowledge-hub/useful-resources/ot-node-engine-implementation-details/modules.md) + * [Command Executor](dkg-knowledge-hub/useful-resources/ot-node-engine-implementation-details/command-executor.md) + +## TO BE REPOSITIONED + +* [DKG AI Agents](to-be-repositioned/ai-agents/README.md) + * [ElizaOS DKG agent](to-be-repositioned/ai-agents/elizaos-dkg-agent.md) + * [Custom DKG Python agent](to-be-repositioned/ai-agents/custom-dkg-python-agent.md) + * [Custom DKG JavaScript agent](to-be-repositioned/ai-agents/custom-dkg-javascript-agent.md) + +## Graveyard + +* [Everything](graveyard/everything/README.md) + * [DKG Edge Node](graveyard/everything/dkg-edge-node/README.md) + * [DKG Edge Node architecture](graveyard/everything/dkg-edge-node/dkg-edge-node-architecture.md) + * [Get started with the Edge Node boilerplate](graveyard/everything/dkg-edge-node/get-started-with-the-edge-node-boilerplate/README.md) + * [Automated setup with the installer](graveyard/everything/dkg-edge-node/get-started-with-the-edge-node-boilerplate/automated-setup-with-the-installer.md) + * [Manual setup](graveyard/everything/dkg-edge-node/get-started-with-the-edge-node-boilerplate/manual-setup.md) + * [Usage example](graveyard/everything/dkg-edge-node/get-started-with-the-edge-node-boilerplate/usage-example.md) + * [Customize & build with the Edge Node](graveyard/everything/dkg-edge-node/customize-and-build-with-the-edge-node.md) + * [Knowledge Mining and dRAG examples](graveyard/everything/dkg-edge-node/knowledge-mining-and-drag-examples.md) + * [Deploy your Edge Node based project](graveyard/everything/dkg-edge-node/deploy-your-edge-node-based-project/README.md) + * [Automated deployment via Google Cloud Marketplace](graveyard/everything/dkg-edge-node/deploy-your-edge-node-based-project/automated-deployment-via-google-cloud-marketplace.md) + * [Automated deployment with installer](graveyard/everything/dkg-edge-node/deploy-your-edge-node-based-project/automated-deployment-with-installer.md) + * [DKG Edge Node inception program](graveyard/everything/dkg-edge-node/dkg-edge-node-inception-program.md) + * [DKG Edge Node API documentation](graveyard/everything/dkg-edge-node/dkg-edge-node-api-documentation.md) + * [DKG V8 update guidebook](graveyard/everything/dkg-v8-update-guidebook.md) + * [DKG Core Node](graveyard/everything/dkg-core-node/README.md) + * [Upgrading from V6 to V8](graveyard/everything/dkg-core-node/upgrading-from-v6-to-v8.md) + * [Run a V8 Core Node on testnet](graveyard/everything/dkg-core-node/run-a-v8-core-node-on-testnet/README.md) + * [Preparation for V8 DKG Core Node deployment](graveyard/everything/dkg-core-node/run-a-v8-core-node-on-testnet/preparation-for-v8-dkg-core-node-deployment.md) + * [V8 DKG Core Node installation](graveyard/everything/dkg-core-node/run-a-v8-core-node-on-testnet/v8-dkg-core-node-installation.md) + * [Run a V8 Core Node on mainnet](graveyard/everything/dkg-core-node/run-a-v8-core-node-on-mainnet/README.md) + * [Preparation for V8 DKG Core Node deployment](graveyard/everything/dkg-core-node/run-a-v8-core-node-on-mainnet/preparation-for-v8-dkg-core-node-deployment.md) + * [V8 DKG Core Node installation](graveyard/everything/dkg-core-node/run-a-v8-core-node-on-mainnet/v8-dkg-core-node-installation.md) + * [Deploy Core node via Google Cloud marketplace](graveyard/everything/dkg-core-node/deploy-core-node-via-google-cloud-marketplace.md) + * [How to open up your node for publishing](graveyard/everything/dkg-core-node/how-to-open-up-your-node-for-publishing.md) + * [Auto Updater](graveyard/everything/dkg-core-node/auto-updater.md) + * [Teleport instructions - NeuroWeb](graveyard/everything/teleport-instructions-neuroweb.md) + * [Powering AI minds: OriginTrail hackathon](graveyard/everything/powering-ai-minds-origintrail-hackathon.md) + * [Running DKG nodes](graveyard/everything/node-setup-instructions/README.md) + * [Installation prerequisites](graveyard/everything/node-setup-instructions/installation-prerequisites/README.md) + * [Hardware requirements](graveyard/everything/node-setup-instructions/installation-prerequisites/hardware-requirements.md) + * [Triple store setup](graveyard/everything/node-setup-instructions/installation-prerequisites/triple-store-setup.md) + * [Choosing blockchain networks](graveyard/everything/node-setup-instructions/installation-prerequisites/choosing-blockchain-networks.md) + * [Acquiring tokens](graveyard/everything/node-setup-instructions/installation-prerequisites/acquiring-tokens.md) + * [Acquire archive RPC endpoints](graveyard/everything/node-setup-instructions/installation-prerequisites/acquire-archive-rpc-endpoints.md) + * [DKG node installation](graveyard/everything/node-setup-instructions/dkg-node-installation.md) + * [Verify installation](graveyard/everything/node-setup-instructions/verify-installation.md) + * [OriginTrail DKG node NAT configuration](graveyard/everything/node-setup-instructions/origintrail-dkg-node-nat-configuration.md) + * [Running a full node](graveyard/everything/node-setup-instructions/running-a-full-node.md) + * [Running a gateway node](graveyard/everything/node-setup-instructions/running-a-gateway-node.md) + * [Houston - OriginTrail node command center](graveyard/everything/node-setup-instructions/houston-origintrail-node-command-center.md) + * [🔗 Switch DKG node to multichain](graveyard/everything/node-setup-instructions/switch-dkg-node-to-multichain.md) + * [Delegated Staking](graveyard/everything/delegated-staking/README.md) + * [Staking TRAC on Base](graveyard/everything/delegated-staking/staking-trac-on-base.md) + * [Staking TRAC on Neuroweb](graveyard/everything/delegated-staking/staking-trac-on-neuroweb.md) + * [Staking TRAC on Gnosis](graveyard/everything/delegated-staking/staking-trac-on-gnosis.md) + * [(NEW) Redelegating TRAC](graveyard/everything/delegated-staking/new-redelegating-trac.md) diff --git a/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/README.md b/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/README.md new file mode 100644 index 0000000..e4cf349 --- /dev/null +++ b/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/README.md @@ -0,0 +1,2 @@ +# Advanced features & toolkits + diff --git a/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-paranets/README.md b/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-paranets/README.md new file mode 100644 index 0000000..29a33a6 --- /dev/null +++ b/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-paranets/README.md @@ -0,0 +1,69 @@ +# Paranets + +
+ +**DKG para-networks**, or "**paranets",** are a feature of the OriginTrail Decentralized Knowledge Graph (DKG) designed to enable decentralized, co-owned, and incentivized knowledge graphs. + +With DKG paranets, both humans and AI agents can **collaboratively create, curate, and maintain knowledge graphs** while ensuring transparency, and provenance, and providing incentives for knowledge contributions. + +### Why paranets? + +Traditional knowledge-sharing mechanisms have limitations: + +* Knowledge bases like Wikipedia rely on centralized moderation, which can introduce bias and restrict contributions. +* AI models depend on private datasets, which often lack transparency and introduce biases. +* Scientific discoveries often remain behind paywalls, limiting access and slowing progress. + +DKG paranets provide a decentralized framework for knowledge governance and sharing, addressing these challenges, while maintaining a scalable and flexible semantic data structure perfectly suitable for AI applications. + +### Who's who in a paranet? + +We distinguish several key roles in a DKG paranet. + +* **Knowledge miners** produce new, useful Knowledge Assets and publish them to the paranet **knowledge graph.** If a miner's Knowledge Asset is included in an incentivized paranet, they might be eligible for token rewards for their contribution +* **Knowledge curators** "curate" the submitted Knowledge Assets and decide if they are to be included in the paranet knowledge graph. +* **Paranet operators** create and manage their paranets +* **Knowledge consumers** query the paranet knowledge and use it for their benefit +* [**IPO**](initial-paranet-offerings-ipos/) **voters** can support paranet growth through voting in Initial Paranet Offerings +* An associated **knowledge value** that represents the total amount of tokenized knowledge accumulated in the paranet (measured in TRAC). This value is used as a key multiplier for IPO incentives, which are implemented as a ratio. For example, a paranet operator may offer 20 NEURO tokens for each TRAC spent to knowledge miners as a reward for successfully mined Knowledge Assets. + +### Paranet structure + +Each DKG paranet has a: + +* **Shared knowledge graph, assembled from paranet Knowledge Assets**, published by knowledge miners and stored on the OriginTrail DKG. Depending on the paranet specifics, these Knowledge Assets conform to a set of paranet rules, such as containing knowledge about a particular topic, data structured according to defined ontology, etc. +* **Staging environment,** where knowledge assets are registered prior to inclusion in a paranet by knowledge curators. +* **Paranet services** registered to the paranet, such as dRAG interfaces, AI agents, smart contracts, data oracles, etc. +* **Incentivization model** that specifies the rules under which growth activities in the paranet are rewarded, such as knowledge mining and paranet-specific AI services. The incentivization system may be kick-started through an Initial Paranet Offering (IPO) +* **A "home" blockchain** on which the paranet is hosting the Knowledge Assets. + +### Some paranet use cases + +DKG paranets provide a structured, transparent knowledge-sharing system where value follows knowledge: + +* **High-performant AI agent memory**—AI agents can autonomously govern and curate their own knowledge-graph-based memory using paranets, either individually or as part of agentic swarms. (See more under [ElizaOS agent](../../../to-be-repositioned/ai-agents/elizaos-dkg-agent.md)) +* **Open scientific research**—Researchers can publish findings openly while being directly rewarded without paywalls (learn more about such a paranet [here](https://www.youtube.com/watch?v=9O-DB4EftOk)). +* **Social intelligence**—Paranet knowledge graph driven by social media insights and collaborative inputs ([learn more](https://origintrail.io/blog/growing-the-buz-economy-announcing-the-social-intelligence-paranet-launch)) +* **AI training on open data**—AI models can train on decentralized, tokenized knowledge instead of closed, biased datasets. +* **Decentralized supply chain data**—Supply chain participants can contribute, verify, and access immutable records of product origins and movements, enhancing trust and reducing fraud. +* **Collaborative educational resources**—Educators and students can co-create knowledge repositories, ensuring open access to high-quality learning materials with verified provenance. +* **Decentralized journalism**—Independent journalists can publish reports that are verified and co-owned by a decentralized network, reducing misinformation and ensuring accountability. +* **Crowdsourced innovation**—Communities and organizations can jointly develop and maintain R\&D knowledge bases, allowing open collaboration while ensuring contributions are fairly recognized and rewarded. + +

A paranet knowledge graph example

+ +### Decentralized knowledge sharing for AI + +The characteristics of a paranet, including its Knowledge Asset parameters and how services are provisioned, are all defined by the **paranet operator.** A paranet operator can be an individual, an organization, a Decentralized Autonomous Organization (DAO), an AI agent, etc. Paranets together form the DKG, leveraging the common underlying network infrastructure. Given the DKG is a permissionless system, anyone can initiate a paranet. + +Paranets provide a powerful substrate for AI systems. They leverage network effects of verifiable inputs from multiple sources to receive accurate answers through Decentralized Retrieval-Augmented Generation (dRAG), allowing it to gather information from the graph of public knowledge and privately held knowledge in relevant knowledge collections that it has access to. + +{% hint style="info" %} +**TL;DR** + +**Paranets are the first-ever neutral, transparent knowledge-sharing layer where value follows knowledge:** + +* AI models can train on open, tokenized knowledge instead of closed, biased datasets. +* Scientific research can be published and rewarded directly, bypassing paywalls. +* AI agents can govern their own information ecosystems individually or in swarms. +{% endhint %} diff --git a/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-paranets/building-with-dkg-paranets.md b/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-paranets/building-with-dkg-paranets.md new file mode 100644 index 0000000..5878fea --- /dev/null +++ b/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-paranets/building-with-dkg-paranets.md @@ -0,0 +1,40 @@ +# Build with Paranets + +Paranets are like "virtual" knowledge graphs on the OriginTrail Decentralized Knowledge Graph (DKG). Building with them is quite similar to building on the DKG in general. However, paranets enable you to contain your operations services on these "virtual" graphs, e.g., querying a specific paranet with SPARQL or adding a knowledge collection\* to a specific paranet. + +{% hint style="info" %} +\***A** **knowledge collection (KC)** is a **collection of Knowledge Assets.** It refers to structured data that can be stored, shared, and validated within a distributed network. +{% endhint %} + +To gain access to the paranet knowledge graph, you can use one of the [public DKG nodes](../../../dkg-knowledge-hub/useful-resources/public-nodes.md), or deploy a [DKG node](../../../graveyard/everything/dkg-core-node/) and set it up to host the paranet (or "sync" it). More information is available on the [Sync a DKG Paranet](syncing-a-dkg-paranet.md) page. + +**A direct code example of paranets in use can be found here -** [**Paranet Demo**](https://github.com/OriginTrail/dkg.js/blob/v8/develop/examples/paranet-demo.js) + +### Querying paranets + +Once you have access to the paranet knowledge graph via a gateway node, you can use one of the [DKG SDKs](../dkg-sdk/) to interact with it. It is also possible to open up your triple store SPARQL endpoint directly and query the paranet knowledge graph in its own repository (the paranet repository name is equivalent to the paranet profile Knowledge Asset UAL, with dash characters instead of slash). + +Using SPARQL, it is possible to query and integrate knowledge from multiple paranets and whole DKG in one query using SPARQL federated queries. + +### Running paranet services + +Paranets enable registering and exposing both on-chain and off-chain services associated with it. A paranet service can be identified by all users of the paranet via its registry Knowledge Asset and can have multiple on-chain accounts associated with it, enabling them to engage in economic activity within the DKG. Examples of paranet services are AI agents (e.g., autonomous reasoners mining knowledge collections), chatbots (e.g., [Polkabot](https://polkabot.ai/)), oracle feeds, LLMs, dRAG APIs, etc. + +Paranet operators manage the services through the Paranet Services Registry smart contracts or DKG SDK. + +### Paranet permissions + +There are three permission policies for paranet: + +* Nodes access policy—defines which nodes can sync the _paranet_: + * OPEN—Any node can sync the _paranet._ + * PERMISSIONED — Only approved nodes can sync the _paranet_. +* Miners access policy—defines which knowledge miners can add knowledge to the _paranet_: + * OPEN—Any address can submit a knowledge asset to the _paranet._ + * PERMISSIONED — Only approved addresses can submit a knowledge asset to the _paranet_. +* Knowledge asset submission access policy: + * OPEN—Any knowledge asset can be added to the _paranet._ + * STAGING—Knowledge miners first submit the knowledge asset to staging, where it is reviewed by curators chosen by the paranet owner. The curators can _approve_ (and automatically add knowledge asset to the paranet) or _deny_ staged knowledge asset(which then doesn't get added to the paranet). + + + diff --git a/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-paranets/deploying-a-dkg-paranet.md b/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-paranets/deploying-a-dkg-paranet.md new file mode 100644 index 0000000..f8c3fe0 --- /dev/null +++ b/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-paranets/deploying-a-dkg-paranet.md @@ -0,0 +1,101 @@ +--- +description: >- + A guide for developers and paranet operators on launching their paranet on the + DKG +--- + +# Deploy a Paranet + +### 1. Prepare for paranet deployment + +To successfully deploy a paranet, you will have to create a knowledge collection on the DKG and execute paranet registration transactions on the blockchain. This guide assumes you already have a good idea of what purpose you are deploying your DKG paranet for and will focus only on the technical steps. + +{% hint style="info" %} +A _**Knowledge Asset**_ is an individual knowledge graph entity or a piece of data, while a _**knowledge collection**_ is a group of interconnected _Knowledge Assets_ that form a broader set of information. **Knowledge collections** enable the creation of multiple Knowledge Assets through one atomic operation. +{% endhint %} + +Below is the input you will need: + +* **Decide which blockchain to deploy your paranet on.** This is the blockchain on which knowledge mining will take place. All DKG-integrated blockchains can be used. However, initially, only the NeuroWeb and Base blockchains support [Initial Paranet Offerings (IPOs)](initial-paranet-offerings-ipos/). +* Pick a _paranet name_ and create a short _description_ (which will be stored on-chain). +* Decide what kind of permissions the _paranet_ will have. +* Prepare a _**paranet profile Knowledge Asset**_ to represent your paranet as its profile on the DKG. It can be as minimal or as rich in content as you'd like. + +### 2. Create your paranet profile on the DKG + +A _paranet profile_ is a Knowledge Asset that will uniquely identify your paranet and you as the paranet operator. As long as you own this Knowledge Asset, you will be able to manage paranet operator functions in the DKG paranet smart contracts. + +An example paranet profile Knowledge Asset could look like this: + +```json +{ + "@context": "http://schema.org/", + "@id": "urn:some-data:info:catalog", + "@type": "DataCatalog", + "name": "Super Paranet", + "description": "This is the description of the super paranet!", + "keywords": "keyword1, keyword2, keyword3 ...", +} +``` + +Paranet Knowledge Assets UAL looks like this: + +``` +did:dkg:otp:2043/0x8f678eB0E57ee8A109B295710E23076fA3a443fe/1497611/125 +``` + +Once you create your paranet profile Knowledge Asset, save the **Knowledge Asset UALs** that are contained within the **knowledge collection UAL**, as you will need them for the next step. + +{% hint style="info" %} +As the **paranet profile** **Knowledge Asset** is an **NFT on-chain**, if you would like to change your operator key (wallet), all you need to do is transfer this NFT to your new address. +{% endhint %} + +### 3. Execute _registerParanet_ transaction on the blockchain + +You can use DKG.js or execute the paranet transaction directly on the smart contracts. + +{% hint style="info" %} +Here's a demo of paranets in action: [Paranet Demo](https://github.com/OriginTrail/dkg.js/blob/v8/develop/examples/paranet-demo.js). + +Check the usage of the command **DkgClient.paranet.create**. +{% endhint %} + +Here's a code snippet using dkg.js (from the above example) + +```javascript +// first we create a paranet Knowledge Collection + +let content = { + public: { + "@context": "http://schema.org/", + "@id": "urn:some-data:info:catalog", + "@type": "DataCatalog", + "name": "Super Paranet", + "description": "This is the description of the super paranet!", + "keywords": "keyword1, keyword2, keyword3 ...", + }, + }; + +const paranetCollectionResult = await DkgClient.asset.create(content, { epochsNum: 2 }); + // Paranet UAL is a Knowledge Asset UAL (combination of Knowledge Collection UAL and Knowledge Asset token id) + const paranetUAL = `${paranetCollectionResult.UAL}/1`; + const paranetOptions = { + paranetName: 'MyParanet', + paranetDescription: 'This is my paranet on the DKG!', + paranetNodesAccessPolicy: PARANET_NODES_ACCESS_POLICY.OPEN, + paranetMinersAccessPolicy: PARANET_MINERS_ACCESS_POLICY.OPEN, + paranetKcSubmissionPolicy: PARANET_KC_SUBMISSION_POLICY.OPEN, + }; +// using the paranet knowledge asset, create your paranet + const paranetRegistered = await DkgClient.paranet.create(paranetUAL, paranetOptions); +``` + +**That's it, you have successfully performed a minimal paranet deployment,** and knowledge miners can now start mining knowledge via your paranet. + +To proceed, we recommend setting up a **DKG node** that will continuously sync your paranet knowledge graph. + +Additionally, you might want to consider [running an IPO](initial-paranet-offerings-ipos/) to incentivize knowledge miners. + +{% hint style="info" %} +If you have been running a paranet on the previous V6 version of the DKG, your paranet will not automatically update to the new system. If you need help updating, please contact the core developers in [Discord](https://discord.gg/xCaY7hvNwD) for assistance. +{% endhint %} diff --git a/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-paranets/initial-paranet-offerings-ipos/README.md b/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-paranets/initial-paranet-offerings-ipos/README.md new file mode 100644 index 0000000..75843d3 --- /dev/null +++ b/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-paranets/initial-paranet-offerings-ipos/README.md @@ -0,0 +1,32 @@ +# Initial Paranet Offerings (IPOs) + +Initial Paranet Offerings (IPO) are introduced as **a means of a public launch of a paranet, with a Knowledge Asset that is Paranet Profile and accompanying incentivization structure proposed and voted upon via the NeuroWeb governance mechanism**. + +Each IPO is structured as an initial proposal and an initial set of **knowledge collections (collections of Knowledge Assets)** published, along with an incentivization structure set forth by a paranet operator that proposes how the incentives will be split across three groups: + +1. IPO operator +2. Knowledge miners +3. Voters—NEURO holders who participated in supporting the creation of an IPO through NeuroWeb governance. + +{% hint style="info" %} +The paranets feature is initially rolled out on the NeuroWeb blockchain. Other DKG-enabled blockchains will follow. +{% endhint %} + +To launch your Initial Paranet Offering, you are expected to: + +1. **Share your initial AI paranet idea publicly** with the knowledge miner community (e.g., in the [Discord](https://discord.gg/3BrQDvHpdc) paranets channel) +2. **Specify your paranet** using the provided [IPO template](https://docs.google.com/document/d/1QzKpH_ex-U8mxh-IgwTjijEe3n6vwRVAhG599siapQQ/edit#heading=h.61lymw4v18qp) to prepare it for the NeuroWeb governance proposal. Request a custom Discord channel creation for your paranet via the [#paranets](https://discord.gg/wtC73bqj3c) channel. +3. Introduce your paranet topic, knowledge assets, and AI services to the community. To ensure the required community support, we recommend sharing your proposal widely: + * Sharing it on X + * Posting on [Discord](https://discord.com/invite/qRc4xHpFnN) + * Sending to [Telegram](https://t.me/origintrail) +4. Before launching the governance vote, [**register your paranet**](../building-with-dkg-paranets.md) and instantiate the _ParanetIncentivesPool_ and _ParanetIncentivesPoolStorage_ smart contracts via the _ParanetIncentivesPoolFactory_ contract. +5. **Launch the NeuroWeb Governance Proposal for your paranet**. General instructions for submitting governance proposals are available [here](https://docs.neuroweb.ai/on-chain-governance/submit-a-governance-proposal). +6. Once your paranet idea is supported (voted Aye by a majority vote of the NeuroWeb community), proceed with activating your paranet and knowledge mining + + + +{% hint style="success" %} +Have any questions or feedback for this page? Hop into our [Discord channel](https://discord.com/invite/qRc4xHpFnN) and get in touch +{% endhint %} + diff --git a/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-paranets/initial-paranet-offerings-ipos/ipo-specification.md b/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-paranets/initial-paranet-offerings-ipos/ipo-specification.md new file mode 100644 index 0000000..d294730 --- /dev/null +++ b/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-paranets/initial-paranet-offerings-ipos/ipo-specification.md @@ -0,0 +1,61 @@ +--- +description: >- + What do you need to specify to launch an Initial Paranet Offering (IPO) and + incentivize the creation of your paranet knowledge graph +--- + +# IPO specification + + + +In this specification phase, the focus is on the concept for your paranet and how you plan to apply incentives to grow a vibrant community of knowledge miners. [Here is a template](https://docs.google.com/document/d/1QzKpH_ex-U8mxh-IgwTjijEe3n6vwRVAhG599siapQQ/edit#heading=h.61lymw4v18qp) that needs to be populated for your Initial Paranet Offering (IPO). Below, you will find detailed guidelines and additional information to assist you in filling out the template and structuring your IPO proposal effectively. + +### Paranet context and problem + +Start your journey by stating what your paranet aims to achieve. Clearly outline the purpose and objectives of your paranet. Try to answer the following questions: + +* What problem does it aim to solve? +* What kind of knowledge collections\* and services will it offer? +* What type (if any) of specialized tools will knowledge miners need to run? +* Who are the expected users? +* How will they interact with the paranet? + +### Paranet diagram + +A visual representation is crucial for ensuring easier understanding and communication of your paranet structure. Create a visual representation (block scheme, architecture diagram, or similar) of your paranet structure, showcasing how various services, knowledge collections\*, and stakeholders interact within the paranet ecosystem. + +{% hint style="info" %} +**\*A knowledge collection (KC)** is a **collection of Knowledge Assets.** It refers to structured data that can be stored, shared, and validated within a distributed network. +{% endhint %} + +### Paranet AI services + +As a paranet operator, you may run many different services in association with your paranet knowledge collections, such as AI services (LLMs, agents, etc.), data feeds, user or machine-to-machine interfaces, etc. An example of a service would be an AI question-answering system, which operates on the basis of understanding natural language questions and extracting answers from the DKG (using dRAG). Describe your services with as much detail as possible. + +### Paranet knowledge collections + +To best leverage the neuro-symbolic nature of the DKG, it is recommended to apply a rich schema for your paranet knowledge collections. If your knowledge is organized according to an ontology, abilities such as knowledge graph reasoning become possible, as well as increased efficiency with knowledge querying (through SPARQL). We recommend picking at least one established ontology that applies to the knowledge domain of your paranet. Learn more on ontologies [here](https://www.cs.ox.ac.uk/people/ian.horrocks/Publications/download/2008/Horr08a.pdf). + +### NEURO emissions + +NEURO emissions are requested by the paranet operator from the NeuroWeb Governance to fuel the game-theoretical system the paranet operator designed in order to ensure the growth of knowledge in their paranets, leading to the paranet achieving desired objectives. + +Paranet operators propose how the incentives will be split across three groups: + +* Paranet operator running the AI services +* Knowledge miners contributing knowledge to the paranet +* NEURO holders that participated in supporting the creation of an IPO through governance + +The success of an IPO largely depends on the paranet operator's ability to wisely propose the incentive structure, taking into consideration, among others, the following factors: + +* **Knowledge miners** who are mining knowledge collections on the DKG by using TRAC utility tokens are central to the success of a paranet. Their role is also critical for distributing the NEURO emissions among the three groups, as such distribution only occurs as new knowledge is mined. When launching an IPO, the paranet operator defines the **ratio of NEURO to be earned per TRAC spent to mine** each knowledge collection. An IPO operator may set the ratio autonomously to target a desired profitability before the proposal is submitted to voting, yet attempts of price gouging might not receive support from NEURO holders. +* The **paranet operator** defines AI services that the operator will make available as a part of the paranet. For running the AI services and supporting the paranet, the paranet operator can set a percentage of the emissions as a **paranet operator fee**. +* **NEURO holders** that support an IPO via governance voting are to lock up tokens for the duration of NEURO emission allocated for the IPO. Though the **share of emissions allocated** for an IPO is an important factor for NEURO holders’ decision, the **duration of the “lock period”** can also play an important role. The paranet operator also defines what portion of paranet incentives will be shared with NEURO holders supporting the proposal. + +### Marketing plan + +Detail your marketing strategy for promoting and attracting knowledge miners to your paranet. Include channels, tactics, timelines, and any collaborations or partnerships planned for marketing purposes. + +{% hint style="success" %} +Have any questions or feedback for this page? Hop into our [Discord channel](https://discord.com/invite/qRc4xHpFnN) and get in touch +{% endhint %} diff --git a/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-paranets/initial-paranet-offerings-ipos/ipo-voting.md b/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-paranets/initial-paranet-offerings-ipos/ipo-voting.md new file mode 100644 index 0000000..b25602d --- /dev/null +++ b/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-paranets/initial-paranet-offerings-ipos/ipo-voting.md @@ -0,0 +1,3 @@ +# IPO voting + +To participate in IPO voting, cast your vote through NeuroWeb's governance system using the Polkadot-JS interface. [This guide](https://docs.neuroweb.ai/on-chain-governance/voting-on-a-referendum) provides a step-by-step explanation of the voting process and how to use the on-chain governance mechanism. diff --git a/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-paranets/initial-paranet-offerings-ipos/launching-your-ipo.md b/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-paranets/initial-paranet-offerings-ipos/launching-your-ipo.md new file mode 100644 index 0000000..2c5d7b1 --- /dev/null +++ b/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-paranets/initial-paranet-offerings-ipos/launching-your-ipo.md @@ -0,0 +1,40 @@ +# Launch an IPO + +This page assumes you have completed the previous steps: + +* You have deployed your paranet +* You have specified your IPO (defined all the properties, including incentives, knowledge graph properties, etc.) + +Once you have all these steps completed, you are able to initiate the IPO. + +### 1. Deploy a paranet incentives contract + +The paranet incentive contract is the only type of contract that can receive NEURO incentives, as it implements the incentivization logic. Other addresses, such as EOA addresses will not be accepted and are not eligible for incentives. + +To deploy your paranet incentive contract, either use the Paranet User Interface (coming soon) or execute the transaction directly on the contract. + +### 2. Publish a IPO specification to the GitHub repo + +Provide details for your paranet on the[ Paranets GitHub repo](https://github.com/OriginTrail/dkg-paranets) and provide as many details as possible. We recommend engaging with the community through other channels and social media to garner support for your IPO. + +### 3. Launch a governance proposal on NeuroWeb + +For a successful governance vote, your NeuroWeb governance proposal should initiate a _forceTransfer_ extrinsic towards the _ParanetIncentivesPoolStorage_ contract address on NeuroWeb. + +More on NeuroWeb governance proposals mechanics is available in the [NeuroWeb docs](https://docs.neuroweb.ai/on-chain-governance/submit-a-governance-proposal). + +### 4. Incentives are deployed to the paranet incentives contract + +If your proposal is voted in by the NeuroWeb governance voters, the requested funds will be dispatched to the paranet incentives storage contract. Once the contract is funded, the incentives can be claimed by knowledge miners, the operator and voters in the specified percentage. Incentives are only available, though, when knowledge mining occurs. + +In the current version, IPO incentive emissions are directly correlated with the amount of TRAC tokens spent for publishing to the DKG through knowledge mining (if no knowledge is mined, no incentives can be claimed). + +In the upcoming versions, the IPOs will enable even more configurability of the emission mechanics, as we expect to see paranet and IPOs innovations to rapidly occur with the first IPOs. If you have ideas on how IPOs can be improved, come and share them in [Discord](https://discord.com/invite/qRc4xHpFnN)! + +{% hint style="info" %} +If you're interested in deploying a **paranets incentive pool**, you can find more details and guidelines at this [link](paranets-incentives-pool.md). +{% endhint %} + +{% hint style="success" %} +Have any questions or feedback for this page? Hop into our [Discord channel](https://discord.com/invite/qRc4xHpFnN) and get in touch +{% endhint %} diff --git a/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-paranets/initial-paranet-offerings-ipos/paranets-incentives-pool.md b/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-paranets/initial-paranet-offerings-ipos/paranets-incentives-pool.md new file mode 100644 index 0000000..65cd15c --- /dev/null +++ b/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-paranets/initial-paranet-offerings-ipos/paranets-incentives-pool.md @@ -0,0 +1,21 @@ +# Incentives pool + +The **incentives pool** serves to encourage key participants in the paranet ecosystem to perform essential tasks that support its operation. Here's a breakdown of who receives rewards and for what: + +1. **Knowledge miners**: + * Receive rewards for **publishing valuable data and knowledge** to the paranet. + * Their contributions help grow and maintain the paranet's database and ensure it remains relevant. +2. **Voters**: + * Receive rewards for **supporting or voting on proposals related** to the paranet. + * Their participation ensures that decisions regarding the paranet's direction are made democratically and align with the community's interests. +3. **Operators** (paranet creators and maintainers): + * Paranet operators who **create and maintain the paranet** are rewarded for ensuring the paranet runs smoothly. + * They are responsible for overseeing its operations, managing resources, and ensuring its success. + +{% hint style="info" %} +As a **paranet operator**, funding the pool helps ensure that all key participants—miners, voters, and operators—are incentivized to contribute to the paranet’s success. +{% endhint %} + +{% hint style="success" %} +Have any questions or feedback for this page? Hop into our [Discord channel](https://discord.com/invite/qRc4xHpFnN) and get in touch +{% endhint %} diff --git a/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-paranets/syncing-a-dkg-paranet.md b/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-paranets/syncing-a-dkg-paranet.md new file mode 100644 index 0000000..7c42096 --- /dev/null +++ b/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-paranets/syncing-a-dkg-paranet.md @@ -0,0 +1,45 @@ +# Sync a Paranet + +To interact with specific DKG paranet's knowledge graphs using your OriginTrail node, you need to configure your node to synchronize the paranet's knowledge collections. This setup can be achieved by modifying your node's configuration file to include the paranet UAL. + +If you have not yet set up your node or need guidance on configuring a DKG Core Node, please refer to the [DKG Core Node](../../../graveyard/everything/dkg-core-node/). + +To enable your node to sync with a paranet, you will need to add `assetSync` object to your node’s `.origintrail_noderc` file. Below is an example of how to configure this (make sure to replace the UAL in the example below): + +```json +"assetSync": { + "syncParanets": ["did:dkg:hardhat2:31337/0x8aafc28174bb6c3bdc7be92f18c2f134e876c05e/1/5"] +} +``` + +Once .origintrail\_noderc is updated, it should look something like this: + +
...
+    "auth": {
+        "ipWhitelist": [
+            "::1",
+            "127.0.0.1"
+        ]
+    },
+    "assetSync": {
+        "syncParanets": ["did:dkg:hardhat2:31337/0x8aafc28.../1/5"]
+    }    
+}
+
+ +After you have updated the `.origintrail_noderc` file, save your changes, and restart your OriginTrail node. This ensures that the new settings take effect and your node starts syncing data from the specified paranet. + +Your node will start syncing when you see the following log: + +``` +Paranet sync: Starting paranet sync for paranet: did:dkg:hardhat2:31337/0x8aafc28174bb6c3bdc7be92f18c2f134e876c05e/1/5 +``` + +The paranet is fully synced when you see the following log: + +``` +Paranet sync: KA count from contract and in DB is the same, nothing new to sync, for paranet: did:dkg:hardhat2:31337/0x8aafc28174bb6c3bdc7be92f18c2f134e876c05e/1/5 +``` + +Interacting with the paranet knowledge graph through your node is explained on [this](building-with-dkg-paranets.md) page. + diff --git a/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-sdk/README.md b/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-sdk/README.md new file mode 100644 index 0000000..04dd154 --- /dev/null +++ b/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-sdk/README.md @@ -0,0 +1,52 @@ +--- +description: Decentralized Knowledge Graph V8 client +--- + +# SDKs + +{% embed url="https://youtu.be/4oi_0hJmxcY" %} +OriginTrail dev tutorial: SDK walkthrough +{% endembed %} + +The OriginTrail SDKs are client libraries for your applications that enable your applications to interact with the OriginTrail Decentralized Knowledge Graph (DKG). + +From an architectural standpoint, the SDK libraries are application interfaces into the DKG. They enable you to create and manage **Knowledge Assets** through your apps and perform network queries (such as search or SPARQL queries), as illustrated below. + +

The interplay between your app, DKG and blockchains

+ + + +The OriginTrail SDK currently comes in two forms: + +* Javascript SDK - [**dkg.js**](dkg-v8-js-client/) +* Python SDK - [**dkg.py**](dkg-v8-py-client/)**.** + + + +### Try out the SDK + +You can try out the SDK in two different ways: + +#### 1. Using Public DKG Nodes + +Try the SDK with public DKG nodes by following the [Quickstart: Test Drive the DKG in 5 Minutes](https://docs.origintrail.io/build-with-dkg/quickstart-test-drive-the-dkg-in-5-mins) guide. + +#### 2. Development environment setup + +Set up a development environment using one of the following options: + +* **Deploy your node on the DKG testnet (recommended):**\ + This option allows you to quickly experiment with the SDK on a testnet of your choice.\ + Follow the [Edge Node Deployment Guide](https://docs.origintrail.io/build-with-dkg/dkg-edge-node/setup-your-edge-node-development-environment) for setup instructions. +* **Deploy your node on a local DKG network:**\ + Use this option to set up a fully localized development environment by following the [Development environment setup guide](setting-up-your-development-environment.md). + + + +SDKs for other programming languages would be welcome contributions to the project. The core development team is also considering including them in the roadmap. + +{% hint style="info" %} +Interested in building a DKG SDK in a particular programming language? We'd love to support you. + +Create an [issue](https://github.com/OriginTrail/ot-node/issues) on our GitHub, and let's get the conversation started! +{% endhint %} diff --git a/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-sdk/dkg-v8-js-client/README.md b/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-sdk/dkg-v8-js-client/README.md new file mode 100644 index 0000000..8cd6f05 --- /dev/null +++ b/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-sdk/dkg-v8-js-client/README.md @@ -0,0 +1,408 @@ +--- +description: Javascript library for the Decentralized Knowledge Graph. +--- + +# DKG Javascript SDK (dkg.js) + +If you are looking to build applications leveraging [Knowledge Assets](./#create-a-knowledge-asset) on the OriginTrail Decentralized Knowledge Graph (DKG), the dkg.js SDK library is the best place to start! + +The DKG SDK is used together with an **OriginTrail gateway node** to build applications that interface with the OriginTrail DKG (the node is a dependency). Therefore, to use the SDK, you either need to run a gateway node on [your local environment](../setting-up-your-development-environment.md) or a [hosted OT-node](../../../../graveyard/everything/dkg-core-node/run-a-v8-core-node-on-testnet/). + +## Prerequisites + +* node ≥ 20.0.0 +* npm ≥ 8.0.0 + +## Installation + +The library can be used either in the browser or in a NodeJS application. + +### Using dkg.js in the browser + +Use the prebuilt `dist/dkg.min.js`, or build the file on your own using the[ dkg.js](https://github.com/OriginTrail/dkg.js) repository: + +``` +npm run build +``` + +Then include `dist/dkg.min.js` in your html file. This will expose `DKG` on the window object: + +```javascript + + + +``` + +{% hint style="info" %} +Make sure to also include the **web3.js library**, as it is a dependency for dkg.js. +{% endhint %} + +### Using dkg.js in NodeJS apps + +Run the command to install dependency from the [NPM](https://www.npmjs.com/package/dkg.js) repository: + +```bash +npm install dkg.js@latest +``` + +Then, include `dkg.js` in your project files. This will expose the `DKG` object: + +```javascript +const DKG = require('dkg.js'); +``` + +## :snowboarder: Quickstart + +{% embed url="https://youtu.be/4oi_0hJmxcY?si=SD7GUy35mtovBiPW" %} +OriginTrail dev tutorial: SDK walkthrough +{% endembed %} + +To use the DKG library, you need to connect to a running local or remote OT-node. + +```javascript +const dkg = new DKG({ + environment: ENVIRONMENTS.DEVELOPMENT, // or devnet, testnet, mainnet + endpoint: 'http://localhost', // gateway node URI + port: 8900, + blockchain: { + name: BLOCKCHAIN_IDS.HARDHAT_1, // or any other blockchain id + publicKey: PUBLIC_KEY, // not required in browser, metamask used instead + privateKey: PRIVATE_KEY, // not required in browser, metamask used instead + }, +}); + +const nodeInfo = await dkg.node.info(); +// if successfully connected, the will return an object indicating the node version +// { 'version': '8.X.X' } +``` + +The system supports multiple blockchain networks, which can be configured using the BLOCKCHAIN\_IDS constants. You can select the desired blockchain by specifying the corresponding constant. The available options are: + +**DKG mainnet options:** + +* Base: base:8453 +* Gnosis: gnosis:100 +* Neuroweb: otp:2043 + +**DKG testnet options:** + +* Base: base:84532 +* Gnosis: gnosis:10200 +* Neuroweb: otp:20430 + +**DKG devnet options:** + +* Base: base:84532 +* Gnosis: gnosis:10200 +* Neuroweb: otp:2160 + +**Local options:** + +* Hardhat1: hardhat1:31337 +* Hardhat2: hardhat2:31337 + +The system uses default publicy available RPCs for each chain. However, because these RPCs are shared by many users, they can become overloaded, leading to errors—such as failures when creating a KA. To avoid this, we recommend using your own RPC if possible. You can set a custom RPC by passing `rpc: RPC_URL` in the blockchain options. + +## Create a Knowledge Asset + +In this example, let’s create an example Knowledge Asset representing a city. The content contains both public and private assertions. Public assertions will be exposed publicly (replicated to other nodes), while private ones won't (stay on the node you published to only). + +If you have access to the particular node that has the data, when you search for it using get or query, you will see both public and private assertions. + +```javascript +const content = { + public: { + '@context': 'http://schema.org', + '@id': 'https://en.wikipedia.org/wiki/New_York_City', + '@type': 'City', + name: 'New York', + state: 'New York', + population: '8,336,817', + area: '468.9 sq mi', + }, + private: { + '@context': 'http://schema.org', + '@id': 'https://en.wikipedia.org/wiki/New_York_City', + '@type': 'CityPrivateData', + crimeRate: 'Low', + averageIncome: '$63,998', + infrastructureScore: '8.5', + relatedCities: [ + { '@id': 'urn:us-cities:info:los-angeles', name: 'Los Angeles' }, + { '@id': 'urn:us-cities:info:chicago', name: 'Chicago' }, + ], + }, +} + +``` + +When you create the Knowledge Asset, the above JSON-LD object will be converted into an **assertion**. When an assertion with public data is prepared, we can create a Knowledge Asset on the DKG. `epochsNum` specifies how many epochs the asset should be kept for (an epoch is equal to one month). + +```javascript +const result = await DkgClient.asset.create(content, { + epochsNum: 6 +}); + +console.log(result); +``` + +The complete response of the method will look like: + +```javascript +{ + "UAL": "did:dkg:base:84532/0xd5550173b0f7b8766ab2770e4ba86caf714a5af5/10310", + "datasetRoot": "0x09d732838cb1e4ff56a080d58d2b50fd8383ef66c783655a80cd7522b80b53df", + "signatures": [ + ... + ], + "operation": { + "mintKnowledgeAsset": { + "blockHash": "0x729fbb3bb2852dbc51a6996ae03aed27cebb987b51ec8a3da65e642749b70b74", + "blockNumber": 20541620, + "contractAddress": null, + "cumulativeGasUsed": 1680639, + "effectiveGasPrice": 1026844, + "from": "0x0e1405add312d97d1a0a4faa134c7113488d6cea", + "gasUsed": 530457, + "l1BaseFeeScalar": "0x44d", + "l1BlobBaseFee": "0x20fbab8dde", + "l1BlobBaseFeeScalar": "0xa118b", + "l1Fee": "0x6b3eb30359ac", + "l1GasPrice": "0x411f05c9f", + "l1GasUsed": "0x4e95", + "logs": [ + ... + ], + "logsBloom": "0x00000200000000000000000000000000000000000000000000000000400000000000800000000000000000000000000000000000000000000000000000240000000000800000000000000008000000000000000000042800004000000000000008000000020000000000000100000808000000004800040010040010000000004000000800000020001000000000000000000002000000000000010000000000020000000000000000000a000000000000000000000000000820000020000011000000020020000000000000000000000004000084410000000000000000e0000010000000000000020000000000000000000000000000000000000802000000", + "status": true, + "to": "0x46121121f78f8351da4526813fbfbffd044dec6c", + "transactionHash": "0x1a9f6b954c2149fb03d6adec21ca7b0829a4d84b3bc93fad62291fcbeb74aace", + "transactionIndex": 10, + "type": "0x0" + }, + "publish": { + "operationId": "6f6a0960-e577-43ef-88c9-04e0314347c5", + "status": "PUBLISH_REPLICATE_END" + }, + "finality": { "status": "FINALIZED" }, + "numberOfConfirmations": 3, + "requiredConfirmations": 3 + } +} +``` + +If you want to create multiple different assets, you can increase your allowance. Then, each time you initiate a publish, the step of calling the blockchain to increase your allowance will be skipped, resulting in a faster publishing time. + +```javascript +await dkg.asset.increaseAllowance('1569429592284014000'); + +const result = await DkgClient.asset.create(content, { + epochsNum: 6 +}); +``` + +After you've finished publishing data to the blockchain, you can decrease your allowance to revoke the authorization given to the contract to spend your tokens. If you want to revoke all remaining authorization, it's a good practice to pass the same value that you used for increasing your allowance. + +```javascript +await dkg.asset.decreaseAllowance('1569429592284014000'); +``` + +## Read Knowledge Asset data from the DKG + +To read Knowledge Asset data from the DKG, we utilize the **get** protocol operation. + +In this example, we will get the latest state of the Knowledge Asset we published previously: + +```javascript +const { UAL } = result; + +const getAssetResult = await dkg.asset.get(UAL); + +console.log(JSON.stringify(getAssetResult, null, 2)); +``` + +The response of the get operation will be the assertion graph: + +```javascript +{ + "assertion": [ + { + "@id": "https://ontology.origintrail.io/dkg/1.0#metadata-hash:0x5cb6421dd41c7a62a84c223779303919e7293753d8a1f6f49da2e598013fe652", + "https://ontology.origintrail.io/dkg/1.0#representsPrivateResource": [ + { + "@id": "uuid:e8edba11-5c95-4b02-941b-b662220c4be8" + } + ] + }, + { + "@id": "https://ontology.origintrail.io/dkg/1.0#metadata-hash:0x6a2292b30c844d2f8f2910bf11770496a3a79d5a6726d1b2fd3ddd18e09b5850", + "https://ontology.origintrail.io/dkg/1.0#representsPrivateResource": [ + { + "@id": "uuid:51e3c267-096f-4ff9-b963-bd48f2b0a210" + } + ] + }, + { + "@id": "https://ontology.origintrail.io/dkg/1.0#metadata-hash:0xc1f682b783b1b93c9d5386eb1730c9647cf4b55925ec24f5e949e7457ba7bfac", + "https://ontology.origintrail.io/dkg/1.0#representsPrivateResource": [ + { + "@id": "uuid:4b6065d0-7ed3-4b37-af44-7e202510fd44" + } + ] + }, + { + "@id": "urn:us-cities:data:new-york", + "http://schema.org/averageIncome": [ + { + "@value": "$63,998" + } + ], + "http://schema.org/crimeRate": [ + { + "@value": "Low" + } + ], + "http://schema.org/infrastructureScore": [ + { + "@value": "8.5" + } + ], + "http://schema.org/relatedCities": [ + { + "@id": "urn:us-cities:info:chicago" + }, + { + "@id": "urn:us-cities:info:los-angeles" + } + ], + "@type": [ + "http://schema.org/CityPrivateData" + ] + }, + { + "@id": "urn:us-cities:info:chicago", + "http://schema.org/name": [ + { + "@value": "Chicago" + } + ] + }, + { + "@id": "urn:us-cities:info:los-angeles", + "http://schema.org/name": [ + { + "@value": "Los Angeles" + } + ] + }, + { + "@id": "https://en.wikipedia.org/wiki/New_York_City", + "http://schema.org/name": [ + { + "@value": "New York" + } + ], + "http://schema.org/state": [ + { + "@value": "New York" + } + ], + "http://schema.org/area": [ + { + "@value": "468.9 sq mi" + } + ], + "http://schema.org/population": [ + { + "@value": "8,336,817" + } + ], + "@type": [ + "http://schema.org/City" + ] + }, + { + "@id": "uuid:bb6841b5-ff3d-4c91-aed6-4d2e7726b5a9", + "https://ontology.origintrail.io/dkg/1.0#privateMerkleRoot": [ + { + "@value": "0xaac2a420672a1eb77506c544ff01beed2be58c0ee3576fe037c846f97481cefd" + } + ] + } + ], + "operation": { + "get": { + "operationId": "171045d8-3adc-4d23-832e-6c1c9cb1d612", + "status": "COMPLETED" + } + } +} + +``` + +## Querying Knowledge Asset data with SPARQL + +Querying the DKG is done by using the SPARQL query language, which is very similar to SQL applied to graph data. + +_(If you have SQL experience, SPARQL should be relatively easy to get started with. More information_[ _can be found here_](https://www.w3.org/TR/rdf-sparql-query/)_)._ + +Let’s write a simple query to select all subjects and objects in the graph that have the **State** property of Schema.org context: + +```javascript +const result = await dkg.graph.query( + `prefix schema: + select ?s ?stateName + where { + ?s schema:state ?stateName + }`, + 'SELECT', +); + +console.log(JSON.stringify(result, null, 2)); +``` + +The returned response will contain an array of n-quads: + +
{
+  "status": "COMPLETED",
+  "data": [
+    {
+        s: 'https://en.wikipedia.org/wiki/New_York_City', 
+        stateName: '"New York"'
+    }
+  ]
+}
+
+ +As the OriginTrail node leverages a fully fledged graph database (a triple store supporting RDF), you can run arbitrary SPARQL queries on it. + +To learn more about querying the DKG go [here](../../querying-the-dkg.md). + +## **More on types of interaction with the DKG SDK** + +We can divide operations done by SDK into 3 types: + +* Node API request +* Smart contract call (non-state-changing interaction) +* Smart contract transaction (state-changing interaction) + +Non-state-changing interactions with smart contracts are free and can be described as contract-getters. They don’t require transactions on the blockchain. This means they do not incur transaction fees. + +Smart contract transactions are state-changing operations. This means they change the state of the smart contract memory, which requires some blockchain-native gas tokens (such as ETH, NEURO, etc.). + +In order to perform state-changing operations, you need to use a wallet funded with gas tokens. + +You can use default keys from the example below for hardhat blockchain: + +```javascript +const PRIVATE_KEY="0xac0974bec39a17e36ba4a6b4d238ff944bacb478cbed5efcae784d7bf4f2ff80" +const PUBLIC_KEY="0xf39Fd6e51aad88F6F4ce6aB8827279cffFb92266" +``` + +{% hint style="warning" %} +The default keys above should not be used anywhere except in a local environment for development. +{% endhint %} diff --git a/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-sdk/dkg-v8-js-client/interact-with-dkg-paranets.md b/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-sdk/dkg-v8-js-client/interact-with-dkg-paranets.md new file mode 100644 index 0000000..9ddcc6c --- /dev/null +++ b/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-sdk/dkg-v8-js-client/interact-with-dkg-paranets.md @@ -0,0 +1,265 @@ +--- +cover: ../../../../.gitbook/assets/dkg-js-banner.jpg +coverY: 0 +--- + +# Interact with DKG paranets + +The DKG JavaScript SDK provides functionality for interacting with paranets on the OriginTrail Decentralized Knowledge Graph (DKG). This section of the SDK allows developers to create, manage, and utilize paranets effectively. + +## Setup and installation + +To interact with paranets, you need to connect to a running OriginTrail node (either local or remote) and ensure you have the dkg.js SDK installed and properly configured. + +Follow the general setup instructions for [installing dkg.js](./) and read more about paranets [in the following section](../../dkg-paranets/). + +### Creating a paranet + +Before creating a paranet, you must first create a knowledge collection (KC) on the DKG and choose a Knowledge Asset (KA) from that KC that will represent the paranet. To create a knowledge collection on the DKG, refer to [the following page](./). + +{% hint style="info" %} +**Knowledge collection (KC)** is a **collection of Knowledge Assets (KA).** It refers to structured data that can be stored, shared, and validated within a distributed network. +{% endhint %} + +Once the knowledge collection is created, you can choose which KA from that KC will represent a paranet. KA will have a unique identifier known as a Universal Asset Locator (UAL). You will use this UAL to create a paranet. The paranet creation process essentially links the paranet to the Knowledge Asset, establishing it on the blockchain. This on-chain representation allows for decentralized management and interaction with the paranet. + +Here is an example of how to create a new paranet using the `create` function from the paranet module. This function requires the UAL of the previously created Knowledge Asset, along with other details such as the paranet's name and description: + +```javascript +const kcUAL = 'did:dkg:hardhat1:31337/0x791ee543738b997b7a125bc849005b62afd35578/1' +const kaUAL = `${kcUAL}/1`; +await dkg.paranet.create(kaUAL, { + paranetName: 'AiParanet', + paranetDescription: 'AI agents paranet for demonstration purposes.', + paranetNodesAccessPolicy: PARANET_NODES_ACCESS_POLICY.OPEN, + paranetMinersAccessPolicy: PARANET_MINERS_ACCESS_POLICY.OPEN, + paranetKcSubmissionPolicy: PARANET_KC_SUBMISSION_POLICY.PERMISSIONED, +}); +``` + +In this example: + +* `kaUAL` is the unique identifier of the Knowledge Asset created on the DKG. +* `paranetName` is the name you want to give to your paranet. It should be descriptive enough to indicate the paranet's purpose or focus. +* `paranetDescription` provides additional context about the paranet, explaining its purpose and the types of knowledge collections or services it will involve. +* `paranetNodesAccessPolicy` defines a paranet's policy towards including nodes. If OPEN, any node can be a part of the paranet. +* `paranetMinersAccessPolicy` defines a paranet's policy towards including knowledge miners. If OPEN, anyone can publish to a paranet. +* `paranetKcSubmissionPolicy` defines a paranet's policy regarding which KCs can be added and who can add new collections of Knowledge Assets. To learn more about curation, [read here](knowledge-submission-and-curation.md). If OPEN, anyone can access a paranet. + +After the paranet is successfully created, the paranet UAL can be used to interact with the paranet. This includes deploying services within the paranet, managing incentives, and claiming rewards associated with the paranet's operations. + +### Adding services to a paranet + +Enhance the capabilities of your paranet by integrating new services. The `addServices` function allows you to add both on-chain and off-chain services to your paranet. These services can range from AI agents and data oracles to decentralized knowledge interfaces and more. + +Before adding services, you first need to create them using the `createService` function. Services added to a paranet can either have on-chain addresses, representing smart contracts or other on-chain entities, or they can be off-chain services, which do not have associated blockchain addresses. + +Each service can be identified by all paranet users via its registry Knowledge Asset and can include multiple on-chain accounts under its control. This enables services to participate in economic activities within the DKG. + +```javascript +const paranetUAL = 'did:dkg:hardhat1:31337/0x791ee543738b997b7a125bc849005b62afd35578/1/1'; +const serviceUAL = 'did:dkg:hardhat1:31337/0x791ee543738b997b7a125bc849005b62afd35578/2/1'; +await dkg.paranet.createService(serviceUAL, { + paranetServiceName: 'MyAiService', + paranetServiceDescription: 'Autonomous AI service for AI paranet', + paranetServiceAddresses: ['0xb3155543738b997b7a1a5bc849005bc2afd35578', '0x2375e543738b997b7a125bc849005b62afd35571'], +}); + +const serviceUALs = [serviceUAL]; +await dkg.paranet.addServices(paranetUAL, serviceUALs); +``` + +In this example: + +* `paranetServiceName` specifies the name of the service. +* `paranetServiceDescription` provides a brief description of what the service does. +* `paranetServiceAddresses` lists blockchain addresses associated with the service. For off-chain services, this field can be left empty. +* `serviceUALs` is an array of UALs that are used to register services you want to add to your Paranet. + +By integrating and managing services, paranet operators can expand the capabilities of their paranet, providing a robust infrastructure for decentralized applications and AI-driven services. + +## Knowledge mining for open paranets + +Paranets allow users to leverage collective intelligence by contributing their knowledge collections, enhancing the network's overall utility and value. + +**Submitting existing knowledge collections to a paranet** + +Once you create a knowledge collection, you can submit it to a paranet using the `dkg.asset.submitToParanet` function. + +Here’s an example: + +```javascript +const paranetUAL = 'did:dkg:hardhat1:31337/0x791ee543738b997b7a125bc849005b62afd35578/1/1'; +const kcUAL = 'did:dkg:hardhat1:31337/0x791ee543738b997b7a125bc849005b62afd35578/55'; + +// Submit a Knowledge Collection to a paranet +await DkgClient.asset.submitToParanet(kcUAL, paranetUAL); +``` + +## Checking and claiming rewards + +Participants in an incentivized paranet can earn rewards for their various roles and contributions, such as knowledge mining, voting on proposals, or operating the paranet. The dkg.js library provides functions to check if an address has a specific role within the paranet and to claim rewards associated with that role. + +If you're interested in deploying a **paranet's incentive pool**, you can find more details and guidelines at this [link](../../dkg-paranets/initial-paranet-offerings-ipos/paranets-incentives-pool.md). + +**Roles in a paranet:** + +* **Knowledge miners:** Contribute to the paranet by mining knowledge collections. +* **Paranet operators:** Manage the paranet, including overseeing services and facilitating operations. +* **Proposal voters:** Participate in decision-making by voting on the Initial Paranet Offering (IPO). + +Participants can verify their roles and claim rewards through the following steps and examples: + +
const paranetUAL = 'did:dkg:hardhat1:31337/0x791ee543738b997b7a125bc849005b62afd35578/1/1';
+
+const incentivesPoolOptions = {
+        tracToTokenEmissionMultiplier: 5,
+        operatorRewardPercentage: 10.0,
+        incentivizationProposalVotersRewardPercentage: 12.0,
+        incentivesPoolName: 'YourIncentivesPoolName',
+        rewardTokenAddress: '0x0000000000000000000000000000000000000000', // the native token for the chosen network
+ };
+ 
+ // Deploys the incentives contract to the Paranet using the defined options and prints the deployment result.
+ const paranetDeployed = await DkgClient.paranet.deployIncentivesContract(paranetUAL, incentivesPoolOptions);
+    console.log('======================== PARANET INCENTIVES POOL DEPLOYED');
+    console.log(paranetDeployed);
+    divider();
+
+// Retrieves and logs all existing incentives pools within the Paranet ecosystem.
+const allIncentivesPools = await DkgClient.paranet.getAllIncentivesPools(paranetUAL);
+    console.log('======================== ALL PARANET INCENTIVES POOLS');
+    console.log(allIncentivesPools);
+    divider();
+
+// Fetches and displays the storage address of the deployed incentives pool based on its name and address.
+const incentivesPoolStorageAddressResult = await DkgClient.paranet.getIncentivesPoolStorageAddress(paranetUAL, { incentivesPoolName: incentivesPoolOptions.incentivesPoolName, incentivesPoolAddress: paranetDeployed.incentivesPoolAddress});
+    console.log('======================== PARANET INCENTIVES POOL STORAGE ADDRESS');
+    console.log(incentivesPoolStorageAddressResult);
+    divider();
+    
+// Check if an address is a knowledge miner
+const isMiner = await dkg.paranet.isKnowledgeMiner(paranetUAL, { roleAddress: '0xMinerAddress', incentivesPoolName: incentivesPoolOptions.incentivesPoolName});
+console.log('Is Knowledge Miner:', isMiner);
+
+// Check if an address is a paranet operator
+const isOperator = await dkg.paranet.isParanetOperator(paranetUAL, { incentivesPoolName: incentivesPoolOptions.incentivesPoolName});
+console.log('Is Paranet Operator:', isOperator);
+
+// Check if an address is a voter
+const isVoter = await dkg.paranet.isProposalVoter(paranetUAL, { roleAddress: '0xVoterAddress', incentivesPoolName: incentivesPoolOptions.incentivesPoolName});
+console.log('Is Proposal Voter:', isVoter);
+
+// Check claimable Knowledge miner rewards
+const claimableMinerRewards = await dkg.paranet.getClaimableMinerReward(paranetUAL, { incentivesPoolName: incentivesPoolOptions.incentivesPoolName});
+console.log('Claimable Miner Reward:', claimableMinerRewards);
+
+// Claim miner rewards
+await dkg.paranet.claimMinerReward(paranetUAL, claimableMinerRewards, { incentivesPoolName: incentivesPoolOptions.incentivesPoolName});
+console.log('Miner rewards claimed successfully!');
+
+// Check claimable operator rewards
+const claimableVoterRewards = await dkg.paranet.getClaimableVoterReward(paranetUAL, { incentivesPoolName: incentivesPoolOptions.incentivesPoolName});
+console.log('Claimable Voter Reward:', claimableVoterRewards);
+
+// Claim voter rewards
+await dkg.paranet.claimVoterReward(paranetUAL, { incentivesPoolName: incentivesPoolOptions.incentivesPoolName});
+console.log('Voter rewards claimed successfully!');
+
+// Check claimable operator rewards
+const claimableOperatorRewards = await dkg.paranet.getClaimableOperatorReward(paranetUAL, { incentivesPoolName: incentivesPoolOptions.incentivesPoolName});
+console.log('Claimable Operator Reward:', claimableOperatorRewards);
+
+// Claim operator rewards
+await dkg.paranet.claimOperatorReward(paranetUAL, { incentivesPoolName: incentivesPoolOptions.incentivesPoolName});
+console.log('Operator rewards claimed successfully!');
+
+ +By following these steps, you can effectively check your role and claim the rewards you have earned for contributing to the paranet. + +This system ensures that all participants are fairly compensated for their efforts, promoting a robust and active community within the paranet. + +{% hint style="info" %} +To learn more about **managing the submission and approval process for knowledge collections (KC) in a paranet,** refer to the [Knowledge submission & curation](knowledge-submission-and-curation.md) page. +{% endhint %} + +## Performing SPARQL queries on a specific paranet + +The DKG enables users to perform SPARQL queries on specific paranets. By specifying a paranet, users can target their queries to retrieve data related to that paranet. This can be particularly useful when working with domain-specific data or services within a paranet. + +To query a specific paranet, ensure that the node you are querying already has paranet syncing enabled for the paranet you wish to query. Without this setup, the node may not have the relevant data required to process your queries.[ ](../../dkg-paranets/syncing-a-dkg-paranet.md) + +[Read here](../../dkg-paranets/syncing-a-dkg-paranet.md) how to set up a node to sync a paranet. + +To query a specific paranet, you have to specify the paranet UAL using the `paranetUAL` parameter. This approach allows you to direct your queries to the paranet that holds the relevant data. + +Here’s how you can perform a query on a specific paranet using the `paranetUAL` parameter: + +```javascript + const paranetUAL = 'did:dkg:hardhat1:31337/0x791ee543738b997b7a125bc849005b62afd35578/1/1'; + const queryWhereMadrid = `PREFIX schema: + SELECT DISTINCT ?graphName + WHERE { + GRAPH ?graphName { + ?s schema:city . + } +}`; + +let queryResult = await dkg.graph.query( + queryWhereMadrid, + 'SELECT', + { paranetUAL: paranetUAL }, +); + +console.log(queryResult.data); +``` + +By querying specific paranets, you can leverage the powerful capabilities of the DKG to interact with domain-specific collections of knowledge and services, ensuring that your queries are targeted and efficient. This makes it easier to work with complex data structures and gain insights from your paranet's knowledge collections. + +### Federated SPARQL queries + +Federated SPARQL queries allow users to execute queries across the whole knowledge graph and paranets simultaneously. In the context of the DKG, a node might sync with multiple paranets. Federated queries allow you to query multiple paranets within a single SPARQL query, accessing data from each specified paranet and merging the results. + +Imagine you have a DKG node(ot-node) that synchronizes with three different paranets. You want to perform a query that targets two of these paranets to retrieve data about users and cities. Federated SPARQL queries provide a convenient way to specify which paranets to include in your query. + +If you need to query data across multiple specified paranets, you should use federated SPARQL queries. However, if you want to query all available paranets, you do not need to provide any specific arguments, as all paranets will be queried by default using the default triple store repository. + +To execute a federated SPARQL query, you can use the `SERVICE` keyword to specify the paranet UALs you want to query. This keyword allows you to include data from different sources in your query. + +Here’s an example of a federated query targeting two out of three paranets: + +```javascript +const federatedQuery = ` + PREFIX schema: + SELECT DISTINCT ?s ?city1 ?user1 ?s2 ?city2 ?user2 ?company1 + WHERE { + ?s schema:city ?city1 . + ?s schema:company ?company1 . + ?s schema:user ?user1; + + SERVICE <${paranetUAL3}> { + ?s2 schema:city . + ?s2 schema:city ?city2 . + ?s2 schema:user ?user2; + } + + filter(contains(str(?city2), "Belgrade")) + } + `; + + queryResult = await dkg.graph.query( + federatedQuery, + 'SELECT', + { paranetUAL: paranetUAL1 }, + ); + + console.log(queryResult.data); +``` + +**Explanation:** + +* **`SERVICE` keyword:** The `SERVICE` keyword is used to include data from Paranet 3 (`paranetUAL3`) in the query, while the primary paranet is set to Paranet 1 (`paranetUAL1`). +* **Query structure:** The query retrieves distinct subjects (`?s`), cities, users, and companies from Paranet 1, and performs a sub-query within Paranet 3 to get data on where the city is `Belgrade`. +* **Filter clause:** The `filter` clause is used to ensure that the city data from Paranet 3 contains the string "Belgrade". + +Federated SPARQL queries provide a powerful way to aggregate and analyze data across multiple paranets. This enables more complex data retrieval and cross-paranet data integration, making it easier to gather comprehensive insights from diverse data sources. diff --git a/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-sdk/dkg-v8-js-client/knowledge-submission-and-curation.md b/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-sdk/dkg-v8-js-client/knowledge-submission-and-curation.md new file mode 100644 index 0000000..b66556a --- /dev/null +++ b/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-sdk/dkg-v8-js-client/knowledge-submission-and-curation.md @@ -0,0 +1,148 @@ +# Knowledge submission & curation + +A staging paranet is the process of submitting a **Knowledge Collection (KC)** for approval before it becomes part of the paranet. **Curators review and either approve or reject the KC** to ensure data accuracy and quality before it's added to the network. + +{% hint style="info" %} +**Knowledge collection (KC)** is a **collection of Knowledge Assets.** It refers to structured data that can be stored, shared, and validated within a distributed network. +{% endhint %} + +To maintain data integrity and prevent malicious or irrelevant information from being added, the paranet system includes **permission controls** that regulate who can submit and approve new data entries. + +{% hint style="info" %} +Here's the demo code for a [**staging paranet**](https://github.com/OriginTrail/dkg.js/blob/v8/develop/examples/paranet-permissioned-kc-submission-demo.js). +{% endhint %} + +**Why is KC management important?** + +In decentralized networks, anyone can technically contribute data, but not all data should be trusted. Without a system for validation, there is a risk of: + +* **Spam or false data** being introduced into the network. +* **Duplicate or low-quality data** reducing the efficiency and reliability of the system. +* **Unauthorized modifications** that could disrupt the credibility of shared knowledge. + +To prevent these issues, **paranet uses a curator-based system**, in which **only authorized users (curators) can decide which Knowledge Assets get added to the network**. + +**How does the KC submission work?** + +1. **A miner creates a collection of Knowledge Assets (KC)** — This is structured data (e.g., city information, scientific research, metadata, etc.) prepared for submission. +2. **The KC is staged to the paranet** — This means it is submitted for approval but is not yet officially part of the paranet. +3. **A curator reviews the submission** — The curator has the authority to either **accept or reject** the KC. +4. **If approved, the KC is officially submitted** and becomes part of the paranet network. If rejected, the KC is not added, and the user must make changes or submit a different KC. + +This structured approach ensures that only **reliable and relevant data** is added to the network while maintaining a decentralized governance model. + +\ +When creating a **paranet**, we can define permissions for: + +* [ ] **Nodes** – Who can join the network. (Not added yet) +* [ ] **Miners** – Who can validate data. (Not added yet) +* [x] **KC submissions** – Who can add a new **collection of Knowledge Assets**. + +### Adding a curator + +A curator is a user who decides **which collection of Knowledge Assets (KC) are accepted or rejected**.\ +Curators are added using their **address**, and they control the approval of new data within the Paranet. + +**Add a curator:** + +```js +DkgClient.paranet.addCurator(paranetUAL, PUBLIC_KEY); +``` + +**Remove a curator:** + +```js +DkgClient.paranet.removeCurator(paranetUAL, PUBLIC_KEY); +``` + +**Check if a user is a curator:** + +```js +isCurator = await DkgClient.paranet.isCurator(paranetUAL, PUBLIC_KEY); +console.log('Is user a curator?', isCurator); +``` + +### **Staging — Submitting KC to a paranet** + +When a **Knowledge Collection (KC)** is staged, it must go through the **staging process** before being officially added to the paranet. + +This means the KC is **submitted for review**, and the **curator decides whether to accept or reject it**. + +**Create a new Knowledge Collection (KC):** + +```js +jsCopyEditconst content = { + public: { + '@context': 'https://www.schema.org', + '@id': 'urn:us-cities:info:dallas', + '@type': 'City', + name: 'Dallas', + state: 'Texas', + population: '1,343,573', + area: '386.5 sq mi', + } +}; + +const createKcResult = await DkgClient.asset.create(content, { epochsNum: 2 }); +console.log('Knowledge Collection Created:', createKcResult); +``` + +**Stage KC to the paranet (submit for approval):** + +```js +stageToParanetResult = await DkgClient.paranet.stageKnowledgeCollection( + createKcResult.UAL, + paranetUAL, +); +console.log('Knowledge Collection Staged to Paranet:', stageToParanetResult); +``` + +**Check if KC is staged for approval:** + +```js +isStaged = await DkgClient.paranet.isKnowledgeCollectionStaged(createKcResult.UAL, paranetUAL); +console.log('Is KC staged to Paranet?', isStaged); +``` + +### Reviewing and approving KC + +A curator can **accept or reject** a knowledge collection: + +**Reject a KC:** + +```js +DkgClient.paranet.reviewKnowledgeCollection(createKcResult.UAL, paranetUAL, false); +console.log('Knowledge Collection Rejected'); +``` + +**Accept a KC:** + +```js +DkgClient.paranet.reviewKnowledgeCollection(createKcResult.UAL, paranetUAL, true); +console.log('Knowledge Collection Approved'); +``` + +**Check approval status:** + +```js +approvalStatus = await DkgClient.paranet.getKnowledgeCollectionApprovalStatus(createKcResult.UAL, paranetUAL); +console.log('KC Approval Status:', approvalStatus); +``` + +**Check if KC is registered:** + +```js +isRegistered = await DkgClient.paranet.isKnowledgeCollectionRegistered(createKcResult.UAL, paranetUAL); +console.log('Is KC registered to Paranet?', isRegistered); +``` + +### **Conclusion** + +* **Curators manage which KC entries are accepted to a paranet.** Users who wish to submit data to a paranet must go through the **staging process**. + +{% hint style="info" %} +**The curator** doesn't have to be human; it can also be an AI agent. +{% endhint %} + +* **KC must first be submitted for approval**, and then the curator can **accept or reject it**. +* **All operations are tied to the user's public key**, enabling **secure and decentralized data management**. 🚀 diff --git a/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-sdk/dkg-v8-js-client/paranets-incentives-pool-implementation.md b/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-sdk/dkg-v8-js-client/paranets-incentives-pool-implementation.md new file mode 100644 index 0000000..2cc359e --- /dev/null +++ b/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-sdk/dkg-v8-js-client/paranets-incentives-pool-implementation.md @@ -0,0 +1,38 @@ +# Paranet's incentives pool implementation + +The **incentives pool** is designed to motivate key participants in the paranet ecosystem by rewarding them for their contributions. Knowledge miners, voters, and operators all play crucial roles in maintaining and growing the system. These incentives ensure the continued success and proper functioning of the network. Multiple incentives pools can be deployed for one paranet. + +### Incentives pool options + +The `incentivesPoolOptions` object defines the parameters for the reward system within the paranet ecosystem. It includes the following key settings: + +```javascript + +const incentivesPoolOptions = { + tracToTokenEmissionMultiplier: 5, + operatorRewardPercentage: 10.0, + incentivizationProposalVotersRewardPercentage: 12.0, + incentivesPoolName: 'YourIncentivesPoolName', + rewardTokenAddress: '0x0000000000000000000000000000000000000000', + }; +``` + +* **tracToTokenEmissionMultiplier**: A multiplier that affects the token emission rate, determining how much reward is distributed based on user actions. +* **operatorRewardPercentage**: Operators who are responsible for managing and maintaining the paranet. +* **incentivizationProposalVotersRewardPercentage**: Voters who participate in proposals. +* **incentivesPoolName**: Sets the name of the pool. +* **rewardTokenAddress**: This specifies the address of the reward token. If zero address is set, then the chain's native token is used for incentivization. The reward token address can also be any ERC-20 token of the respective chain. + +### Deployment of incentives pool + +This code deploys the incentives contract for the paranet using the specified options and `paranetUAL`, then logs the deployment result to verify its success. + +```javascript + const paranetDeployed = await DkgClient.paranet.deployIncentivesContract( + paranetUAL, + incentivesPoolOptions, + ); + console.log('======================== PARANET INCENTIVES POOL DEPLOYED'); + console.log(paranetDeployed); + divider(); +``` diff --git a/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-sdk/dkg-v8-js-client/permissioned-paranets.md b/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-sdk/dkg-v8-js-client/permissioned-paranets.md new file mode 100644 index 0000000..b14e183 --- /dev/null +++ b/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-sdk/dkg-v8-js-client/permissioned-paranets.md @@ -0,0 +1,84 @@ +# Permissioned paranets + +**Paranet permission policies** define which nodes and knowledge miners can participate in a paranet. These policies are set by the **paranet operator** at the time of creation. + +{% hint style="info" %} +**A paranet operator** is the account that owns the Knowledge Asset from which the paranet was created. +{% endhint %} + +There are two permission policies: + +• PARANET\_NODES\_ACCESS\_POLICY – governs which nodes can sync Knowledge Collections. + +• PARANET\_MINERS\_ACCESS\_POLICY – governs which knowledge miners (wallet addresses) can submit Knowledge Collection + +{% hint style="info" %} +Here is demo code for a [permissioned paranet](https://github.com/OriginTrail/dkg.js/blob/v8/develop/examples/curated-paranet-demo.js). +{% endhint %} + +### Paranet node-access permission policy + +This policy controls which nodes are allowed to sync the paranet’s Knowledge Collections and whether they can sync the private part of the collection. + +* OPEN — Any node can sync the Paranet, and only the public part of Knowledge Collections is included +* PERMISSIONED — Only approved nodes sync the paranet, and both the public and private parts of Knowledge Collection are included. Private knowledge sharing is enable! + +#### Interacting with a node-access permissioned paranet + +The paranet operator can **add nodes** to a permissioned paranet + +```javascript +await DkgClient.paranet.addPermissionedNodes(paranetUAL, identityIds) +``` + +The paranet operator can **remove nodes** from a permissioned paranet + +```javascript +await DkgClient.paranet.removePermissionedNodes(paranetUAL, identityIds); +``` + +**Anybody can check which nodes** are part of a paranet: + +```javascript +await DkgClient.paranet.getPermissionedNodes(paranetUAL); +``` + +### Paranet-miner-access permission policy + +This policy defines who can submit Knowledge Collections to a paranet. + +* OPEN — Any knowledge miner (address) can submit a Knowledge Collection +* PERMISSIONED — Only approved knowledge miners (addresses) can submit a Knowledge Collection. Allows fine-grained control over who contributes data. + +{% hint style="info" %} +**Knowledge collection (KC)** is a **collection of Knowledge Assets.** It refers to structured data that can be stored, shared, and validated within a distributed network. +{% endhint %} + +#### Interacting with a miner-access permissioned paranet + +The paranet operator can **add miners** to a permissioned paranet + +```javascript +await DkgClient.paranet.addParanetPermissionedMiners(paranetUAL, minerAddresses); +``` + +The paranet operator can **remove miners** from a permissioned paranet + +```javascript +await DkgClient.paranet.removeParanetPermissionedMiners(paranetUAL, minerAddresses); +``` + +### Combining policies + +These two policies can be combined in any way: + +| Node Access Policy | Miner Acces Policy | Result | +| ------------------ | ------------------ | ------------------------------------------------------------------------------------------------------------------------------------------------ | +| OPEN | OPEN | Any node can sync the public part of the KC from the paranet and any miner can add knowledge to the paranet. | +| OPEN | PERMISSIONED | Any node can sync the public part of the KC from the paranet and only selected miners can add knowledge to the paranet | +| PERMISSIONED | OPEN | Only selected nodes can sync both private and public parts of the KC from the paranet and any miner can add knowledge to the pParanet | +| PERMISSIONED | PERMISSIONED | Only selected nodes can sync both private and public parts of the KC from the paranet and only selected miners can add knowledge to the paranet | + +### Access policies and knowledge curations + +These permissions will also interact with staging paranets. If a paranet has PARANET\_KC\_SUBMISSION\_POLICY STAGING and PERMISSIONED PARANET\_MINERS\_ACCESS\_POLICY, only approved knowledge miners can stage Knowledge Collections. diff --git a/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-sdk/dkg-v8-py-client/README.md b/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-sdk/dkg-v8-py-client/README.md new file mode 100644 index 0000000..43b63e9 --- /dev/null +++ b/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-sdk/dkg-v8-py-client/README.md @@ -0,0 +1,375 @@ +--- +description: Python library for interacting with the DKG +--- + +# DKG Python SDK (dkg.py) + +If you are looking to build applications leveraging [Knowledge Assets](./#create-a-knowledge-collection) on the OriginTrail Decentralized Knowledge Graph (DKG), the dkg.py library is the best place to start! + +The DKG SDK is used together with an **OriginTrail gateway node** to build applications that interface with the OriginTrail DKG (the node is a dependency). Therefore, you either need to run a gateway node on [your local environment](../setting-up-your-development-environment.md) or a [hosted OT-node](../../../../graveyard/everything/dkg-core-node/run-a-v8-core-node-on-testnet/), in order to use the SDK. + +## Prerequisites + +* python ≥ 3.11 +* poetry ≥ 1.8.5 + +## Installation + +The library can be used in any Python application. + +Run the command to install dkg.py library using pip: + +```bash +pip install dkg +``` + +pip x: + +```bash +pipx install dkg +``` + +or poetry: + +```bash +poetry add dkg==8.0.1 +``` + +## :snowboarder: Quickstart + +In this package, there are both synchronous and asynchronous versions of the DKG client. + +The synchronous client is designed for applications where blocking calls are acceptable. It operates sequentially, making it simpler to integrate into existing codebases that do not use asynchronous programming. + +The asynchronous client is built for non-blocking operations, making it ideal for scenarios where multiple tasks need to run concurrently. It is generally faster than the synchronous client. + +### Synchronous DKG client + +To use the Synchronous DKG library, you need to connect to a running local or remote OT-node. + +
from dkg import DKG
+from dkg.providers import BlockchainProvider, NodeHTTPProvider
+
+node_provider = NodeHTTPProvider(endpoint_uri="http://localhost:8900", api_version="v1")
+blockchain_provider = BlockchainProvider(
+    Environments.DEVELOPMENT.value, # or TESTNET, MAINNET
+    BlockchainIds.HARDHAT_1.value,
+)
+
+dkg = DKG(node_provider, blockchain_provider)
+
+print(dkg.node.info)
+# if successfully connected, this should print the dictionary with node version
+# { "version": "8.X.X" }
+
+ +### Asynchronous DKG client + +The asynchronous DKG client leverages Python's `asyncio` library for managing asynchronous operations. Below is an example of how to set up and use the asynchronous DKG client: + +```python +import asyncio +from dkg.providers import AsyncBlockchainProvider, AsyncNodeHTTPProvider +from dkg import AsyncDKG + +async def main(): + node_provider = AsyncNodeHTTPProvider( + endpoint_uri="http://localhost:8900", + api_version="v1", + ) + + # make sure that you have PRIVATE_KEY in .env so the blockchain provider can load it + blockchain_provider = AsyncBlockchainProvider( + Environments.DEVELOPMENT.value, + BlockchainIds.HARDHAT_1.value, + ) + + dkg = AsyncDKG( + node_provider, + blockchain_provider, + config={"max_number_of_retries": 300, "frequency": 2}, + ) + +if __name__ == "__main__": + asyncio.run(main()) +``` + +{% hint style="warning" %} +Make sure to create an .env file and add the PRIVATE\_KEY variable to it so that the blockchain provider can pick it up. +{% endhint %} + +### Blockchain networks + +The system supports multiple blockchain networks, which can be configured using the `BlockchainIds` constants. You can select the desired blockchain by specifying the corresponding constant. The available options are: + +**DKG mainnet options:** + +* Base: base:8453 +* Gnosis: gnosis:100 +* Neuroweb: otp:2043 + +**DKG testnet options:** + +* Base: base:84532 +* Gnosis: gnosis:10200 +* Neuroweb: otp:20430 + +**DKG devnet options:** + +* Base: base:84532 +* Gnosis: gnosis:10200 +* Neuroweb: otp:2160 + +**Local options:** + +* Hardhat1: hardhat1:31337 +* Hardhat2: hardhat2:31337 + +## Create a Knowledge Collection + +In this example, let’s create an example Knowledge Collection representing a city. The content contains both public and private assertions. Public assertions will be exposed publicly (replicated to other nodes), while private ones won't (stay on the node you published to only). If you have access to the particular node that has the data, when you search for it using get or query, you will see both public and private assertions. + +```python +const content = { + public: { + '@context': 'http://schema.org', + '@id': 'https://en.wikipedia.org/wiki/New_York_City', + '@type': 'City', + name: 'New York', + state: 'New York', + population: '8,336,817', + area: '468.9 sq mi', + }, + private: { + '@context': 'http://schema.org', + '@id': 'https://en.wikipedia.org/wiki/New_York_City', + '@type': 'CityPrivateData', + crimeRate: 'Low', + averageIncome: '$63,998', + infrastructureScore: '8.5', + relatedCities: [ + { '@id': 'urn:us-cities:info:los-angeles', name: 'Los Angeles' }, + { '@id': 'urn:us-cities:info:chicago', name: 'Chicago' }, + ], + }, +} +``` + +When you create the Knowledge Collection, the above JSON-LD object will be converted into an **assertion**. When an assertion with public data is prepared, we can create a Knowledge Asset on the DKG. `epochs_number` specifies how many epochs the asset should be kept for (an epoch is equal to three months). + +```python +create_asset_result = await dkg.asset.create( + content=content, + options={ + "epochs_num": 2, + "minimum_number_of_finalization_confirmations": 3, + "minimum_number_of_node_replications": 1 + }, +) +print(create_asset_result) + +``` + +{% hint style="warning" %} +To use the synchronous version, just remove the await (this applies to any function call you see in the rest of this document) +{% endhint %} + +The complete response of the method will look like: + +```python +{ + "UAL": "did:dkg:otp:2043/0x8f678eb0e57ee8a109b295710e23076fa3a443fe/572238", + "datasetRoot": "0xd7a2dd6d747d2f8d2d0f76cc6fa04ebf383a368249cc24a701788f271a41df4d", + "signatures": [ + { + "identityId": 131, + "v": 28, + "r": "0x583598a701e4c54a1e47e6ff2f0cf0a9660d1749f3e413408ccd3ff5ca2288dc", + "s": "0x131c07215977c4d681dc30cc32d9e8c1644825b932fb72cf035bb62f0963d2a5", + "vs": "0x931c07215977c4d681dc30cc32d9e8c1644825b932fb72cf035bb62f0963d2a5" + }, + ], + "operation": { + "mintKnowledgeAsset": { + "transactionHash": "0x04efacfc576578836c6736f376d23930bb01accd38df31414ff7b5e35861d8f2", + "transactionIndex": 1, + "blockHash": "0x98bf84f7cb05f5742213ed305d9e99b59f6c874b6abf10312fc5062bc32f14ac", + "from": "0x0E1405adD312D97d1a0A4fAA134C7113488D6ceA", + "to": "0xc8cf8064d7fc7cF42d51Ca5B28218472157F3d90", + "blockNumber": 7459414, + "cumulativeGasUsed": 572870, + "gasUsed": 537962, + "contractAddress": null, + "logs": [ + ], + "logsBloom": "0x000002000000000000000000000000000000000000000000000008004000000000008000000000080000000000000000000000000000000000000000000c0000000000000008000008000008000000000040000000042000004000000000200000000000020100000000000000000808000000000802828000000010000000004000000000000000000000200000000000000000000000000000000200000000000000000000000002001a00000000000000000000000000080000000000001100000042000000000000000000040000000400000001000000000000000060000000080000000000020000000000010000000000000000000000000802000000", + "status": 1, + "effectiveGasPrice": 8, + "type": 2 + }, + "publish": { + "operationId": "a1709e45-0a0e-44c8-8c67-19edb4baefaa", + "status": "COMPLETED" + }, + "finality": { + "status": "FINALIZED" + }, + "numberOfConfirmations": 6, + "requiredConfirmations": 3 + } +} + +``` + +## Read Knowledge Asset data from the DKG + +To read Knowledge Asset data from the DKG, we utilize the **get** protocol operation. + +In this example, we will get the latest state of the Knowledge Asset we published previously: + +```python +ual = create_asset_result.get("did:dkg:otp:2043/0x8f678eb0e57ee8a109b295710e23076fa3a443fe/572238") + +get_asset_result = await dkg.asset.get(ual) + +print(get_asset_result) +``` + +The response of the get operation will be the assertion graph: + +```python +{ + "assertion": [ + { + "@id": "https://ontology.origintrail.io/dkg/1.0#metadata-hash:0x5cb6421dd41c7a62a84c223779303919e7293753d8a1f6f49da2e598013fe652", + "https://ontology.origintrail.io/dkg/1.0#representsPrivateResource": [ + { + "@id": "uuid:a7a27a50-7180-4949-b9d7-48ab931b9650" + } + ] + }, + { + "@id": "https://ontology.origintrail.io/dkg/1.0#metadata-hash:0x6a2292b30c844d2f8f2910bf11770496a3a79d5a6726d1b2fd3ddd18e09b5850", + "https://ontology.origintrail.io/dkg/1.0#representsPrivateResource": [ + { + "@id": "uuid:43e30a8d-fc1e-41e1-a4b4-4d419c21108a" + } + ] + }, + { + "@id": "https://ontology.origintrail.io/dkg/1.0#metadata-hash:0xc1f682b783b1b93c9d5386eb1730c9647cf4b55925ec24f5e949e7457ba7bfac", + "https://ontology.origintrail.io/dkg/1.0#representsPrivateResource": [ + { + "@id": "uuid:eec71df5-bb62-48cb-b196-9804aff60f51" + } + ] + }, + { + "@id": "urn:us-cities:info:new-york", + "http://schema.org/name": [ + { + "@value": "New York" + } + ], + "http://schema.org/area": [ + { + "@value": "468.9 sq mi" + } + ], + "http://schema.org/population": [ + { + "@value": "8,336,817" + } + ], + "http://schema.org/state": [ + { + "@value": "New York" + } + ], + "@type": [ + "http://schema.org/City" + ] + }, + { + "@id": "uuid:e63489c8-618b-494c-8fbb-f22ab0537b89", + "https://ontology.origintrail.io/dkg/1.0#privateMerkleRoot": [ + { + "@value": "0xaac2a420672a1eb77506c544ff01beed2be58c0ee3576fe037c846f97481cefd" + } + ] + } + ], + "operation": { + "get": { + "operationId": "9918220b-5175-44c3-b43a-3544610be560", + "status": "COMPLETED" + } + } +} + + +``` + +## Querying Knowledge Asset data with SPARQL + +Querying the DKG is done by using the SPARQL query language, which is very similar to SQL applied to graph data. + +_(If you have SQL experience, SPARQL should be relatively easy to get started with. More information_[ _can be found here_](https://www.w3.org/TR/rdf-sparql-query/)_)._ + +Let’s write a simple query to select all subjects and objects in the graph that have the **Model** property of Schema.org context: + +```python +query_operation_result = await dkg.graph.query( + """ + PREFIX SCHEMA: + SELECT ?s ?stateName + WHERE { + ?s schema:state ?stateName . + } + """ +) + +print(query_graph_result) +``` + +The returned response will contain an array of n-quads: + +```python +{ + "status": "COMPLETED", + "data": [ + { + "s": "urn:us-cities:info:new-york", + "stateName": "\"New York\"" + } + ] +} +``` + +As the OriginTrail node leverages a fully fledged graph database (a triple store supporting RDF), you can run arbitrary SPARQL queries on it. + +To learn more about querying the DKG go [here](../../querying-the-dkg.md). + +## **More on types of interaction with the DKG SDK** + +We can divide operations done by SDK into 3 types: + +* Node API request +* Smart contract call (non-state-changing interaction) +* Smart contract transaction (state-changing interaction) + +Non-state-changing interactions with smart contracts are free and can be described as contract-getters. They don’t require transactions on the blockchain. This means they do not incur transaction fees. + +Smart contract transactions are state-changing operations. This means they change the state of the smart contract memory, which requires some blockchain-native gas tokens (such as ETH, NEURO, etc.). + +In order to perform state-changing operations, you need to use a wallet funded with gas tokens. + +You can use default keys from the example below for hardhat blockchain: + +```python +PRIVATE_KEY = "0xac0974bec39a17e36ba4a6b4d238ff944bacb478cbed5efcae784d7bf4f2ff80" +``` + +{% hint style="warning" %} +The default keys above should not be used anywhere except in a local environment for development. +{% endhint %} diff --git a/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-sdk/dkg-v8-py-client/interact-with-dkg-paranets.md b/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-sdk/dkg-v8-py-client/interact-with-dkg-paranets.md new file mode 100644 index 0000000..03bd7f6 --- /dev/null +++ b/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-sdk/dkg-v8-py-client/interact-with-dkg-paranets.md @@ -0,0 +1,242 @@ +--- +hidden: true +--- + +# Interact with DKG paranets + +{% hint style="danger" %} +**Paranets are not currently supported in the DKG V8. Expect the support to be back latest by the end of February.** +{% endhint %} + +The DKG Python SDK provides functionality for interacting with paranets on the OriginTrail Decentralized Knowledge Graph (DKG). This section of the SDK allows developers to create, manage, and utilize paranets effectively. + +## Setup and installation + +To interact with paranets, you need to connect to a running OriginTrail node (either local or remote) and ensure you have the dkg.py SDK installed and properly configured. Follow the general setup instructions for [installing dkg.py](./) and read more about paranets [in the following section](../../dkg-paranets/). + +### Creating a paranet + +Before creating a paranet, you must first create a Knowledge Collection (KC) on the DKG and choose a Knowledge Asset (KA) from that KC that will represent the paranet. To create a Knowledge Asset on the DKG, refer to [the following page](./). + +Once the Knowledge Collection is created, you can choose which KA from that KC will represent a paranet. KA will have a unique identifier known as a Universal Asset Locator (UAL). You will use this UAL to create a paranet. The paranet creation process essentially links the paranet to the Knowledge Asset, establishing it on the blockchain. This on-chain representation allows for decentralized management and interaction with the paranet. + +Here is an example of how to create a new paranet using the `create` function from the paranet module. This function requires the UAL of the previously created Knowledge Asset, along with other details such as the paranet's name and description: + +```python +kc_ual = 'did:dkg:hardhat1:31337/0x791ee543738b997b7a125bc849005b62afd35578/1' +ka_ual = f"{kc_ual}/1" +options = { + "paranet_name": "TestParanet", + "paranet_description": "TestParanetDescription", + "paranet_nodes_access_policy": ParanetNodesAccessPolicy.CURATED, + "paranet_miners_access_policy": ParanetMinersAccessPolicy.CURATED +} + +await dkg.paranet.create(paranet_ual, options) +``` + +{% hint style="warning" %} +To use the synchronous version, just remove the await (this applies to all function calls you will see in the rest of this document) + +Asynchronous version setup guide can be found here: [async guide](./) +{% endhint %} + +In this example: + +* `ka_ual` is the unique identifier of the Knowledge Asset created on the DKG. +* `name` is the name you want to give to your paranet. It should be descriptive enough to indicate the paranet's purpose or focus. +* `description` provides additional context about the paranet, explaining its purpose and the types of Knowledge Assets or services it will involve. +* `paranet_nodes_access_policy` defines a paranet's policy towards including nodes. If OPEN, any node can be a part of the paranet. If CURATED, only the paranet owner can approve nodes to be a part of the paranet. +* `paranet_miners_access_policy` defines a paranet's policy towards including knowledge miners. If OPEN, anyone can publish to a paranet. If CURATED, only the paranet owner can approve knowledge miners who can publish to the paranet. + +After the paranet is successfully created, the paranet UAL can be used to interact with the specific paranet. This includes deploying services within the paranet, managing incentives, and claiming rewards associated with the paranet's operations. + +### Adding services to a paranet + +Enhance the capabilities of your paranet by integrating new services. The `add_services` function allows you to add both on-chain and off-chain services to your paranet. These services can range from AI agents and data oracles to decentralized knowledge interfaces and more. + +Before adding services, you first need to create them using the `create_service` function. Services added to a paranet can either have on-chain addresses, representing smart contracts or other on-chain entities, or they can be off-chain services, which do not have associated blockchain addresses. + +Each service can be identified by all paranet users via its registry Knowledge Asset and can include multiple on-chain accounts under its control. This enables services to participate in economic activities within the DKG. + +
paranet_ual = 'did:dkg:hardhat1:31337/0x791ee543738b997b7a125bc849005b62afd35578/1/1'
+paranet_service_ual = 'did:dkg:hardhat1:31337/0x791ee543738b997b7a125bc849005b62afd35578/2/1'
+options = {
+    "paranet_service_name": "TestParanetService",
+    "paranet_service_description": "TestParanetServiceDescription",
+    "paranet_service_addresses": ["0x03C094044301E082468876634F0b209E11d98452"],
+}
+
+await dkg.paranet.create_service(paranet_service_ual, options)
+
+await dkg.paranet.add_services(ual=paranet_ual, services_uals=[paranet_service_ual])
+
+ +In this example: + +* `ual` specifies the UAL of the Paranet Service Knowledge Asset +* `paranet_service_name` specifies the name of the service. +* `paranet_service_description` provides a brief description of what the service does. +* `paranet_service_addresses` lists blockchain addresses associated with the service. For off-chain services, this field can be left empty. +* `services_uals` is an array of Universal Asset Locators for the services you want to add to your paranet. + +By integrating and managing services, paranet operators can expand the capabilities of their paranet, providing a robust infrastructure for decentralized applications and AI-driven services. + +## Knowledge mining for open paranets + +Paranets allow users to leverage collective intelligence by contributing their Knowledge Collections or Knowledge Assets, enhancing the overall utility and value of the network. + +**Submitting existing Knowledge Collections or Knowledge Assets to a paranet** + +Once you create a Knowledge Collection or a Knowledge Asset, you can submit it to a paranet using the `dkg.asset.submit_to_paranet` function. Here’s an example: + +```python +paranet_ual = 'did:dkg:hardhat1:31337/0x791ee543738b997b7a125bc849005b62afd35578/1/1' +kc_ual = 'did:dkg:hardhat1:31337/0x791ee543738b997b7a125bc849005b62afd35578/55' +ka_ual = 'did:dkg:hardhat1:31337/0x791ee543738b997b7a125bc849005b62afd35578/55/1' + +# Submit a Knowledge Collection to a paranet +submit_kc_result = await dkg.asset.submit_to_paranet(kc_ual, paranet_ual) + +# Submit a Knowledge Asset to a paranet +submit_ka_result = await dkg.asset.submit_to_paranet(ka_ual, paranet_ual) +``` + +## Checking and claiming rewards + +Participants in a paranet can earn rewards for their various roles and contributions, such as knowledge mining, voting on proposals, or operating the paranet. The dkg.py library provides functions to check if an address has a specific role within the paranet and to claim rewards associated with that role. + +**Roles in a paranet:** + +* **Knowledge miners:** Contribute to the paranet by mining Knowledge Collections/Assets. +* **Paranet operators:** Manage the paranet, including overseeing services and facilitating operations. +* **Proposal voters:** Participate in decision-making by voting on the Initial Paranet Offering (IPO). + +Participants can verify their roles and claim rewards through the following steps and examples: + +
paranet_ual = 'did:dkg:hardhat1:31337/0x791ee543738b997b7a125bc849005b62afd35578/1'
+
+# Check if an address is a knowledge miner
+is_knowledge_miner = await dkg.paranet.is_knowledge_miner(ual=paranet_ual)
+
+# Check if an address is a paranet operator
+is_operator = await dkg.paranet.is_operator(ual=paranet_ual)
+
+# Check if an address is a voter
+is_voter = await dkg.paranet.is_voter(ual=paranet_ual)
+
+# Check claimable rewards
+knowledge_miner_reward = await dkg.paranet.calculate_claimable_miner_reward_amount(ual=paranet_ual)
+operator_reward = await dkg.paranet.calculate_claimable_operator_reward_amount(ual=paranet_ual)
+
+print(f"Claimable Knowledge Miner Reward for the Current Wallet: {knowledge_miner_reward}")
+print(f"Claimable Paranet Operator Reward for the Current Wallet: {operator_reward}")
+if (is_voter):
+    voter_rewards = await dkg.paranet.calculate_claimable_voter_reward_amount(ual=paranet_ual)
+    print(f"Claimable Proposal Voter Reward for the Current Wallet: {voter_rewards}")
+
+# Claim miner rewards
+await dkg.paranet.claim_miner_reward(ual=paranet_ual)
+
+# Claim operator rewards
+await dkg.paranet.claim_operator_reward(ual=paranet_ual)
+
+# Claim operator rewards
+await dkg.paranet.claim_voter_reward(ual=paranet_ual)
+
+ +By following these steps, you can effectively check your role and claim the rewards you have earned for contributing to the paranet. + +This system ensures that all participants are fairly compensated for their efforts, promoting a robust and active community within the p. + +## Updating claimable rewards + +In some cases, you may have already mined a Knowledge Collection/Asset to a specific paranet but decided to update the Knowledge Asset. By doing so, you become eligible for additional NEURO rewards (in the case when additional TRAC is spent for the update). + +To ensure that your claimable rewards reflect your new contributions, you need to call the `update_claimable_rewards` function for the paranet after the update finalization. This function updates your claimable reward amounts based on your latest contributions. + +
paranet_ual = 'did:dkg:hardhat1:31337/0x791ee543738b997b7a125bc849005b62afd35578/1'
+
+await dkg.paranet.update_claimable_rewards(ual=paranet_ual)
+
+ +This function only updates the claimable rewards based on your latest contributions. To actually claim the rewards, use the respective claiming functions, such as `claim_miner_reward`, `claim_voter_reward`, or `claim_operator_reward`. + +## Performing SPARQL queries on a specific paranet + +The DKG enables users to perform SPARQL queries on specific paranets. By specifying a paranet, users can target their queries to retrieve data related to that paranet. This can be particularly useful when working with domain-specific data or services within a paranet. + +To query a specific paranet, ensure that the node you are querying has already enabled paranet syncing for the paranet you wish to query. Without this setup, the node may not have the relevant data required to process your queries.[ ](../../dkg-paranets/syncing-a-dkg-paranet.md) + +[Read here](../../dkg-paranets/syncing-a-dkg-paranet.md) how to set up a node to sync a paranet. + +To query a specific paranet, you should set the `graph_location` to the desired paranet UAL. This approach allows you to direct your queries to the paranet that holds the relevant data. + +Here’s how you can perform a query on a specific paranet: + +```python +paranet_ual = 'did:dkg:hardhat1:31337/0x791ee543738b997b7a125bc849005b62afd35578/1/1' +query_where_madrid = """ +PREFIX schema: +SELECT DISTINCT ?graphName +WHERE { + GRAPH ?graphName { + ?s schema:city . + } +} +""" + +query_result = await dkg.graph.query( + query=query_where_madrid, options={"paranet_ual": paranet_ual} +) + +print(query_result) +``` + +By querying specific paranets, you can leverage the powerful capabilities of the DKG to interact with domain-specific Knowledge Collections, Knowledge Assets and services, ensuring that your queries are targeted and efficient. This makes it easier to work with complex data structures and gain insights from your paranet's Knowledge Collections and Knowledge Assets. + +### Federated SPARQL queries + +Federated SPARQL queries allow users to execute queries across multiple knowledge graphs or paranets simultaneously. In the context of the DKG, a node might sync with multiple paranets. Federated queries allow you to query multiple paranets within a single SPARQL query, accessing data from each specified paranet and merging the results. + +Imagine you have a DKG node that synchronizes with three different paranets. You want to perform a query that targets two of these paranets to retrieve data about users and cities. Federated SPARQL queries provide a convenient way to specify which paranets to include in your query. + +If you need to query data across multiple specified paranets, you should use federated SPARQL queries. However, if you want to query all available paranets, you do not need to provide any specific arguments, as all paranets will be queried by default using the default triple store repository. + +To execute a federated SPARQL query, you can use the `SERVICE` keyword to specify the paranet UALs you want to query. This keyword allows you to include data from different sources in your query. + +Here’s an example of a federated query targeting two out of three paranets: + +```python +federated_query = """ +PREFIX schema: +SELECT DISTINCT ?s ?city1 ?user1 ?s2 ?city2 ?user2 ?company1 +WHERE {{ + ?s schema:city ?city1 . + ?s schema:company ?company1 . + ?s schema:user ?user1; + + SERVICE <{paranet_ual3}> {{ + ?s2 schema:city . + ?s2 schema:city ?city2 . + ?s2 schema:user ?user2; + }} + + filter(contains(str(?city2), "Belgrade")) +}} +""" + +query_result = await dkg.graph.query( + query=federated_query, options={"paranet_ual": paranet_ual1} +) + +print(query_result) +``` + +**Explanation:** + +* **`SERVICE` keyword:** The `SERVICE` keyword is used to include data from Paranet 3 (`paranet_ual3`) in the query, while the primary paranet is set to Paranet 1 (`paranet_ual1`). +* **Query structure:** The query retrieves distinct subjects (`?s`), cities, users, and companies from Paranet 1, and performs a sub-query within Paranet 3 to get data where the city is `Belgrade`. +* **Filter clause:** The `filter` clause is used to ensure that the city data from Paranet 3 contains the string "Belgrade". + +Federated SPARQL queries provide a powerful way to aggregate and analyze data across multiple paranets, this enables more complex data retrieval and cross-paranet data integration, making it easier to gather comprehensive insights from diverse data sources. diff --git a/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-sdk/setting-up-your-development-environment.md b/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-sdk/setting-up-your-development-environment.md new file mode 100644 index 0000000..7b04ba0 --- /dev/null +++ b/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-sdk/setting-up-your-development-environment.md @@ -0,0 +1,92 @@ +--- +description: How to setup a local and shared development environment +--- + +# Development environment setup + +## Running node engines on the DKG testnet (recommended) + +We recommend following the [DKG Node deployment guide](https://app.gitbook.com/o/-McnF-Jcg4utndKcdeko/s/-McnEkhdd7JlySeckfHM/~/changes/408/build-your-ai-agent-with-the-dkg-node/decentralized-knowle-dge-graph-dkg) for testnet setup instructions. + +## Running a local DKG network + +These instructions are made for macOS and Linux. + +### Prerequisites + +* An installed and running **Blazegraph** + * To download and run Blazegraph, please visit their [official website](https://blazegraph.com/). +* An installed and running **MySQL** + * You need to create an empty table named **operationaldb** inside MySQL. +* You should have **npm** and **Node.js (v16)** installed. + +{% hint style="success" %} +Need any assistance with node setup? Join the [Discord ](https://discord.com/invite/xCaY7hvNwD)chat and find help within the OriginTrail tech community! +{% endhint %} + +### Installation steps + +First, clone the ot-node repo by running: + +```sh +git clone https://github.com/OriginTrail/ot-node +``` + +Navigate to it: + +```sh +cd ot-node +``` + +Change the branch: + +```sh +git checkout v8/develop +``` + +Then, install the required dependencies by running: + +```sh +npm install +``` + +Next, create a file called `.env` and add the following lines: + +```sh +NODE_ENV=development +RPC_ENDPOINT_BC1=http://127.0.0.1:8545 +RPC_ENDPOINT_BC2=http://127.0.0.1:9545 +``` + +To start the local DKG network, run the **local network setup** script to install multiple node engines in the local environment. To ensure stability of operation, it is recommended to run at least 5 node engines (1 bootstrap and 4 subsequent node engines). + +{% hint style="warning" %} +The scripts below only work for macOS and Linux (or Windows WSL). + +If you need help with the setup, contact the core development team on [Discord](https://discord.com/invite/FCgYk2S). +{% endhint %} + +\ +To start the local DKG network on **macOS**, run the following command: + +```sh +bash tools/local-network-setup/setup-macos-environment.sh --nodes=5 +``` + +To start the local DKG network on **Linux**, run the following command: + +```sh +./tools/local-network-setup/setup-linux-environment.sh --nodes=5 +``` + + + +{% hint style="info" %} +### Contributing + +These setup instructions are a work in progress and are subject to change. The core development team expects to introduce improvements in setting up the DKG node engine in the local environment in the future. + +As DKG Node is open source, we **happily invite you to contribute to building the Decentralized Knowledge Graph.** We're excited about your contributions! + +Please visit the [GitHub](https://github.com/OriginTrail/ot-node) repo for more info. +{% endhint %} diff --git a/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/querying-the-dkg.md b/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/querying-the-dkg.md new file mode 100644 index 0000000..31c0de1 --- /dev/null +++ b/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/querying-the-dkg.md @@ -0,0 +1,116 @@ +# Query the DKG + +If you're just getting started with querying the OriginTrail Decentralized Knowledge Graph (DKG), don't worry — you're in the right place! We use the SPARQL query language to interact with the DKG. At first, SPARQL might seem confusing (we've all been there), but once you get the hang of it, it's actually much simpler than it looks — and incredibly powerful. + +## Paranets, Knowledge Collections, and Knowledge Assets + +Paranets are isolated environments within the DKG where participants can publish and query data privately or publicly. Within a paranet, a Knowledge Collection (KC) groups together multiple Knowledge Assets (KAs). A KC acts as a container that maintains a set of KAs, each representing a distinct set of information, assertions, or metadata. You can read more about these concepts [here](../../dkg-knowledge-hub/learn-more/readme/dkg-key-concepts.md). + +Relationships: + +* A Paranet contains multiple Knowledge Collections (KCs). +* Each KC contains multiple Knowledge Assets (KAs). +* Each KA is stored in its own named graph. + +## Understanding DKG connections + +Before diving into queries, here’s a quick overview of the most important RDF connections you'll encounter in the DKG: + +* \ dkg:hasNamedGraph \ - This tells us which Knowledge Asset graphs are currently considered valid and active. You’ll use this to filter for the current version of a KA. +* \ \ dkg:hasNamedGraph \ - This connection links a Knowledge Collection (KC) to one or more Knowledge Assets (KAs). It’s used when looking up KAs via their KC metadata (e.g. publisher, timestamp). +* \ \ dkg:hasKnowledgeAsset \ - This links the KC to the KA’s Universal Asset Locator (UAL). While it doesn’t point to the named graph directly, it’s important for referencing and versioning KAs. +* <[paranetUAL](#user-content-fn-1)[^1]> dkg:hasNamedGraph \ - This is used when querying inside a paranet. The paranet graph stores references to all associated KAs within that scope. You can use it to restrict queries to a specific environment. + +{% hint style="info" %} +KaGraph - did:dkg:hardhat1:31337/0xd5724171c2b7f0aa717a324626050bd05767e2c6/4/1/public + +KaUal - did:dkg:hardhat1:31337/0xd5724171c2b7f0aa717a324626050bd05767e2c6/4/1 +{% endhint %} + +## Query examples + +### Fetch all KAs by publisher on a specific day + +```sparql +PREFIX dkg: + +SELECT ?kaGraph +WHERE { + GRAPH { + ?kc dkg:publishedBy . + ?kc dkg:publishTime ?publishTime . + FILTER(STRSTARTS(STR(?publishTime), "2025-04-28")) + ?kc dkg:hasNamedGraph ?kaGraph . + } +} + +``` + +### Fetch all KAs by publisher key in a paranet + +Paranet in which we are searching for the KAs: did:dkg:hardhat1:31337/0xd5724171c2b7f0aa717a324626050bd05767e2c6/3/1 + +```sparql +PREFIX dkg: + +SELECT ?kaGraph +WHERE { + GRAPH { + dkg:hasNamedGraph ?kaGraph . + } + GRAPH { + ?kc dkg:hasNamedGraph ?kaGraph ; + dkg1:publishedBy . + } +} + +``` + +### Fetch all KAs by transaction hash + +```sparql +PREFIX dkg: + +SELECT ?kaGraph +WHERE { + GRAPH { + ?kc dkg:publishTx "0x028cefe6e1a828508c730aeef498117be8e47935619d057d061c203ae2a30b6f" . + ?kc dkg:hasNamedGraph ?kaGraph . + } +} + +``` + +## Generic query template + +This is the go-to pattern for querying across currently active KAs. + +```sparql +PREFIX schema: +PREFIX dkg: + +SELECT ?subject ?predicate ?object ?containedGraph +WHERE { + GRAPH { + ?g dkg:hasNamedGraph ?containedGraph . + } + GRAPH ?containedGraph { + ?subject ?predicate ?object . + # Optional: FILTER or specific patterns can go here + } +} + +``` + +## Learn more + +Want to dive deeper into SPARQL? Check out this awesome guide:[ SPARQL 1.1 Query Language Overview](https://www.w3.org/TR/sparql11-query/). + +Happy querying! You've got this. 🚀 + +*** + +**Next step: DKG SDK**\ +Once you know how to query the graph, it’s time to go deeper and start **building with code**. The next section introduces the official **DKG SDKs (JavaScript and Python)**, which make it simple to publish, retrieve, and verify Knowledge Assets programmatically. + +[^1]: diff --git a/docs/build-a-dkg-node-ai-agent/architecture.md b/docs/build-a-dkg-node-ai-agent/architecture.md new file mode 100644 index 0000000..c84f2a8 --- /dev/null +++ b/docs/build-a-dkg-node-ai-agent/architecture.md @@ -0,0 +1,28 @@ +# Architecture + +The DKG Node is built as a modular project with two core runtimes: + +* the **DKG Engine**, which powers network communication and implements the core protocol +* the **DKG Node Runtime,** which hosts an AI Agent with MCP capabilities + +Adding functionality is done through **Plugins,** which is where you'll likely spend the majority of your time coding. Conceptually the architecture is illustrated below. + +
+ +#### DKG Node server (MCP) and Agent + +This is the **“brain”** of your node. It runs all **plugins** and connects to the underlying services of the DKG node, that AI agents or other applications use. It allows agents to **publish**, **query**, and **retrieve** Knowledge Assets directly from the OriginTrail DKG. It can also expose **REST APIs (via Express)** so your apps can interact with the node over HTTP. + +#### Plugins + +Plugins are like mini-apps for your DKG Node AI Agent - small add-ons that unlock new functionality. They can provide **MCP tools** (for AI agents), **HTTP endpoints**, or both. + +Some useful built-in plugins include: + +* **DKG Essential Plugin** - includes the basic tools for publishing and retrieving knowledge. +* **OAuth 2.1 authentication** - controls who can access your node. +* **Swagger** - automatically documents available APIs. + +#### DKG Node engine + +The **DKG** **engine** (formerly known as ot-node) implements the core OriginTrail protocol and is considered a dependency (not intended for implementing agent functionality). The DKG Node engine should be kept up to date in order to maintain reliable and efficient communication with the rest of the network. It implements blockchain interactions, takes care of loading new knowledge assets from the network and their validation, performs staking transactions (in case of Core nodes) etc.\ diff --git a/docs/build-a-dkg-node-ai-agent/contributing-a-plugin.md b/docs/build-a-dkg-node-ai-agent/contributing-a-plugin.md new file mode 100644 index 0000000..698827d --- /dev/null +++ b/docs/build-a-dkg-node-ai-agent/contributing-a-plugin.md @@ -0,0 +1,168 @@ +--- +description: >- + Find out how to actively participate in improving the DKG Node itself — from + submitting bug reports to contributing code, or plugins. Perfect for + developers who want to help shape the ecosystem. +--- + +# Contributing a plugin + +## Publish a plugin + +We welcome contributions from the community! Whether you’ve built a plugin you want to share, fixed a bug, or improved the codebase, your contributions help the DKG Node and agents grow. + +This guide explains how to contribute your work to the official DKG Node repository. + +### How to contribute + +#### 1. Fork the repo + +1. Go to the [official DKG Node GitHub repo](https://github.com/OriginTrail/dkg-node). +2. Click **Fork** (top right). +3. This creates your own copy of the repo under your GitHub account. + +#### 2. Clone your fork + +```bash +git clone https://github.com/YOUR_USER/dkg-node.git +cd dkg-node +``` + +#### 3. Create a new branch + +```sh +git checkout -b my-contribution +``` + +#### 4. Make your changes + +* Implement your plugin, fix, or feature. +* Run tests if applicable. + +#### 5. Push changes to your fork + +```sh +git push origin my-contribution +``` + +#### 6. Open a Pull Request (PR) + +1. Go to your fork on GitHub. +2. Click **Compare & pull request**. +3. On the PR page, make sure the branches are correct: + * **base repository**: `OriginTrail/dkg-node` + * **base**: `main` (or other target branch) + * **compare**: `my-contribution` +4. Fill in a clear PR description. A good template: + * **What**: brief summary of the change + * **Why**: the problem it solves / motivation + * **How**: key implementation details + * **Tests**: how you verified it (commands, screenshots) + * **Breaking changes/migration**: if any +5. Click **Create pull request**. + +The OriginTrail core developer team will review your PR. If everything looks good, it will be merged and published. 🎉 + +### Creating an official DKG Node plugin + +If you’ve built a plugin and want it included in the official DKG Node repo: + +#### 1. Scaffold a plugin package + +From the repo root, run: + +```sh +turbo gen plugin +``` + +* Name it starting with `plugin-` (e.g. `plugin-custom`). +* A new package will be created at: + + ```sh + packages/plugin-/src/index.ts + ``` + +#### 2. Develop your plugin + +* Add your logic inside `index.ts`. +* Your package name will be: + + ```sh + @dkg/plugin- + ``` + +#### 3. Submit via PR + +* Commit your work. +* Push it to your fork. +* Open a pull request as described above. + +Once reviewed and merged, your plugin will be published to **npm** under the `@dkg/` namespace for the community to use. + +📖 To learn more about writing plugins, see [Create a custom plugin](broken-reference). + +### Working with packages in the DKG Node monorepo + +The DKG Node repo is a **Turborepo** that contains multiple packages — not just plugins. + +#### Explore packages + +Run: + +```sh +turbo ls +``` + +You’ll see entries like: + +* `@dkg/agent` → Example of a DKG agent (Expo UI + MCP Server) +* `@dkg/plugins` → Utility package for creating DKG plugins +* `@dkg/eslint-config` → Shared ESLint configuration +* `@dkg/typescript-config` → Shared TypeScript configs +* `@dkg/plugin-oauth` → OAuth 2.1 module for the DKG Node + +#### Add new packages + +* Use `turbo gen` to generate new packages. +* New packages will be published under the `@dkg/` namespace once reviewed and merged. + +### Repo utilities + +The DKG Node monorepo comes with powerful tools preconfigured: + +* [**Turborepo**](https://turborepo.com/) → build system with caching +* [**TypeScript**](https://www.typescriptlang.org/) → static type checking +* [**ESLint**](https://eslint.org/) **+** [**Prettier**](https://prettier.io) → code linting & formatting + +#### Remote caching with [Vercel](https://vercel.com/signup?/signup?utm_source=remote-cache-sdk\&utm_campaign=free_remote_cache) + +By default, builds are cached locally.\ +Enable [**remote caching**](https://turborepo.com/docs/core-concepts/remote-caching) to share build caches across your team or CI/CD: + +```sh +npx turbo login # authenticate with your Vercel account +npx turbo link # link this repo to remote cache +``` + +Learn more in Turborepo docs. + +*** + +### Further resources + +👥 OriginTrail Discord server + +📖 **Expo framework:** + +* [Expo docs](https://docs.expo.dev/) +* [Video tutorials](https://www.youtube.com/@ExpoDevelopers/videos) + +⚡**Turborepo:** + +* [Tasks](https://turborepo.com/docs/crafting-your-repository/running-tasks) +* [Caching](https://turborepo.com/docs/crafting-your-repository/caching) +* [Remote Caching](https://turborepo.com/docs/core-concepts/remote-caching) +* [Filtering](https://turborepo.com/docs/crafting-your-repository/running-tasks#using-filters) +* [Configuration Options](https://turborepo.com/docs/reference/configuration) +* [CLI Usage](https://turborepo.com/docs/reference/command-line-reference) + diff --git a/docs/build-a-dkg-node-ai-agent/customizing-your-dkg-agent.md b/docs/build-a-dkg-node-ai-agent/customizing-your-dkg-agent.md new file mode 100644 index 0000000..9274ad6 --- /dev/null +++ b/docs/build-a-dkg-node-ai-agent/customizing-your-dkg-agent.md @@ -0,0 +1,280 @@ +# Customizing your DKG agent + +Each DKG node includes a **collocated neuro-symbolic AI agent** that combines neural model capabilities (e.g., LLMs) with symbolic reasoning over RDF-based graph data. This enables DKG nodes to not only publish and query semantic knowledge but also perform knowledge graph reasoning, summarization, and data transformation tasks directly on locally or remotely stored knowledge. + +The **DKG Agent** is built around a modular **plugin system** centered on the **Model Context Protocol (MCP)**. Plugins define how the agent interacts with external tools, APIs, and reasoning systems. A generic DKG Node ships with a base set of plugins for common operations- such as knowledge publishing, retrieval, and validation - **while developers can extend functionality by creating custom plugins**. + +## Build your first plugin for the DKG Agent + +The rest of this page will focus on how you can build custom plugins for your DKG agent. For example, you might build a **Scientific Research** plugin to ingest papers and publish structured knowledge on the DKG, helping your agent drive research. Or a **Social Media** plugin to extract relevant posts, build a knowledge pool on the DKG, and run sentiment analysis. + +### Option 1: Create a custom plugin inside the DKG monorepo + +This is the easiest path if you’re already working inside the **DKG monorepo**. + +#### 🔨 Steps + +1. **Use turbo to generate a plugin scaffold** (from the project root folder) + + ```bash + turbo gen plugin + ``` + +This will create all the files for your plugin in a new folder. + +2. **Name your plugin** + +We suggest you start your name with `plugin-` (example: `plugin-pdf-parser`), but it is not a hard requirement. + +It will be created under `packages/plugin-` + +3. **Edit the plugin source** + +Open the plugin source `packages/plugin-/src/index.ts` + +Your plugin comes pre-scaffolded with examples that show how to expose a tool both as an MCP tool and as a REST API route (more details on this in the [#exposing-tools-in-your-plugin](customizing-your-dkg-agent.md#exposing-tools-in-your-plugin "mention") section below). Your `index.ts` file should look something like this: + +
+ +Within the `defineDkgPlugin` module, on line 6 we expose the plugin code via an MCP tool through `mcp.registerTool()` and on line 50 the plugin exposes a classic HTTP API, in this case a GET route, via `api.get()` + +We recommend writing your custom plugin logic in your custom functions, like `yourCustomFunction` above, then use it within the `mcp.registerTool()` and `api.get()` code block. + + + +1. **(Optional) Add dependencies** + + ```bash + npm install --save + ``` + + Run this inside your `plugin-` directory. +2. **(optional) Add additional source files if needed (e.g., utils.ts)** + * Place them in the `src/` directory. + * Import them into your `index.ts`. +3. **Once your are done, make sure to build your plugin by running** + + ```bash + npm run build + ``` + + or: + + ```bash + turbo build + ``` +4. **Install your plugin in the DKG Node Agent** + + ```bash + cd apps/agent + npm install --save @dkg/plugin- + ``` + + This package name is auto-generated (check `packages/plugin-/package.json`). +5. **Make sure to import your plugin and register it through** `createPluginServer`\ + \ + Open `apps/agent/src/server/index.ts` and add: + + ```ts + import myCustomPlugin from "@dkg/plugin-"; + + const app = createPluginServer({ + // ... other config (name, context, dkg client, etc.) + plugins: [ + defaultPlugin, + oauthPlugin, + dkgEssentialsPlugin, + examplePlugin.withNamespace("protected", { + middlewares: [authorized(["scope123"])], + }), + + // Add your own plugin here + myCustomPlugin, + ], + }); + ``` +6. **Run your DKG Node Agent** + + Start the agent from the project root folder and run `npm run dev` . Test that your plugin is registered and working + +To learn how DKG plugins work internally, see the [#how-do-dkg-plugins-work](customizing-your-dkg-agent.md#how-do-dkg-plugins-work "mention") section below. + +### Option 2: Create a standalone DKG plugin (npm package) + +If you want to create your plugin outside of the monorepo and manage it as your separate package, you can! Follow the steps below + +#### 🔨 Steps + +1. **Create a new Node.js project** + + ```bash + npm init + ``` + + (or use an existing project) +2. **Add "@dkg/plugins" package as a dependency** + + ```bash + npm install --save @dkg/plugins + ``` +3. **Define your plugin** + + ```ts + import { defineDkgPlugin } from "@dkg/plugins"; + + const myCustomPlugin = defineDkgPlugin((ctx, mcp, api) => { + // Example MCP tool + // mcp.registerTool("my-mcp-tool", ... ); + + // Example API route + // api.get("/my-get-route", (req, res) => ... ); + }); + + export default myCustomPlugin; + ``` +4. **(Optional) Add configuration support**\ + You can export your plugin as a function that accepts options: + + ```ts + const configurablePlugin = (options: {...}) => + defineDkgPlugin((ctx, mcp, api) => { ... }); + ``` +5. **(Optional) Publish to npm** + * Run `npm publish` to share with the DKG builder community. + +{% hint style="warning" %} +💡**Tip:** See how the already existing plugins are created by looking into existing plugin packages in the monorepo, e.g `packages/plugin-auth` and `packages/plugin-example`. +{% endhint %} + +### How do DKG plugins work? + +DKG plugins are **functions** applied in the order you register them.\ +While you _can_ define inline functions, using `defineDkgPlugin` is recommended for: + +* **Type-safety** +* **Extension methods** + +#### `defineDkgPlugin` arguments + +When you define a plugin, the DKG Node automatically injects three objects: + +1. **`ctx` (Context)** + * `ctx.logger` → Logger instance for logging messages + * `ctx.dkg` → DKG Client instance for interacting with the DKG network +2. **`mcp` (MCP Server)** + * An instance of the MCP Server from `@modelcontextprotocol/sdk` + * Use it to register MCP tools and resources +3. **`api` (API Server)** + * Express server instance from the express npm package + * Use it to expose REST API routes from your plugin + +All registered routes and MCP tools become part of the **DKG Node API server**. + +### Exposing tools in your plugin + +#### Exposing as MCP Tools + +MCP tools are automatically available to the DKG Agent (or any MCP client connected to your Node). + +**Example of how to expose tools** **(auto-scaffolded by `turbo gen plugin`):** + +```ts +mcp.registerTool( + "add", + { + title: "Tool name", + description: "Tool description", + inputSchema: { /* expected input variables and format */ }, + }, + // YOUR TOOL CODE HERE +); +``` + +{% hint style="success" %} +#### Including source Knowledge Assets in your MCP tool responses + +When building custom tools for your DKG Node Agent, you can attach **Source Knowledge Assets** to any MCP tool response, allowing the DKG Node Agent (and other agents you might use with your DKG Node) to display which Knowledge Assets were used to form the answer. + +See the [Source Knowledge Assets in tool responses](https://app.gitbook.com/o/-McnF-Jcg4utndKcdeko/s/-McnEkhdd7JlySeckfHM/~/changes/408/manage-and-extend-your-dkg-node/what-are-dkg-node-plugins/dkg-node-essentials-plugin/~/comments#source-knowledge-assets-in-tool-responses) section of the DKG Essentials plugin page for full details and examples. +{% endhint %} + +#### Exposing as REST API routes + +Expose routes through the API server for more “traditional” API calls. + +**Example of how to expose tools through API routes** **(auto-scaffolded by `turbo gen plugin`):** + +```ts +api.get("/ROUTE_NAME", (req, res) => { + // YOUR TOOL CODE HERE +}); +``` + +💡 **Tip: Test your API routes with Swagger**\ +When your DKG Node is running, all exposed API routes are automatically documented and testable via the Swagger UI: + +* Open [http://localhost:9200/swagger](http://localhost:9200/swagger) +* You’ll see: + * All registered API routes + * Descriptions (from your route/tool definitions) + * Input/output schemas (from your route/tool definitions) + * Ability to **test requests directly in the browser** + +This makes it easy to confirm your plugin’s routes are working as expected. + +### Using plugins in the DKG Node + +Once your plugin is built, register it in the DKG Node and (optionally) configure it with extension methods like `.withNamespace()`. + +#### 1) Install & import + +In the `apps/agent` folder run: + +```bash +npm install --save @dkg/plugin- +``` + +Open `apps/agent/src/server/index.ts` and import your plugin: + +```ts +import myCustomPlugin from "@dkg/plugin-"; +``` + +#### 2) Register your plugin in `createPluginServer` + +
const app = createPluginServer({
+  // ... other config (name, context, dkg client, etc.)
+  plugins: [
+    defaultPlugin,
+    oauthPlugin,
+    dkgEssentialsPlugin,
+
+    // Protect routes with middleware
+    examplePlugin.withNamespace("protected", {
+      middlewares: [authorized(["scope123"])],
+    }),
+
+    // Add your own plugin here
+    myCustomPlugin.withNamespace("protected", {
+      middlewares: [authorized(["scope123"])],
+    }),
+  ],
+});
+
+ +#### Notes + +* `.withNamespace("...")` is optional — it scopes your plugin’s routes/tools under a namespace and lets you attach middlewares (e.g., auth/permissions) - more on that in the [Configure access & security](broken-reference) section +* All registered **MCP tools** and **API routes** from your plugins are exposed via the DKG Node API. +* You can combine inline plugins and imported packages in the same `plugins` array. + +#### Run & verify + +Start your DKG Node and confirm your plugin’s endpoints/tools are available under the configured namespace by running: + +```bash +npm run dev +``` + +(from the root folder) 🎉 + diff --git a/docs/build-a-dkg-node-ai-agent/essentials-plugin.md b/docs/build-a-dkg-node-ai-agent/essentials-plugin.md new file mode 100644 index 0000000..6c50817 --- /dev/null +++ b/docs/build-a-dkg-node-ai-agent/essentials-plugin.md @@ -0,0 +1,193 @@ +--- +description: >- + Learn about the default plugin that powers core functions like publishing, + querying, and verifying Knowledge Assets. +--- + +# Essentials Plugin + +The **DKG Node Essentials Plugin** ships preinstalled with every DKG Node. It provides the **baseline tools, resources, and APIs** you’ll use to **publish** and **retrieve** verifiable knowledge on the OriginTrail DKG. It’s also the reference implementation for **including Source Knowledge Assets** in tool responses, so users can see _which verifiable knowledge_ from the DKG powered an answer. + +{% hint style="info" %} +💡**Tip:** Use DKG Essentials as your **starting toolkit**. You can customize these tools or use them as blueprints for your own plugins. +{% endhint %} + +### What’s included + +* **DKG Knowledge Asset create tool** - basic too to publish Knowledge assets from a JSON-LD object with `public` or `private` visibility +* **DKG Knowledge Asset get** tool - retrieve a Knowledge asset by it's **UAL**. + +Publishing Knowledge assets with the "public" visibility, will replicate their content to the entirety of the DKG - making it **publicly visible**. When creating private knowledge assets, their content never leaves your node - only knowledge asset registration material (such as the cryptographic hash and UALs) will be published publicly. + +#### 🧱 Resources (MCP) + +* **Knowledge Asset (KA) resource** — resolve a **KA UAL.** +* **Knowledge Collection (KC) resource** — resolve a **KC UAL**. + +*** + +### Tool reference + +Below is a consistent structure you can reuse for every tool: **Purpose → Inputs → Returns → Example → Notes**. + +#### 1) DKG Knowledge Asset **create** + +**Purpose**\ +Publish a single **Knowledge Asset** **(KA)** or a single **Knowledge Collection (KC)** to the DKG. + +**Inputs** + +* `content` _(string, required)_ — a **JSON-LD** string (e.g., Schema.org-based) representing a KA or KC. +* `privacy` _(string, optional)_ — `"public"` or `"private"`, defaults to `"private"` if no input is provided. + +**Returns** + +All tools return an **MCP-formatted** payload: + +* `content` _(array)_ — human-readable messages. This tool returns: + * a success line, + * the **UAL**, and + * a **DKG Explorer** link derived from the UAL. + +**Example input (JSON-LD)** + +```json +{ + "@context": "https://schema.org/", + "@type": "CreativeWork", + "@id": "urn:first-dkg-ka:info:hello-dkg", + "name": "Hello DKG", + "description": "My first Knowledge Asset on the Decentralized Knowledge Graph!" +} +``` + +**Typical response** + +```json +Knowledge Asset collection successfully created. + +UAL: did:dkg:otp:20430/0xABCDEF0123456789/12345/67890 +DKG Explorer link: https://dkg-testnet.origintrail.io/explore?ual=did:dkg:otp:20430/0xABCDEF0123456789/12345/67890 +``` + +*** + +#### 2) DKG Knowledge Asset **get** + +**Purpose**\ +Fetch a **KA or KC** by **UAL**. + +**Inputs** + +* `ual` _(string, required)_ — the KA or KC UAL. + +**Returns** + +All tools return an **MCP-formatted** payload: + +* `content` _(array)_ — one item with **pretty-printed JSON** (as text) containing: + * `assertion` — the JSON-LD content of the KA/KC + * `operation` — retrieval info: `operationId` and `status` (e.g., `COMPLETED`) + +**Example input (UAL)** + +``` +did:dkg:otp:20430/0xABCDEF0123456789/12345/67890 +``` + +**Typical response** + +```json +{ + "assertion": [ + { + "@id": "urn:ka:example", + "http://schema.org/name": [ + { + "@value": "DKG Example KA" + } + ], + "http://schema.org/description": [ + { + "@value": "The best KA example on the DKG" + } + ], + "@type": [ + "http://schema.org/CreativeWork" + ] + } + ], + "operation": { + "get": { + "operationId": "3951dd30-4781-4584-a3f2-4116ce26e8d2", + "status": "COMPLETED" + } + } +} +``` + +### Coming soon (preview) + +* **DKG query & retrieve** - generate/execute Schema.org-based **SPARQL** queries on the DKG. +* **Document → JSON/Markdown** - convert PDFs/Word/TXT/… into JSON/Markdown for downstream processing. +* **JSON/Markdown → JSON-LD** - transform structured text into a **schema.org** knowledge graph ready for publishing. + +### Source Knowledge Assets in tool responses + +You can attach **source Knowledge Assets** to any MCP tool response, allowing the DKG Node Agent (and other agents you might use with your DKG Node) to display which Knowledge Assets were used to form the answer. + +
+ +Use the helper **`withSourceKnowledgeAssets`** from the plugin’s `utils` submodule to include source Knowledge Assets along with your other tool responses: + +```ts +import { withSourceKnowledgeAssets } from "@dkg/plugin-dkg-essentials/utils"; + +// Some code ... + +mcp.registerTool( + "tool name...", + { + title: "Tool name", + description: "Tool description", + inputSchema: { /* expected input variables and format */ }, + }, + async (params) => { + // Your tool code here + return { + content: [{type: "text", text: "My tool response..."}], + }; + + return withSourceKnowledgeAssets({ + content: [{type: "text", text: "My tool response..."}], + }, [ + { title: "KA 1", issuer: "OriginTrail", ual: "did:dkg..." }, + { title: "KA 2", issuer: "OriginTrail", ual: "did:dkg..." }, + { title: "KA 3", issuer: "OriginTrail", ual: "did:dkg..." }, + ]); + } +); +``` + +{% hint style="info" %} +💡**Tip:** To see the source Knowledge Assets when using agents other than the DKG Node Agent (e.g., VS Code, Cursor, etc.), you will need to adjust your prompt to ask for them to be shown in the response (i.e, "Please include source Knowledge Assets in the response if there are any"). +{% endhint %} + +You can also check the `packages/plugin-example` to see how this works first-hand. + +*** + +### Customize & extend + +* **Tune the essentials** — adjust defaults (e.g., privacy, retry/finality settings) or validate inputs for your domain. +* **Use as a scaffold** — copy the patterns (tool registration, response helpers, resource resolvers) to **build new tools** and full plugins. +* **Compose with other plugins** — chain tools into **end-to-end agentic pipelines**. + +{% hint style="success" %} +Builders are encouraged to **customize DKG Essentials** to fit their use case, and to **use these tools as the basis** for creating new, domain-specific capabilities. +{% endhint %} + +*** + +**Next step: Creating custom plugins for your node**\ +Want more than the basics? Next, we’ll show you how to **build your own plugins** — integrating APIs, adding new tools, and tailoring your node’s capabilities to your specific use case. diff --git a/docs/build-a-dkg-node-ai-agent/evaluating-agent-responses.md b/docs/build-a-dkg-node-ai-agent/evaluating-agent-responses.md new file mode 100644 index 0000000..50d8767 --- /dev/null +++ b/docs/build-a-dkg-node-ai-agent/evaluating-agent-responses.md @@ -0,0 +1,111 @@ +--- +description: A step-by-step guide to evaluating your agent responses +--- + +# Evaluating agent responses + +Evaluating your DKG Agent is a recommended quality control step which helps you understand how well your custom agent performs on various retrieval metrics. The DKG Node comes together with the premier AI Agent RAG evaluation framework, called Ragas, which you can use to evaluate how well your agent responds. + +## What is RAGAS, and what is it used for? + +[RAGAS](https://www.ragas.io/) (Retrieval Augmented Generation Assessment) is a framework for evaluating how well an AI agent answers questions, especially those that use knowledge bases or document retrieval. Think of it as a quality control system that checks whether your agent is doing its job properly. It measures if the agent finds the right information from available knowledge, gives accurate and relevant answers, stays truthful to the source material without making things up, and actually addresses what the user asked.\ +For the DKG Node, RAGAS helps ensure that the DKG Agent provides high-quality, reliable answers about decentralized knowledge graphs, blockchain publishing, and related topics. It's essentially an automated testing system that makes sure your AI isn't hallucinating information or giving irrelevant responses. + +### RAGAS metrics explained + +Each evaluation measures these key aspects: + +#### **1. Context metrics (How well does the agent find information)** + +* **Context precision** — Is the agent pulling the right information from our knowledge base? +* **Context recall** — Did the agent find all the relevant information available? +* **Context relevance** — Is the information the agent retrieved actually useful for the question? + +#### **2. Answer metrics (How well does the agent respond)** + +* **Answer relevance** — Does the answer actually address what was asked? +* **Faithfulness** — Is the answer based on facts from our knowledge base (no hallucinations)? +* **Answer similarity** — How close is the answer to what we expect? +* **Answer correctness** — Is the answer factually correct? + +{% hint style="info" %} +Each metric gets a score from 0-1, and you can set minimum thresholds (e.g., 0.8 = 80%) that the answers must meet to pass. +{% endhint %} + +### Where to find questions + +The test questions are stored in: + +1. Navigate to your project root directory +2. Open apps/agent/tests/ragas/questionsAnswers/ +3. Open dkg-node-evaluation-dataset.json + +``` +apps/agent/tests/ragas/questionsAnswers/dkg-node-evaluation-dataset.json +``` + +
+ +The JSON file contains an array of test cases, each with questions, answers, ground\_truth, and context. All fields are already populated with examples for DKG Node, which you can modify or replace to fit your chatbot's specific use case. + +### What each field means + +Think of the dataset as having four parallel lists that all work together. Questions are the prompts you're testing ("What is DKG Node?"), ground\_truths are your ideal answers — the gold standard you're measuring against, contexts are the documentation or knowledge your AI should be using to answer, answers should contain actual responses from your DKG Node for each question. + +### Adding new questions step by step + +Adding a new test question is straightforward. Start by putting your question in the questions array. Then write what you consider the perfect answer and add it to ground\_truths. Next, include any relevant documentation in the contexts array — this is the source material your AI should reference. For the answers field, you need to manually add the actual response from your DKG Node. You can get this by asking your DKG Node the question directly and copying the response, or you can run a test session to see what it generates and then add that to the array. Just remember: all four arrays need to stay in sync. The first item in each array corresponds to the same test case. + +### Configuration settings + +Edit \`tests/ragas/config.ts\` to change: + +* **Which metrics to run** — Enable/disable specific RAGAS metrics +* **Score thresholds** — Set minimum passing scores (e.g., require 80% minimum) +* **LLM model** — Choose which AI model evaluates the responses +* **Browser automation settings** — Playwright timeouts and behavior + +### Setup and installation + +```sh +# Install dependencies and build the project +npm install +npm run build + +# Run complete evaluation (starts servers + generates report + opens dashboard) +npm run ragas +``` + +If you want to run individual parts: + +```sh +# Just run the evaluation (servers must be running) +npm run test:ragas + +# Show results from last evaluation +npm run test:ragas:results + +# Open dashboard (shows cached results) +npm run test:ragas:dashboard +``` + +* **Update login credentials** in `apps/agent/tests/ragas/dkg-node-client.ts`: + +```typescript +// Lines 28-29 and 264-265 +email: "admin", // Change to your email/username +password: "adminN131!" // Change to your password +``` + +### The dashboard + +When you run npm run ragas, a web dashboard opens at http://localhost:3001 showing: + +* **Overall score** — How well the DKG Node agent is performing (0-100%) +* **Metric breakdown** — Individual scores for each RAGAS metric +* **Question-by-question analysis** — Detailed view of each failed test question with: + * The question asked + * DKG Node's actual answer + * Expected answer + * Which metrics failed and why + * Real-time Results — Dashboard auto-refreshes as new evaluations complete diff --git a/docs/build-a-dkg-node-ai-agent/set-up-your-custom-dkg-node-fork-and-update-flow.md b/docs/build-a-dkg-node-ai-agent/set-up-your-custom-dkg-node-fork-and-update-flow.md new file mode 100644 index 0000000..f3cff38 --- /dev/null +++ b/docs/build-a-dkg-node-ai-agent/set-up-your-custom-dkg-node-fork-and-update-flow.md @@ -0,0 +1,115 @@ +--- +description: >- + Learn how to keep your DKG Node up to date with the latest features, bug + fixes, and security patches. Includes guidance on safe upgrade practices so + your node stays online and reliable. +--- + +# Set up your custom DKG Node fork & update flow + +## **Keeping your DKG Node up to date** + +The DKG Node is continuously evolving - new features, performance improvements, and security updates are released frequently. To stay current while keeping your custom modifications intact, you’ll maintain your own private fork of the DKG Node repository. + +This setup allows you to: + +* **Safely integrate official updates** without overwriting local changes. +* **Experiment and customize** your node codebase while staying compatible with the latest OriginTrail releases. +* **Stay stable and secure**, ensuring your node runs the most reliable version of the network software. + +In this section, you’ll learn how to structure your repository, pull updates from the official source, and merge them into your project with confidence. + +### How updates work + +To receive new updates, you must maintain a **private fork** of the DKG Node monorepo. Your local project will use **two git remotes**: + +* `origin` pointing to your **custom GitHub repository** (private or public) +* `upstream` pointing to the **official DKG Node repository** + +This setup lets you safely pull in upstream changes while keeping your customizations.​ + +## How to set up your project with update flow + +**1. Custom GitHub repository setup** + +{% hint style="info" %} +Create a new repository on GitHub (private or public) where you'll store your DKG Node based project. +{% endhint %} + +**2. Clone the official DKG Node repo** + +```sh +git clone git@github.com:OriginTrail/dkg-node.git +``` + +**3. Rename the folder** + +```sh +mv dkg-node your_project_namecd +``` + +**4. Configure remotes** + +Rename the original remote to `upstream`: + +```sh +git remote rename origin upstream +``` + +**5. Add your private repo as `origin`** + +```sh +git remote add origin git@github.com:your-username/.git +``` + +**6. Push to your private repo** + +```sh +git push -u origin main +``` + +Your custom DKG Node repository is now set up with: + +* `origin` pointing to your private fork +* `upstream` pointing to the official DKG Node + +## Configure and start your custom DKG Node project + +Once this setup process is complete, you are ready to configure and run your custom DKG Node using the `dkg-cli`. The `dkg-cli` provides automated installation, configuration management, and service control for your DKG Node. Detailed instructions on how to use `dkg-cli` to configure your node, and manage its services are available in the [**Installation**](../getting-started/decentralized-knowle-dge-graph-dkg.md#id-1-install-cli) page under "Getting started" section. + +## Update your custom DKG Node project + +When a new version of DKG Node is released, follow the process steps below to update your custom DKG Node project. + +**1. Fetch the latest changes from upstream:** + +```sh +git fetch upstream +``` + +**2. Merge upstream changes into your project** + +```sh +git merge upstream/main +``` + +**3. Resolve conflicts (if any)** + +Most projects will encounter differences between upstream and local changes. Review conflict markers in your code, then decide whether to keep, override, or blend changes. + +**4. Push the updated code to your repo** + +```sh +git push origin main +``` + +## You’re up to date + +At this point, your codebase is synced with the latest official [DKG Node](https://github.com/OriginTrail/dkg-node) while keeping your customizations intact. + +{% hint style="info" %} +⚠️ **Tips for smoother updates** + +* Pull upstream updates **regularly** to avoid large conflict sets. +* Always test your DKG Node after merging updates to ensure compatibility +{% endhint %} diff --git a/docs/contribute-to-the-dkg/bounties-and-rewards/README.md b/docs/contribute-to-the-dkg/bounties-and-rewards/README.md new file mode 100644 index 0000000..15e0566 --- /dev/null +++ b/docs/contribute-to-the-dkg/bounties-and-rewards/README.md @@ -0,0 +1,2 @@ +# Bounties & rewards + diff --git a/docs/contribute-to-the-dkg/bounties-and-rewards/code-contributions-and-v8-bug-bounty.md b/docs/contribute-to-the-dkg/bounties-and-rewards/code-contributions-and-v8-bug-bounty.md new file mode 100644 index 0000000..ef6cd27 --- /dev/null +++ b/docs/contribute-to-the-dkg/bounties-and-rewards/code-contributions-and-v8-bug-bounty.md @@ -0,0 +1,94 @@ +--- +icon: sack-dollar +--- + +# Code contributions & V8 bug bounty + +Interested in helping us build the substrate of collective neuro-symbolic AI? + +## Code contributions + +We encourage code contributions to the following repositories. + +* ot-node +* dkg-evm-module +* dkg.js +* dkg.py + +Please check the contribution guidelines in each repo. + +Once a contribution is made, you can **tag the development team in your pull request** for an assessment of your contribution. If you'd like to check if your contribution will qualify for a reward, [contact the core development team in Discord](https://discord.gg/xCaY7hvNwD). + +## V8 bug bounty + +To ensure the **security and proper functioning of the DKG V8**, Trace Labs has launched a dedicated bounty program. Each submission will be evaluated based on its severity and will correspond to a specific bounty reward. + +### Vulnerability categories and rewards + +* **Minor bug:** 50 TRAC +* **Medium bug:** 200 TRAC +* **Critical bug:** 5000 TRAC + +### Bug bounty rules + +1. **Severity assessment:** The severity of each bug will be determined solely at the discretion of Trace Labs, based on both the likelihood and impact of the bug. All reward decisions are final. +2. **Submission process:** Please send your bug reports to [**bounty@origin-trail.com**](mailto:bounty@origin-trail.com) with the subject "WEBSITE/APP BUG BOUNTY." Upon receipt, we will evaluate the severity of the bug and contact you with further information. Submissions through other channels (e.g., social media) will not be accepted. + +### Security vulnerabilities + +* SQL injection. +* Cross-site scripting (XSS). +* Cross-site request forgery (CSRF). +* Remote code execution (RCE). +* Insecure configurations in web servers, databases, and application frameworks. +* Session hijacking and clickjacking. +* Sensitive data exposure. +* Unauthorized access to user accounts. +* Bypassing authentication mechanisms. +* Credentials exposure. +* Logic bypasses. + +### Example submission template + +``` +**Title:** [Short description of the vulnerability] + +**Description:** +[A detailed description of the vulnerability, including what it is and how it can be exploited] + +**Steps to Reproduce:** +1. [First step] +2. [Second step] +3. [Further steps as necessary] + +**Proof of Concept:** +[Include any screenshots, videos, or code snippets] + +**Impact:** +[Explain the potential impact of the vulnerability] + +**Suggested Fix:** +[Provide recommendations for how to fix the issue] + +**Additional Information:** +[Any other information that might be relevant] + +``` + +### Important restrictions + +Please ensure that you do not harm any live contracts on public networks while testing; otherwise, you will not be eligible for a bug bounty. + +Leaking any vulnerability of the smart contracts on any social media platforms or public channels will lead to the cancellation of the bounty and might also invite legal action. + +### Legal notice + +We cannot issue rewards to individuals on sanctions lists or to those in countries on sanctions lists. Depending on your country of residency and citizenship, you are responsible for any tax implications. Your local law may also impose additional restrictions. + +This is a discretionary rewards program. We can cancel the program at any time, and the decision to pay a reward is entirely at Trace Labs' discretion. + +Your testing must not violate any law or disrupt or compromise any data that is not your own. + +{% hint style="warning" %} +To avoid potential conflicts of interest, we will not grant rewards to Trace Labs employees, employees who have left Trace Labs within the last 2 years, and contractors. +{% endhint %} diff --git a/docs/contribute-to-the-dkg/bounties-and-rewards/general-bug-bounty/README.md b/docs/contribute-to-the-dkg/bounties-and-rewards/general-bug-bounty/README.md new file mode 100644 index 0000000..832e5c2 --- /dev/null +++ b/docs/contribute-to-the-dkg/bounties-and-rewards/general-bug-bounty/README.md @@ -0,0 +1,78 @@ +--- +icon: file-invoice-dollar +--- + +# General bug bounty + +To ensure the **security and proper functioning of our websites and applications,** Trace Labs has launched a dedicated bounty program. Each submission will be evaluated based on its severity and will correspond to a specific bounty reward. + +## Vulnerability categories and rewards + +* **Minor bug:** 50 TRAC +* **Medium bug:** 250 TRAC +* **Serious bug:** 500 TRAC +* **Critical bug:** 1000 TRAC + +## Bug bounty rules + +1. **Severity assessment:** The severity of each bug will be determined solely at the discretion of Trace Labs, based on both the likelihood and impact of the bug. All reward decisions are final. +2. **Submission process:** Please send your bug reports to [**bounty@origin-trail.com**](mailto:bounty@origin-trail.com) with the subject "WEBSITE/APP BUG BOUNTY." Upon receipt, we will evaluate the severity of the bug and contact you with further information. Submissions through other channels (e.g., social media) will not be accepted. + +## Security vulnerabilities + +* SQL injection. +* Cross-site scripting (XSS). +* Cross-site request forgery (CSRF). +* Remote code execution (RCE). +* Insecure configurations in web servers, databases, and application frameworks. +* Session hijacking and clickjacking. +* Sensitive data exposure. +* Unauthorized access to user accounts. +* Bypassing authentication mechanisms. +* Credentials exposure. +* Logic bypasses. + +## Example submission template + +``` +**Title:** [Short description of the vulnerability] + +**Description:** +[A detailed description of the vulnerability, including what it is and how it can be exploited] + +**Steps to Reproduce:** +1. [First step] +2. [Second step] +3. [Further steps as necessary] + +**Proof of Concept:** +[Include any screenshots, videos, or code snippets] + +**Impact:** +[Explain the potential impact of the vulnerability] + +**Suggested Fix:** +[Provide recommendations for how to fix the issue] + +**Additional Information:** +[Any other information that might be relevant] + +``` + +### Important restrictions + +Please ensure you do not harm any live contracts on public networks while testing; otherwise, you will not be eligible for a bug bounty. + +Leaking any vulnerability of the smart contracts on social media platforms or public channels will result in the cancellation of the bounty and might also invite legal action. + +### Legal notice + +We cannot issue rewards to individuals on sanctions lists or who reside in countries on sanctions lists. Depending on your country of residency and citizenship, you are responsible for any tax implications. Your local law may also impose additional restrictions. + +This is a discretionary rewards program. We can cancel the program at any time, and the decision to pay a reward is entirely at Trace Labs' discretion. + +Your testing must not violate any law or disrupt or compromise any data that is not your own. + +{% hint style="warning" %} +To avoid potential conflicts of interest, we will not grant rewards to Trace Labs employees, employees who have left Trace Labs within the last 2 years, and contractors. +{% endhint %} diff --git a/docs/contribute-to-the-dkg/bounties-and-rewards/general-bug-bounty/staking-security-bounty.md b/docs/contribute-to-the-dkg/bounties-and-rewards/general-bug-bounty/staking-security-bounty.md new file mode 100644 index 0000000..d8cff9c --- /dev/null +++ b/docs/contribute-to-the-dkg/bounties-and-rewards/general-bug-bounty/staking-security-bounty.md @@ -0,0 +1,22 @@ +--- +hidden: true +--- + +# Staking security bounty + +As a part of the DKG V8 mainnet launch, a 100k TRAC staking security bounty will be awarded to test the staking system in a real economic environment. + +The new, improved staking system includes: + +* The new [Staking Dashboard](https://staking.origintrail.io/) +* The updated [smart contracts](https://github.com/OriginTrail/dkg-evm-module/) + +To be eligible, users need to: + +* Stake TRAC on the mainnet and test the upgraded V8 Staking Dashboard between Dec 27, 17:00 CET, and January 10, 17:00 CET. +* Register for claiming the reward from January 10, 17:00 CET, to January 16, 17:00 CET, on [the designated reward claiming interface](https://dkg-v8-incentivised-testnet.origintrail.io/claim-rewards). Registration includes submitting KYC data for the purposes of reward distribution and qualitative feedback on the usage of the staking interface. + +The total reward amount will be distributed pro rata among eligible participants according to the size of the stake they contributed during that period. + +Rewards will be distributed no later than January 27, 17:00 CET. + diff --git a/docs/contribute-to-the-dkg/contribute/README.md b/docs/contribute-to-the-dkg/contribute/README.md new file mode 100644 index 0000000..f715b03 --- /dev/null +++ b/docs/contribute-to-the-dkg/contribute/README.md @@ -0,0 +1,84 @@ +--- +description: How to contribute to OriginTrail code repositories +--- + +# Contribution guidelines + +OriginTrail is an ecosystem dedicated to **open-source software.** It is based on the principles of **neutrality, inclusiveness, and usability**. + +All project repositories, such as OT node, DKG clients, Houston, etc., are available on our official [GitHub](https://github.com/OriginTrail). + +If you are new to OriginTrail development, there are guides in this documentation for getting your development environment set up. + +Please follow the below procedure to contribute new code or fixes: + +* Create a separate branch by branching the relevant branch (we generally follow [Gitflow](https://www.atlassian.com/git/tutorials/comparing-workflows/gitflow-workflow)) +* Create a pull request to **develop** (except for v6 contributions, then use the **v6/develop**) branch containing a description of what your code does and how it can be tested +* Provide at least a minimum of unit tests +* Please include descriptive commit messages + +## Rules + +There are a few basic ground rules for contributors: + +1. **No `--force` pushes** or modifying the Git history in any way. +2. **Non-master branches** ought to be used for ongoing work. +3. **External API changes and significant modifications** ought to be subject to an **internal pull request** to solicit feedback from other contributors. +4. Internal pull requests to solicit feedback are _encouraged_ for any other non-trivial contribution but are left to the discretion of the contributor. +5. Contributors should attempt to adhere to the prevailing code style. + +### Merging pull requests once CI is successful + +* Each pull request **must be reviewed and approved by at least two OriginTrail core developers** +* A pull request that does not significantly change logic and is urgently needed may be merged after a non-author OriginTrail core developer has reviewed it thoroughly. +* All other PRs should sit for 48 hours in order to garner feedback. +* No PR should be merged until all review comments are addressed. + +### Reviewing pull requests + +When reviewing a pull request, the end goal is to suggest useful changes to the author. Reviews should finish with approval unless there are issues that would result in: + +* Buggy behavior. +* Undue maintenance burden. +* Breaking with house coding style. +* Pessimization (i.e., reduction of speed as measured in the project benchmarks). +* Feature reduction (i.e., it removes some aspect of functionality that a significant minority of users rely on). +* Uselessness (i.e., it does not strictly add a feature or fix a known issue). + +### Releases + +Declaring formal releases remains the prerogative of the project maintainer(s). + +## Changes to this arrangement + +This is an experiment, and feedback is welcome! This document may also be subject to pull requests or changes by contributors where you believe you have something valuable to add or change. + +## Heritage + +These contributing guidelines are compiled with inspiration from "OPEN Open Source Project" guidelines for the Level project: [https://github.com/Level/community/blob/master/CONTRIBUTING.md](https://github.com/Level/community/blob/master/CONTRIBUTING.md) and Polkadot: [https://github.com/paritytech/polkadot/blob/master/CONTRIBUTING.md](https://github.com/paritytech/polkadot/blob/master/CONTRIBUTING.md) + +## Contributor code of conduct + +As contributors and maintainers of OriginTrail, we pledge to respect everyone who contributes by posting issues, updating documentation, submitting pull requests, providing feedback in comments, and any other activities. + +Communication through any of our channels (GitHub, Discord, X, etc.) must be constructive and never resort to personal attacks, trolling, public or private harassment, insults, or other unprofessional conduct. + +We promise to extend courtesy and respect to everyone involved in this project regardless of gender, gender identity, sexual orientation, disability, age, race, ethnicity, religion, or level of experience. We expect anyone contributing to the project to do the same. + +If any member of the community violates this code of conduct, the maintainers of the OT Node project may take action, removing issues, comments, and PRs or blocking accounts as deemed appropriate. + +If you are subject to or witness unacceptable behavior or have any other concerns, please email us. + +## Questions, bugs, features + +### Got a question or problem? + +Do not open issues for general support questions. We want to keep GitHub issues for bug reports and feature requests. You’ll have much better chances of getting your question answered on [Discord](https://discord.gg/xCaY7hvNwD). + +### Found an issue or bug? + +If you find a bug in the source code, you can help us by submitting an issue to the appropriate GitHub repo. Even better, you can submit a pull request with a fix. We often have [bounty programs](broken-reference) as well; you might be eligible for rewards! + +### Want a doc fix? + +These docs are available publicly on this [GitHub repo](https://github.com/OriginTrail/dkg-docs). Feel free to propose updates through pull requests or open discussions through GitHub issues diff --git a/docs/contribute-to-the-dkg/contribute/guidelines-for-automated-test-contributions.md b/docs/contribute-to-the-dkg/contribute/guidelines-for-automated-test-contributions.md new file mode 100644 index 0000000..4faec86 --- /dev/null +++ b/docs/contribute-to-the-dkg/contribute/guidelines-for-automated-test-contributions.md @@ -0,0 +1,32 @@ +--- +description: Contribute to DKG by growing the automated test suite for the ot-node! +--- + +# Guidelines + +### Guidelines for automated test contributions + +Welcome to the automated tests contribution guidelines! By contributing to the ot-node automated test suite, you help further develop the OriginTrail technology and increase the robustness of the DKG implementation. Detailed instructions on how you can participate are described below. However, if you need any assistance, feel free to [jump into Discord](https://discord.gg/xCaY7hvNwD) and ask for help from the community or core developers there. + +{% hint style="info" %} +This guide applies to the DKG node GitHub repo, particularly for the v8 branches (v8/develop being the default branch). If you are creating a contribution, branch off from v8/develop. +{% endhint %} + +\ +We use [**Cucumber.js**](https://cucumber.io/) for running automated tests. This framework uses the Gherkin language, which allows expected software behaviors to be specified in a logical language that can be easily understood. + +For a detailed guide on how to write BDD scenarios, visit [Cucumber docs](https://cucumber.io/docs/gherkin/reference/). + +Tests need to cover errors listed on the [GitHub discussions](https://github.com/OriginTrail/ot-node/discussions/2095) page. Different errors are thrown in different commands, all of which are executed by the **command executor**. To learn what the command executor is and how it works, please visit [Command Executor](../../dkg-knowledge-hub/useful-resources/ot-node-engine-implementation-details/command-executor.md). + +## Steps to contribute BDD tests + +1. Check out [GitHub discussions](https://github.com/OriginTrail/ot-node/discussions/2095) to see current test requirements. Check out the existing steps available in `./ot-node/test/bdd/steps`. +2. Create a feature file where you'll define new scenarios. Features should be located in **test/bdd/features** directory with the appropriate file name format: _**\**_**-errors.feature** (e.g., _publish-errors.feature, get-errors.feature_). +3. When you're done with the scenarios that you've created, following the [Contribution guidelines](./), create a **draft pull request (PR)** and tag at least two core developers (and as many other community members as you wish). The team members will review the feature files and either give approval or request for changes. Once you get a confirmation that the feature files are satisfactory, you can continue with the corresponding step definitions. This helps provide feedback on the contribution early on. +4. Put step definitions in the `./ot-node/test/bdd/steps` directory. +5. Implement the steps and ensure that both old tests and the new ones you just created are passing locally (use the `npm run test:bdd` command). Node logs can be found in **./ot-node/test/bdd/log/**_**\**_**/origintrail-test-**_**\**_**.log**(e.g. _test/bdd/log/Publishing-a-valid-assertion/origintrail-test-0.log)._ +6. When you're satisfied with the scenarios and their step definitions, you may now mark your PR draft as "Ready for review". +7. There will likely be feedback on your PR before it gets approved, so make sure to follow the status of your PR contribution. After PR approval, your changes will be merged. +8. Congratulations! You've just made ot-node more robust :tada: + diff --git a/docs/contribute-to-the-dkg/delegated-staking/README.md b/docs/contribute-to-the-dkg/delegated-staking/README.md new file mode 100644 index 0000000..87ebe36 --- /dev/null +++ b/docs/contribute-to-the-dkg/delegated-staking/README.md @@ -0,0 +1,123 @@ +--- +description: >- + Discover how TRAC holders can strengthen the DKG by delegating stake to Core + Nodes — earning rewards, supporting network security, and keeping full control + of their tokens. +--- + +# Delegated staking + +As a decentralized system, the OriginTrail Decentralized Knowledge Graph (DKG) enables all ecosystem stakeholders owning TRAC to contribute their economic stake to the functioning of the network for utility. Delegated staking involves locking up TRAC for contributing to the DKG security on selected DKG Core Nodes. The DKG Core Node rewards are shared between the TRAC stake delegators. + +
+ +## TRAC delegated staking mechanics + +For a DKG Core Node to be eligible to host a portion of the DKG and receive TRAC network rewards, its TRAC stake plays a crucial role. Set at a minimum of 50,000 TRAC on a particular blockchain, the stake has an important role in ensuring the security of the DKG. The DKG Core Node operators can contribute to the node stake on their own or by attracting more TRAC to their stake through delegated staking. + +There are 2 roles involved in delegated staking: **Core Node operators** and **TRAC delegators.** + +**Core Node operators** are network participants who choose to host and maintain core network nodes (specialized DKG software running servers). Core Nodes store, validate, and make knowledge available to AI systems. They receive $TRAC rewards for this service. All Core Nodes together form a permissionless market of DKG services, competing for their share of network TRAC rewards. + +**Delegators** lock up their TRAC to contribute to the DKG security on selected DKG Core Nodes and increase their chance of capturing TRAC network rewards. The rewards that the DKG Core Node captures are then shared between the TRAC stake delegators. The delegated tokens are locked in a smart contract and are never accessible to the Core Node operators. + +Note that Core Node operators and node delegators are not distinct - you can be both at the same time. + +{% hint style="info" %} +Contrary to inflationary systems, TRAC staking is strictly utility-based, and rewards are generated through DKG usage via knowledge publishing fees. +{% endhint %} + +*** + +## How do delegators earn TRAC fees? + +As knowledge publishers create Knowledge Assets on the DKG, they lock an appropriate amount of TRAC tokens in the DKG smart contracts. The TRAC amount offered has to be high enough to ensure that enough DKG Core Nodes will store it for a specific amount of time. The nodes then commit to storing the Knowledge Assets for a specific amount of time, measured in **30-day periods called epochs**. + +At the end of each epoch, DKG nodes "prove" that they are providing DKG services to the DKG smart contracts, which in turn unlocks TRAC rewards initially locked by the knowledge publisher. + +Many Core Nodes can compete for the same TRAC reward on the basis of their total stake, node ask, and publishing factor. Node rewards are a function of 4 parameters in order of importance: + +1. **Node uptime & availability,** in positive correlation, as nodes need to prove their commitment to hosting the DKG by submitting proofs to the blockchain (through the new V8 random sampling proof system); +2. **TRAC stake security factor,** in positive correlation - the more stake a node attracts, the higher the security guarantees and, therefore, the higher chance of rewards; +3. **Publishing factor,** in positive correlation - the more new knowledge has been published via a specific Core Node (measured in TRAC tokens), the higher the chance of rewards; +4. **Node ask,** in negative correlation - the nodes with asks lower than the current network fee are positively impacting the system scalability and, therefore, have a higher chance of rewards. + +More details are presented in[ OT-RFC-21](https://github.com/OriginTrail/OT-RFC-repository/blob/main/RFCs/OT-RFC-21_Collective_Neuro-Symbolic_AI/OT-RFC-21%20Collective%20Neuro-Symbolic%20AI.pdf) + +After claiming the rewards, the rewards are **automatically restaked, increasing the Core Node's overall stake by the amount of collected rewards.** + +In order to introduce a level of predictability of network operations, withdrawing tokens is subject to an unbonding period of 28 days. + +{% hint style="warning" %} +If you want to withdraw tokens in order to delegate to another node on the same network (blockchain), you **do not** have to wait 28 days! [See here >](redelegating-stake.md) +{% endhint %} + +{% hint style="success" %} +Delegated staking is a non-custodial system, so the Core Node operator has no access to the locked TRAC tokens at any time. +{% endhint %} + + + +Each Core Node operator can also set an “**operator fee,**” which is a percentage of the TRAC rewards deducted each time a node claims rewards from a Knowledge Asset. The remaining TRAC fee is then split proportionally to the share of staked tokens across all delegators. + +{% hint style="info" %} +**Example**: If a node accumulated **1,000 TRAC** tokens in the previous period, and the node has two delegators, both with a 50% share, and the operator\_fee is 10%: + +* the node operator will receive 100 TRAC (10%) +* each delegator receives 450 TRAC (50% of the remaining 900 TRAC) +{% endhint %} + +*** + +## What makes a good node? How should I pick a node to delegate to? + +Nodes compete to provide the best service in the network — the better the nodes perform, the more TRAC rewards they attract. But how do you know which nodes perform the best? + +### Node Power + +
+ +**Node Power** is a metric that gives delegators a simplified view of a node’s overall strength in the network. It combines: + +* The amount of TRAC staked on the node +* How much new knowledge has the node published +* The node's service ask (lower ask = higher competitiveness) + +This score shows how competitive the node is in attracting publishing rewards, and how its influence compares to other nodes in the network. + +### Node Health + +
+ +**Node Health** indicates how reliably a node has performed in the random sampling proof system. It reflects: + +* How many proof challenges did the node successfully respond to +* Compared to the number of challenges it was expected to respond to in that epoch + +High node health indicates the node has strong uptime and actively maintains the availability of Knowledge Assets—both critical for earning consistent rewards. + +*** + +### Operator fee + +Each node may **charge an operator fee** (e.g., 10%) on rewards earned. A lower fee means a higher share of rewards for delegators, but also consider the performance indicators (Node Power and Health) when making your decision. + +
+ +*** + +## Delegating if you run a Core Node + +If you are running a DKG Core Node, you can delegate TRAC tokens to your node in the same way as others. It is recommended that you also delegate TRAC tokens, signaling your commitment to the network via economic stake - this provides a trust signal to other delegators. + +To understand how to set up your operator fee, follow the [Core Node setup](../../graveyard/everything/dkg-core-node/) instructions. Note that changing your operator fee incurs a 28-day delay, balancing the 28-day delay that delegators experience when withdrawing stake from your node. + +

Depiction of delegating and withdrawing of TRAC from DKG smart contracts

+ +*** + +## **Have questions?** + +Drop by our [Discord](https://discord.com/invite/xCaY7hvNwD) or [Telegram group](https://t.me/origintrail), and feel free to ask your questions there. Make sure to follow our official announcements, and stay safe! + +Happy staking! 🚀 diff --git a/docs/contribute-to-the-dkg/delegated-staking/redelegating-stake.md b/docs/contribute-to-the-dkg/delegated-staking/redelegating-stake.md new file mode 100644 index 0000000..67a5a03 --- /dev/null +++ b/docs/contribute-to-the-dkg/delegated-staking/redelegating-stake.md @@ -0,0 +1,52 @@ +--- +description: Moving your TRAC stake from one node to another +--- + +# Redelegating stake + +If you want **move your delegated TRAC stake from one DKG node to another**, you can use the **redelegate** feature instead of withdrawing and then delegating again. With redelegation, the amount of TRAC stake you are "redelegating" will be transferred from the original DKG node to the new DKG node of your choice, avoiding the 28-day delay that would otherwise take place if you were to withdraw tokens first. + +*** + +## Keep in mind + +* The DKG is multichain. However, **TRAC tokens can only be redelegated within nodes on the same blockchain** +* The amount of stake (TRAC) that you want to redelegate **should not exceed the second node's remaining capacity** (a node can have a maximum of 2,000,000 TRAC stake delegated to it). + +*** + +## How can you redelegate TRAC? + +1. Click on the '**Connect Wallet**' button in the top right corner of the navigation bar and follow the prompts to connect your wallet to the interface. +2. Go to the **'My delegation**' tab to see available nodes that you can redelegate from. +3. Optionally, use the **'Filter by blockchain'** dropdown to select the desired blockchain, which will filter and display nodes on this network along with their staking information. +4. Once you've decided which node you want to redelegate your TRAC from, click on the **'Manage stake'** button next to the desired node on the right side of the table. Make sure you read the disclaimer. +5. When the staking pop-up opens, you'll have the option to **Delegate, Redelegate,** or **Withdraw** TRAC tokens from the node. Proceed by selecting '**Redelegate**'. + +

Use the redelegate button in the popup to redelegate your stake

+ + + +6. After clicking on 'Redelegate' a field to enter the amount of TRAC you wish to redelegate to another node will appear on the right side of the pop-up as well as the select-box for selecting the other node - the one that will receive the TRAC. **Enter the amount of TRAC you want redelegated and select the node you want to redelegate to.** + +
+ +{% hint style="warning" %} +You can stake your TRAC only to nodes that have less than 2,000,000 TRAC stake delegated to them. +{% endhint %} + +{% hint style="info" %} +Only the nodes from the same network with the remaining capacity greater than zero will be shown in the 'Choose available node' select box. +{% endhint %} + +7. **The redelegation process will require two transactions:** one to increase the allowance and another to confirm the redelegation contract interaction. Please be patient, as this can take some time. + +
+ +8. Once both transactions are signed and confirmed, you should see a **'Stake redelegated successfully'** message appear. +9. To confirm that the process was successful, **check your TRAC delegation** by going to the 'My delegations' tab above the table with the nodes and verifying that your delegations are listed there. Additionally, ensure that the stake amount on the node has decreased and the amount on the other node has increased following the successful redelegation.\ + + +{% hint style="info" %} +If you encounter any issues during the staking process or require assistance, please get in touch with the OriginTrail community in [Discord](https://discord.gg/xCaY7hvNwD). +{% endhint %} diff --git a/docs/contribute-to-the-dkg/delegated-staking/step-by-step-staking.md b/docs/contribute-to-the-dkg/delegated-staking/step-by-step-staking.md new file mode 100644 index 0000000..8b9b752 --- /dev/null +++ b/docs/contribute-to-the-dkg/delegated-staking/step-by-step-staking.md @@ -0,0 +1,86 @@ +--- +description: If you are new to TRAC delegated staking, this guide is for you! +--- + +# Step-by-step staking + +Welcome to the step-by-step TRAC delegated staking guide! First, lets start with some prerequisites + +## Prerequisites + +1. You need to have some TRAC tokens to delegate. See ['How to get on TRAC(k)?' section of this website >](https://origintrail.io/get-started/trac-token) +2. You need to decide which blockchain you want to stake on. The DKG supports multiple blockchains: + * [Base Blockchain](../../dkg-knowledge-hub/learn-more/connected-blockchains/base-blockchain/) + * [NeuroWeb](../../dkg-knowledge-hub/learn-more/connected-blockchains/neuroweb.md) + * [Gnosis Chain](../../dkg-knowledge-hub/learn-more/connected-blockchains/gnosis-chain/) +3. Bridge your TRAC to the chosen blockchain. See instructions for bridging: + * [Base Blockchain](../../dkg-knowledge-hub/learn-more/connected-blockchains/base-blockchain/) + * [NeuroWeb](../../graveyard/everything/teleport-instructions-neuroweb.md) + * [Gnosis Chain](../../dkg-knowledge-hub/learn-more/connected-blockchains/gnosis-chain/) +4. Have some gas fee tokens available on the chosen network: + * Base Mainnet: ETH on Base + * NeuroWeb: NEURO + * Gnosis Chain: xDAI + +{% hint style="warning" %} +_If you are staking on NeuroWeb, please make sure that you update both **"Max base fee"** and "**Priority fee**" to **0.00000001** before signing transactions._ +{% endhint %} + +*** + +## **TRAC staking using the Staking Dashboard** + +{% hint style="info" %} +For the purpose of this tutorial we used the Metamask wallet extension. +{% endhint %} + +Once you have confirmed that you have both gas tokens and TRAC tokens available in your wallet, you can proceed to the Staking Dashboard at [https://staking.origintrail.io/](https://staking.origintrail.io/) and follow the steps below: + +### **Step 1:** + +Click on the **'Connect wallet'** button in the top right corner of the navigation bar and follow the prompts to connect your wallet to the interface. + +
+ +### **Step 2:** + +Make sure you have selected the right blockchain in your wallet. + +
+ +### **Step 3:** + +The Staking Dashboard shows a list of all the Core Nodes hosting the DKG. This table shows different information, such as: + +* The node name, +* Which blockchain it's connected to, +* How much stake does a node have, +* The node's ask, +* The node's operator fee, +* Reward statistics, and other. + +**To delegate your TRAC tokens, you need to pick one or more nodes you believe are going to perform best for the network** (on the basis of criteria explained [here](./)). The chosen node has to have **enough "room" to take TRAC,** meaning less than 2M TRAC already staked. 2M is the maximum amount of TRAC staked per node. + +
+ +### **Step 4:** + +Once you click on a Core Node, a staking pop-up opens with the option to delegate or withdraw TRAC tokens from the node. Proceed by pressing the **'Delegate'** button + +

Delegating popup

+ +### **Step 5:** + +Enter the amount of TRAC you would like to delegate and press the **'Delegate TRAC'** button. The delegation process will require two transactions: one to increase the allowance and another to confirm the contract interaction. + +
+ +### **Step 6:** + +To confirm that the process was successful, check your TRAC delegation by going to the **'My delegations'** tab above the table with the nodes and verify that your delegation is listed there. Additionally, ensure that the stake amount on the node has increased following the successful delegation. + +
+ +{% hint style="info" %} +If you encounter any issues during the staking process or require assistance, please contact our technical support team by sending an email to **tech@origin-trail.com**. +{% endhint %} diff --git a/docs/contribute-to-the-dkg/ecosystem-call-for-papers/README.md b/docs/contribute-to-the-dkg/ecosystem-call-for-papers/README.md new file mode 100644 index 0000000..013699f --- /dev/null +++ b/docs/contribute-to-the-dkg/ecosystem-call-for-papers/README.md @@ -0,0 +1,81 @@ +--- +hidden: true +--- + +# Ecosystem call for papers + +**Contribute to building the trusted knowledge foundation for AI and humanity.** + +OriginTrail is inviting researchers and builders to submit proposals and research papers that advance the frontier of decentralized knowledge technologies. The initiative aims to support the development of systems that make artificial intelligence (AI) more explainable, trustworthy, and beneficial to all. + +This call will formally launch soon. Until then, we invite interested contributors to join the conversation and prepare for submission. + +### Why this matters + +The world is entering a critical phase of digital evolution, where AI systems increasingly shape decisions in science, healthcare, finance, and governance. Yet, the trustworthiness, provenance, and interpretability of these systems remain significant challenges. + +This Call for Papers is part of a broader effort to address these challenges through: + +* Advancing verifiable and explainable AI systems +* Enabling privacy-preserving, decentralized knowledge agents +* Empowering communities to co-create and govern public knowledge infrastructures + +All submissions will be published on the OriginTrail Decentralized Knowledge Graph (DKG), a public infrastructure for trusted knowledge. Discussions will be supported through community calls and asynchronous collaboration tools. + +### What we're looking for + +We invite papers and project proposals that explore and advance any of the following (non-exhaustive) areas: + +* Privacy-preserving agents using decentralized knowledge graphs (on devices or edge nodes) +* Decentralized data monetization and knowledge marketplaces +* Assessment and signaling of truthfulness using trusted knowledge +* Discoverability and retrieval of knowledge in distributed networks +* Privacy-preserving graph queries and vector search +* Autonomous research agents and decentralized scientific workflows +* Federated learning with decentralized trust +* Applications of graph machine learning and neuro-symbolic reasoning +* Multichain and cross-domain knowledge verification\ + + +We encourage submissions that demonstrate both rigorous research and practical experimentation—especially those that aim to publish, interact with, or expand knowledge on the DKG. + +### Who should apply + +This call is open to individuals and teams with expertise in AI, semantic technologies, graph data systems, distributed computing, and related domains. Both academic and industry participants are welcome. + +Ideal applicants will have prior experience in: + +* Designing or working with knowledge graph systems +* Publishing technical or scientific work +* Building experimental or deployed AI systems + +### Funding and recognition + +The top 10 submissions will receive support to further develop and deploy their work with a: + +* A total prize pool of 100,000 TRAC tokens +* All accepted papers will be published on the DKG, making them accessible, immutable, and referenceable + +Proposals that intend to conduct live experiments, publish structured knowledge, or extend the DKG’s capabilities will be given priority in evaluation. + +### Submission and tools + +A new set of tools and infrastructure will accompany the call, including: + +* A DKG Research Agent that allows interaction with published papers +* Edge Nodes capable of reading and publishing papers in PDF or Markdown +* A community-driven technical fellowship to support and evaluate proposals + +Detailed submission guidelines will be announced shortly. + +### Stay involved + +Join the OriginTrail community to exchange ideas, get early access to submission details, and help shape the future of decentralized knowledge systems. + +* Join the conversation:[ OriginTrail Discord](https://discord.gg/origintrail) +* Follow updates:[ origintrail.io\ + ](https://origintrail.io/) + +The formal call will go live soon—be part of this global effort to build trusted, open, and intelligent systems for the benefit of all. + +\ diff --git a/docs/contribute-to-the-dkg/ecosystem-call-for-papers/origintrail-ecosystem-call-for-papers-coming-soon.md b/docs/contribute-to-the-dkg/ecosystem-call-for-papers/origintrail-ecosystem-call-for-papers-coming-soon.md new file mode 100644 index 0000000..21b02e9 --- /dev/null +++ b/docs/contribute-to-the-dkg/ecosystem-call-for-papers/origintrail-ecosystem-call-for-papers-coming-soon.md @@ -0,0 +1,77 @@ +--- +description: Contribute to building the trusted knowledge foundation for AI and humanity. +hidden: true +--- + +# OriginTrail Ecosystem — Call for Papers (Coming Soon) + +OriginTrail is inviting researchers and builders to submit proposals and research papers that advance the frontier of decentralized knowledge technologies. The initiative aims to support the development of systems that make AI more explainable, trustworthy, and beneficial to all. + +This call will formally launch soon. Until then, we invite interested contributors to join the conversation and prepare for submission. + +## Why This Matters + +The world is entering a critical phase of digital evolution, where artificial intelligence systems increasingly shape decisions in science, healthcare, finance, and governance. Yet, the trustworthiness, provenance, and interpretability of these systems remain significant challenges. + +This Call for Papers is part of a broader effort to address these challenges through: + +* Advancing **verifiable and explainable AI** systems +* Enabling **privacy-preserving, decentralized knowledge agents** +* Empowering communities to **co-create and govern public knowledge infrastructures** + +All submissions will be published on the OriginTrail Decentralized Knowledge Graph (DKG), a public infrastructure for trusted knowledge. Discussions will be supported through community calls and asynchronous collaboration tools. + +## What We're Looking For + +We invite papers and project proposals that explore and advance any of the following (non-exhaustive) areas: + +* Privacy-preserving agents using decentralized knowledge graphs (on devices or edge nodes) +* Decentralized data monetization and knowledge marketplaces +* Assessment and signaling of truthfulness using trusted knowledge +* Discoverability and retrieval of knowledge in distributed networks +* Privacy-preserving graph queries and vector search +* Autonomous research agents and decentralized scientific workflows +* Federated learning with decentralized trust +* Applications of graph machine learning and neuro-symbolic reasoning +* Multichain and cross-domain knowledge verification + +We encourage submissions that demonstrate both rigorous research and practical experimentation—especially those that aim to publish, interact with, or expand knowledge on the DKG.\ + + +## Who Should Apply + +This call is open to individuals and teams with expertise in artificial intelligence, semantic technologies, graph data systems, distributed computing, and related domains. Both academic and industry participants are welcome. + +Ideal applicants will have prior experience in: + +* Designing or working with knowledge graph systems +* Publishing technical or scientific work +* Building experimental or deployed AI systems + +## Funding and Recognition + +Top 10 submissions will receive support to further develop and deploy their work with a: + +* A total prize pool of **100,000 TRAC tokens** +* All accepted papers will be **published on the DKG**, making them accessible, immutable, and referenceable + +Proposals that intend to conduct live experiments, publish structured knowledge, or extend the DKG’s capabilities will be given priority in evaluation. + +## Submission and Tools + +A new set of tools and infrastructure will accompany the call, including: + +* A DKG Research Agent that allows interaction with published papers +* Edge Nodes capable of reading and publishing papers in PDF or Markdown +* A community-driven technical fellowship to support and evaluate proposals + +Detailed submission guidelines will be announced shortly. + +## Stay Involved + +Join the OriginTrail community to exchange ideas, get early access to submission details, and help shape the future of decentralized knowledge systems. + +* **Join the conversation:**[ OriginTrail Discord](https://discord.gg/origintrail) +* **Follow updates:**[ origintrail.io](https://origintrail.io/) + +The formal call will go live soon—be part of this global effort to build trusted, open, and intelligent systems for the benefit of all. diff --git a/docs/contribute-to-the-dkg/whitepapers-and-rfcs/README.md b/docs/contribute-to-the-dkg/whitepapers-and-rfcs/README.md new file mode 100644 index 0000000..840f754 --- /dev/null +++ b/docs/contribute-to-the-dkg/whitepapers-and-rfcs/README.md @@ -0,0 +1,2 @@ +# Whitepapers & RFCs + diff --git a/docs/contribute-to-the-dkg/whitepapers-and-rfcs/origintrail-rfcs.md b/docs/contribute-to-the-dkg/whitepapers-and-rfcs/origintrail-rfcs.md new file mode 100644 index 0000000..9ddd35e --- /dev/null +++ b/docs/contribute-to-the-dkg/whitepapers-and-rfcs/origintrail-rfcs.md @@ -0,0 +1,30 @@ +--- +icon: notes +--- + +# RFCs + +The OriginTrail development is transparent and being guided by public debate through the Trace Alliance working groups, as well as through technical discussions within the community through the technical **Requests for Comments (OT-RFCs)**. + +The RFCs have formerly been part of the official documentation and have moved to [this repository](https://github.com/OriginTrail/OT-RFC-repository) from OT-RFC-07 on October 12, 2020. + +## The RFC lifecycle + +There are three stages for an OT-RFC, easily visible in [the RFC Repository Kanban](https://github.com/OriginTrail/OT-RFC-repository/projects/1). + +* **DRAFT stage:** The RFC is associated with DRAFT status in the period of writing and collaborating on the RFC. Once the discussions are completed, the RFC moves to the ACCEPTED stage. +* **ACCEPTED stage:** The accepted RFCs are to be implemented by the OriginTrail developers, serving as technical specifications. +* **FINAL stage:** Once RFCs have been implemented and merged into the codebase, they are considered FINAL. + +### The RFC rules & best practices + +* RFCs move to the “Accepted” stage by the core developers after discussions with relevant Trace Alliance working group task forces and the community. +* A **“last call”** can be issued for a RFC that is close to moving out of the draft stage to notify the community to provide the final feedback (about a week before acceptance). +* Moving the RFC to the “Final” stage is performed by developers to indicate the specific client release that has implemented the RFC +* All RFC-related discussions should be directed to the dedicated [RFC issues](https://github.com/OriginTrail/OT-RFC-repository/issues) page to promote a constructive, structured, and long-term debate on the RFC, avoiding less structured channels such as chat rooms and social media. + +The RFC GitHub repository can be found [here](https://github.com/OriginTrail/OT-RFC-repository). + +### Can I contribute to RFCs? + +Absolutely! You can propose your own RFCs or comment and provide feedback on the existing ones in the pipeline. The OriginTrail community welcomes inputs and will greatly appreciate any improvement proposals! diff --git a/docs/contribute-to-the-dkg/whitepapers-and-rfcs/origintrail-whitepaper.md b/docs/contribute-to-the-dkg/whitepapers-and-rfcs/origintrail-whitepaper.md new file mode 100644 index 0000000..4ede10e --- /dev/null +++ b/docs/contribute-to-the-dkg/whitepapers-and-rfcs/origintrail-whitepaper.md @@ -0,0 +1,7 @@ +--- +icon: file-lines +--- + +# Whitepaper + +Explore [the OriginTrail Whitepaper](https://origintrail.io/ecosystem/whitepaper) to learn more about a vision for the future of artificial intelligence (AI) through the concept of a verifiable Internet for AI, leveraging the synergies of crypto, internet, and AI technologies. diff --git a/docs/deploy-your-dkg-node-to-production/from-edge-node-to-core-node.md b/docs/deploy-your-dkg-node-to-production/from-edge-node-to-core-node.md new file mode 100644 index 0000000..f52dd10 --- /dev/null +++ b/docs/deploy-your-dkg-node-to-production/from-edge-node-to-core-node.md @@ -0,0 +1,6 @@ +--- +hidden: true +--- + +# From Edge Node To Core Node + diff --git a/docs/dkg-key-concepts.md b/docs/dkg-key-concepts.md new file mode 100644 index 0000000..74a54b2 --- /dev/null +++ b/docs/dkg-key-concepts.md @@ -0,0 +1,130 @@ +--- +description: >- + The OriginTrail Decentralized Knowledge Graph (DKG) introduces novel concepts, + such as Knowledge Assets, autonomous paranets, and others. Find an overview of + key concepts below +--- + +# DKG - Key concepts + +## DKG Network & Nodes + +OriginTrail Decentralized Knowledge Graph (DKG) is a permissionless, multi-chain infrastructure designed to host and interlink semantically-rich “[Knowledge Assets](dkg-key-concepts.md#knowledge-assets)” - structured containers of machine-readable data (e.g., RDF-based graphs) that are discoverable, verifiable and owned by their creators.\ +\ +The DKG enables AI-agents and applications to query, connect and build upon distributed knowledge while preserving provenance and trust through blockchain-anchored proof systems. + +The DKG Network is comprised of network nodes, running on different servers and devices. **There are two primary node types that enable the network’s operation**. The first is **the DKG Core Node**, which hosts the public DKG, persistently stores and serves knowledge assets, participates in random-sampling proofs and token incentives, and requires a minimum stake (e.g., 50 000 TRAC) to participate.\ +\ +The second is the **DKG Edge Node**, which runs on devices at the “edge” (e.g., laptops, phones, IoT, and even servers, if deployed that way) and enables local knowledge processing, private-graph handling, and integration with AI-pipelines (via APIs like dRAG), allowing owners to retain control of their data while still contributing to the global DKG. + +Together Core and Edge Nodes form the network and exchange knowledge, facilitated by the blockchain. They share the same codebase however, so **it is possible to turn a DKG Edge Node into a DKG Core node (more on that later in the docs)**. + +## Knowledge Assets + +A Knowledge Asset is **an ownable container of knowledge in the DKG** with verifiable provenance and source. It can describe any digital or physical object, abstract concept, or really any "thing." It can easily connect to other Knowledge Assets in the OriginTrail DKG, enabling the building of sophisticated graph representations of the world (a.k.a. the World model). + +More precisely, a Knowledge Asset is a web resource identified by a unique Uniform Asset Locator (or UAL, which is an extension of the traditional URL), consisting of: + +* **Knowledge:** In the form of graph data (RDF) and vector embeddings, stored on the DKG (not on the blockchain). +* **Cryptographic proofs:** Representing cryptographic digests of the knowledge stored on the blockchain. +* **Uniform Asset Locator**: Globally unique URI with assigned ownership using blockchain accounts, implemented as a non-fungible token (NFT) on the blockchain. +* **Derivable vector embeddings**: These facilitate the neuro-symbolic features - such as link prediction, entity prediction, similarity search, and others. + + + +
+ +Knowledge content can be observed as a time series of knowledge content states or **assertions**. Each assertion can be independently verified for integrity, by recomputing the cryptographic fingerprint by the verifier and comparing if the computed result matches with the corresponding blockchain fingerprint record. + +Technically, an assertion is represented using the n-quads serialization and a cryptographic fingerprint (n-quads graph Merkle root, stored immutably on the blockchain) for assertion verification. + +**Knowledge Assets** can be both **public and private.** Public assertion data is replicated on the OriginTrail Decentralized Network and publicly available, while private assertion data is contained within the private domain of the asset owner (e.g., an OriginTrail node hosted by the asset owner, such as a person or company). + +In summary, a Knowledge Asset is a combination of an NFT record and a semantic record. Using the dkg.js SDK, you can perform CRUT (create, read, update, transfer) operations on Knowledge Assets, which are explained below in further detail. + +### Knowledge Asset state finality + +Similar to distributed databases, the OriginTrail DKG applies replication mechanisms and needs mechanisms to reach a consistent state on the network for Knowledge Assets. In OriginTrail DKG, state consistency is reconciled using the blockchain, which hosts state proofs for Knowledge Assets, and replication commit information from DKG nodes. This means that updates for an existing Knowledge Asset are accepted by the network nodes (similar to the way nodes accept Knowledge Assets on creation) and can operate with all accepted states. + +## What is a UAL? + +Uniform Asset Locators (UALs) are ownable identifiers on the DKG, similar to URLs in the traditional web. The UALs follow the DID URL specification and are used to identify and locate a specific Knowledge Asset within the OriginTrail DKG. + +UAL consists of 5 parts: + +* did (decentralized identifier) predicate +* dkg (decentralized knowledge graph) predicate +* blockchain identifier (otp:2043 = OriginTrail NeuroWeb Mainnet) +* blockchain address (such as the address of the relevant asset NFT smart contract) +* identifier specific to the contract, such as the ID of the NFT token +* query and fragment components + +An example UAL may look like this: + +``` +did:dkg:otp:2043/0x5cac41237127f94c2d21dae0b14bfefa99880630/318322#color +``` + +This UAL refers to the DKG on the mainnet, its blockchain address is `0x5cac41237127f94c2d21dae0b14bfefa99880630` , the ID of the token is `318322`and to a property "color" inside its knowledge graph. + +More information on DID URLs can be found [here](https://www.w3.org/TR/did-core/#did-url-syntax). + +## TRAC token + +The Trace token (TRAC) is the utility token that powers the OriginTrail Decentralized Knowledge Graph (DKG). Introduced in 2018 as an ERC-20 token on Ethereum with a fixed supply of 500 million, TRAC is essential for various network operations. + +**How TRAC is used** + +* **Publishing & updating Knowledge Assets** – TRAC is required to create, update, and manage Knowledge Assets in the DKG. +* **Node incentives** – Network nodes earn TRAC by hosting data, securing the network, and ensuring knowledge integrity. +* **Staking** – Nodes stake TRAC to increase their reputation and improve their ability to participate in the network. +* **Multi-chain compatibility** – TRAC operates across multiple blockchains, including Ethereum, NeuroWeb, Base, Gnosis, and Polygon. + +## Decentralized Retrieval Augmented Generation + +Patrick Lewis coined the term Retrieval-Augmented Generation (RAG) in a [2020 paper](https://arxiv.org/pdf/2005.11401.pdf). It is a technique for enhancing the accuracy and reliability of GenAI models with facts fetched from external sources. This allows artificial intelligence (AI) solutions to dynamically fetch relevant information before the generation process, enhancing the accuracy of responses by limiting the generation to re-working the retrieved inputs. \ +\ +**Decentralized Retrieval Augmented Generation (dRAG) advances the model by organizing external sources in a DKG with verifiable sources made available for AI models to use.** The framework enables a hybrid AI system that brings together neural (e.g., LLMs) and symbolic (e.g., Knowledge Graph) methodologies. Contrary to using a solely neural AI approach based on vector embedding representations, a symbolic AI approach enhances it with the strength of Knowledge Graphs by introducing a basis in symbolic representations. + +dRAG is, therefore, a framework that allows AI solutions to tap into the strengths of both paradigms: + +* The powerful learning and generalization capabilities of neural networks, and +* The precise, rule-based processing of symbolic AI. + +It operates on two core components: + +(1) the DKG paranets and + +(2) AI models. + +The dRAG applications framework is entirely compatible with the existing techniques, tools, and RAG frameworks and supports all major data formats. + +## Knowledge mining + +**Knowledge mining** is the process of producing high-quality, blockchain-tracked knowledge for AI pioneered by the OriginTrail ecosystem. This cyclical process leverages the key component of the OriginTrail technology - Knowledge Assets - which are ownable containers for knowledge with inherent discoverability, connectivity, and data provenance. + +Similarly to Bitcoin mining, where miners collectively provide computing resources to the network and receive incentives in coins, knowledge miners contributing useful knowledge to the OriginTrail DKG receive NEURO tokens. With knowledge mining incentives enabled across multiple blockchains, the ambition is to drive exponential growth of trusted knowledge in the OriginTrail DKG. + +Read more about knowledge mining in the [NeuroWeb docs](https://docs.neuroweb.ai/knowledge-mining). + +## RDF & SPARQL + +The Resource Description Framework (RDF) is a W3C standardized model designed to represent data about physical objects and abstract concepts (resources). It’s a model to express relations between entities using a graph format. + +RDF schemas provide mechanisms for describing related resources and their relationships. It is similar to object-oriented programming languages and differs in that it describes properties in terms of resource classes. RDF enables querying via the SPARQL query language. + +[Examples of schema definitions by schema.org](https://schema.org/docs/schemas.html) + +## What is an NFT? + +NFT—short for the non-fungible token—is a type of blockchain token used as an implementation component of Knowledge Assets in the OriginTrail DKG. The token represents ownership of the Knowledge Asset and enables its owner to perform all standardized NFT functionality, such as transferring ownership, listing it on NFT marketplaces, and using it as a rich NFT in Web3 applications. + +If you are interested in learning more about NFTs, you can find out more [here](https://en.wikipedia.org/wiki/Non-fungible_token). + +## Autonomus AI paranets + +The next building block of the DKG is **AI para-networks** or **paranets**. + +**AI para-networks** or **paranets** are autonomously operated structures in the DKG, owned by their community as a paranet operator. In paranets, we find **assemblies of Knowledge Assets** driving use cases with associated **paranet-specific AI services** and an **incentivization model** to reward knowledge miners fueling its growth. + +**To see the DKG in action, continue to the** [**Quickstart section**](broken-reference)**.** diff --git a/docs/dkg-knowledge-hub/how-tos-and-tutorials/README.md b/docs/dkg-knowledge-hub/how-tos-and-tutorials/README.md new file mode 100644 index 0000000..4d9f0fc --- /dev/null +++ b/docs/dkg-knowledge-hub/how-tos-and-tutorials/README.md @@ -0,0 +1,13 @@ +--- +description: >- + Step-by-step guides to help you deploy, build, and interact with the DKG in + practice. +--- + +# How-tos & tutorials + +#### Pages in this section + +* [**DKG V8.1.X Update Guidebook**](https://app.gitbook.com/o/-McnF-Jcg4utndKcdeko/s/-McnEkhdd7JlySeckfHM/~/changes/408/dkg-knowledge-hub/how-tos-and-tutorials/dkg-v8.1.x-update-guidebook) – A detailed walkthrough for updating your DKG Node to the latest release. +* [**Bridging to Moonbeam**](https://app.gitbook.com/o/-McnF-Jcg4utndKcdeko/s/-McnEkhdd7JlySeckfHM/~/changes/408/dkg-knowledge-hub/how-tos-and-tutorials/bridging-to-moonbeam) – How to connect OriginTrail components to Moonbeam for multi-chain deployments. +* [**Builder tutorials**](https://app.gitbook.com/o/-McnF-Jcg4utndKcdeko/s/-McnEkhdd7JlySeckfHM/~/changes/408/dkg-knowledge-hub/how-tos-and-tutorials/tutorials) – Practical tutorials and examples for publishing data, creating Knowledge Assets, and integrating with applications. diff --git a/docs/dkg-knowledge-hub/how-tos-and-tutorials/bridging-to-moonbeam.md b/docs/dkg-knowledge-hub/how-tos-and-tutorials/bridging-to-moonbeam.md new file mode 100644 index 0000000..3c06470 --- /dev/null +++ b/docs/dkg-knowledge-hub/how-tos-and-tutorials/bridging-to-moonbeam.md @@ -0,0 +1,36 @@ +--- +hidden: true +--- + +# Bridging to Moonbeam + +### Bridge NEURO from NeuroWeb to Moonbeam + +The NEURO token can be bridged to Moonbeam parachain using XCM. This can be done using multiple interfaces like Polkadot.js, Moonbeam Dapp, etc. + +Steps for bridging NEURO to Moonbeam using Moonbeam Dapp are: + +1. Visit [Moonbeam Dapp - Parachain bridges page](https://apps.moonbeam.network/moonbeam/xcm). +2. Click on the connect the wallet in the top right corner: + 1. Choose Moonbeam for the network. + 2. Connect your EVM wallet that will receive NEURO tokens on Moonbeam. This can be done using Metamask or any other wallet provided as an option in the interface. + 3. Connect your Substrate wallet that will send NEURO tokens from NeuroWeb. This can be done using Polkadot.js extension wallet or any other wallet that is provided as an option in the interface. +3. Choose NEURO as the token. +4. Choose NeuroWeb as 'to' and Moonbeam as 'from', and click connect wallet for both chains. +5. Select the amount of tokens to bridge. +6. Observe the estimation of how many NEURO tokens you will receive after the fees are paid. +7. Click send. + +### Bridge TRAC from Ethereum to Moonbeam + +The process of bridging TRAC from Ethereum to Moonbeam can be executed in multiple ways. + +One way to bridge the tokens is over the Wormhole bridge, and this can be done by following the following steps: + +1. Visit the [Portal bridge interface](https://www.portalbridge.com/). +2. Choose Ethereum as the source and Moonbeam as the target blockchain. +3. Enter the TRAC token contract address on Ethereum after clicking the select token button. + 1. TRAC address: **0xaa7a9ca87d3694b5755f213b5d04094b8d0f0a6f** +4. Select the amount of tokens to bridge. +5. Select the target address. +6. Send the tokens. diff --git a/docs/dkg-knowledge-hub/how-tos-and-tutorials/dkg-v8.1.x-update-guidebook.md b/docs/dkg-knowledge-hub/how-tos-and-tutorials/dkg-v8.1.x-update-guidebook.md new file mode 100644 index 0000000..0347044 --- /dev/null +++ b/docs/dkg-knowledge-hub/how-tos-and-tutorials/dkg-v8.1.x-update-guidebook.md @@ -0,0 +1,190 @@ +--- +icon: circle-chevron-right +cover: ../../.gitbook/assets/DKG V8.1 (1).png +coverY: 0 +--- + +# DKG V8.1.X update guidebook + +The DKG V8.1.X release series marks a major step forward for the OriginTrail ecosystem, **unlocking TRAC staking rewards and laying the groundwork for the next phase of the OriginTrail roadmap: the Metcalfe Convergence**. With the introduction of the [Random Sampling proof system](../learn-more/introduction/random-sampling-dkg-proof-system/), TRAC delegators will begin sharing network rewards based on this new, highly scalable “Proof of Knowledge” system, while DKG Core Node runners benefit from new automation and network resilience features. + +Alongside reward activation, V8.1.X releases introduce user-facing improvements such as **Node Health** and **Node Power,** two new metrics that make it easier to understand how DKG nodes are performing and earning rewards. Together, these upgrades strengthen the DKG’s role as the trusted knowledge layer for AI and set the stage for further growth of the DKG in the Metcalfe Convergence phase of the roadmap. + +V8.1.X Launch mechanics & timeline\ + + + +* **JUNE 23, 2025 V8.1.0 MAINNET LAUNCH SEQUENCE INITIATED**\ + Staking features suspended from 12:00 CET for up to 72 hours. During this period, new V8.1.0 contracts are deployed and tested. +* **JUNE 24, 2025: DKG CORE NODE V8.1.0 RELEASE**\ + Core Node runners start updating nodes +* **ETA JUNE 26, 2025: V8.1.0 MAINNET RELEASE COMPLETE**\ + Staking Dashboard back online +* **ETA JULY '25: V8.1.1 RELEASE**\ + Nodes start collecting V6 rewards +* **ETA JULY '25: V8.1.2 RELEASE:**\ + Rewards for the V8 Tuning period become collectible +* **AUGUST '25: TRANSITION TO METCALFE CONVERGENCE PHASE** + +## Instructions for DKG Core Node runners + +{% hint style="success" %} +In a nutshell: + +* Auto-update features should take care of your Core Node update (if you have it turned on). However, it is highly recommended that you review your node activity and confirm that it has successfully updated to V8.1.0 +* Core Nodes get new features; familiarize yourself with them below +{% endhint %} + +### How to update to v8.1.0 + +If your node **auto-updater is enabled**, the **node will automatically update to version v8.1.0** without requiring any manual effort on your part. + +{% hint style="info" %} +Keeping the autoupdater enabled ensures your node always runs the latest stable release, including important security patches, bug fixes, and performance improvements. Staying up to date helps maintain optimal network participation. +{% endhint %} + +While the update process itself is seamless, once your node is running v8.1.0, there are a few important features and settings you should review and get familiar with. + +### DKG Random Sampling proof system + +With the release of V8.1.0, DKG Core Nodes will automatically begin participating in the [Random Sampling Proof-of-Knowledge (PoK) system](../learn-more/introduction/random-sampling-dkg-proof-system/). **No manual setup is required.** + +Random Sampling is a decentralized mechanism that continuously verifies whether nodes store specific Knowledge Assets. Smart contracts issue randomized challenges to nodes, requiring them to cryptographically prove they hold certain data. This system is transparent, fair, and entirely trustless. Learn more about the [Random Sampling system](https://docs.origintrail.io/random-sampling-dkg-proof-system) here. + +Random Sampling has three key objectives: + +* Ensure long-term data availability across the network +* Reward active and reliable nodes that consistently store and serve Knowledge Assets +* Automate the distribution of publisher fees based on verifiable data storage + +{% hint style="info" %} +No configuration is necessary, and your node will automatically begin submitting proofs in response to Random sampling challenges. +{% endhint %} + +{% hint style="info" %} +Random sampling requires nodes to submit a (limited) amount of additional transactions to the blockchain (up to \~100 per day). Make sure your node's operational keys are funded with enough gas tokens for nodes to be able to do so. +{% endhint %} + +### DKG Sync feature + +As of DKG V8.1.0, DKG Core Nodes have the ability to “Sync” with the DKG state via the new [DKG Sync feature](../learn-more/introduction/dkg-sync.md), capturing any missing knowledge by downloading it from the network. **To start using the DKG Sync feature, you have to manually enable it.** + +**Enabling the Sync feature is highly recommended as it can help maintain high node health**, ultimately leading to your node attracting more rewards. Familiarize yourself with the Sync feature [here](../learn-more/introduction/dkg-sync.md) and how it relates to the [Random Sampling system](../learn-more/introduction/random-sampling-dkg-proof-system/) to gain a comprehensive understanding, and be prepared to enable it when your node updates to DKG V8.1.0.\ + + +DKG Sync strengthens the reliability of the DKG by allowing Core Nodes to automatically retrieve and backfill any missing Knowledge Assets (KAs) from the network. This is especially important when your node restarts, experiences downtime, or joins the network for the first time. + +**How to enable Sync:** + +Instructions for enabling Sync and a more detailed explanation of how it works are available in our documentation, [here](../learn-more/introduction/dkg-sync.md). + +### Configurable Blazegraph timeouts + +As part of the V8.1 update, configurable timeout values for Blazegraph operations have been introduced to improve system performance and prevent heavy operations from overloading your node triple store. If you are experiencing issues with the Blazegraph process (which may occur for heavy-load users), consider lowering the timeouts. + +Generalized default values have already been applied automatically with the update — no action is required. However, if you wish to customize these settings yourself, you can do so by adding the following block to your `.origintrail_noderc` file. + +Default timeout configuration (all timeout values are defined in milliseconds): + +``` +// all values in milliseconds +"tripleStore": { + "enabled": true, + "timeout": { + "query": 90000, + "get": 10000, + "batchGet": 60000, + "insert": 300000 + } +} +... +``` + +{% hint style="info" %} +In V8.1.0, these timeouts are Blazegraph-specific, however, the core developers intend to expand the timeout configuration for all supported triple stores.\ + +{% endhint %} + +Parameters overview: + +* `query`: Maximum duration for SPARQL queries +* `get`: Maximum duration for GET operations queries +* `batchGet`: Maximum duration for bulk GET operation queries +* `insert`: Maximum duration allowed for insert operation queries (e.g., storing Knowledge Assets) + +You can override the timeouts in the .`origintrail_noderc` file by adjusting the values accordingly. + +{% hint style="info" %} +**On timeouts:** Query timeouts are useful when your node is under heavy load (due to many concurrent queries), especially if you open up your query endpoint to the user who might run arbitrarily complex queries, so that they limit the load on the triple store. The values above are set to support general implementations +{% endhint %} + +## Instructions for TRAC delegators + +{% hint style="success" %} +In a nutshell: + +* **No action is required from TRAC delegators** for the DKG V8.1.0 update process. +* During the release, staking features will be disabled for approximately 72 hours. +* It’s highly recommended to get familiar with the new Node Power and Node Health KPIs to understand how to best utilize your TRAC in the network. +{% endhint %} + +### V8.1.0 launch will start with a temporary suspension of staking features + +To complete the V8.1.0 update safely, **staking features will be temporarily suspended for up to 72 hours**, starting on **Monday, June 23, 2025, at 12:00 CET**. During this time, no changes to stakes or reward claims will be possible. + +Once staking is locked, the smart contracts will be updated to V8.1.0, **maintaining the state (snapshot)** of staking values present at that time in V8.0.X contracts via an on-chain migration. Target time for the beginning of the on-chain migration (and latest snapshot time) is at + +* Timestamp **1750683600** ( June 23rd 15:00 CET) +* NeuroWeb: Snapshot Block - 9777120 +* Base : Snapshot Block - 31948000 +* Gnosis: Snapshot Block - 40731720 + +### No more V6 token migration required + +Unlike the V8.0.0 release, where delegators needed to migrate their stake from V6 to V8.0 (possible through the Staking Dashboard), **with the V8.1.0 release, the need for this migration goes away** because of the Random Sampling updates. This means that with DKG V8.1.0, “Node Share tokens” used previously in DKG V6 are **fully deprecated** and no longer serve any function in the DKG (not even for migration). The entire staking system now runs natively on V8 infrastructure, simplifying participation for all delegators. + +If you haven’t migrated your stake from V6 yet, no action is needed — the V8.1.0 release will take care of this automatically. + +### New Staking UI and metrics + +The Staking Dashboard user interface has been upgraded with new, easier-to-understand metrics: + +* **Node Power:** Reflects how competitive a node is in earning rewards +* **Node Health:** Indicates how reliably a node has submitted proofs. + +You can find detailed explanations of both metrics in the Staking section [here](../../contribute-to-the-dkg/delegated-staking/). + +### Upcoming reward visibility + +In addition to the live reward distribution introduced with V8.1.0, two upcoming releases will make previously accrued rewards visible in the Staking Dashboard: + +* V8.1.1 will display and enable claims for V6-era rewards +* V8.1.2 will unlock Tuning Period rewards for nodes active between V8.0.0 and V8.1.0 + +Stay tuned to the official OriginTrail community channels for updates on these follow-up releases and their expected July rollout. + +## Instructions for DKG builders + +{% hint style="success" %} +In a nutshell: + +* V8.1.0 is **not a breaking change**, so no action is needed. +* It is **recommended to update to the latest client** versions to get all the benefits of the improvements in the V8.1.0 release.\ + +{% endhint %} + +The DKG V8.1.0 release is not a breaking change for builders. Existing integrations, publishing flows, and query mechanisms will continue to function as expected. + +That said, we strongly recommend updating to the latest versions of the DKG client libraries and tools to benefit from performance improvements, updated defaults (e.g., Blazegraph timeouts), and compatibility with upcoming features such as advanced staking visibility and network telemetry. + +### To get the most out of V8.1.0: + +* Update your dkg-client packages to the latest stable versions +* Review documentation on how to interact with nodes running V8.1+ (e.g., sync-aware publishing and querying) +* If you are building agent-based systems, check out the updated DKG x MCP integration to take advantage of decentralized knowledge workflows + +No code migration is needed, but keeping your client stack up to date ensures you remain aligned with the evolving network. + +Let us know in the [#builders-hub on Discord](https://discord.gg/hUcNmaEnSg) if you run into any issues or need assistance with upgrading. + +\ diff --git a/docs/dkg-knowledge-hub/how-tos-and-tutorials/fund-your-web3-wallets.md b/docs/dkg-knowledge-hub/how-tos-and-tutorials/fund-your-web3-wallets.md new file mode 100644 index 0000000..9d53a57 --- /dev/null +++ b/docs/dkg-knowledge-hub/how-tos-and-tutorials/fund-your-web3-wallets.md @@ -0,0 +1,104 @@ +--- +description: >- + Before your DKG Node can publish or interact with the network, each wallet + generated during setup needs to be funded with two types of tokens. +--- + +# Fund your Web3 wallets + +## Fund your wallets + +Before your DKG Node can operate, you’ll need to fund the wallets created during setup with two types of tokens: + +* **TRAC** – for publishing Knowledge Assets. +* **Native gas token** – for transaction fees (e.g., **NEURO** on NeuroWeb, **ETH** on Base, **xDAI** on Gnosis). + +On testnets, these tokens are free. Request them from each network’s faucet (e.g., via our Discord for NeuroWeb) before publishing or interacting with the DKG. To obtain your testnet tokens: + +1. Join our [**Discord**](https://discord.com/invite/xCaY7hvNwD) +2. Open **#faucet-bot** channel + +You can type `!help` to see more commands. + +### **NeuroWeb testnet** + +Please enter the following commands in our **Discord `#faucet-bot` channel**, replacing `YOUR_WALLET_ADDRESS` with your actual wallet address. Once submitted, the OriginTrail faucet bot will automatically send your test tokens: + +```bash +# Fund NEURO +!fundme_neuroweb YOUR_WALLET_ADDRESS + +# Fund TRAC +!fundme_neuroweb_trac YOUR_WALLET_ADDRESS +``` + +
+ +#### **Verify your transactions — NeuroWeb** + +After requesting tokens, you can confirm that they arrived in your wallet by checking the **NeuroWeb blockchain explorer**: + +* 👉 Visit**:** [https://neuroweb.subscan.io/](https://neuroweb.subscan.io/) + +
+ +* Enter **your wallet address** in the search bar. +* Navigate to the **Balance** or **Transfers** section. +* You’ll see your **TRAC** and **NEURO** testnet transactions listed there. + +
+ +
+ +If your transaction appears, your wallet is funded and ready for use on the testnet. + +*** + +### **Base Sepolia testnet** + +* Fund ETH on Base by following the [official Base documentation](https://docs.base.org/base-chain/tools/network-faucets). + +Please enter the following commands in our **Discord `#faucet-bot` channel**, replacing `YOUR_WALLET_ADDRESS` with your actual wallet address. Once submitted, the OriginTrail faucet bot will automatically send your test tokens: + +```bash +# Fund TRAC +!fundme_v8_base_sepolia_trac YOUR_WALLET_ADDRESS +``` + +
+ +#### Verify your transactions – Base + +If you’re deploying your DKG Node on **Base**, you can verify that your wallet was funded using the Base block explorer: + +* 👉 Visit: [Base Sepolia explorer](https://sepolia.basescan.org/) + +
+ +Enter your **public wallet address** to check for incoming tokens and completed transactions. + +*** + +### **Gnosis Chiado testnet** + +Please enter the following commands in our **Discord `#faucet-bot` channel**, replacing `YOUR_WALLET_ADDRESS` with your actual wallet address. Once submitted, the OriginTrail faucet bot will automatically send your test tokens: + +```bash +# Fund xDAI +!fundme_xdai YOUR_WALLET_ADDRESS + +# Fund TRAC +!fundme_chiado_trac YOUR_WALLET_ADDRESS +``` + +
+ +#### **Verify your transactions – Gnosis** + +For nodes running on **Gnosis**, confirm your wallet balance and transaction status here: + +* 👉 Visit: [Gnosis Chiado explorer](https://gnosis-chiado.blockscout.com/) + +
+ +Search for your **wallet address** to see recent activity, token deposits, and confirmations. diff --git a/docs/dkg-knowledge-hub/how-tos-and-tutorials/tutorials.md b/docs/dkg-knowledge-hub/how-tos-and-tutorials/tutorials.md new file mode 100644 index 0000000..8bcfe48 --- /dev/null +++ b/docs/dkg-knowledge-hub/how-tos-and-tutorials/tutorials.md @@ -0,0 +1,24 @@ +--- +hidden: true +--- + +# Builder tutorials + +Check out the tutorials prepared by the OriginTrail builders community: + +* [DKG Basic concepts overview](https://www.youtube.com/watch?v=rfvknL1gVDc) +* [DKG.js SDK walkthrough](https://www.youtube.com/watch?v=4oi_0hJmxcY) +* [Build an agent with OriginTrail and ElizaOS](https://www.youtube.com/watch?v=w3-_WBH3uSQ) +* [Build semantic search apps with Google Vertex AI](https://github.com/OriginTrail/ChatDKG/tree/main/examples/google-vertex-ai) +* [Integrate OriginTrail DKG with Milvus VectorDB](https://github.com/OriginTrail/ChatDKG/tree/main/examples/milvus) +* [Langchain + OriginTrail DKG tutorial](https://github.com/OriginTrail/ChatDKG/tree/main/examples/langchain) +* [Dev3 SDK OriginTrail tutorial](https://dev3.sh/web3/dev3xorigintraildkg/) +* [Decentralized RAG with OriginTrail DKG and NVIDIA Build ecosystem](https://origintrail.io/blog/decentralized-rag-with-origintrail-dkg-and-nvidia-build-ecosystem) +* [Trusted AI for next-generation RWAs with OriginTrail and Chainlink](https://origintrail.io/blog/trusted-ai-for-next-generation-rwas-with-origintrail-and-chainlink) +* [Decentralized RAG 101 with OriginTrail DKG and Google Gemini](https://origintrail.io/blog/decentralized-rag-101-with-origintrail-dkg-and-google-gemini) + + + +{% hint style="info" %} +Have a tutorial to share? Create a [pull request to these docs](https://github.com/OriginTrail/dkg-docs) +{% endhint %} diff --git a/docs/dkg-knowledge-hub/learn-more/README.md b/docs/dkg-knowledge-hub/learn-more/README.md new file mode 100644 index 0000000..e59a023 --- /dev/null +++ b/docs/dkg-knowledge-hub/learn-more/README.md @@ -0,0 +1,18 @@ +--- +description: >- + Understand the core building blocks of the OriginTrail Decentralized Knowledge + Graph (DKG), how its components work together, and the technologies that power + it. +--- + +# Learn more + +#### Pages in this section + +* [**Understanding OriginTrail**](https://app.gitbook.com/o/-McnF-Jcg4utndKcdeko/s/-McnEkhdd7JlySeckfHM/~/changes/408/dkg-knowledge-hub/learn-more/readme/decentralized-knowle-dge-graph-dkg) – A high-level overview of the OriginTrail ecosystem and how the DKG enables trusted, verifiable knowledge across networks. +* [**The DKG Node + MCP**](https://app.gitbook.com/o/-McnF-Jcg4utndKcdeko/s/-McnEkhdd7JlySeckfHM/~/changes/408/dkg-knowledge-hub/learn-more/dkg-key-concepts) – Learn how DKG Nodes and the Model Context Protocol (MCP) connect AI agents, apps, and blockchains to verifiable knowledge. +* [**Essential DKG mechanics & systems**](https://app.gitbook.com/o/-McnF-Jcg4utndKcdeko/s/-McnEkhdd7JlySeckfHM/~/changes/408/dkg-knowledge-hub/learn-more/introduction) – Explore the main architectural components and how knowledge publishing, querying, and verification happen. +* [**Connected blockchains**](https://app.gitbook.com/o/-McnF-Jcg4utndKcdeko/s/-McnEkhdd7JlySeckfHM/~/changes/408/dkg-knowledge-hub/learn-more/connected-blockchains) – Discover which blockchains integrate with the DKG and how they provide the trust layer for verifiable data. +* [**On-chain deployments & contracts**](https://app.gitbook.com/o/-McnF-Jcg4utndKcdeko/s/-McnEkhdd7JlySeckfHM/~/changes/408/dkg-knowledge-hub/learn-more/deployed-smart-contracts) – Understand how smart contracts and on-chain logic extend DKG functionality. +* [**Node keys (wallets)**](https://app.gitbook.com/o/-McnF-Jcg4utndKcdeko/s/-McnEkhdd7JlySeckfHM/~/changes/408/dkg-knowledge-hub/learn-more/node-keys-wallets) – Learn about the key types your node uses for publishing, operations, and management — and how they work together. +* [**Previous version release**](https://app.gitbook.com/o/-McnF-Jcg4utndKcdeko/s/-McnEkhdd7JlySeckfHM/~/changes/408/dkg-knowledge-hub/learn-more/previous-updates) – Review past releases to understand how the network has evolved and what’s changed. diff --git a/docs/dkg-knowledge-hub/learn-more/connected-blockchains/README.md b/docs/dkg-knowledge-hub/learn-more/connected-blockchains/README.md new file mode 100644 index 0000000..ed79396 --- /dev/null +++ b/docs/dkg-knowledge-hub/learn-more/connected-blockchains/README.md @@ -0,0 +1,2 @@ +# Connected Blockchains + diff --git a/docs/dkg-knowledge-hub/learn-more/connected-blockchains/base-blockchain/README.md b/docs/dkg-knowledge-hub/learn-more/connected-blockchains/base-blockchain/README.md new file mode 100644 index 0000000..f52d881 --- /dev/null +++ b/docs/dkg-knowledge-hub/learn-more/connected-blockchains/base-blockchain/README.md @@ -0,0 +1,25 @@ +# Base Network (L2) + +Base Chain is an Ethereum Layer-2 blockchain network developed by Coinbase. It is designed to improve scalability, performance, and interoperability while reducing transaction costs and settlement time on the Ethereum network. + +The synergy between the DKG and Base blockchain opens up new horizons for builders, driving them to create groundbreaking decentralized AI applications. + +### Bridging TRAC to Base + +TRAC is the ERC20 token used for creating Knowledge Assets on the OriginTrail Decentralized Knowledge Graph (DKG). TRAC also gets rewarded to delegated stakers and node runners for contributing to the DKG network. + +You can use the [Superbridge](https://superbridge.app/base) to transfer TRAC from Ethereum to Base. + +TRAC token contract address on Base: 0xa81a52b4dda010896cdd386c7fbdc5cdc835ba23 + +{% embed url="https://www.youtube.com/watch?v=Pmw6TJmobDY" %} + +### Setting up DKG node on Base + +Setting up a DKG node on Base is easy and simple by following instructions in the [DKG Core Node](../../../../graveyard/everything/dkg-core-node/) section of the docs. If you want to integrate Base to an already running DKG node, you should follow the steps from [Connect to Base](connect-to-base.md). + +### Building on DKG on Base + +A powerful tool to build upon the DKG on BASE is the [DKG SDK](../../../../build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-sdk/). By leveraging the DKG SDK, developers can harness all the capabilities that the DKG on Base provides, unlocking new potentials for creating advanced decentralized applications and services. + +At OriginTrail, we are empowering builders to transcend the ordinary and construct the extraordinary. Dive into our tutorials to harness the power of decentralized AI. Explore guides prepared by the OriginTrail Builders community to inspire your own innovative projects - 📚 [Tutorials](../../../how-tos-and-tutorials/tutorials.md). diff --git a/docs/dkg-knowledge-hub/learn-more/connected-blockchains/base-blockchain/connect-to-base.md b/docs/dkg-knowledge-hub/learn-more/connected-blockchains/base-blockchain/connect-to-base.md new file mode 100644 index 0000000..59ab331 --- /dev/null +++ b/docs/dkg-knowledge-hub/learn-more/connected-blockchains/base-blockchain/connect-to-base.md @@ -0,0 +1,269 @@ +--- +cover: ../../../../.gitbook/assets/OT x BASE doc visual.jpg +coverY: 0 +--- + +# Connect to Base + +Since 6.5.0 release, OriginTrail DKG nodes support Base blockchain (Mainnet). Learn more about how to use your node with Base below. + +## Mainnet node setup instructions (Base) + +Since the 6.5.0 release, your OriginTrail DKG node can operate on Base network. In order to connect your node to Base, please refer to the instructions below. + +### 1. Obtain Base Archival RPC Endpoint + +Refer to the [official Base documentation](https://docs.base.org/docs/) or search for the RPC providers that offer Base RPC's. + +{% hint style="warning" %} +Selecting an archival endpoint is a crucial requirement for the optimal functionality of your DKG node. +{% endhint %} + +### 2. Acquire TRAC and ETH tokens + +List of available exchanges is available on our [website](https://origintrail.io/technology/trac-token). + +Instructions on how to bridge TRAC and ETH tokens to Base blockchains can be found [here](https://docs.origintrail.io/integrated-blockchains/ethereum-ecosystem/base-blockchain#bridging-trac-to-base). + +### 3. Update DKG node configuration + +Open the **.origintrail\_noderc** configuration file of your DKG node located inside the **ot-node** directory. Within the config, locate the **"blockchain"** object, and add the following object to the **"implementation"** array, specifying your RPC endpoint and wallets. + +As `operationalWallets` is an array, you can define multiple operational wallets. + +```json +"base:8453": { + "enabled": true, + "config": { + "sharesTokenSymbol": "shares_token_symbol", + "sharesTokenName": "shares_token_name", + "operatorFee": , + "rpcEndpoints": [ + "/" + ], + "operationalWallets": [ + { + "evmAddress": "0x0bf...", + "privateKey": "0x1e3..." + } + ], + "evmManagementWalletPublicKey": "0xd09..." + } + } +``` + +After adding **"base:8453"**, your **"blockchain"** object in the configuration file should look like the one below: + +```json +... + "blockchain": { + "defaultImplementation": "otp:2043", + "implementation": { + "otp:20430": { + "enabled": true, + "config": { + "sharesTokenSymbol": "shares_token_symbol", + "sharesTokenName": "shares_token_name", + "operatorFee": , + "operationalWallets": [ + { + "evmAddress": "0x0bf...", + "privateKey": "0x1e3..." + } + ], + "evmManagementWalletPublicKey": "0xd09..." + } + }, + "gnosis:100": { + "enabled": true, + "config": { + "sharesTokenSymbol": "shares_token_symbol", + "sharesTokenName": "shares_token_name", + "operatorFee": , + "rpcEndpoints": [ + "https://archive-rpc.chiado.gnosischain.com/" + ], + "operationalWallets": [ + { + "evmAddress": "0x0bf...", + "privateKey": "0x1e3..." + } + ], + "evmManagementWalletPublicKey": "0xd09..." + } + }, + "base:8453": { + "enabled": true, + "config": { + "sharesTokenSymbol": "shares_token_symbol", + "sharesTokenName": "shares_token_name", + "operatorFee": , + "rpcEndpoints": [ + "" + ], + "operationalWallets": [ + { + "evmAddress": "0x0bf...", + "privateKey": "0x1e3..." + } + ], + "evmManagementWalletPublicKey": "0xd09..." + } + } + } + }, +... +``` + +## 4. Restart your node + +You can proceed and restart your node to confirm that it will start communicating with Base. + +{% hint style="warning" %} +Once again, make sure that your operational wallet has some Base ETH in order for your OriginTrail DKG node to be able to create the profile on the new network. +{% endhint %} + +``` +otnode-restart && otnode-logs +``` + +If you added everything successfully, your node will show the log that says “**blockchain module initialized with implementation: base:8453**”. + +If you have come this far and your node logs are not showing any errors, you're node is successfully set up! + + + +## Testnet node setup instructions (Base Sepolia) + +Since the 6.5.0 release, your OriginTrail DKG node can operate on Base Sepolia network. In order to connect your node to Base Sepolia, please refer to the instructions below. + +### 1. Obtain Base Sepolia Archival RPC Endpoint + +Refer to the [official Base documentation](https://docs.base.org/docs/) or search for the RPC providers that offer Base RPC's. + +{% hint style="warning" %} +Selecting an archival endpoint is a crucial requirement for the optimal functionality of your DKG node. +{% endhint %} + +### 2. Acquire TRAC and Sepolia ETH test tokens + +Please refer to [Base official documentation](https://docs.base.org/docs/tools/network-faucets) to see the list of available faucet providers.\ +In order to obtain TRAC tokens on Base Sepolia, please contact **tech@origin-trail.com.** + +{% hint style="info" %} +TRAC faucet will be enabled soon via Discord bot, same as for the other networks and the usage instructions will be provided here. +{% endhint %} + +### 3. Update DKG node configuration + +Open the **.origintrail\_noderc** configuration file of your DKG node located inside the **ot-node** directory. Within the config, locate the **"blockchain"** object, and add the following object to the **"implementation"** array, specifying your RPC endpoint and wallets. + +As `operationalWallets` is an array, you can define multiple operational wallets. + +```json +"base:84532": { + "enabled": true, + "config": { + "sharesTokenSymbol": "shares_token_symbol", + "sharesTokenName": "shares_token_name", + "operatorFee": , + "rpcEndpoints": [ + "/" + ], + "operationalWallets": [ + { + "evmAddress": "0x0bf...", + "privateKey": "0x1e3..." + } + ], + "evmManagementWalletPublicKey": "0xd09..." + } + } +``` + +After adding **"base:**84532**"**, your **"blockchain"** object in the configuration file should look like the one below: + +```json +... + "blockchain": { + "defaultImplementation": "otp:20430", + "implementation": { + "otp:20430": { + "enabled": true, + "config": { + "sharesTokenSymbol": "shares_token_symbol", + "sharesTokenName": "shares_token_name", + "operatorFee": 5, + "operationalWallets": [ + { + "evmAddress": "0x0bf...", + "privateKey": "0x1e3..." + } + ], + "evmManagementWalletPublicKey": "0xd09..." + } + }, + "gnosis:10200": { + "enabled": true, + "config": { + "sharesTokenSymbol": "shares_token_symbol", + "sharesTokenName": "shares_token_name", + "operatorFee": 5, + "rpcEndpoints": [ + "https://archive-rpc.chiado.gnosischain.com/" + ], + "operationalWallets": [ + { + "evmAddress": "0x0bf...", + "privateKey": "0x1e3..." + } + ], + "evmManagementWalletPublicKey": "0xd09..." + } + }, + "base:84532": { + "enabled": true, + "config": { + "sharesTokenSymbol": "shares_token_symbol", + "sharesTokenName": "shares_token_name", + "operatorFee": , + "rpcEndpoints": [ + "" + ], + "operationalWallets": [ + { + "evmAddress": "0x0bf...", + "privateKey": "0x1e3..." + } + ], + "evmManagementWalletPublicKey": "0xd09..." + } + } + } + }, +... +``` + +## 4. Restart your node + +You can proceed and restart your node to confirm that it will start communicating with Base Sepolia. + +{% hint style="warning" %} +Once again, make sure that your operational wallet has some Base Sepolia ETH in order for your OriginTrail DKG node to be able to create the profile on the new network. +{% endhint %} + +``` +otnode-restart && otnode-logs +``` + +If you added everything successfully, your node will show the log that says “**blockchain module initialized with implementation: base:84532**”. + +If you have come this far and your node logs are not showing any errors, you're node is successfully set up! + +## Node Stake and ask setup: + +{% hint style="info" %} +If you are running a [Gateway](https://docs.origintrail.io/decentralized-knowledge-graph/node-setup-instructions/running-a-gateway-node) node, setting up **stake** and **ask** is not required. +{% endhint %} + +Please refer to "[Running a full node](https://docs.origintrail.io/decentralized-knowledge-graph/node-setup-instructions/running-a-full-node)" part of our documentation for more details regarding setting up stake and ask parameters. diff --git a/docs/dkg-knowledge-hub/learn-more/connected-blockchains/gnosis-chain/README.md b/docs/dkg-knowledge-hub/learn-more/connected-blockchains/gnosis-chain/README.md new file mode 100644 index 0000000..c571d6d --- /dev/null +++ b/docs/dkg-knowledge-hub/learn-more/connected-blockchains/gnosis-chain/README.md @@ -0,0 +1,16 @@ +# Gnosis Chain + +The Gnosis blockchain is one of the Ethereum sidechains supported by OriginTrail Decentralized Knowledge Graph (DKG) since early versions. It's an EVM-based blockchain with the native token xDAI (stablecoin) with a strong developer community and good track record. + +More information on the Gnosis chain can be found on their [website](https://www.gnosischain.com/). + +## Bridging TRAC to Gnosis + +To use TRAC tokens on Gnosis for powering your nodes, staking, or other activities, you need to bridge TRAC to Gnosis. + +You can use the [official Gnosis bridge](https://bridge.gnosischain.com/) to bridge your TRAC tokens from Ethereum to the Gnosis chain and vice versa. + +The TRAC token contract address on the Gnosis chain is: 0xeddd81e0792e764501aae206eb432399a0268db5 + + + diff --git a/docs/dkg-knowledge-hub/learn-more/connected-blockchains/gnosis-chain/connect-to-gnosis.md b/docs/dkg-knowledge-hub/learn-more/connected-blockchains/gnosis-chain/connect-to-gnosis.md new file mode 100644 index 0000000..fe71524 --- /dev/null +++ b/docs/dkg-knowledge-hub/learn-more/connected-blockchains/gnosis-chain/connect-to-gnosis.md @@ -0,0 +1,231 @@ +--- +cover: ../../../../.gitbook/assets/MicrosoftTeams-image (1).png +coverY: 0 +--- + +# Connect to Gnosis + +Since 6.2.0 release, OriginTrail DKG nodes support Gnosis blockchain. Learn more about how to use your node with Gnosis below. + +## Mainnet node instructions (Gnosis chain) + +Since the 6.2.0 release, your OriginTrail DKG node supports the Gnosis blockchain. In order to connect your node to Gnosis, please refer to the instructions below. + +### 1. Obtain Gnosis archival RPC Endpoint + +Refer to the [official Gnosis documentation](https://docs.gnosischain.com/tools/rpc/) and select an RPC provider to acquire the Archival RPC Endpoint. + +{% hint style="warning" %} +Selecting an archival endpoint is a crucial requirement for the optimal functionality of your DKG node. +{% endhint %} + +### 2. Acquire tokens + +In order for your node to be able to create the profile on the Gnosis blockchain, it will require some xDai tokens on the operational wallet (at least one of the wallets in **operationalWallets**). Make sure that you acquire them before proceeding to update the configuration file; otherwise, your node will fail to connect to the Gnosis network. + +If you are planning on running an OriginTrail [Full node](../../../../graveyard/everything/node-setup-instructions/running-a-full-node.md), make sure that you also acquire TRAC tokens on Gnosis network and have them ready on the management wallet (**evmManagementWalletPublicKey**). TRAC is required for the process of setting up stake on your node once it's successfully connected to Gnosis and created its profile. + +{% hint style="info" %} +As described in the "[**Acquiring tokens**](https://docs.origintrail.io/decentralized-knowledge-graph/node-setup-instructions/installation-prerequisites/acquiring-tokens)" instructions page, bridging TRAC tokens from Ethereum to Gnosis network is done via [**OmniBridge**](https://omnibridge.gnosischain.com/bridge) or any other bridging platform. +{% endhint %} + +{% hint style="warning" %} +DKG [Gateway](https://docs.origintrail.io/decentralized-knowledge-graph/node-setup-instructions/running-a-gateway-node) nodes do not require TRAC stake. +{% endhint %} + +### 3. Update DKG node configuration + +Open **.origintrail\_noderc** file of your DKG node located inside the **ot-node** directory. Inside the configuration file, locate the **"blockchain"** object, and add the following object to the **"implementation"** array, specifying your RPC endpoint and wallets. As `operationalWallets` is an array, you can define multiple operational wallets, which is recommended. + +```json +"gnosis:100": { + "enabled": true, + "config": {, + "operatorFee": 5, + "rpcEndpoints": [ + "https://" + ], + "operationalWallets": [ + { + "evmAddress": "0x0bf...", + "privateKey": "0x1e3..." + } + ], + "evmManagementWalletPublicKey": "0xd09..." + } + } +``` + +After adding **"gnosis:100"**, make sure to add the initial `operatorFee` (range from 0% to 100%). + +{% hint style="warning" %} +Initial operator fee (**operatorFee**) can only be set on the profile creation, so make sure not to forget about it. In order to change it later through Houston, you will need to wait for a delay of 28 days! +{% endhint %} + +After these additions, your **"blockchain"** object in the configuration file should look similar to the example below: + +```json +... + "blockchain": { + "defaultImplementation": "otp:2043", + "implementation": { + "otp:2043": { + "enabled": true, + "config": { + "sharesTokenSymbol": "shares_token_symbol", + "sharesTokenName": "shares_token_name", + "operatorFee": 5, + "operationalWallets": [ + { + "evmAddress": "0x...", + "privateKey": "0x..." + } + ], + "evmManagementWalletPublicKey": "0x..." + } + }, + "gnosis:100": { + "enabled": true, + "config": { + "sharesTokenSymbol": "shares_token_symbol", + "sharesTokenName": "shares_token_name", + "operatorFee": 5, + "rpcEndpoints": [ + "https://" + ], + "operationalWallets": [ + { + "evmAddress": "0x...", + "privateKey": "0x..." + } + ], + "evmManagementWalletPublicKey": "0x..." + } + } + } + }, +... +``` + +### 4. Restart your node + +You can proceed and restart your node to confirm that it will start communicating with Gnosis Chiado. + +``` +otnode-restart && otnode-logs +``` + +If you added everything successfully, your node will show the “**blockchain module initialized with implementation: gnosis:10200**” log. + +## Testnet node instructions (Gnosis Chiado) + +Since the 6.1.0 release, your OriginTrail DKG node can operate on Gnosis Chiado network. In order to connect your node to Gnosis, please refer to the instructions below. + +### 1. Obtain Gnosis Chiado Archival RPC Endpoint + +Refer to the [official Gnosis documentation](https://docs.gnosischain.com/tools/rpc/) and select an RPC provider to acquire the Archival RPC Endpoint. + +{% hint style="warning" %} +Selecting an archival endpoint is a crucial requirement for the optimal functionality of your DKG node. +{% endhint %} + +### 2. Acquire TRAC and Chiado test tokens + +Go to [test-token-faucet.md](../../../useful-resources/test-token-faucet.md "mention") to get test TRAC and xDAI tokens. + +### 3. Update DKG node configuration + +Open the **.origintrail\_noderc** configuration file of your DKG node located inside the **ot-node** directory. Within the config, locate the **"blockchain"** object, and add the following object to the **"implementation"** array, specifying your RPC endpoint and wallets. As `operationalWallets` is an array, you can define multiple operational wallets, which is recommended. + +```json +"gnosis:10200": { + "enabled": true, + "config": { + "sharesTokenSymbol": "shares_token_symbol", + "sharesTokenName": "shares_token_name", + "operatorFee": 5, + "rpcEndpoints": [ + "https://archive-rpc.chiado.gnosischain.com/" + ], + "operationalWallets": [ + { + "evmAddress": "0x0bf...", + "privateKey": "0x1e3..." + } + ], + "evmManagementWalletPublicKey": "0xd09..." + } + } +``` + +After adding **"gnosis:10200"**, your **"blockchain"** object in the configuration file should look like the one below: + +```json +... + "blockchain": { + "defaultImplementation": "otp:20430", + "implementation": { + "otp:20430": { + "enabled": true, + "config": { + "sharesTokenSymbol": "shares_token_symbol", + "sharesTokenName": "shares_token_name", + "operatorFee": 5, + "operationalWallets": [ + { + "evmAddress": "0x0bf...", + "privateKey": "0x1e3..." + } + ], + "evmManagementWalletPublicKey": "0xd09..." + } + }, + "gnosis:10200": { + "enabled": true, + "config": { + "sharesTokenSymbol": "shares_token_symbol", + "sharesTokenName": "shares_token_name", + "operatorFee": 5, + "rpcEndpoints": [ + "https://archive-rpc.chiado.gnosischain.com/" + ], + "operationalWallets": [ + { + "evmAddress": "0x0bf...", + "privateKey": "0x1e3..." + } + ], + "evmManagementWalletPublicKey": "0xd09..." + } + } + } + }, +... +``` + +## 4. Restart your node + +You can proceed and restart your node to confirm that it will start communicating with Gnosis Chiado. + +{% hint style="warning" %} +Once again, make sure that your operational wallet has some Chiado in order for your OriginTrail DKG node to be able to create the profile on the new network. +{% endhint %} + +``` +otnode-restart && otnode-logs +``` + +If you added everything successfully, your node will show the log that says “**blockchain module initialized with implementation: gnosis:10200**”. + + + +If you have come this far and your node logs are not showing any errors, you're node is successfully set up! + +## Node Stake and ask setup: + +{% hint style="info" %} +If you are running a [Gateway](https://docs.origintrail.io/decentralized-knowledge-graph/node-setup-instructions/running-a-gateway-node) node, setting up **stake** and **ask** is not required. +{% endhint %} + +Please refer to "[Running a full node](https://docs.origintrail.io/decentralized-knowledge-graph/node-setup-instructions/running-a-full-node)" part of our documentation for more details regarding setting up stake and ask parameters. + diff --git a/docs/dkg-knowledge-hub/learn-more/connected-blockchains/neuroweb.md b/docs/dkg-knowledge-hub/learn-more/connected-blockchains/neuroweb.md new file mode 100644 index 0000000..906ae7b --- /dev/null +++ b/docs/dkg-knowledge-hub/learn-more/connected-blockchains/neuroweb.md @@ -0,0 +1,42 @@ +# NeuroWeb Parachain + +NeuroWeb network is a decentralized artificial intelligence (AI) blockchain designed to incentivize knowledge creation, connectivity, and sharing through **knowledge mining**. Its utility token, NEURO, is designed to fuel the AI knowledge economy, rewarding relevant knowledge contributions to the **OriginTrail Decentralized Knowledge Graph (DKG)**. + +NeuroWeb builds on the basis of its predecessor - the OriginTrail Parachain - which was transformed into NeuroWeb via a community governance vote on OriginTrail Parachain in December 2023. NeuroWeb is a permissionless, EVM-enabled blockchain secured by Polkadot validators. + +Dedicated documentation for NeuroWeb can be found [here](https://docs.neuroweb.ai/). + +## NEURO token + +NEURO is the native token of NeuroWeb. It's used to incentivize knowledge creation, sharing, and connectivity through knowledge mining. NEURO powers the AI knowledge economy by rewarding valuable contributions to the OriginTrail DKG. + +More information on NEURO can be found in the [official NeuroWeb documentation](https://docs.neuroweb.ai/neuro-token). + +## Bridging TRAC to NeuroWeb + +To use TRAC tokens on NeuroWeb for powering your nodes, staking, or other activities, you need to bridge TRAC to NeuroWeb. + +Teleport interface has been launched in order to allow users to transfer TRAC tokens from Ethereum to NeuroWeb and vice versa. More information on [teleport is available here](../../../graveyard/everything/teleport-instructions-neuroweb.md). + +## Adding TRAC on NeuroWeb to your wallet + +Here are step-by-step instructions for adding the TRAC token on NeuroWeb to your wallet (in this case Metamask). These instructions and TRAC token address are the same for both NeuroWeb mainnet and testnet. + +TRAC token address: 0xFfFFFFff00000000000000000000000000000001 + +### **Step 1:** + +Open Metamask that is connected to NeuroWeb (connection details available here), then under the Assets tab, click on `Import tokens`. + +
+ +### Step 2: + +On the import tokens page, you need to add the TRAC token contract address. Usually, all the other fields will be automatically populated by Metamask. + +

Import TRAC token

+ +### Step 3: + +After you get all the fields filled with the right information (as in the image above), you click **'Add custom tokens'** and your TRAC balance will be displayed in Metamask. + diff --git a/docs/dkg-knowledge-hub/learn-more/decentralized-knowle-dge-graph-dkg.md b/docs/dkg-knowledge-hub/learn-more/decentralized-knowle-dge-graph-dkg.md new file mode 100644 index 0000000..916a032 --- /dev/null +++ b/docs/dkg-knowledge-hub/learn-more/decentralized-knowle-dge-graph-dkg.md @@ -0,0 +1,40 @@ +--- +description: >- + Understand the new unified DKG Node — your gateway between applications, AI + agents, and the DKG. +hidden: true +--- + +# What is a DKG Node? + +{% embed url="https://media.tracverse.com/dkgnodedocs/What+Is+A+DKG+Node+Final.mp4" %} + +
+ +### What is the DKG Node? + +The DKG Node is the next evolution of the OriginTrail node software — your intelligent gateway into the Decentralized Knowledge Graph (DKG). It’s not just infrastructure for publishing, querying, and verifying verifiable Knowledge Assets — it’s also the home of AI-powered DKG Agents that can reason over trusted data, automate knowledge-driven workflows, and interact with external systems. With it, you can create new knowledge, retrieve and validate existing information, and power intelligent applications that operate on a foundation of verifiable, provenance-rich data. + +In the OriginTrail ecosystem, there are two types of roles your DKG Node can fulfill: + +* **The DKG Edge Node** is designed to operate at the network "edge" and can be set up on anything from laptops and phones to the cloud. They can enrich and utilize DKG knowledge and can operate agents, but are not responsible for hosting the DKG state. Therefore, DKG Edge Nodes do not require TRAC token stake to run — Edge Nodes can fully publish, query, and verify knowledge, but are not eligible for a share of protocol fees (via delegated stake) as they do not contribute DKG services to the wider network, and therefore do not require a high uptime. +* **DKG Core Nodes** can do everything Edge Nodes can, but are intended to run as the "network core" — they are running the DKG and maintain its state, which requires high uptime. A DKG Core Node, therefore, needs a minimum of 50,000 TRAC staked as an economic guarantee, which can be "sponsored" by any ecosystem stakeholder willing to delegate TRAC to it. Once a DKG Core Node is set up and contributing to the network, it earns network publishing fees based on its contribution to the network. + +
+ +## Is it too difficult for me? + +Running a DKG Node is designed to be accessible — but like any powerful technology, there’s a **learning curve**. If you’re new to this space, it’s worth familiarizing yourself with a few foundational concepts before diving in: + +* **Knowledge Graph basics** – Understand how linked data, ontologies, and verifiable knowledge assets work. +* **Blockchain fundamentals** – Learn how transactions, staking, and gas fees function. +* **AI agent concepts** – Knowing how agents interact with external data (like the DKG) will help you design better applications. +* **Basic terminal & server skills** – Installing and managing a node requires comfort with the command line and deploying services on a VPS. + +We recommend exploring our [introductory resources or tutorials if any of these areas are new to you](https://app.gitbook.com/o/-McnF-Jcg4utndKcdeko/s/-McnEkhdd7JlySeckfHM/~/changes/408/dkg-knowledge-hub). And remember — you’re not alone. The OriginTrail Discord community is active and welcoming, with dedicated channels where you can ask questions, troubleshoot issues, and share ideas as you learn. + +*** + +#### **Next Step: $TRAC: Powering The Knowledge Economy** + +Once you understand what a DKG Node does, the next step is learning **how it’s powered**. The following section introduces **$TRAC**, the token that drives the DKG’s decentralized economy — enabling staking, publishing, and the incentive mechanisms that keep the network fair and secure. diff --git a/docs/dkg-knowledge-hub/learn-more/deployed-smart-contracts.md b/docs/dkg-knowledge-hub/learn-more/deployed-smart-contracts.md new file mode 100644 index 0000000..2bfb714 --- /dev/null +++ b/docs/dkg-knowledge-hub/learn-more/deployed-smart-contracts.md @@ -0,0 +1,6 @@ +# On-chain deployments & contracts + +An up-to-date list of all DKG smart contract deployments can be found [here](https://github.com/OriginTrail/dkg-evm-module/tree/main/deployments). + + + diff --git a/docs/dkg-knowledge-hub/learn-more/dkg-key-concepts/README.md b/docs/dkg-knowledge-hub/learn-more/dkg-key-concepts/README.md new file mode 100644 index 0000000..bf93a67 --- /dev/null +++ b/docs/dkg-knowledge-hub/learn-more/dkg-key-concepts/README.md @@ -0,0 +1,25 @@ +--- +description: >- + Learn how MCP (Model Context Protocol) connects your DKG Node seamlessly with + AI frameworks and external tools. +--- + +# The DKG Node + MCP + + + +
+ +## Using MCP on DKG Node + +The Model Context Protocol (MCP) is the bridge between your AI agents and the DKG Node. It allows applications, chatbots, or services to interact with the DKG in a standardized way — no custom glue code required. + +With MCP, your node can: + +* Publish knowledge as verifiable assets into the DKG. +* Query trusted knowledge with provenance, ready to ground AI outputs. +* Verify information, ensuring that what your AI consumes is authentic. + +Think of MCP as the universal language your AI tools use to talk to the DKG Node. If your framework or agent speaks MCP (LangChain, VS Code, Cursor, Copilot Studio, etc.), it can immediately tap into the DKG without extra setup. + +MCP makes your DKG Node usable by AI out of the box — transforming it from a passive node into an active gateway for verifiable, trusted knowledge. diff --git a/docs/dkg-knowledge-hub/learn-more/dkg-key-concepts/using-mcp-on-your-dkg-node.md b/docs/dkg-knowledge-hub/learn-more/dkg-key-concepts/using-mcp-on-your-dkg-node.md new file mode 100644 index 0000000..51895f0 --- /dev/null +++ b/docs/dkg-knowledge-hub/learn-more/dkg-key-concepts/using-mcp-on-your-dkg-node.md @@ -0,0 +1,41 @@ +# Using MCP on your DKG Node + +## How does the DKG work with MCP? + +By registering the DKG as a tool provider in an MCP server, you can: + +* **Query knowledge** using tools like `query_dkg_by_name`, which execute SPARQL queries over the DKG +* **Publish knowledge** from unstructured text using tools like `create_knowledge_asset`, which turn LLM-generated content into structured JSON-LD and publish it to the DKG +* Expand the tools above and add additional tools to interact with the DKG (e.g. creating Knowledge Assets from websites or documents) + +This allows agents using MCP-compatible clients to: + +* Interact with real-time, decentralized knowledge +* Add to the shared memory layer used by other agents +* Benefit from data provenance, versioning, and ownership built into the DKG + +

Example of DKG integration with the Microsoft Copilot Agent

+ +### Why is this powerful? + +MCP makes it easy to build modular, agent-based systems where LLMs use tools to: + +* Ask questions against the Decentralized Knowledge Graph +* Write and revise their own memory as JSON-LD Knowledge Assets +* Store results and publish new discoveries collaboratively + +When paired with the DKG, this gives LLM-based systems access to a decentralized knowledge base that is: + +* **Interoperable** (via RDF and schema.org) +* **Trustworthy** (anchored with cryptographic proofs) +* **Queryable** (via SPARQL and linked data tooling) + +To get started: + +* Explore the [dkg-mcp-server example repo ](https://github.com/OriginTrail/dkg-mcp-server) +* Try integrating the DKG into your MCP server +* Build agents that reason over, expand, and query the decentralized web of knowledge + +### See it in action - DKG & Microsoft Copilot agent integration via MCP + +{% embed url="https://www.youtube.com/watch?v=_S5cNdwAGsQ" %} diff --git a/docs/dkg-knowledge-hub/learn-more/dkg-key-concepts/what-is-mcp-model-context-protocol.md b/docs/dkg-knowledge-hub/learn-more/dkg-key-concepts/what-is-mcp-model-context-protocol.md new file mode 100644 index 0000000..c358948 --- /dev/null +++ b/docs/dkg-knowledge-hub/learn-more/dkg-key-concepts/what-is-mcp-model-context-protocol.md @@ -0,0 +1,38 @@ +--- +description: >- + A beginner-friendly deep dive into MCP: What it is, why it matters, and how it + standardizes AI ↔ tool communication. +--- + +# What is MCP? (Model Context Protocol) + +### What is MCP? (Model Context Protocol) + +The **Model Context Protocol (MCP)** is an open standard that allows AI agents to connect to **real, trusted data and tools** beyond their own LLM-limited memory. In the context of the **Decentralized Knowledge Graph (DKG)**, MCP is the bridge between the reasoning power of AI and the verifiable, linked knowledge that lives on the network. + +Think of an AI agent as a brilliant brain — capable of reasoning, summarizing, and generating language — but often trapped inside its own head. MCP acts as the **“bridge” or “port”** that lets that brain reach out into the real world: to query databases, fetch live knowledge, call APIs, and even publish new knowledge back into the DKG. + +Some in the AI space call MCP the **“USB-C of AI”** — because it standardizes how models connect to external systems. Instead of writing custom code for every single integration, MCP provides a single, universal way to plug AI into anything: from enterprise APIs and local tools to **DKG Nodes**, where cryptographically verified knowledge lives. + +### Why MCP matters + +#### 1. Eliminates the “M x N” connector problem + +Before MCP, connecting AI systems to external tools was messy. Imagine you have one AI model (like a chatbot) and ten different tools it needs to use (a database, a calendar, a search engine, a file system, etc.). You’d usually have to write ten separate custom integrations just for that one model. + +Now imagine adding a second AI model — you might need to write those integrations all over again. That’s the M × N problem: the number of integrations grows out of control as you add more models and more tools. + +MCP fixes this by being a universal connector. Once a tool supports MCP, any AI model that speaks MCP can use it. Think of it like a USB-C port for AI — instead of needing a different charger for every device, you just use one cable. + +#### 2. Interoperability & composability + +Because MCP is an open protocol, it’s not tied to one company or platform. Anyone can build tools, agents, or DKG Nodes that use it. + +This creates a world where: + +* A LangChain agent, a VS Code extension, and a Copilot Studio bot can all talk to the DKG through the same MCP server. +* Developers don’t have to reinvent the wheel each time. +* You can combine tools easily — like chaining together a calendar, a file system, and the DKG — without special hacks. + +This is what “composability” means: building larger, more powerful systems out of smaller, interoperable parts. Just like Lego blocks, if each piece follows the same standard, they all fit together. + diff --git a/docs/dkg-knowledge-hub/learn-more/dkg-key-concepts/why-dkg-node-and-mcp-combo.md b/docs/dkg-knowledge-hub/learn-more/dkg-key-concepts/why-dkg-node-and-mcp-combo.md new file mode 100644 index 0000000..6ef31be --- /dev/null +++ b/docs/dkg-knowledge-hub/learn-more/dkg-key-concepts/why-dkg-node-and-mcp-combo.md @@ -0,0 +1,33 @@ +--- +description: >- + See why DKG + MCP together are so powerful — giving AI agents direct access to + trusted, verifiable knowledge. +--- + +# Why DKG Node & MCP combo? + +
+ +## The power couple unleashed + +The OriginTrail Decentralized Knowledge Graph (DKG) and the Model Context Protocol (MCP) solve different problems — but together, they unlock something far greater. + +On their own: + +* DKG Nodes turn information into verifiable knowledge that can be published, discovered, and trusted across industries. +* MCP gives AI agents a universal language to safely communicate with tools and data sources. + +But when you combine them, you create the missing link for trustworthy AI. + +### AI can finally trust its sources + +Most AI today works like a student with a foggy memory. It “remembers” patterns from training data but doesn’t actually know where those answers originally came from — which is why hallucinations are so common. + +With **MCP**, AI agents can query a **DKG Node** directly. Because the DKG serves **verifiable, cryptographically signed knowledge**, the AI no longer guesses — it reasons with real, trustworthy data that is: + +* **Traceable** – Every piece of knowledge carries provenance, showing _who published it_ and _where it originated_. +* **Auditable** – Anyone (including AI) can verify that the data hasn’t been tampered with. +* **Continuously updated** – The DKG is a living knowledge layer, enriched over time by publishers, contributors, and agents themselves. + +And that’s where the real power lies: **AI agents don’t just consume knowledge — they can also contribute it.** When an agent discovers new facts, validates existing claims, or generates useful structured information, it can publish those results back into the DKG as verifiable knowledge. That knowledge then becomes part of a shared, trusted foundation that **other agents, apps, and humans** can later tap into — creating a virtuous cycle of intelligence. + diff --git a/docs/dkg-knowledge-hub/learn-more/introduction/README.md b/docs/dkg-knowledge-hub/learn-more/introduction/README.md new file mode 100644 index 0000000..de129b0 --- /dev/null +++ b/docs/dkg-knowledge-hub/learn-more/introduction/README.md @@ -0,0 +1,13 @@ +# Network mechanics & systems + +While other sections introduce the high-level capabilities of the DKG, this section is for those who want to understand what’s really happening behind the scenes: how data flows through the system, how nodes coordinate, how cryptographic proofs work, and how the DKG achieves verifiability, decentralization, and semantic interoperability at scale. + +#### What you'll learn in the following Pages: + +* The mechanics of delegated staking +* How data is structured and hashed using Merkle roots +* The mechanics of publishing, replication, and querying across Core and Edge Nodes +* How consensus protocols tie into graph storage and knowledge lifecycle via Random Sampling +* How DKG Sync works and why it is needed, etc + +Whether you're building dapps, deploying nodes, or integrating symbolic AI agents, this section will give you a technical foundation to architect on top of the DKG with confidence. diff --git a/docs/dkg-knowledge-hub/learn-more/introduction/dkg-codebase-and-structure.md b/docs/dkg-knowledge-hub/learn-more/introduction/dkg-codebase-and-structure.md new file mode 100644 index 0000000..5658e45 --- /dev/null +++ b/docs/dkg-knowledge-hub/learn-more/introduction/dkg-codebase-and-structure.md @@ -0,0 +1,2 @@ +# DKG codebase & structure + diff --git a/docs/dkg-knowledge-hub/learn-more/introduction/dkg-sync.md b/docs/dkg-knowledge-hub/learn-more/introduction/dkg-sync.md new file mode 100644 index 0000000..e31bc4c --- /dev/null +++ b/docs/dkg-knowledge-hub/learn-more/introduction/dkg-sync.md @@ -0,0 +1,83 @@ +# How DKG synchronization works + +The **DKG Sync** feature enhances the **anti-fragility** of the OriginTrail Decentralized Knowledge Graph (DKG) by allowing Core Nodes to continuously and efficiently backfill any missing Knowledge Assets (KAs). This is especially useful for **gracefully handling temporary node downtime**, such as during server upgrades or brief periods of unavailability. + +Rather than relying on constant uptime, DKG Sync enables nodes to recover and remain synchronized with the state of the public DKG shard(s) they participate in, ensuring robust participation in discovery, querying, and Random Sampling. + +### Why was DKG sync introduced + +As the DKG grows and evolves, Core Nodes may experience periods of downtime, restarts, or join the network for the first time. Rather than penalizing such situations, the sync system is designed to allow: + +* Reliable and autonomous catch-up for missing data +* Smooth reintegration of previously unavailable nodes +* Full historical sync for newly joined nodes to get up to speed +* Consistently high levels of data replication across the network + +This improves the system's overall anti-fragility, making it more adaptive and resilient as it scales, restarts, or undergoes late onboarding. Rather than penalizing such situations, the sync system is designed to allow: + +* Reliable and autonomous catch-up for missing data +* Smooth reintegration of previously unavailable nodes +* Consistently high levels of data replication across the network + +This makes the network more robust over time, allowing it to adapt and improve in response to dynamic changes in node availability. + +### How it works + +The sync feature operates as a **background service** and performs two key tasks: + +#### 1. Sync new Knowledge Collections + +* Compares the latest on-chain KC ID to the node’s own synced state +* Constructs and sends Batch GET requests for any new, missing KCs +* Stores verified KCs locally and tracks update progress +* Queues any missing results for retry + +#### 2. Retry missed Knowledge Collections + +* Periodically retries fetching failed KCs using intelligent retry logic +* Dynamically schedules based on retry count and last attempt timestamp +* Removes successfully synced KCs from retry queue + + + +The DKG Sync feature introduces **Batch GET**, a new network operation optimized for retrieving multiple KCs efficiently. Rather than issuing redundant GET requests to multiple nodes, Batch GET enables smart coordination strategies that reduce bandwidth, balance load, and maximize responsiveness. + +These mechanisms ensure that even a freshly joined or recently restarted node can synchronize efficiently without overwhelming the network. + +### Sync Logic and resource considerations + +| Requirement | Behavior | +| ------------------------ | --------------------------------------------------------------- | +| **Fresh nodes** | Can perform full historical sync on joining | +| **Operational priority** | Sync yields to publish/query tasks | +| **Peer interaction** | Sync tracks peer GET volumes to avoid overload | +| **Resilience** | Sync recovers cleanly from failures and continues automatically | + +*** + +#### How to turn on DKG Sync + +For DKG Sync to run, it needs to be enabled in the node configuration. + +In `.origintrail_noderc` file at top level add `assetSync` object to enable and configure sync: + +``` +{ + "modules": { + ... + }, + "auth": { + ... + }, + "assetSync": { + "syncDKG": { + "enabled": true, + // Number of KCs suggested to be 50 in beginning + "syncBatchSize": + }, + // You already have this if your node is syncing a paranet + "syncParanets": [] + } +} +``` + diff --git a/docs/dkg-knowledge-hub/learn-more/introduction/random-sampling-dkg-proof-system/README.md b/docs/dkg-knowledge-hub/learn-more/introduction/random-sampling-dkg-proof-system/README.md new file mode 100644 index 0000000..c2f4e40 --- /dev/null +++ b/docs/dkg-knowledge-hub/learn-more/introduction/random-sampling-dkg-proof-system/README.md @@ -0,0 +1,173 @@ +# Random Sampling & proofs explained + +The **Random Sampling** proof system in OriginTrail DKG V8 is a decentralized, scalable, and lightweight mechanism for ensuring the **availability of knowledge**, **incentivizing network participation**, and **fairly distributing publishing rewards**. This page introduces its purpose, mechanics, and how it connects node runners, token delegators, and publishers in the DKG ecosystem. + +### Why does Random Sampling exist + +For a stable functioning decentralized system, service and data availability cannot be assumed — it must be proven. In the DKG, **Core Nodes are responsible for hosting and serving Knowledge Assets**. But how can the network be sure they actually do? + +Random Sampling is a **Proof-of-Knowledge (PoK)** mechanism that enables the network to continuously and randomly challenge nodes to prove that they store specific data. Nodes are challenged by blockchain smart contracts, which govern the system in a decentralized way. In particular, the system: + +* Ensures that data remains available over time +* Rewards nodes that are consistently online and engaged +* Provides a fair, automated way to distribute fees paid by publishers + +It replaces earlier, less scalable methods used in DKG V6 with a vastly more efficient system, capable of supporting **100 billion+ Knowledge Assets.** + +### Where do the node rewards come from? + +The DKG tokenomics is based on the TRAC utility token, a non-inflationary, fully circulating ERC-20 token launched in 2018 on the Ethereum blockchain. The node rewards are all utility-based — paid by the knowledge publishers, which cover the DKG service fees to node runners. There are no inflationary rewards. + +### Network goals and incentives + +The DKG network is designed to reward: + +* **System uptime** — Nodes must submit frequent, valid proofs of knowledge availability +* **Data correctness** — Proofs are only possible if a node hosts the challenged data in the exact form published by the publishers +* **High publishing factor** — Nodes that contribute new knowledge by opening their API for publishing to the DKG earn higher scores +* **Stake commitment** — The more TRAC tokens staked to a node by delegators, the greater the reward share +* **Service efficiency** — Lower service ask fees improve system competitiveness and incentivise adoption + +These incentives align node behavior with the network's goals: fast, reliable, and decentralized knowledge hosting. + +### Staking implementation details + +The DKG staking system allows any TRAC token holder to participate in the network by **delegating tokens to Core Nodes**. This supports the network’s operation and entitles delegators to a share of the publishing rewards. + +{% hint style="info" %} +Visit [this page ](../../../../contribute-to-the-dkg/delegated-staking/)for a step-by-step guide on how to stake TRAC tokens. +{% endhint %} + +#### Non-custodial system + +The DKG delegated staking system is **fully non-custodial,** meaning TRAC delegators do not provide access to their tokens to the DKG Core Node runners or any third party. The tokens are securely locked in the DKG smart contracts and can only be transferred by the token delegator through the smart contract functions. + +#### Staking Dashboard + +The Staking Dashboard is an easy-to-use web page that enables DKG staking interactions, as well as monitoring node performance and rewards over time. Staking can be done via any EVM-compatible wallet. + +#### Reward claiming and restaking + +* Rewards are **not automatically claimed —** they must be explicitly claimed using the `claimRewards` function, for each epoch individually. +* When claimed, rewards are **automatically re-staked** into the delegator's active stake. This increases both the delegator’s stake and the total stake of the node, enhancing future reward shares. +* Rewards can only be claimed for **completed and finalized epochs**. The system enforces this to ensure accurate score calculation. + +#### Who can trigger reward claiming? + +* **Core Nodes** are naturally incentivized to claim rewards for all their delegators, since doing so increases their total stake and therefore reward performance. This is the preferred option to the alternative, which is having delegators each triggering the claiming transaction manually for each epoch. +* To ensure fairness, **delegators can also claim rewards for themselves**, particularly in cases where the node is offline or unresponsive. In this way, delegators are in no way dependent on the node operation and can withdraw their tokens at any time. + +#### Withdrawal period + +* All delegated stakes are subject to a **28-day withdrawal period**. +* A delegator must initiate a withdrawal request and wait for the withdrawal period to elapse before accessing their tokens. + +*** + +### Random sampling proof system — key concepts + +| Concept | Description | +| ----------------------------- | ------------------------------------------------------------------------- | +| **Knowledge Sector** | A virtual shard of the DKG, hosted by a subset of Core Nodes | +| **Knowledge Collection (KC)** | A group of Knowledge Assets minted together (stored as an NFT collection) | +| **Chunk** | A 32-byte unit of a Knowledge Collection used for Merkle hashing | +| **Epoch** | A fixed-length interval (e.g., 30 days) over which rewards are calculated | +| **Proof Period** | A short cycle (e.g., 30 minutes) in which nodes are challenged | + +*** + +### How Random Sampling works + +1. **Random challenge generation** + * At the start of each _proof period_, the DKG smart contract randomly selects a chunk from a Knowledge Collection for each Core Node. + * Example: Node1 receives challenge `(KCID = 1012, chunk = 23)`. +2. **Node responds with Merkle proof** + * The node computes a Merkle proof for the chunk and submits it on-chain. + * The contract verifies the proof by recomputing the Merkle root. +3. **Score assignment** + * If the proof is valid, a _proofScore_ is computed based on: + * Node stake + * Publishing factor + * Uptime + * Ask price +4. **Epoch score aggregation** + * All valid scores in the epoch are summed: `totalNodeScore = ∑ proofScore` +5. **Reward distribution** + * At the end of each epoch, accumulated publisher fees are distributed to: + * **Core Node operators**, based on totalNodeScore and their operator fee + * **Delegators**, proportionally to their stake and staking duration + + + +A high-level sequence diagram below illustrates the entire lifecycle from publishing, staking, service provisioning, reward claiming, and token withdrawals. + +
+ +*** + +### Delegators, node Runners, publishers: Roles explained + +| Role | Description | +| ----------------------- | -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | +| **Node operator** |

Operates a DKG Core Node, which hosts the DKG, submits proofs, and earns node rewards. Node operators are the "system administrators" of the DKG, and the better their nodes run, the better their TRAC reward performance will be.
For their services, they set an Operator fee (funded by publishing as well).

| +| **TRAC delegator** | Stakes TRAC to a Core Node to earn a share of its rewards. Delegators curate nodes based on their performance in the network — uptime, stake, publishing factor, and service ask. One delegator can stake TRAC to multiple Core Nodes simultaneously. | +| **Knowledge publisher** | Uses the public DKG for knowledge creation and querying. Pays fees to mint Knowledge Assets and fund node incentives | + + + +*** + +### Time dimension: Epochs and proof periods + +To coordinate time-based operations across the DKG, it uses a system of **epochs** and **proof periods**, implemented by the `Chronos` and `RandomSampling` contracts. + +#### Epochs + +An **epoch** is a long-form unit of time (typically 30 days) used to: + +* Define the lifespan of Knowledge Assets +* Track and reset reward cycles +* Anchor delegator scoring and reward eligibility + +Epochs are calculated from a global start time using: + +``` +(currentTimestamp - START_TIME) / EPOCH_LENGTH + 1 +``` + +This ensures that all participants are synchronized to the same epoch regardless of timezone or local clock. + +{% hint style="info" %} +Synchronized epochs have been introduced in V8, as an evolution of decoupled epochs in the previous version of the DKG (V6), which enabled major DKG scalability improvements on the blockchain layer. +{% endhint %} + +#### Proof periods + +A **proof period** is a much shorter window of time (e.g., 30 minutes) in which Core Nodes must submit a proof in response to a randomized challenge. Within each epoch, there are many proof periods. + +Each proof period: + +* Begins at a calculated block interval based on average block time and on-chain parameters +* Triggers a new challenge to each eligible Core Node +* Allows nodes to submit a Merkle proof for a randomly selected chunk from a Knowledge Collection + +#### Coordination between epochs and proof periods + +* The `Chronos` contract tracks the current epoch, calculates timestamps for any epoch, and determines when new epochs begin. +* The `RandomSampling` contract coordinates proof periods within the current epoch by: + * Calculating the active proofing block window + * Generating unique challenges per node + * Tracking proof submissions for reward calculation + +This two-layered time model enables the DKG to scale efficiently by decoupling frequent proof submissions from the slower epoch-based reward distribution. + +### Scalability and performance + +* Challenges are randomized and verifiably stored on-chain +* Nodes only prove small slices of data per period → no full graph needed +* Gas cost per node per day is <100M gas +* Efficient sector-based storage allows horizontal scaling +* Proofs are computed off-chain and verified cheaply on-chain + +The system is robust even with **billions of knowledge chunks** across **hundreds of nodes**, ensuring future-proof operation. + diff --git a/docs/dkg-knowledge-hub/learn-more/introduction/random-sampling-dkg-proof-system/random-sampling-faq.md b/docs/dkg-knowledge-hub/learn-more/introduction/random-sampling-dkg-proof-system/random-sampling-faq.md new file mode 100644 index 0000000..a3765be --- /dev/null +++ b/docs/dkg-knowledge-hub/learn-more/introduction/random-sampling-dkg-proof-system/random-sampling-faq.md @@ -0,0 +1,74 @@ +# Random Sampling FAQ + + + +**How do I choose which Core Node to delegate my TRAC to?** + +The system is designed in such a way as to incentivize positive network behavior (see more in [.](./ "mention")). That means your delegated TRAC will help increase a Core Node's chances of receiving rewards, while the nodes with the best characteristics receive a higher share of the rewards. To pick a node, consider the following metrics (available in the [Staking Dashboard](https://staking.origintrail.io)) + +* **Node Power** — Shows how competitive the node is in earning rewards. Higher Node Power means a higher chance of rewards. +* **Node Health** — Reflects reliability and uptime. Higher Node health means a higher chance of rewards. +* **Operator fee** — The percentage cut the node takes from rewards to cover the cost of its operations (e.g., blockchain transaction fees). + +You’re generally looking for a node with high power, high health, and a reasonable fee. + + + +**Can a node operator take my stake?** + +No. Delegated TRAC tokens are locked in smart contracts. Node operators cannot access or withdraw your stake. You can withdraw your tokens yourself after initiating a withdrawal and completing the 28-day unstaking period. + + + +**What happens if my Core Node goes offline during an upgrade?** + +DKG is designed to be resilient. If your node goes offline (e.g., for maintenance), it won’t be slashed, but it may miss proof submissions during that downtime. This means: + +* Your node’s **Node Health** score may decrease. +* You may receive lower rewards for that epoch. + +As a node operator, maintaining your node uptime is one of the highest priorities, and it is, therefore, incentivized. You can use the **DKG Sync** feature to catch up on missed data and rejoin the network without penalty. + + + +**Will my node be penalized if it misses a proof?** + +No, there is no slashing or loss of stake for missing a proof. However, missing proofs results in a **zero score** for that proof period, which reduces your total rewards for the epoch. Node Health will also decrease temporarily. + + + +**What’s the timeline for the V8.1 reward system rollout?** + +The rollout is structured as follows: + +1. **V8.1.0** – Random Sampling live (current proofs and rewards) +2. **V8.1.1** – V6 rewards compatibility module +3. **V8.1.2** – Tuning Period rewards module + +These are released in quick succession as development and testing complete, rolling out in June 2025. + + + +**How can I improve my node’s performance and earn more rewards?** + +Node rewards are based on three main factors: + +1. **Uptime and proof submissions** — Stay online and responsive to increase your Node Health. +2. **Publishing activity** — Publish Knowledge Assets or [open your publishing API](../../../../graveyard/everything/dkg-core-node/how-to-open-up-your-node-for-publishing.md) for others to use to increase your Node Power. +3. **Ask price** — Set a fair, competitive service fee, increasing your Node Power. +4. **Set a low operator fee** — Making your node more interesting for delegators. +5. **Promote your node** — Get TRAC holders excited to delegate their stake to your node to increase your Node Power. + +**How are rewards collected for V6 and the Tuning Period?** + +Both the V6 rewards and the Tuning Period (between V8.0.0 and V8.1.0) are processed using the new **Random Sampling** reward system: + +* **V6 rewards** use a retroactive application of Random Sampling based on available historical data. +* **Tuning Period rewards** assume all eligible nodes had **full uptime** and participation. Rewards for this period will be claimable once V8.1.2 is released. + +**Who is eligible for V6 era rewards?** + +Nodes that were active during the DKG V6 period (before the V8 upgrade) are eligible to earn accumulated rewards from that era. V6 rewards will be distributed via a compatibility module introduced in version V8.1.1, and are consolidated for the period of all 12 epochs in 2025 (1 year). Specifically, eligible nodes will submit proofs for upcoming epochs based on the same principles as those for V8 era rewards. + +Reward claiming for already elapsed epochs **will assume 100% uptime for that period (all nodes will have equal Node Health),** so nodes will compete on **Node Power**. Further technical details will be released with the V6 compatibility module release on the DKG testnet. + diff --git a/docs/dkg-knowledge-hub/learn-more/introduction/random-sampling-dkg-proof-system/random-sampling-rollout.md b/docs/dkg-knowledge-hub/learn-more/introduction/random-sampling-dkg-proof-system/random-sampling-rollout.md new file mode 100644 index 0000000..89d88b2 --- /dev/null +++ b/docs/dkg-knowledge-hub/learn-more/introduction/random-sampling-dkg-proof-system/random-sampling-rollout.md @@ -0,0 +1,65 @@ +# Random Sampling rollout + +The deployment of the DKG Random Sampling system follows a carefully phased approach to ensure stability, fairness, and backward compatibility with earlier versions of the DKG. This page outlines the rollout timeline, what each release includes, and what node operators and delegators can expect. + +### Overview of releases + +#### ✅ Phase 1: RFC Confirmation + +* **Goal:** Finalize community review of [OT-RFC-24](https://github.com/OriginTrail/OT-RFC-repository/tree/main/RFCs/OT-RFC-24_DKG_V8_Random_sampling_proof_system) +* **Outcome:** Approval of the Random Sampling design by the community and core development team +* **Phase status: ****Completed** + +#### 🚀 Phase 2: V8.1.0 — Initial rollout (Active Proof System) + +* **Features introduced:** + * Live Random Sampling challenge generation and proof submission + * Score calculation per proof period + * Epoch-based reward allocation system + * DKG Sync system goes online +* **Who is affected:** + * **Core Nodes** begin submitting live proofs + * **Delegators** start accruing rewards for the active epochs (but do not yet claim rewards for past epochs) +* **Purpose:** Establish live PoK infrastructure and real-time score updates for staking participants +* **Phase status: ****Live on Testnet. Mainnet rollout expected in June 2025** + +#### 🔄 Phase 3: V8.1.1 — V6 Compatibility Module goes online + +* **Goal:** Allow accumulated rewards from the DKG V6 era to be earned retroactively +* **How it works:** + * Applies the Random Sampling system retroactively for TRAC fees in the V6 period + * Nodes that were active during the V6 phase can compete for retroactive rewards, claiming tokens for completed epochs and competing for upcoming epochs. +* **Compatibility duration:** Available for a limited window (12 months from V8.0 launch) +* **Phase status: Testnet & mainnet rollout expected in June 2025** + +#### 🛠 Phase 4: V8.1.2 — Tuning Period Module + +* **Goal:** Distribute locked fees for the period between V8.0.0 and V8.1.0 releases +* **How it works:** + * All active nodes during this period are considered to have **100% uptime** for the purpose of reward allocation + * The same score distribution and reward formula is applied as in the live Random Sampling system +* **Phase status: Testnet & mainnet rollout expected in June 2025** + +### Rollout principles + +* **Incremental activation** — Each module builds on the previous one, ensuring that no backward compatibility is broken +* **Transparency first** — The DKG Staking Dashboard will be upgraded to reflect: + * **Node Power**, which will represent the current node power in the network with regard to its stake, service ask, and publishing factor + * **Node Health,** computed based on proof submissions per epoch + * Compatibility module reward status +* **Fail-safe participation** — Delegators and node runners are protected from loss of eligibility due to technical transitions, with catch-up mechanisms in place for reward collection + +### What you should do + +#### Node runners + +* Make sure to follow the updates to update your node as soon as the V8.1 release goes live on the mainnet. +* Once your node is updated, make sure to monitor and maintain its proof submission health. +* Optionally enable auto-claiming for delegators via node services to automatically restake all rewards and increase your Node Power. + +#### Delegators + +* You don’t need to take immediate action. +* Make sure you understand when rewards become claimable based on finalization rules. +* Use the upgraded Staking Dashboard to track your delegator epoch scores and rewards + diff --git a/docs/dkg-knowledge-hub/learn-more/introduction/rules-and-token-thresholds.md b/docs/dkg-knowledge-hub/learn-more/introduction/rules-and-token-thresholds.md new file mode 100644 index 0000000..4d8cc9a --- /dev/null +++ b/docs/dkg-knowledge-hub/learn-more/introduction/rules-and-token-thresholds.md @@ -0,0 +1,23 @@ +# Rules & token thresholds + +In the unified node model, all nodes run the same software. The difference is in how much TRAC is staked: + +#### **Edge Node** + +
+ +* **Stake:** _No stake required_. +* **What it means:** Anyone can spin one up and start publishing/verifying knowledge. +* **Use case:** Perfect for developers, small teams, or community members experimenting with the DKG. +* **Incentives:** No direct staking rewards yet, but you can publish and contribute. + +#### **Core Node** + +
+ +* **Stake:** _50,000 TRAC minimum_. +* **What it means:** Your node graduates into full participation. +* **Use case:** Ready for production, with incentives and delegations enabled. +* **Incentives:** + * Eligible for rewards when securing the network. + * Others can delegate their stake to your Core Node, letting you earn operator fees. diff --git a/docs/dkg-knowledge-hub/learn-more/node-keys-wallets.md b/docs/dkg-knowledge-hub/learn-more/node-keys-wallets.md new file mode 100644 index 0000000..7e70848 --- /dev/null +++ b/docs/dkg-knowledge-hub/learn-more/node-keys-wallets.md @@ -0,0 +1,84 @@ +--- +description: >- + This section will guide you trough the preparation of the node wallets (keys) + for each of the blockchains currently supported by OriginTrail DKG node. +--- + +# Node keys (wallets) + +## What are node keys and why are they needed? + +For a DKG node to be able to properly operate, it will need to execute transactions on the blockchains it is connected to, and for that it uses node keys (wallets) of H160 type (Ethereum). Nodes have two different types of keys: + +* **operational** keys, for which the node requires access to (you will need to generate and upload their private keys to your node) +* **admin** keys, for which the node doesn't require access and are used to manage the node on chain configuration + +To get you started, you will need at least one operational and one one admin key. More details on keys can be found [here](broken-reference). + +The details of key setup can vary between particular blockchains, so are presented below separately for clarity. + +## NeuroWeb wallet preparation: + +Since DKG nodes are connected to one or more blockchains, in order to perform actions on them they need to appropriate keys (wallets). Therefore, for each blockchain you want to support with your node, you will need to setup several keys. + +More info can be found in [the node keys explained page](broken-reference). + +{% hint style="info" %} +Since version 6.2.0 DKG nodes support multiple operational wallets. +{% endhint %} + +## NeuroWeb wallet preparation: + +In order for you to run your OriginTrail DKG node on NeuroWeb blockchain, you will need to prepare and fund multiple wallet addresses as presented below: + +* At least one “Operational” Ethereum wallet and "Operational" Substrate wallet (defined in **operationalWallets**) +* An “Admin” Ethereum wallet (mapped to **evmManagementWallet**) +* An “Admin” Substrate wallet (mapped to **evmManagementWallet**) + +A node can use multiple operational wallets at once. Utilising multiple is preferrable and provides the node with more throughput for sending and updating commits and proofs, as well as performing other operations in parallel. + +A node can use multiple operational wallets at once. Utilising multiple is preferrable and provides the node with more throughput for sending and updating commits and proofs, as well as performing other operations in parallel. + +TRAC is an ERC20 token on the Ethereum blockchain, transferred over to the NeuroWeb blockchain and enhanced with Polkadot native capabilities. In order to run OriginTrail DKG node on NeuroWeb parachain, TRAC tokens are required to be transfered from Ethereum to NeuroWeb blockchain network. + +In order to transfer your TRAC tokens to the NeuroWeb blockchain, a Teleport system has been established until a Polkadot <-> Ethereum bridge is ready. Therefore to get your TRAC on the NeuroWeb blockchain, you need to [teleport your tokens first](https://teleport.origintrail.io/). + +Details are available in [OT-RFC-12](https://github.com/OriginTrail/OT-RFC-repository/blob/main/RFCs/OT-RFC-12%20OriginTrail%20Parachain%20TRAC%20bridges%20\(v2\).pdf). + +### **Mapping your operational and admin wallets** + +In order to complete the Teleport and setup your V6 node, you will need to perform a process of “mapping” your Ethereum and Substrate wallets. The mapping process is explained [here](https://docs.origintrail.io/blockchain-layer-1/origintrail-parachain/teleport-instructions). The process will require you to use an “account mapping” interface - an example usage video is available [here](https://www.youtube.com/watch?v=yltbdB1bpEA). + +During the installation process, the installer will ask you to provide it with the above wallet addresses (keys) and will automatically add them into your OriginTrail configuration file. + +Read more about operational and admin wallets [here](https://docs.origintrail.io/decentralized-knowledge-graph-layer-2/testnet-node-setup-instructions/node-keys). + +After you finish the mapping process, your keys are ready to be used by the OriginTrail DKG node. + +{% hint style="info" %} +The mapping procedure is part of the EVM implementation on NeuroWeb and is a one time activity. +{% endhint %} + +## Gnosis and Chiado wallet preparation: + +In order for you to run your OriginTrail DKG node on Gnosis or Chiado blockchains, you will need to prepare and fund at least following wallets: + +* At least one “Operational” EVM wallet (defined in **operationalWallets**) +* 1x “Admin” EVM wallet (mapped to **evmManagementWallet**) + +It's recommended to assign multiple operational wallets to a node in order to increase throughput and reliability of operations. + +Once the wallets have been prepared, proceed to the process of acquiring tokens on the desired network (testnet or mainnet). + + + +## Base and Base Sepolia wallet preparation: + +In order for you to run your OriginTrail DKG node on Base or Base Sepolia blockchains, you will need to prepare and fund at least following wallets: + +* At least one “Operational” EVM wallet (defined in **operationalWallets**) +* 1x “Admin” EVM wallet (mapped to **evmManagementWallet**) + +It's recommended to assign multiple operational wallets to a node in order to increase throughput and reliability of operations. + +Once the wallets have been prepared, proceed to the process of acquiring tokens on the desired network (testnet or mainnet). diff --git a/docs/dkg-knowledge-hub/learn-more/previous-updates/README.md b/docs/dkg-knowledge-hub/learn-more/previous-updates/README.md new file mode 100644 index 0000000..3bf2769 --- /dev/null +++ b/docs/dkg-knowledge-hub/learn-more/previous-updates/README.md @@ -0,0 +1,2 @@ +# Previous version release + diff --git a/docs/dkg-knowledge-hub/learn-more/previous-updates/dkg-v8.0-update-guidebook/README.md b/docs/dkg-knowledge-hub/learn-more/previous-updates/dkg-v8.0-update-guidebook/README.md new file mode 100644 index 0000000..ff1ad37 --- /dev/null +++ b/docs/dkg-knowledge-hub/learn-more/previous-updates/dkg-v8.0-update-guidebook/README.md @@ -0,0 +1,17 @@ +# DKG V8.0 update guidebook + +V8 is a major update that increases scalability up to 1000x, introduces DKG Edge Nodes to power privacy-preserving AI use, improves Knowledge Assets, and much more. + +This guide will introduce you to the DKG V8 update, what you can expect from the new features, the launch timeline, and most importantly, what you need to do if you are a builder, staker, or node operator. + +Check [this page](../whats-new-with-origintrail-v8.md) to get acquainted with the most important new updates of the DKG V8. In this document, we will focus on the implementation details of the protocol. + +**Learn more on** [**Protocol updates here**](protocol-updates.md) + +**Learn more on the upcoming** [**Feature roadmap**](feature-roadmap.md) **here** + +To understand what you need to **do to upgrade your applications or nodes**, head over to the "[How to upgrade to V8](how-to-upgrade-to-v8.md)" page + +If you have any questions or issues, please contact us in the [Discord #v8-discussion](https://discord.gg/WCnDQArdzQ) channel and post them there. + +Trace on! diff --git a/docs/dkg-knowledge-hub/learn-more/previous-updates/dkg-v8.0-update-guidebook/feature-roadmap.md b/docs/dkg-knowledge-hub/learn-more/previous-updates/dkg-v8.0-update-guidebook/feature-roadmap.md new file mode 100644 index 0000000..ea74178 --- /dev/null +++ b/docs/dkg-knowledge-hub/learn-more/previous-updates/dkg-v8.0-update-guidebook/feature-roadmap.md @@ -0,0 +1,37 @@ +# Feature roadmap + +## Release timeline + +The V8 deployment is a major update executed as a direct upgrade on the existing OriginTrail V6 network. The update process will, therefore, require several migrations to be executed. To successfully complete all the necessary activities, the V8 launch will be executed through an “**Upgrade Period**”, which will last for approximately 1 week. During the Upgrade Period, all relevant DKG components will be updated (nodes, blockchain contracts, Staking Dashboard, DKG Explorer, and others). The update will be executed on all three connected blockchains - NeuroWeb, Gnosis, and Base. + + + +

V8 timeline illustration

+ +The beginning of the Upgrade Period, marking the start of the V8 deployment, is scheduled for **Thursday, December 19th, 2024**, and the Upgrade Period is expected to end on **Thursday, December 26th, 2024**. + +## V8.0 - Launch and Tuning Period + +Following the launch of DKG V8.0, which will introduce fundamental updates and batch-minting features, the network will enter the **Tuning Period.** During this period more features and parameter tuning updates are expected on all blockchains of the network. + +During the Tuning Period, DKG Core Nodes will be accruing rewards, which will be claimable after the expiry of the first epoch in V8, which is expected to be at the end of January (with the shortened epoch). + +The pending publishing rewards from the V6 period will also be made available through the new proof system through the upgrade, with the V6 tokenomics factor of TRAC stake being the only reward factor (as the distance factor is being deprecated), and available only to nodes online prior to V8 launch. The new Random Sampling System will also tackle the problem of under-submission of proofs by V6 nodes, making the reward collection process more efficient. Due to the migration to the new system of synchronized epochs, the V6 reward pool will be distributed across 1 year. + +Features and updates expected for the Tuning period (V8.0.x) + +
FeatureDescriptionReleaseETA
Paranets supportIntroducing V8 paranets and re-enabling existing onesV8.0.1mid-January 2025
Tuning releaseUpdates discovered during the Tuning Period, backwards compatibilityV8.0.2January 2025
Tuning release 2Updates discovered during the Tuning Period, updating V8 Knowledge Assets becomes operationalV8.0.3January 2025
+ +### Backward compatibility + +Due to changes in Knowledge Assets V2 and protocol updates, the V8.0 network is unfortunately not fully backward compatible with V6 clients. However, backward compatibility will be extended in the coming minor releases as well. **If you need any assistance with your V6 applications or migrating to V8,** [**contact the core developers in Discord directly**](https://discord.gg/xCaY7hvNwD)**, and we will gladly assist.** + +We intend to expand the level of backward compatibility over the V8.0 releases. + +## Post Tuning period + +The Tuning Period ends with **V8.1**, which fully introduces the **Random Sampling** features and completes the scalability updates. From this point on, the Core Nodes will start collecting rewards, and the system will be further tuned for different blockchains. More details on the Random Sampling features will be published in the documentation in due time. + +Features and updates expected after the Tuning period + +
FeatureDescriptionReleaseETA
Random sampling proof system (RSPS)RSPS will enable the proof system enabling nodes collecting rewardsV8.1.0February 2025
Collective Programmatic Treasury supportCPT enablementV8.2.0March 2025
diff --git a/docs/dkg-knowledge-hub/learn-more/previous-updates/dkg-v8.0-update-guidebook/how-to-upgrade-to-v8.md b/docs/dkg-knowledge-hub/learn-more/previous-updates/dkg-v8.0-update-guidebook/how-to-upgrade-to-v8.md new file mode 100644 index 0000000..67e614d --- /dev/null +++ b/docs/dkg-knowledge-hub/learn-more/previous-updates/dkg-v8.0-update-guidebook/how-to-upgrade-to-v8.md @@ -0,0 +1,142 @@ +--- +description: >- + Below you will find the guidelines for builders, TRAC delegators, and node + runners for the V8.0 release and activities expected in the Upgrade Period. +--- + +# How to upgrade to V8? + +
+ +## For builders + +**Edge Nodes: If you’re running edge-node services**, you should make sure the DKG Node Engine (ot-node service) is properly updated, as V6 Edge Nodes will not be compatible with the new features. You should also ensure that you update to the newly released V8.0 DKG clients (dkg.js and dkg.py) in your projects - a sufficient level of backward compatibility is enabled for the V6 clients, however, keep in mind that the new features of V8 are somewhat different ([check the previous sections on updated Knowledge Assets](protocol-updates.md)). + +For the ot-node service update, please follow the instructions in the [#for-core-node-runners](how-to-upgrade-to-v8.md#for-core-node-runners "mention") section below. + +Having that said, individual pre-V8 Knowledge Assets will remain queryable via the GET protocol method. + +**Paranets**: If you are running a paranet on the DKG mainnet, you should be aware that it will be resynced if you have enabled paranet sync in the node config. The node will perform a re-sync by itself, and all the data will be available when sync is completed. + +**Smart contracts**: If you are **querying DKG contracts** on any of the deployed chains, you can expect the productive version of V8 smart contracts to be [available on the official repo](https://github.com/OriginTrail/dkg-evm-module/) after the release. + +For any questions, please reach out to the developer community in Discord. + +## For TRAC delegators + +If you have previously delegated your TRAC on any of the DKG-supported blockchains to Core Nodes, your stake will remain active and will continue to contribute to the network. In order to manage your stake (delegating, withdrawing, and redelegating), **you will have to perform a migration to the new system**. This migration will be facilitated by the new Staking Dashboard, which will enable executing migration transactions directly from your staking wallet. These transactions will burn your current share tokens, and account for your stake in the new V8 smart contracts directly. + +There is no rush to migrate - your rewards will continue to accrue even if you don’t migrate. The only thing unavailable before migration is managing your stake. + +Learn more about the new Delegated staking [here](../../../../contribute-to-the-dkg/delegated-staking/). + +## For Core Node runners + +### Upgrading from V6 to V8 + +{% hint style="info" %} +This section is only for the node runners who are already running their nodes on V6 and need to upgrade to V8. +{% endhint %} + +A few manual steps must be performed to successfully update your V6 node to a V8 Core Node. The following steps should be performed after the official V8 release has been deployed on the DKG mainnet. + +### Preparing your node for V8 update + +Have your node automatically download the latest version, and verify that the auto-updater is enabled on the ot-node service. To enable the auto-updater, follow the instructions on the following [page](https://docs.origintrail.io/dkg-v6-current-version/node-setup-instructions/useful-resources/manually-configuring-your-node). + +Once the auto-updater is enabled in the .origintrail\_noderc file, restart the node to apply the configuration changes. When the update is released, your node will automatically pull the latest version (V8), install node modules, and restart. + +### Configuring the new V8 blockchainEvents module + +The new V8 ot-node engine introduces a blockchainEvents module, which you will need to configure on your node by adjusting the settings in the configuration file. The provided template includes configurations for three blockchains: otp:2043, gnosis:100, and base:8453. Depending on which blockchains your node is connected to, you should modify the template accordingly. + +* **Adjust the ****blockchains**** array**: Only include the blockchain IDs relevant to your node. For example, if your node is not connected to the gnosis:100 blockchain, remove it from the array. +* **Update the ****rpcEndpoints**** section**: Provide the appropriate RPC endpoints for each blockchain your node is connected to. Remove any endpoints that do not apply to your node’s blockchain configuration. + +**Example configuration with blockchainEvents added** + +``` +{ + "modules": { + "autoUpdater": { + + }, + "blockchainEvents": { + "enabled": true, + "implementation": { + "ot-ethers": { + "enabled": true, + "package": "./blockchain-events/implementation/ot-ethers/ot-ethers.js", + "config": { + "blockchains": [ + "otp:2043", + "gnosis:100", + "base:8453" + ], + "rpcEndpoints": { + "base:8453": ["https://"], + "gnosis:100": ["https://"] + } + } + } + } + }, + "blockchain": { + + }, + ... +``` + +### Running data migration + +After the node successfully starts with version 8, you **should manually run the data migration script** run-data-migration.sh. Before that, make sure you update the **MAIN\_DIR** variables in these two files if your main directory is **NOT** "root": + +1. run-data-migration.sh +2. constants.js + +Both files are located in the "**`/ot-node/current/v8-data-migration/`**" directory.\ +\ +After that, execute the data migration script in the same directory by running: + +```bash +bash run-data-migration.sh +``` + +The data migration script migrates all triples from their V6 to V8 triple store repositories. Make sure you have configured RPCs properly for all the blockchains your node is connected to. Depending on how much data your node is hosting, this migration might last several hours to several days. + +{% hint style="info" %} +This migration will not influence your node reward performance. Your node will remain fully operational, and all pre-V8 Knowledge Assets will remain queryable via the GET protocol method. +{% endhint %} + +If the data migration is interrupted for any reason (e.g., server restart), simply re-run the script (make sure the script is not already running, check the [#restarting-terminating-the-data-migration](how-to-upgrade-to-v8.md#restarting-terminating-the-data-migration "mention") section for more information). It will automatically resume from the point where it was interrupted. + +### Tracking data migration progress + +To track the data migration progress, check the nohup.out file located at "**`/ot-node/data/nohup.out`**", for example, by using the command: + +```bash +tail -f nohup.out +``` + +If you want to analyze the logs, we suggest taking a look at the migration log file located at "/**`ot-node/data/data-migration/logs/migration.log`**". + +### Restarting/terminating the data migration + +If you want to restart the script, make sure you terminate the old process first. You can find out whether the old process is still running by entering this into the terminal: + +```bash +ps aux | grep v8-data-migration +``` + +If the old data migration process is still running, you should see an output like this: + +```bash +root 1046979 30.9 3.8 22159880 152996 pts/0 Sl 11:12 0:15 node v8-data-migration.js +root 1047113 0.0 0.0 8164 712 pts/0 S+ 11:13 0:00 grep --color=auto v8-data-migration +``` + +Run "**`kill `**" command on the node process and repeat the steps under the[#running-data-migration](how-to-upgrade-to-v8.md#running-data-migration "mention") section. + +{% hint style="warning" %} +We strongly encourage all node runners to update as soon as the release is out to ensure continued compatibility and to take advantage of the latest features and improvements. +{% endhint %} diff --git a/docs/dkg-knowledge-hub/learn-more/previous-updates/dkg-v8.0-update-guidebook/protocol-updates.md b/docs/dkg-knowledge-hub/learn-more/previous-updates/dkg-v8.0-update-guidebook/protocol-updates.md new file mode 100644 index 0000000..a530382 --- /dev/null +++ b/docs/dkg-knowledge-hub/learn-more/previous-updates/dkg-v8.0-update-guidebook/protocol-updates.md @@ -0,0 +1,122 @@ +--- +description: What's new on the infrastructure updates +--- + +# Protocol updates + +
+ +The following description of updates should be considered a summary. **It assumes some knowledge of how the DKG V6 system worked.** The details of each feature will be further explained in the dedicated documentation pages. + +## Scalability updates + +The major improvement V8 brings is orders of magnitude higher throughput. Improving from V6, which peaked at 100k Knowledge Assets per day, we expect OriginTrail V8 to be able to reach up to 1000x that number, enabling Internet-scale neuro-symbolic AI. + +The two major features enabling scalability are: + +* **Batch minting of Knowledge Assets**, which enables minting hundreds of Knowledge Assets in one transaction (compared to previously minting just a single Knowledge Asset), as well as easier creation of whole collections of Knowledge Assets. Together with the new Edge Node’s Knowledge Mining API, it will now be very easy to create hundreds of Knowledge Assets from various data inputs such as PDFs, CSVs, JSON, and many others. The capability comes together with an updated version of Knowledge Assets, explained further below. +* **Random sampling proof system**, which makes the protocol much less gas intensive through an optimized proof system and lower complexity of implementation. This means the: + * **V6 service agreements** are being replaced by a random sampling algorithm. This algorithm periodically proves randomized elements of the DKG by Core Nodes, significantly lowering the protocol blockchain footprint. **This will unlock significant capacity on all the blockchains connected to the DKG, significantly lowering protocol-required node transactions.** + * **Neighborhoods are superseded by sectors**: In V6, nodes have been organized in a p2p hash ring with fluid ranges called “neighborhoods”. A neighborhood was formed of 20 nodes closest in the hash ring (in terms of hash distance) to the Merkle root hash of the currently published Knowledge Asset. In V8, nodes are no longer grouped in neighborhoods, but rather in fixed “sectors”. A **sector** is implemented as a fixed set of Core Nodes (with hash distance no longer being relevant), and all the nodes active in V6 (with over 50k TRAC stake) will be automatically transferred to the initial V8 sector. Sectors will scale over time by fractal division (once a sector grows too big, it will be split in half). All nodes in a sector host the entire sector of the DKG. It is expected that the initial V8 sector will be the only sector and will not split for some time (several months or even years), and once more sectors are available on the network, it will be possible to have Core Nodes join multiple sectors. + * **Epochs**: Knowledge Asset lifetime and publishing fees will now follow synchronized epochs, rather than each Knowledge Asset service agreement having its own “timeline of epochs”. This makes the system simpler to track, implement, and understand. Together with this update, epochs in V8 will also be shortened from 3 months to 1 month. + * **Efficient reward collection**: Nodes will collect rewards more gas-efficiently, as they will be able to collect fees for multiple proofs at once, saving on gas costs. Node runners will be able to configure the frequency of reward collection for their node + +## New version of Knowledge Assets + +
+ +DKG V6 introduced Knowledge Assets for the first time - a revolutionary AI primitive initially implemented as 'containers of knowledge'. More specifically, a Knowledge Asset in DKG V6 is implemented as a binding between a triple store [named graph](https://en.wikipedia.org/wiki/Named_graph), containing as many triples as one wants, and an on-chain ERC721 NFT containing the Knowledge Asset Merkle root. The Knowledge Asset is addressed by the use of a special kind of ownable URI - a **Universal Asset Locator**, or UAL, which is provably globally unique, is a qualified Decentralized Identifier (DID), and is transferrable (via ERC721). + +Knowledge Assets have been deployed productively in [various contexts](https://origintrail.io/solutions/overview) and have yielded two key feedback points from builders: + +1. For most of the builders, it was more intuitive to think in terms of having **one Knowledge Asset correspond to one real-world entity** (building, train, wheel, song, etc.). Having the ability to put multiple entities in a “container of knowledge” (Knowledge Asset in V6) was often not used because of this intuition, and likely as it requires a deeper understanding of the underlying implementation, such as the concept of named graphs. +2. More importantly, from a semantic perspective, **pointing to a UAL of a named graph is much less practical than referring to a UAL of an actual knowledge graph (KG) entity** when building a knowledge graph or a paranet. + +Combining those two learnings with the scalability improvements, the DKG V8 brings an updated version of Knowledge Assets, which are now knowledge graph entities. Named graphs are now used to implement “Knowledge Collections” (equivalent to NFT collections) that can still be referenced by a UAL. Each resource in the graph becomes a Knowledge Asset, represented by an ERC1155Delta standard token on-chain. This satisfies both feedback points 1 and 2, making Knowledge Assets more intuitive, semantically useful, and easier to operate with. + +Given the scalability updates and the new paradigm introduced with Knowledge Assets in V8, it is expected that the cost of publishing per Knowledge Asset will go down significantly. + +## V8 staking updates + +
+ +The V8 staking system will also be improved in many ways. Specifically in terms of visibility and transparency of the system on-chain, as well as user experience and interface improvements. + +The V8 staking system is inspired by Uniswap V3 implementation, just as the V6 system was inspired by Uniswap V2 by adopting the concept of “**node share tokens**”. In V6, node share tokens are ERC20 tokens that represent the “share” of one’s delegated TRAC stake in a node. Share tokens are minted when TRAC is delegated to a node and burned when one withdraws it. + +In V8, the share tokens are going away, being replaced by a more optimal system. **However, this will require all V6 delegators to migrate their staking tokens**. The updated Staking Dashboard will facilitate this migration. The migration will require delegators to submit several transactions that will burn their share tokens and account for the stake in the new V8 smart contracts. The V8 staking mechanism remains fully non-custodial. + +**IMPORTANT**: Note that **this** **migration can be done at any time.** If you are a stake delegator in V6, your TRAC stake will still be accruing rewards and be accounted for in the contracts after the V8 update. The migration is only required for delegators to be able to interact with the new V8 system (to delegate TRAC, withdraw rewards, etc.). + +The new Staking Dashboard will also provide new statistics and ways to measure your TRAC stake performance, according to the updated tokenomics described in [OT-RFC-21](https://github.com/OriginTrail/OT-RFC-repository/blob/main/RFCs/OT-RFC-21_Collective_Neuro-Symbolic_AI/OT-RFC-21%20Collective%20Neuro-Symbolic%20AI.pdf), and will be upgraded, specifically during the Tuning period. + +## Core and Edge Nodes as neuro-symbolic AI systems + +
+ +The DKG V8 introduces a **new concept of a DKG Edge Node**, designed to run the DKG on edge devices such as laptops, phones, and others. DKG Core Nodes, on the other hand, are designed to host the DKG and form a secure network backbone. The new **Edge Node services include neuro-symbolic services such as:** + +* Knowledge Mining API, which facilitates Knowledge Asset graph creation pipelines +* DRAG API, which provides a framework to perform Decentralized Retrieval Augmented Generation (DRAG) on the DKG, +* Multi-modal LLM interfaces, +* A customizable UI, +* A publishing service, and more. + +**Edge and Core Nodes share most of their code and services**, and as a builder, you can extend and combine them in different ways, including creating your custom services either for your node or paranet. The key differences between Core and Edge Nodes are in their mode of operation: + +**DKG Edge Node:** + +* It is designed for privacy-preserving applications, where private Knowledge Assets, the use of local LLMs, and knowledge pipelines enable the utmost privacy-preserving environment. +* Is not intended to host the public DKG and, therefore, does not require a TRAC stake collateral. The Edge Node is, therefore, also not capable of collecting publishing fees and network rewards. +* Does not require high uptime, as does the DKG Core Node. +* Can be used as a substrate to quickly create neuro-symbolic AI applications based on the DKG paranets. +* Can be “converted” into a Core Node, assuming the right conditions (interesting for builders of paranets who would like to recuperate some of the publishing fees by running a Core Node). + +**DKG Core Node:** + +* It is designed to host the public DKG and provide high availability, so is incentivized to provide high uptime. +* Requires a minimum of 50,000 TRAC token stake to be eligible for joining the network, however, with a higher stake Core Nodes increase their reward chances. +* Can be used as a “gateway node” for publishing, and with a higher publishing rate, the chances of network rewards grow. +* Performs the network services for a fee, determined globally by the pricing mechanism from all the individual Core Node asks. However, nodes with lower asks (less “greedy” nodes) have a higher chance of receiving network rewards. + +More details on the system are provided in the [OT-RFC-21](https://github.com/OriginTrail/OT-RFC-repository/blob/main/RFCs/OT-RFC-21_Collective_Neuro-Symbolic_AI/OT-RFC-21%20Collective%20Neuro-Symbolic%20AI.pdf). + +For more information on the DKG Edge Node [visit this page](../../../../graveyard/everything/dkg-edge-node/), or see [this talk at DKGCon2024](https://youtu.be/9WeVMHH3gg4?t=5831). + +## Protocol pricing updates + +With the protocol tokenomics upgrade, the DKG publishing fee system is also being optimized. With learnings from V6 and removing neighborhood pricing complexities, the new system of “sectors” relies on a simpler and more robust implementation of pricing on the network. The new implementation is **expected to solve the reward “cliff effect”** (where only nodes with close to the maximum stake were receiving rewards), as well as solve some recurring problems with obtaining the right market price. + +\ +To determine the price of network service in V8 in a specific sector (V8.0 starts with only 1 sector), a sector-wide DKG publishing fee is computed based on individual node asks. To achieve a price equilibrium, the V8 pricing system utilizes the statistics-based sigma approach, determining the **publishing fee as a stake-weighted average ask of all Core Node asks within 1 standard deviation from the current price equilibrium**. That is a mouthful :), so here’s the formula: + +$$ +\text{serviceFee} = \frac{\sum \left( \text{nodeStake}_i \times \text{nodeAsk}_i \right)}{\sum \left( \text{nodeStake}_i \right)} +$$ + +for all nodes which satisfy the + +$$ +\text{serviceFee} - \sigma \leq \text{nodeAsk} \leq \text{serviceFee} + \sigma +$$ + +where $$\sigma$$ is the standard deviation across the population of all node asks. If a Core Node ask is outside of the bounds, it will, therefore, not influence the DKG service fee. + +This means the fee will also be available on-chain and can be easily read, eliminating the previously encountered “bid suggestion” issues that would occur in V6. + +{% hint style="info" %} +You can manage your Core Node ask in the new [Staking Dashboard](https://staking.origintrail.io). +{% endhint %} + +As a key implementation component requiring real economic conditions for proper testing, the pricing system will be improved during the **Tuning period**. + +### Upgraded tokenomics + +The [OT-RFC-21](https://github.com/OriginTrail/OT-RFC-repository/blob/main/RFCs/OT-RFC-21_Collective_Neuro-Symbolic_AI/OT-RFC-21%20Collective%20Neuro-Symbolic%20AI.pdf) has laid out the V8 updated tokenomics with additional factors incentivizing positive behavior on the network. It sets **4 key factors for Core Node performance:** + +* The amount of TRAC stake, +* The amount of publishing performed through the node, +* The node ask, and +* Total uptime. + +We encourage exploring the RFC further until the detailed documentation for the implementation is published. diff --git a/docs/dkg-knowledge-hub/learn-more/previous-updates/whats-new-with-origintrail-v8.md b/docs/dkg-knowledge-hub/learn-more/previous-updates/whats-new-with-origintrail-v8.md new file mode 100644 index 0000000..6c2989b --- /dev/null +++ b/docs/dkg-knowledge-hub/learn-more/previous-updates/whats-new-with-origintrail-v8.md @@ -0,0 +1,42 @@ +--- +description: >- + Bringing the power of DKG to any device, for anyone, on any chain at Internet + scale +--- + +# What's new with OriginTrail V8 + +
+ +OriginTrail Decentralized Knowledge Graph (DKG) V6 has been battle-tested in real-world applications increasingly used by an ecosystem of organizations and government-supported initiatives. To date, no decentralized system has scaled in the production environment the way DKG V6 has. However, the current capacity of DKG reached its limits to support the growing usage requirements, prompting a transition to the DKG V8, evolved to tackle the scale at which AI is consumed in any environment. + +Data has been growing exponentially for decades, with AI driving further growth acceleration — according to the latest estimates, [402.74 million terabytes of data are created each day](https://www.techbusinessnews.com.au/blog/402-74-million-terrabytes-of-data-is-created-every-day/). This trend is increasingly visible in the rising demand for additional DKG capacity, driven by data-intensive industry deployments in aerospace, manufacturing, railways, consumer goods, construction, and new solutions encompassing content from some of the most searched-for websites, fueling DKG growth. + +Version 8 of the DKG has therefore been designed **with major scalability improvements at multiple levels**, with a prototyped implementation tested in collaboration with partners from the sectors mentioned above. + +The major advancement that DKG V8 is making is in expanding the OriginTrail ecosystem’s product suite to 3 key products: + +* **DKG Core Node V8** — highly scalable network nodes forming the network core, persisting the public replicated DKG +* **DKG Edge Node V8** — user-friendly node applications tailored to edge devices (phones, laptops, etc.) +* **ChatDKG V8** — the launchpad for creating neuro-symbolic AI solutions with the DKG + +
+ +The newcomer in the product suite is the **DKG Edge Node** — a new type of DKG node enabling the OriginTrail ecosystem to tackle the global challenges described above. As the name suggests, DKG Edge Nodes can operate on Internet edge devices. Devices such as personal computers, mobile phones, wearables, IoT devices, but also enterprise and government systems are where we can find huge volumes of very important data activity that DKG Edge Nodes will enable to enter the AI age in a safe and privacy-preserving way. The DKG Edge Node will enable such sensitive data to remain protected on the device, giving owners full control over how their data is shared. + +Together with being protected on the device, Edge Node data becomes a part of the global DKG with precise access management permissions controlled by the data owner. In this way, AI applications that the owner allows data access to will be able to use it together with the public data in the DKG via Decentralized Retrieval Augmented Generation (dRAG). + +Since such AI applications can also be run locally on devices directly, this enables fully privacy-preserving AI solutions aimed at the ever-growing number of devices on the network edge that can use both public and private DKG data simultaneously. The introduction of the DKG Edge Node enables the DKG to quickly expand to be the largest, Internet-scale decentralized physical infrastructure network (DePIN). + +To unlock these powerful capabilities, DKG Edge Node includes new features that have previously not been available on DKG nodes but were elements of other proprietary or open-source products. To enable a seamless creation of knowledge, DKG nodes will inherit the proven knowledge publishing pipelines from the Network Operating System (nOS). The data protection techniques for private and sensitive data will be based on the NGI-funded OpenPKG project outcomes. The DKG Node Engine will support all major standards such as GS1 Digital Link, EPCIS, Verifiable Credentials, and Decentralized Identities. To support the growing field of knowledge graph implementations globally, it will enable seamless knowledge graph integrations of major knowledge graph providers such as Ontotext, Oracle, Snowflake, Neo4j, Amazon Neptune, and others. + +To launch the DKG V8 in the best way, the core developers have launched an **incentivized testnet program to which anyone can contribute and get rewarded in TRAC tokens.** + +### What are the new capabilities in DKG V8? + +Some of the new features and products in the DKG V8 include: + +* **DKG Edge Nodes:** Expanding the #DePIN capabilities of the DKG, OriginTrail V8 will enable anyone to run DKG-enabled trusted AI on their devices with **DKG Edge Nodes.** This is preserving privacy and opening up a plethora of new capabilities for builders. Your devices will participate in the global knowledge marketplace through DKG nodes on your devices, monetizing your knowledge for you, under your control, in the way you decide. Edge Nodes (formerly called "light" nodes) are capable of running your trusted AI agents, contributing to network capacity, robustness, and shared knowledge. +* **Random sampling proof system:** Through this upgraded proof system, we expect at least a 100x order of magnitude scalability improvement compared to V6 across all blockchains. Random sampling is an optimistic proof system that enables growing the DKG without growing the transaction volume on blockchains, as random sampling has time-constant efficiency on the blockchain layer. Details to be published in a more comprehensive RFC. +* **Knowledge Assets V2:** introducing batch minting (creating multiple Knowledge Assets in one transaction), together with improved network resolution, Knowledge Asset I/O operations will become both faster and much less gas intensive on-chain. + diff --git a/docs/dkg-knowledge-hub/learn-more/readme/README.md b/docs/dkg-knowledge-hub/learn-more/readme/README.md new file mode 100644 index 0000000..5176bb5 --- /dev/null +++ b/docs/dkg-knowledge-hub/learn-more/readme/README.md @@ -0,0 +1,25 @@ +--- +description: >- + Build trusted neuro-symbolic AI with Knowledge Assets and the Decentralized + Knowledge Graph +cover: ../../../.gitbook/assets/kk.png +coverY: 0 +--- + +# Understanding OriginTrail + +> _“Q. **How do you propose to do this?**_ +> +> _A. **By saving the knowledge of the race**. The sum of human knowing is beyond any one man; any thousand men. With the destruction of our social fabric, science will be broken into a million pieces. Individuals will know much of the exceedingly tiny facets of which there is to know. They will be helpless and useless by themselves. The bits of lore, meaningless, will not be passed on. They will be lost through the generations. **But, if we now prepare a giant summary of all knowledge, it will never be lost**. Coming generations will build on it, and will not have to rediscover it for themselves. One millennium will do the work of thirty thousand.”_ +> +> Hari Seldon, **Foundation series by Isaac Asimov (1951)** + +OriginTrail is building a verifiable knowledge layer for AI, where knowledge is traceable, memory is decentralized, and humans remain in control. It aims to achieve this by organizing all human knowledge in a **Decentralized Knowledge Graph (DKG)** through a **collective neuro-symbolic AI** approach. + +A collective neuro-symbolic AI combines structured and connected information from symbolic AI (DKG) with the creativity of neural AI technologies (LLMs), building a **robust decentralized AI infrastructure.** + +This provides a powerful substrate for **trusted, human-centric AI solutions** to tackle some of humanity's most pressing challenges. It also **drives AI agents’ autonomous memories and trusted intents**, as both AI agents and robots become potent enough to act on behalf of humans. + +### Choose your learning path + +
Cover image
Delegated stakingStake your TRAC tokens to support the network and earn rewards through delegated participation.Delegated Staking.png
DKG key concepts Explore the core ideas behind the DKG and learn how it enables trusted, verifiable, and AI-ready data.DKG key concepts.pngdkg-key-concepts.md
The DKG & MCPGet a quick look at the architecture and components that power the DKG.DKG under the hood.png
diff --git a/docs/dkg-knowledge-hub/learn-more/readme/decentralized-knowle-dge-graph-dkg.md b/docs/dkg-knowledge-hub/learn-more/readme/decentralized-knowle-dge-graph-dkg.md new file mode 100644 index 0000000..ed1bc0b --- /dev/null +++ b/docs/dkg-knowledge-hub/learn-more/readme/decentralized-knowle-dge-graph-dkg.md @@ -0,0 +1,80 @@ +--- +description: What Bitcoin did for money, OriginTrail is doing for knowledge +--- + +# The OriginTrail Decentralized Knowledge Graph (DKG) + +The OriginTrail Decentralized Knowledge Graph (DKG) is a **global decentralized data structure that interlinks Knowledge Assets in a semantic format (RDF)**, hosted on a permissionless peer-to-peer network. It enables a **verifiable knowledge layer** for artificial intelligence (AI) and other advanced applications. + +### Why a Decentralized Knowledge Graph + +Modern AI applications increasingly demand: + +* **Structured, contextualized memory** +* **Data integrity and provenance** +* **Cross-system interoperability** + +The DKG meets these needs by uniting the **trust layer of blockchains**, the **semantic expressiveness of knowledge graphs (symbolic AI),** and **state-of-the-art generative AI models (neural AI).** + +### Why Use Blockchain? + +Blockchains enable: + +* **Trustless verification:** Every claim is anchored to a consensus-verified state +* **Decentralized Computation**: Blockchains enable consensus based code execution (e.g. via smart contracts) on decentralized networks, with no single point of control, perfect for building decentralized protocols like OriginTrail. +* **Data integrity and auditability:** Through cryptographic hashing and timestamping of data records on a blockchain, making it possible to verifiably track the origin of records and their update trail +* **Tokenization -** enabling decentralized participation and support of the system through the TRAC token, as well as the ability to tokenize data through Knowledge Assets + +In the DKG, blockchain smart contracts handle: + +* Identity (via DIDs) +* Knowledge Asset ownership (via NFTs) and temporal state anchoring (via graph fingerprints) +* decentralized service agreements between nodes and DKG users (through publishing and staking) + +### Why Use Knowledge Graphs? + +Knowledge graphs are the best approach to managing knowledge due to their rich context, flexibility and data integration capabilities. Specifically, knowledge graphs: + +* Represent **relationships and context** between data +* Enable **semantic search and inferencing** +* Are **machine-readable and human-intelligible** +* Provide a foundation for interoperable data exchange + +The DKG leverages the RDF data model and SPARQL query language, aligning with W3C Semantic Web standards. + +### Why OriginTrail DKG combines Blockchains and Knowledge Graphs? + +While powerful on their own, blockchains and knowledge graphs are **exponentially more effective** when combined: + +* **Trust meets meaning** — Blockchains provide decentralized trust and data integrity, while knowledge graphs provide structured, meaningful context. +* **Immutable semantics** — Verifiable knowledge assets can be timestamped, hashed, and anchored on-chain, ensuring that their meaning is preserved over time. +* **Decentralized collaboration** — Communities or ecosystems can build shared knowledge bases without central authorities, enabled by smart contracts and semantic interoperability. +* **Discoverable and verifiable** — A knowledge graph indexed across a peer-to-peer network enables discovery of information, while blockchain ensures that what is found is provably authentic and untampered. + +This synergy creates a foundational infrastructure for the next generation of intelligent and verifiable applications, from AI agents to decentralized identity, supply chain, healthcare, and beyond. + +The DKG leverages the RDF data model and SPARQL query language, aligning with W3C Semantic Web standards. + +### How Does the DKG Support Neuro-Symbolic AI + +Neuro-symbolic AI combines the pattern recognition capabilities of neural networks with the reasoning capabilities of symbolic systems like knowledge graphs. The DKG enhances this hybrid approach in several critical ways: + +* **Contextual grounding** — Structured RDF data allows LLMs and other neural systems to operate with clearly defined entities and relationships, enriching prompt responses and grounding model outputs in verifiable facts. +* **Retrieval-augmented generation (RAG)** — SPARQL-compatible queries over the DKG can fetch symbolic facts at runtime, improving relevance and reducing hallucination in generated content. +* **Symbolic reasoning** — The DKG’s RDF-based structure supports formal reasoning, inference rules, and logic-based querying, which neural networks can’t natively perform. +* **Neural graph reasoning** — Neural-symbolic architectures that operate over graph structures benefit from the DKG’s richly connected semantic data, enabling advanced reasoning tasks like entity disambiguation, link prediction, and knowledge completion. +* **Memory and feedback loops** — Persistent, verifiable memory allows autonomous agents to build and refer to structured experiences over time. When shared across multiple agents through a common knowledge graph, this forms a **collective memory** — enabling agents to learn from each other, reason jointly, and contribute to a growing corpus of semantically interlinked knowledge. + +In short, the DKG is an essential infrastructure layer for building trusted, intelligent systems that integrate learning with logic. + +## System architecture + +OriginTrail synergizes blockchains, knowledge graphs (symbolic AI) and LLMs (neural AI) in a 3-layer architecture, where each layer is implemented as a decentralized network. + +**The trust layer leverages blockchains as trust networks,** established to enable reliable computation through **decentralized consensus**, operating as a global, dependable computer. It is used to track the origin of knowledge, its provenance, integrity and enable decentralized economic interactions in the system. + +**The knowledge base layer is the core of the DKG,** implemented as a peer to peer network of DKG Core nodes, **leveraging knowledge graphs serve as semantic data networks** for **knowledge management, tightly integrating with** the trust layer. + +**The verifiable AI layer** hosts AI agents and systems leveraging both the knowledge base and the trust layer underneath. This is the realm of the DKG Edge Nodes, as well as MCP and other integration enablers, with operations supported by the Knowledge Base and Trust layers below. + +

The three layers of OriginTrail

diff --git a/docs/dkg-knowledge-hub/learn-more/readme/development-principles.md b/docs/dkg-knowledge-hub/learn-more/readme/development-principles.md new file mode 100644 index 0000000..7d65b4a --- /dev/null +++ b/docs/dkg-knowledge-hub/learn-more/readme/development-principles.md @@ -0,0 +1,22 @@ +--- +description: How we guide the OriginTrail ecosystem development +--- + +# Development principles + +The OriginTrail ecosystem is built around 3 core pillars: + +**Neutrality** — Being an open-source, decentralized system based on open global standards, neutrality is crucial for the OriginTrail ecosystem as it prevents vendor lock-ins, ensures integrity, and effectively breaks data silos. Neutrality means adopting co-creation principles and working with other blockchain ecosystems and solution builders even as they may be competing in the same market on the application level. The technical design of OriginTrail applies neutrality at several levels - integrating with multiple blockchains to enable users to make choices on their preferred chain, enabling integrations with any system by applying open standards and building in open source. + +**Usability** — The technologies underpinning Web3 are fundamental, protocol-level technologies. In order to ensure smooth onboarding of users, enterprises, and developers, there needs to be a great focus on usability and user experience. OriginTrail today is being used within [global enterprises and institutions](https://origintrail.io/solutions/overview). + +**Inclusiveness** — Continuing to form partnerships with technological and business global leaders who can employ the OriginTrail ecosystem in their communities. Catering to the needs of leading global communities requires us to make strides in designing technical infrastructure and business models that support the adoption of OriginTrail in diverse business communities. + +Therefore OriginTrail technology development follows the below principles: + +* **Connection-first approach** — Building and utilizing synergetic technologies to connect the world’s data into a global, decentralized knowledge graph. +* **Technological neutrality** — Avoiding technological lock-ins and striving towards technical agnosticism where possible. +* **Decentralization** — Utilizing and creating technologies designed to prevent any central authorities from gaining power over the system. +* **Privacy-by-design approach** — According to the [7 Foundational Principles](https://privacy.ucsc.edu/resources/privacy-by-design---foundational-principles.pdf) of Privacy by Design +* **Development transparency** — Towards the OriginTrail ecosystem community of developers, node runners, and businesses. +* **Open-source development** — According to [Open Source Software](https://en.wikipedia.org/wiki/Open-source_model) principle diff --git a/docs/dkg-knowledge-hub/learn-more/readme/dkg-key-concepts.md b/docs/dkg-knowledge-hub/learn-more/readme/dkg-key-concepts.md new file mode 100644 index 0000000..9b25784 --- /dev/null +++ b/docs/dkg-knowledge-hub/learn-more/readme/dkg-key-concepts.md @@ -0,0 +1,120 @@ +--- +description: >- + The OriginTrail Decentralized Knowledge Graph (DKG) introduces novel concepts, + such as Knowledge Assets, autonomous paranets, and others. Find an overview of + key concepts below. +--- + +# Core DKG concepts + +{% embed url="https://youtu.be/rfvknL1gVDc" %} + +## Knowledge Assets + +A Knowledge Asset is an ownable knowledge graph entity in the DKG with a verifiable provenance and source. It can describe any digital or physical object, abstract concept, or really any "thing." It can easily connect to other Knowledge Assets in the OriginTrail DKG, enabling the building of sophisticated graph representations of the world (a.k.a. the World model). + +More precisely, a Knowledge Asset is a web resource identified by a unique Uniform Asset Locator (or UAL, which is an extension of the traditional URL), consisting of: + +* **Knowledge:** In the form of graph data (RDF) and vector embeddings, stored on the DKG (not on the blockchain). +* **Cryptographic proofs:** Representing cryptographic digests of the knowledge stored on the blockchain. +* **Uniform Asset Locator**: Globally unique URI with assigned ownership using blockchain accounts, implemented as a non-fungible token (NFT) on the blockchain. +* **Derivable vector embeddings**: These facilitate the neuro-symbolic features - such as link prediction, entity prediction, similarity search, and others. + + + +
+ +Knowledge content can be observed as a time series of knowledge content states or **assertions**. Each assertion can be independently verified for integrity, by recomputing the cryptographic fingerprint by the verifier and comparing if the computed result matches with the corresponding blockchain fingerprint record. + +Technically, an assertion is represented using the n-quads serialization and a cryptographic fingerprint (n-quads graph Merkle root, stored immutably on the blockchain) for assertion verification. + +**Knowledge Assets** can be both **public and private.** Public assertion data is replicated on the OriginTrail Decentralized Network and publicly available, while private assertion data is contained within the private domain of the asset owner (e.g., an OriginTrail node hosted by the asset owner, such as a person or company). + +In summary, a Knowledge Asset is a combination of an NFT record and a semantic record. Using the dkg.js SDK, you can perform CRUT (create, read, update, transfer) operations on Knowledge Assets, which are explained below in further detail. + +### Knowledge Asset state finality + +Similar to distributed databases, the OriginTrail DKG applies replication mechanisms and needs mechanisms to reach a consistent state on the network for Knowledge Assets. In OriginTrail DKG, state consistency is reconciled using the blockchain, which hosts state proofs for Knowledge Assets, and replication commit information from DKG nodes. This means that updates for an existing Knowledge Asset are accepted by the network nodes (similar to the way nodes accept Knowledge Assets on creation) and can operate with all accepted states. + +## TRAC token + +The Trace token (TRAC) is the utility token that powers the OriginTrail Decentralized Knowledge Graph (DKG). Introduced in 2018 as an ERC-20 token on Ethereum with a fixed supply of 500 million, TRAC is essential for various network operations. + +**How TRAC is used** + +* **Publishing & updating Knowledge Assets** – TRAC is required to create, update, and manage Knowledge Assets in the DKG. +* **Node incentives** – Network nodes earn TRAC by hosting data, securing the network, and ensuring knowledge integrity. +* **Staking** – Nodes stake TRAC to increase their reputation and improve their ability to participate in the network. +* **Multi-chain compatibility** – TRAC operates across multiple blockchains, including Ethereum, NeuroWeb, Base, Gnosis, and Polygon. + +## Decentralized Retrieval Augmented Generation + +Patrick Lewis coined the term Retrieval-Augmented Generation (RAG) in a [2020 paper](https://arxiv.org/pdf/2005.11401.pdf). It is a technique for enhancing the accuracy and reliability of GenAI models with facts fetched from external sources. This allows artificial intelligence (AI) solutions to dynamically fetch relevant information before the generation process, enhancing the accuracy of responses by limiting the generation to re-working the retrieved inputs. \ +\ +**Decentralized Retrieval Augmented Generation (dRAG) advances the model by organizing external sources in a DKG with verifiable sources made available for AI models to use.** The framework enables a hybrid AI system that brings together neural (e.g., LLMs) and symbolic (e.g., Knowledge Graph) methodologies. Contrary to using a solely neural AI approach based on vector embedding representations, a symbolic AI approach enhances it with the strength of Knowledge Graphs by introducing a basis in symbolic representations. + +dRAG is, therefore, a framework that allows AI solutions to tap into the strengths of both paradigms: + +* The powerful learning and generalization capabilities of neural networks, and +* The precise, rule-based processing of symbolic AI. + +It operates on two core components: + +(1) the DKG paranets and + +(2) AI models. + +The dRAG applications framework is entirely compatible with the existing techniques, tools, and RAG frameworks and supports all major data formats. + +## Knowledge mining + +**Knowledge mining** is the process of producing high-quality, blockchain-tracked knowledge for AI pioneered by the OriginTrail ecosystem. This cyclical process leverages the key component of the OriginTrail technology - Knowledge Assets - which are ownable containers for knowledge with inherent discoverability, connectivity, and data provenance. + +Similarly to Bitcoin mining, where miners collectively provide computing resources to the network and receive incentives in coins, knowledge miners contributing useful knowledge to the OriginTrail DKG receive NEURO tokens. With knowledge mining incentives enabled across multiple blockchains, the ambition is to drive exponential growth of trusted knowledge in the OriginTrail DKG. + +Read more about knowledge mining in the [NeuroWeb docs](https://docs.neuroweb.ai/knowledge-mining). + +## RDF & SPARQL + +The Resource Description Framework (RDF) is a W3C standardized model designed to represent data about physical objects and abstract concepts (resources). It’s a model to express relations between entities using a graph format. + +RDF schemas provide mechanisms for describing related resources and their relationships. It is similar to object-oriented programming languages and differs in that it describes properties in terms of resource classes. RDF enables querying via the SPARQL query language. + +[Examples of schema definitions by schema.org](https://schema.org/docs/schemas.html) + +## What is an NFT? + +NFT—short for the non-fungible token—is a type of blockchain token used as an implementation component of Knowledge Assets in the OriginTrail DKG. The token represents ownership of the Knowledge Asset and enables its owner to perform all standardized NFT functionality, such as transferring ownership, listing it on NFT marketplaces, and using it as a rich NFT in Web3 applications. + +If you are interested in learning more about NFTs, you can find out more [here](https://en.wikipedia.org/wiki/Non-fungible_token). + +## What is a UAL? + +Uniform Asset Locators (UALs) are ownable identifiers of the DKG, similar to URLs in the traditional web. The UALs follow the DID URL specification and are used to identify and locate a specific Knowledge Asset within the OriginTrail DKG. + +UAL consists of 5 parts: + +* did (decentralized identifier) predicate +* dkg (decentralized knowledge graph) predicate +* blockchain identifier (otp:2043 = OriginTrail NeuroWeb Mainnet) +* blockchain address (such as the address of the relevant asset NFT smart contract) +* identifier specific to the contract, such as the ID of the NFT token +* query and fragment components + +An example UAL may look like this: + +``` +did:dkg:otp:2043/0x5cac41237127f94c2d21dae0b14bfefa99880630/318322#color +``` + +This UAL refers to the DKG on the mainnet, its blockchain address is `0x5cac41237127f94c2d21dae0b14bfefa99880630` , the ID of the token is `318322`and to a property "color" inside its knowledge graph. + +More information on DID URLs can be found [here](https://www.w3.org/TR/did-core/#did-url-syntax). + +## Autonomus AI paranets + +The next building block of the DKG is **AI para-networks** or **paranets**. + +**AI para-networks** or **paranets** are autonomously operated structures in the DKG, owned by their community as a paranet operator. In paranets, we find **assemblies of Knowledge Assets** driving use cases with associated **paranet-specific AI services** and an **incentivization model** to reward knowledge miners fueling its growth. + +**To see the DKG in action, continue to the** [**Quickstart section**](broken-reference)**.** diff --git a/docs/dkg-knowledge-hub/learn-more/readme/kg.md b/docs/dkg-knowledge-hub/learn-more/readme/kg.md new file mode 100644 index 0000000..9c8c32c --- /dev/null +++ b/docs/dkg-knowledge-hub/learn-more/readme/kg.md @@ -0,0 +1,68 @@ +--- +description: The power of semantic technology +--- + +# Linked data & knowledge graphs + +## The challenges of scattered data + +Developers constantly struggle with **discovering, sharing,** and **managing data** from different systems in different formats. This requires understanding, structuring, integrating, and verifying the data each time new features or applications are built based on that data. + +In Web2, **discoverability** is enabled by search engines, which return a list of web links based on the query you search for. **Sharing** and **managing** data is governed by centralized services and protocols that do not share common data structures and interfaces, making it complicated to access and use this data. + +As a simple example, let's consider a traditional relational (SQL) database dataset like the one below: + +| id | user | address | GLN | +| --- | ------------ | ------------------- | --------- | +| 987 | ACME company | Awesome st 2044, NY | 123456789 | + +Another system might keep some more data on this in another format, such as a CSV: + +``` +company_name;company_address;email +ACME company; Awesome st 2044, NY; office@acme.com +``` + +As humans, we can quickly understand that this data is related to the same **thing** (ACME company). However, this is not so obvious to software, as it generally doesn't have enough context. To use the example data above in apps, one needs to resolve the challenges of having different data structures, schemas, and means of access — requiring many operations to understand, integrate, and validate the datasets. For example, how would you query for the email address of ACME company? This is where the Semantic Web helps. + +### What is linked data and the Semantic Web? + +> _"The Semantic Web isn't just about putting data on the web. It is about making links, so that a person or machine can explore the web of data. With linked data, when you have some of it, you can find other, related, data."_ \ +> _- Tim Berners-Lee, the father of the World Wide Web and Semantic Web_ + +The core idea behind linked data is to represent all **things** with **relationships** between them in a common graph. Linked data is built on primitives called "**triples",** which connect a **subject entity** with an **object entity** via a **relationship**. + +![Example of a triple, with subject being Acme company and object being the address value](../../../.gitbook/assets/01.jpg) + +Triples are great because they can be used to create more complex data structures — graphs. Roughly speaking, connecting two triples gives us this. + +![](../../../.gitbook/assets/2.jpg) + +Integrating the two above-mentioned example datasets according to the principles of the Semantic Web will, therefore, render a graph structure like this. + +![](../../../.gitbook/assets/3.jpg) + + + +Having such a "semantic network" of data, we inherently add context and enable easy extensions. The semantic graph can be easily queried in many ways and enables growing a body of _knowledge_ around things rather than keeping "tables of strings". + +In the coming sections, we will show you how to use the OriginTrail Decentralized Knowledge Graph (DKG) for data discovery and querying. However, let's first explain what a knowledge graph is. + +### What is a knowledge graph? + +There are many definitions of knowledge graphs (KGs), all slightly different. Without emphasizing precision, all of them describe a knowledge graph as a network of entities — physical & digital objects, events, or concepts — illustrating the relationship between them (aka a semantic network). KGs are used by major companies such as [Amazon](http://lunadong.com/talks/PG.pdf), [Google](https://en.wikipedia.org/wiki/Google_Knowledge_Graph), [Uber](https://www.youtube.com/watch?v=r3yMSl5NB_Q), [IBM](https://www.ibm.com/cloud/learn/knowledge-graph), etc., for various applications: search, data integration, knowledge reasoning, recommendation engines, analytics, machine learning, and AI, etc. + +Key characteristics of knowledge graphs are: + +* Focus on data connections as "first-class citizens" (linked data) +* Designed to ingest data from multiple sources, usually in different formats +* Flexible data models, easily extendable + +For the moment, we restrict this document only to a high-level introduction and encourage the reader to research resources on the Semantic Web and knowledge graphs available online. + +![An illustration of a Knowledge Graph and it's entities](https://lh4.googleusercontent.com/1rOyzC751vA96QcoPOOavfy_RlkKuEofhJ8M9I9KHSK_XCPuW5HxAvIioqSqFROkNMbEqD0Muq0yGKAHSA4ZqIYQgtsz-J0pmBJ64bzZARoHXOdMMNoA3VdD40yoTTLbhyyPfn4Rs7CS) + +**Knowledge graphs are commonly deployed within the domain of one organization and are designed to capture knowledge from various sources both from within and outside of the organization.** These centralized knowledge graphs generate huge value for their owners, yet a decentralized globally shared knowledge graph brings orders of magnitude higher value to everyone participating. + +We present the **OriginTrail Decentralized Knowledge Graph (DKG)** as the first permissionless, global, open decentralized knowledge graph. Learn more about the [OriginTrail DKG here](decentralized-knowle-dge-graph-dkg.md). + diff --git a/docs/dkg-knowledge-hub/learn-more/readme/usdtrac-token.md b/docs/dkg-knowledge-hub/learn-more/readme/usdtrac-token.md new file mode 100644 index 0000000..cae87aa --- /dev/null +++ b/docs/dkg-knowledge-hub/learn-more/readme/usdtrac-token.md @@ -0,0 +1,54 @@ +--- +description: >- + The TRAC token serves as the foundational utility asset of the OriginTrail + DKG, enabling secure, decentralized, and incentivized participation in the + network. +--- + +# $TRAC token + +## Powering the knowledge economy + +Launched in 2018 as an ERC-20 token on Ethereum, TRAC has a fixed supply of 500,000,000 tokens (all tokens are in circulation). + +#### The TRAC token is used for: + +* **Publishing fees**: Knowledge publishers (e.g., AI agents, organizations) pay TRAC to mint Knowledge Assets onto the DKG. +* **Staking**: Core Nodes and delegators stake TRAC to secure the network and earn a share of publishing fees (there are no inflationary rewards in the system, or any other rewards other than the publishing fees) +* **Reward distribution**: TRAC is used to reward nodes for uptime, data availability, and performance in the random sampling system. +* **Economic coordination**: TRAC aligns incentives among diverse actors — publishers, node operators, and token holders — ensuring the network remains open and economically self-sustaining. + +Learn more about the TRAC token [here](https://origintrail.io/technology/trac-token) + +*** + +### Running a DKG Node + +To run your DKG Node and interact with the blockchain, you’ll need **two types of tokens** — but **what they are depends on the network you’re connected to**. + +1. **TRAC** — This is the OriginTrail utility token, used to publish and manage Knowledge Assets on the DKG. Regardless of the network you choose, you’ll always need TRAC for publishing operations. [**TRAC (OriginTrail Token)**](https://origintrail.io/get-started/trac-token) + +
+ +2. **Native gas token** — Every blockchain also requires its own “gas” token to process transactions, similar to how Ethereum uses ETH. The specific token you need depends on which chain your DKG Node is connected to: + +* On **NeuroWeb**, the gas token is **NEURO**. +* On **Base**, the gas token is **ETH (on Base)**. +* On **Gnosis**, the gas token is **xDAI**. + +In other words: + +* **TRAC** powers the _knowledge layer_ of your node. +* **Gas tokens** power the _transaction layer_ of the underlying blockchain. + +Your node needs **both** to operate — TRAC to publish verifiable data and the network’s native token to actually send those transactions on-chain. + +*** + +#### Next step: Build your AI agent with the DKG Node or learn about staking in Origintrail + +Now that you understand what a DKG Node is and how it’s powered by $TRAC, you’re ready to take action. + +If you’d like to start building right away, jump ahead to the “[Build your AI agent with the DKG Node](https://app.gitbook.com/o/-McnF-Jcg4utndKcdeko/s/-McnEkhdd7JlySeckfHM/~/changes/408/build-your-ai-agent-with-the-dkg-node)” section — where you’ll set up, install, and configure your own DKG Node to connect with AI models. + +Or, if you want to learn more about tokenomics first, continue to “[Operator staking & delegated staking](https://app.gitbook.com/o/-McnF-Jcg4utndKcdeko/s/-McnEkhdd7JlySeckfHM/~/changes/408/get-started-with-dkg-node/usdtrac-powering-the-knowledge-economy/delegated-staking)” to explore how staking works across the OriginTrail ecosystem and how it powers trust, security, and participation. diff --git a/docs/dkg-knowledge-hub/useful-resources/README.md b/docs/dkg-knowledge-hub/useful-resources/README.md new file mode 100644 index 0000000..937ea6a --- /dev/null +++ b/docs/dkg-knowledge-hub/useful-resources/README.md @@ -0,0 +1,15 @@ +--- +description: >- + Access essential tools, services, and references for operating and building on + the DKG. +--- + +# Useful resources + +#### Pages in this section + +* [**Public nodes**](https://app.gitbook.com/o/-McnF-Jcg4utndKcdeko/s/-McnEkhdd7JlySeckfHM/~/changes/408/dkg-knowledge-hub/useful-resources/public-nodes) – A list of public nodes you can query or connect to for development and testing. +* [**Test token faucet**](https://app.gitbook.com/o/-McnF-Jcg4utndKcdeko/s/-McnEkhdd7JlySeckfHM/~/changes/408/dkg-knowledge-hub/useful-resources/test-token-faucet) – How to obtain free testnet tokens for publishing and running a node. +* [**Community-created resources**](https://app.gitbook.com/o/-McnF-Jcg4utndKcdeko/s/-McnEkhdd7JlySeckfHM/~/changes/408/dkg-knowledge-hub/useful-resources/community-resources) – Explore open-source tools, libraries, and projects built by the OriginTrail community. +* [**Available networks & RPCs**](https://app.gitbook.com/o/-McnF-Jcg4utndKcdeko/s/-McnEkhdd7JlySeckfHM/~/changes/408/dkg-knowledge-hub/useful-resources/networks) – Full details on supported networks, endpoints, and RPC configurations. +* [**OT Node Engine implementation details**](https://app.gitbook.com/o/-McnF-Jcg4utndKcdeko/s/-McnEkhdd7JlySeckfHM/~/changes/408/dkg-knowledge-hub/useful-resources/ot-node-engine-implementation-details) – A deeper technical dive into how the OT Node engine works and interacts with the DKG. diff --git a/docs/dkg-knowledge-hub/useful-resources/community-resources.md b/docs/dkg-knowledge-hub/useful-resources/community-resources.md new file mode 100644 index 0000000..57428e0 --- /dev/null +++ b/docs/dkg-knowledge-hub/useful-resources/community-resources.md @@ -0,0 +1,19 @@ +--- +description: Tools and apps developed by the OriginTrail community +--- + +# Community created resources + +OriginTrail is a community-driven project with lots of great community-created resources. A shoutout to all the Tracers contributing to the project! + +## Tools + +* [OTHub](https://othub.io/overview) — OriginTrail monitoring hub developed by the community + +## Websites + +* [TRAC Deep Dive ](https://deepdive.othub.io/)— All-in-one central hub and OriginTrail deep dive maintained by the community + +{% hint style="info" %} +Want your resource featured here, or are aware of something that should be part of this list? Feel free to add it yourself by creating a pull request to this page on [GitHub](https://github.com/OriginTrail/dkg-docs) +{% endhint %} diff --git a/docs/dkg-knowledge-hub/useful-resources/networks.md b/docs/dkg-knowledge-hub/useful-resources/networks.md new file mode 100644 index 0000000..2193a49 --- /dev/null +++ b/docs/dkg-knowledge-hub/useful-resources/networks.md @@ -0,0 +1,24 @@ +# Available networks, network details and RPCs + +### DKG Mainnet + +| Network Name | RPC URL | Chain ID | Currency Symbol | Block Explorer URL | +| ---------------- | ---------------------------------------------------- | -------- | --------------- | ---------------------------- | +| Base Mainnet | https://base-mainnet.infura.io/v3/YOUR-PROJECT-ID | 8453 | ETH | https://basescan.org | +| Gnosis Mainnet | https://rpc.gnosischain.com | 100 | xDAI | https://gnosisscan.io | +| NeuroWeb Mainnet | https://astrosat-parachain-rpc.origin-trail.network/ | 2043 | MNEURO | https://neuroweb.subscan.io/ | + +### DKG Testnet + +| Network Name | RPC URL | Chain ID | Currency Symbol | Block Explorer URL | +| ---------------- | ------------------------------------------ | -------- | --------------- | ------------------------------------ | +| Base Sepolia | https://sepolia.base.org | 84532 | ETH | https://sepolia.basescan.org | +| Gnosis Chiado | https://rpc.chiadochain.net | 10200 | xDAI | https://blockscout.chiadochain.net | +| NeuroWeb Testnet | https://lofar-testnet.origin-trail.network | 20430 | MNEURO | https://neuroweb-testnet.subscan.io/ | + +{% hint style="info" %} +If you want to play around and test out the DKG, we recommend using the DKG Testnet. For that, you will need test TRAC and test NEURO tokens, which you can obtain by joining the OriginTrail [Discord](https://discord.com/invite/FCgYk2S) and [requesting them from our Faucet Bot](test-token-faucet.md). + +You can also run a local network on your machine. To run a local network, please refer to the [Local network setup page](../../build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-sdk/setting-up-your-development-environment.md) +{% endhint %} + diff --git a/docs/dkg-knowledge-hub/useful-resources/ot-node-engine-implementation-details/README.md b/docs/dkg-knowledge-hub/useful-resources/ot-node-engine-implementation-details/README.md new file mode 100644 index 0000000..26d0b3f --- /dev/null +++ b/docs/dkg-knowledge-hub/useful-resources/ot-node-engine-implementation-details/README.md @@ -0,0 +1,9 @@ +--- +hidden: true +--- + +# DKG Engine implementation details + +Multiple implementations in various programming languages of ot-nodes are planned, with the current implementation in JavaScript [available here](https://github.com/origintrail/ot-node). + +Learn more about [ot-node modules here](modules.md) and the [command system](command-executor.md) implementation here. diff --git a/docs/dkg-knowledge-hub/useful-resources/ot-node-engine-implementation-details/command-executor.md b/docs/dkg-knowledge-hub/useful-resources/ot-node-engine-implementation-details/command-executor.md new file mode 100644 index 0000000..39d4dd9 --- /dev/null +++ b/docs/dkg-knowledge-hub/useful-resources/ot-node-engine-implementation-details/command-executor.md @@ -0,0 +1,223 @@ +--- +description: The engine behind the OT-Node application. +--- + +# Command Executor + +## Introduction + +The Command Executor is a component of the ot-node implementation, which uses an approach similar to the event sourcing pattern. Essentially, it allows developers to organize functionalities (code) in "commands" that can be executed in sequence to implement the protocol features. It also enables system recovery in case of the node stopping or restarting for some reason. + +## Commands + +The Command Executor splits business logic into **commands**. A command is a general abstraction with many features that can be enabled. The Command Interface is described in the table below. + +| **Method** | **Description** | +| ------------------------------------------------------------------- | ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | +| async execute() | Executes command and produces zero or more events | +| async recover(command, err) | Recover system from failure | +| async expired(command) | Execute strategy when the command is too late | +| async retryFinished(command) | This method is executed when retry command counter reaches 0 | +| pack(data) | Packs data for the database | +| unpack(data) | Unpacks data from the database | +| continueSequence(data, sequence, opts) | Makes command from the sequence and continues the execution | +| async handleError(operationId, errorMessage, errorName, markFailed) | Error handler for command. If an error pops up during the execution of a command, operation status is set to **FAILED** with the appropriate operation error message. List of operation errors can be found in the **\*/src/constants/constants.js** file. | + +_Table 1.1 Command interface_ + +Creating a command is done by extending an abstract class called simply **Command.** This means it inherits the default behavior of all the methods and can override them with specific behavior. + +The core command method is the **execute** method. This method executes the code of the command and returns one of the three results: + +* **this.continueSequence(data,sequence,opts)** — A list of commands taken from the execution context that will be executed after the current command is finished successfully. + * Returns **Command.empty()** if the current command is the last one in the sequence. +* **Command.repeat()** — Command object with the **repeat** flag set to **true.** That means that the command will be executed once again. +* **Command.retry()** — Command object with the **retry** flag set to **true**. That means that the command will be executed again, and the retried counter will be decreased. + +Command data describes everything that is related to the specific command. This is described in the table below. + +| **Parameter** | **Description** | +| ------------- | ---------------------------------------------------------------------------------------------------------------------------------- | +| id | Command id (uuid) | +| name | Command name (example: helloCommand) | +| data | Command data needed for execution | +| ready\_at | Time in milliseconds when the command is ready for execution | +| delay | Initial delay in milliseconds for command execution | +| started\_at | Time in milliseconds when the command has started | +| deadline\_at | Future time in milliseconds until the command needs to be executed | +| period | If the command is repeatable (_repeat=true_), this is the interval time in milliseconds | +| status |

Command status:

  • Failed
  • Expired
  • Started
  • Pending
  • Completed
  • Repeating
| +| message | Proprietary message for the command. This is useful if the command has failed | +| parent\_id | Command can have its parent. This is the parent command ID | +| transactional | Command can be transactional or not | +| retries | If the command fails this is the number of times the command can retry to execute again | +| sequence | Command can carry information about the future commands that can be executed after successful completion (chain of commands) | + +_Table 1.2 Command data parameters_ + +## Command Executor and dependency injection + +Command executor is initialized on ot-node start. Commands are stored in the **\*/src/commands** directory. Commands will be injected into [Awilix](https://www.npmjs.com/package/awilix) automatically. The naming convention of the command is in camel case, and the name of the file where the command is described by using slashes(kebab case). In the following section, we will create a simple command called `PublishStartedCommand`. + +### PublishStartedCommand + +Let’s create a simple `PublishStartedCommand` and call it in`handleHttpApiPublishRequest` controller method that handles the asset publishing request. + +{% code title="publish-started-command.js" %} +```javascript +import Command from "./command.js"; + +class PublishStartedCommand extends Command { + constructor(ctx){ + super(ctx); +} + + async execute(command){ + const {name} = command.data; + + this.logger.info(`Hello from ${name}`); + return this.continueSequence( + { ...command.data, retry: undefined, period: undefined}, + command.sequence, + ); + } + + default(map) { + const command = { + name: 'publishStartedCommand', + transactional: false + } + Object.assign(command,map); + return command; + } +} +export default PublishStartedCommand; +``` +{% endcode %} + +`PublishStartedCommand` will be called before the passed assertion is validated and propagated to the network. + +```javascript +async handleHttpApiPublishRequest(req, res) { + try{ + /** + Code responsible for creating operation record and notifying the client + that the operation started is intentionaly left out for simplicity reasons. + + You can see the whole implementaton here: + https://github.com/OriginTrail/ot-node/blob/1181ec828bba51aa7f115e30c48458352199c67b/src/controller/v1/publish-controller.js#L18 + + **/ + const commandData = { + ...req.body, + name: 'publishController', + operationId, + }; + + //adding the publishStartedCommand to the command sequence + const commandSequence = [ + 'publishStartedCommand', + 'validateAssertionCommand', + 'networkPublishCommand']; + + await this.commandExecutor.add({ + name: commandSequence[0], + sequence: commandSequence.slice(1), + delay: 0, + period: 5000, + retries: 3, + data: commandData, + transactional: false, + }); + } catch (error) { + /** + ... error handling + /** + } +} +``` + +This is the simplest command that logs the **name** given in the data parameter. After this command is finished, the Command Executor continues with the next command in the sequence. In this case, it continues with the execution of `validateAssertionCommand` and `networkPublishCommand`. + +If we want to return some new command or list of commands, the return statement will look like this: + +```javascript +return { + commands: [ + { + name: 'someCommand', + data: { + param: 'value' + }, + transactional: false + } + ] +} +``` + +In order to make this new command repetitive and add a delay for example, we would add these parameters to the command. The code snippet will look like this: + +```javascript +return { + commands: [ + { + name: 'someCommand', + data: { + param: 'value' + }, + delay: 10000, + period: 5000, + transactional: false + } + ] +} +``` + +## Command Executor API + +Command Executor API is simple, and it looks like this: + +```javascript +/** +* Initialize executor +* @returns {Promise} +*/ +async init() {...} + +/** +* Starts the command executor +* @return {Promise} +*/ +async start() {...} + +/** +* Adds single command to queue +* @param addCommand +* @param addDelay +* @param insert +*/ +async add(addCommand, addDelay = 0, insert = true) {...} + +/** +* Replays pending commands from the database +* @returns {Promise} +*/ +async replay() {...} +``` + +One of the key methods of the API is **add(),** which is responsible for adding new commands to the array of commands for command-executor to carry out. Such a call would look like this: + +```javascript +await this.commandExecutor.add({ + name: 'nameOfTheCommand', + delay:45000, + data, + transactional: false +}) +``` + +The **start()** command starts the executor and all the repetitive commands listed in the **constants.PERMANENT\_COMMANDS**. Those commands will be executed permanently throughout the node execution. + +### Need help with the Command Executor? + +Jump into our [Discord](https://discord.gg/xCaY7hvNwD), and someone from the OriginTrail community of developers will gladly help! diff --git a/docs/dkg-knowledge-hub/useful-resources/ot-node-engine-implementation-details/modules.md b/docs/dkg-knowledge-hub/useful-resources/ot-node-engine-implementation-details/modules.md new file mode 100644 index 0000000..70cae19 --- /dev/null +++ b/docs/dkg-knowledge-hub/useful-resources/ot-node-engine-implementation-details/modules.md @@ -0,0 +1,50 @@ +--- +description: OriginTrail node building blocks +--- + +# Modules + +OT-node is composed of various modules that can easily be added or removed. It's even possible to have different implementations of the same module, making the ot-node very flexible. + +As soon as you run the node, module configurations are picked up from the **config.json** file and used for module initialization. All modules must be initialized for a node to run properly. + +## Module types + +### The blockchain module + +The ot-node blockchain module enables interactions with multiple blockchains in order to perform operations of the OriginTrail protocol. + +### Triple store module + +The entire DKG state is replicated and sharded across the ODN network. Each node persists its designated replicas in its individual local triple stores (graph databases). The triple store module is responsible for connecting and communicating to a triple store avaiable to the ot-node. There are several triple-store implementations supported directly at the moment: + +* [Ontotext GraphDB](https://www.ontotext.com/products/graphdb/) +* [Blazegraph](https://blazegraph.com/) +* [Apache Jena Fuseki](https://jena.apache.org/documentation/fuseki2/) + +However, due to the standard implementation (W3C RDF / SPARQL standards), it is easy to integrate with any standardized RDF triple store. + +The triple store module utilizes [the Communica framework](https://comunica.dev/) under the hood. + +### Validation module + +The validation module is used to validate assertions seen on the network. More information on assertions can be found [here.](https://docs.origintrail.io/key-concepts/dkg-key-concepts#knowledge-assets) + +### Auto-updater module + +If enabled, this module automatically updates the local node. Every 15 minutes, it checks if the remote (git) version of ot-node is different from the local version. If it is, the local node will be updated to the latest version of the code. After a successful update, the node will restart and start running on the latest version. + +### Network module + +The network module takes care of all network RPC communication between nodes in the DKG. It is utilized for implementing the necessary protocol choreographies such as the publishing choreography. Under the hood, it is utilizing an implementation of [_Kademlia_](https://en.wikipedia.org/wiki/Kademlia). + +#### HTTP client module + +Client applications use HTTP requests to communicate with the node. The HTTP client module implements the endpoints at which requests can be made and defines the ot-node's response to HTTP requests. + +### Repository module + +The repository module is responsible for establishing connections with the operational database and storing, updating, and deleting commands and operations data. Storing commands and operations states makes the node fault tolerant, making it possible to, in, e.g., the case of node restart, continue from the last preserved state of operation and continue working from where it left off. + +We expect additional modules to be added in the future based on the evolution of the DKG implementations. + diff --git a/docs/dkg-knowledge-hub/useful-resources/public-nodes.md b/docs/dkg-knowledge-hub/useful-resources/public-nodes.md new file mode 100644 index 0000000..3e95760 --- /dev/null +++ b/docs/dkg-knowledge-hub/useful-resources/public-nodes.md @@ -0,0 +1,18 @@ +--- +hidden: true +--- + +# Public nodes + +If you do not already have a DKG node set up, you can use a public node that the OriginTrail team set up so that everyone has an easy way to interact with the DKG. + +There's one public node available for the mainnet and one for the testnet: + +* **Mainnet public node:** [https://positron.origin-trail.network](https://positron.origin-trail.network/) +* **Testnet public nodes :** [https://v6-pegasus-node-02.origin-trail.network](https://v6-pegasus-node-02.origin-trail.network) and [https://v6-pegasus-node-03.origin-trail.network](https://v6-pegasus-node-03.origin-trail.network) + +All blockchains are supported on each of the nodes. + +{% hint style="info" %} +Mainnet is the live blockchain for real transactions, while testnet is a risk-free testing environment. +{% endhint %} diff --git a/docs/dkg-knowledge-hub/useful-resources/test-token-faucet.md b/docs/dkg-knowledge-hub/useful-resources/test-token-faucet.md new file mode 100644 index 0000000..8b59fe5 --- /dev/null +++ b/docs/dkg-knowledge-hub/useful-resources/test-token-faucet.md @@ -0,0 +1,34 @@ +--- +description: Learn how to get testnet tokens from the OriginTrail Discord faucet bot +--- + +# Test token faucet + +The OriginTrail Decentralized Knowledge Graph (DKG) provides a testing environment on the NeuroWeb testnet, Gnosis Chiado, and Base Sepolia blockchains. To perform various blockchain operations on these testnets, users need both **test TRAC on the chosen network** and the **test utility token** of their chosen blockchain for gas. + +The **OriginTrail faucet service**, which provides test tokens, is deployed on the [**OriginTrail Discord server**](https://discord.com/invite/WaeSb5Mxj6) and located in the [**#faucet-bot**](https://discord.com/invite/WaeSb5Mxj6) channel. + +To view the available faucet options, run the following command in the chat of the **#faucet-bot** channel: + +``` +!help +``` + +The output will look like the one shown below: + +

Available Faucet commands

+ +Currently, depending on your requirements, you can request tokens for the following blockchains: + +1. **NeuroWeb:** test TRAC and NEURO +2. **Gnosis Chiado:** test TRAC and xDAI +3. **Base Sepolia:** test TRAC + +{% hint style="success" %} +**NeuroWeb:** Make sure to run the NEURO request command (!fundme\_neuroweb) first to initialize your wallet on the NeuroWeb blockchain. Once you receive NEURO in your wallet, you can run the TRAC request command. If your wallet isn't initialized on NeuroWeb, you will not be able to receive TRAC.\ +\ +**Base Sepolia:** Please refer to the official Base [documentation](https://docs.base.org/docs/tools/network-faucets/) for instructions on acquiring test ETH as gas for the Base Sepolia blockchain. +{% endhint %} + +If you experience any issues with the Faucet Bot, please tag the core developers in one of the Discord channels. + diff --git a/docs/getting-started/basic-knowledge-asset-operations.md b/docs/getting-started/basic-knowledge-asset-operations.md new file mode 100644 index 0000000..4c7d827 --- /dev/null +++ b/docs/getting-started/basic-knowledge-asset-operations.md @@ -0,0 +1,50 @@ +--- +description: >- + A step-by-step guide to publishing new Knowledge Assets into the DKG and + retrieving existing ones. Understand how to structure, verify, and query + verifiable data for use in AI, apps, or research. +--- + +# Basic Knowledge Asset operations + +## **Creating and retrieving your first Knowledge Asset** + +This simple exercise demonstrates the basic end-to-end flow of the DKG - from AI-assisted publishing to knowledge retrieval (something like "Hello world"). I + +### Create your first Knowledge Assets + +In the **DKG Node UI**, send the agent this prompt with JSON-LD you want to publish: + +``` +Create this Knowledge Asset on the DKG for me: + +{ + "@context": "https://schema.org/", + "@type": "CreativeWork", + "@id": "urn:first-dkg-ka:info:hello-dkg", + "name": "Hello DKG", + "description": "My first Knowledge Asset on the Decentralized Knowledge Graph!" +} +``` + +
+ +When asked, **allow** the agent to use the **“DKG Knowledge Asset create”** tool. + +* The agent will publish your KA and return its **UAL**. +* Depending on the blockchain used and network load, publishing may take **\~10-30s**. + +_(Insert screenshot of successful KA publish here)_ + +### Retrieve your Knowledge Asset data + +Use the basic retrieval tool by asking the agent to retrieve the knowledge asset you just published by its UAL: + +``` +Get this Knowledge Asset from the DKG and summarize it for me: + +``` + +You should see the retrieved JSON-LD and a friendly summary. + +_(Insert screenshot of successful retrieval/summary here)_ diff --git a/docs/getting-started/decentralized-knowle-dge-graph-dkg.md b/docs/getting-started/decentralized-knowle-dge-graph-dkg.md new file mode 100644 index 0000000..4c0581c --- /dev/null +++ b/docs/getting-started/decentralized-knowle-dge-graph-dkg.md @@ -0,0 +1,218 @@ +# Installation + +{% hint style="info" %} +If you are new to OriginTrail, DKG, knowledge graphs or blockchains we highly recommend you get familiarized with the [Key concepts](../dkg-key-concepts.md) before proceeding. +{% endhint %} + +### What are we installing today? + +You will be installing the **DKG Edge Node on your machine.** You can think of the DKG Edge Node as a framework for building neuro-symbolic AI agents based on the OriginTrail Decentralized Knowledge Graph. When you finish, a basic DKG based neuro-symbolic agent will be accessible through a user interface (chatbot), but also via MCP and a classic API. You will be able to customize it to your will by building plugins and extensions. + +To install the **DKG Edge Node**, we will be using the DKG CLI (`dkg-cli`) - a streamlined command-line interface that automates much of the process and project maintainance. The CLI will automatically install all **DKG Edge Node** components (DKG Agent, DKG Engine), including databases - by default MySQL, Redis and Blazegraph. + +### The DKG utilizes blockchain + +The DKG Network utilizes blockchains as a trusted environment for incentivisation and securing data exchanges. It's a multichain network, so DKG Nodes support 3 blockchains, but can currently be deployed on a **single blockchain at a time** (multichain deployment support is on the way). + +If you're not too familiar with blockchain technology, and not sure which blockchain to pick to get started with the DKG Node, which one is better for you etc - don't worry, a default blockchain will be chosen for you and you will be able to learn as you go (the DKG Node abstracts a lot of the complexities of blockchain for you). You shouldn't notice a big differences between blockchains while you are in development - this choice matters most when you are ready for your DKG Node deployment to mainnet. + +For now you need to know the following: + +* a DKG node requires a blockchain to run +* because of that, it will require a set of crypto tokens - **TRAC** for DKG features, and the native token of the chosen blockchain (if you choose the default, it will be NEURO as the native token of Neuroweb) +* this will not cost you - the development setup leverages the **DKG testnet,** which utilizes _test tokens_ which have no economic value and you can get for free (more info below). To run an agent on the **DKG Mainnet however**, you will need _"real"_ tokens + +### What do you need for the installation? + +* A **macOS** or **Linux** machine with at least 8GB RAM and 20GB storage space (Windows version is on the way) +* Node.js **v22.20.0** or higher installed +* About 15-30 minutes of your time to complete all the steps + +### OK, let's go! + +#### 1. First install the DKG-CLI + +```sh +npm install -g dkg-cli +``` + +#### 2. Generate the DKG Node Configuration + +Your DKG Node allows for rich configuration (more on that in the **Configuration** section later), however this setup focuses on a minimal default configuration. + +We recommend setting up your project folder and starting with the default development setup on DKG testnet. + +```bash +# Make a new folder for your DKG Node project in your working directory +mkdir my_dkg_node +cd my_dkg_node + +# If you're just getting started with the default blockchain - Neuroweb Testnet +dkg-cli setup-config --testnet + +# Advanced: Pick a blockchain and environment. These are all the possible options for each blockchain +# dkg-cli setup-config --testnet --neuroweb # NeuroWeb testnet +# dkg-cli setup-config --testnet --base # Base testnet +# dkg-cli setup-config --testnet --gnosis # Gnosis testnet +# dkg-cli setup-config --mainnet --neuroweb # NeuroWeb mainnet +# dkg-cli setup-config --mainnet --base # Base mainnet +# dkg-cli setup-config --mainnet --gnosis # Gnosis mainnet +``` + +This will create an `.env` file with pre-generated DKG Node blockchain keys (learn more about DKG Node keys [here](../dkg-knowledge-hub/learn-more/node-keys-wallets.md)). This is done for convenience, you can change those keys if you'd like + +All DKG node wallets require native blockchain tokens, while the publishing wallet also requires TRAC tokens. Wallets generated by the setup command will be presented to you (as shown in the image below) + +
+ +#### 3. Funding wallets + +As mentioned previously, your DKG Node requires tokens to be able to create Knowledge Assets. + +**To get tokens for DKG testnet, use the** [**testnet token faucet**](../dkg-knowledge-hub/useful-resources/test-token-faucet.md)**.** For DKG Mainnet deployments we suggest visiting the [TRAC token](https://origintrail.io/technology/trac-token) page to check for its availability. + +{% hint style="warning" %} +Make sure to fund your node keys with tokens before running the `dkg-cli install` command, otherwise your DKG node might not function correctly. +{% endhint %} + +Here's an overview of supported blockchains and the needed tokens per key type. + +| Blockchain | Operational keys | Publish keys | +| ---------- | ---------------- | ------------ | +| NeuroWeb | NEURO | NEURO + TRAC | +| Base | ETH | ETH + TRAC | +| Gnosis | xDAI | xDAI + TRAC | + +#### 4. Installing your DKG Node + +```sh +dkg-cli install +``` + +{% hint style="success" %} +The installation can take a few minutes. It installs the DKG Node in the same directory from which you ran both `setup-config` and `dkg-cli install` commands. +{% endhint %} + +#### 5. Configure your DKG Agent + +Run the agent setup script to enable LLM features. You'll be prompted for your LLM provider, API key, model name, and DKG environment (must match your setup-config choice: testnet or mainnet). The agent supports multiple providers; examples listed below. + +```sh +cd /dkg-node/apps/agent +npm run build:scripts +npm run script:setup +``` + +DKG Node supports various LLM providers. Some examples include: + +| Provider | API Key Link | +| -------------------------- | ------------------------------------------------------------------------------------ | +| **OpenAI** | [https://platform.openai.com/api-keys](https://platform.openai.com/api-keys) | +| **Anthropic** (Claude) | [https://console.anthropic.com/](https://console.anthropic.com/) | +| **Groq** (Fast, free tier) | [https://console.groq.com/keys](https://console.groq.com/keys) | +| **Google GenAI** | [https://makersuite.google.com/app/apikey](https://makersuite.google.com/app/apikey) | +| **Mistral AI** | [https://console.mistral.ai/](https://console.mistral.ai/) | +| **xAI** | [https://x.ai/](https://x.ai/) | + +#### **6. Start DKG Agent (MCP Server)** + +Once all services are up and running, you can start the **DKG Agent (MCP Server)**.\ +To do so, navigate to the `dkg-node` directory and execute the following command: + +```sh +npm run dev +``` + +This starts the DKG Agent (MCP server) in **developmen**t mode and exposes the Web UI. + +{% hint style="success" %} +Complete production deployment instructions are to be provided soon, including Linux server deployments, SSL/HTTPS configuration, systemd units, and other production setup requirements. +{% endhint %} + +#### 7. UI & API access + +Once the `npm run dev` command has been executed, the **DKG Agent** (MCP Server) will start, and the **Web UI** will become available for access. You can then interact with the system through both the web interface and the API as described below. + +* Local macOS or Linux: [http://localhost:8081](http://localhost:8081/) +* Default login: `admin@example.com` / `admin123` + +#### 8. DKG Node management + +This section covers the key aspects of managing your DKG Node, including **DKG CLI commands**, **security and networking guidelines**, and **user management**. It provides an overview of the core services, their default ports, and tools for controlling, monitoring, and maintaining your DKG Node environment. + +#### 8.1 DKG Node controls (using dkg-cli) + +The DKG CLI provides essential commands for managing your DKG Node services, including **dkg-engine**, **Blazegraph**, and **MySQL**. These tools allow you to control, monitor, and inspect service statuses and logs efficiently. + +```sh +# Checks the status of all DKG Node services (dkg-engine, Blazegraph, or MySQL) +dkg-cli status + +# Starts the DKG Node (and all its services - dkg-engine, Blazegraph, MySQL etc) +# Automatically detects your OS and uses the appropriate service manager (systemd on Linux, pm2/brew on macOS) +dkg-cli start + +# Start or stop the specified service +dkg-cli start dkg-engine +dkg-cli start blazegraph +dkg-cli start mysql +dkg-cli stop dkg-engine +dkg-cli stop blazegraph +dkg-cli stop mysql + +# Restarts all DKG Node services +dkg-cli restart + +# Restart a specific service +dkg-cli restart dkg-engine +dkg-cli restart blazegraph +dkg-cli restart mysql + +# View logs (live) +dkg-cli logs dkg-engine --follow +dkg-cli logs blazegraph --follow +dkg-cli logs mysql --follow + +# View last N lines (e.g., 50 lines) +dkg-cli logs dkg-engine -n 50 +dkg-cli logs blazegraph -n 50 +dkg-cli logs mysql -n 50 + +# Show help +dkg-cli --help +``` + +{% hint style="success" %} +All commands work from any directory and automatically detect your operating system to use the appropriate service manager. +{% endhint %} + +#### 8.2 Creating new users + +A `createUser` script is also included to simplify the creation of additional user accounts. + +```sh +cd /dkg-node/apps/agent +npm run script:createUser +# Enter: email, password, permissions (e.g., `mcp llm blob scope123`) +``` + +#### 8.3 Security & networking + +**`.env` file:** + +* Contains sensitive data (wallet keys, passwords, API keys) +* Never commit to version control + +**Services and Ports**\ +The following list provides an overview of which services are running locally and the ports they listen on: + +* **8081** — Web UI & API +* **8900** — DKG Engine API +* **3306** — MySQL +* **6379** — Redis + +### Support + +* 📖 [Documentation](https://docs.origintrail.io/) +* 🐛 [Report Issues](https://github.com/OriginTrail/dkg-node-installer/issues) +* 💬 [Discord Community](https://discord.gg/origintrail) diff --git a/docs/getting-started/dkg-node-services.md b/docs/getting-started/dkg-node-services.md new file mode 100644 index 0000000..f1f4009 --- /dev/null +++ b/docs/getting-started/dkg-node-services.md @@ -0,0 +1,102 @@ +--- +description: >- + Learn how to navigate and use the DKG Node’s built-in web interfaces and APIs. + This section helps you manage your node, interact with agents, and explore + services without writing complex code. +--- + +# DKG Node Services + +## Run[^1]ning your DKG Node in development mode + +You will be running your DKG Node in **development mode** while building, experimenting, and customizing your DKG Node, before deploying it in production. In this mode, the system automatically reloads on code changes, streams real-time logs, and gives you immediate feedback as you work. + +From the project root: + +```bash +cd ~/dkg-node && npm run dev +``` + +This will: + +* Automatically reload your node whenever you change the code. +* Stream live logs across all running services. +* Help you debug and iterate quickly in a local environment. + +{% hint style="info" %} +## Troubleshooting + +If `npm install` fails, try: + +```bash +rm -rf node_modules package-lock.json +npm install +``` + +Also confirm your Node.js version is **v22+**. +{% endhint %} + +## Local services & dashboards + +Once your dev server is up (`npm run dev`), several powerful tools become available through your browser. These interfaces let you **manage, inspect, and debug** every part of your DKG Node. + +### **DKG Node & Agent UI** + +[**http://localhost:8081/**](http://localhost:8081/) + +This is where the prebuilt **template** **UI is** for your DKG Node. From here, you can: + +* Monitor the overall health and status of your node. +* View and manage installed plugins. +* Interact directly with your **DKG Agent** (the built-in AI agent). +* Access settings, credentials, and configuration options. +* Publish, query, and verify Knowledge Assets through a graphical interface - no code required. + +Think of this as your **command center** for operating and experimenting with your node. + +### **DKG MCP Server** + +[**http://localhost:9200/mcp**](http://localhost:9200/mcp) + +This is the **Model Context Protocol (MCP) server endpoint** your node exposes.\ +It allows: + +* **AI agents and external applications** to connect to your node. +* Execution of tools and APIs provided by installed plugins. +* Structured communication between your node and LLMs or external services. + +If your DKG Node is the “brain,” the MCP server is the **communication layer** - it’s what lets AI systems talk to your node programmatically. + +### **Swagger UI (API Explorer)** + +[**http://localhost:9200/swagger**](http://localhost:9200/swagger) + +The Swagger dashboard provides **interactive documentation** for all the REST APIs your node exposes.\ +Here, you can: + +* Explore every available endpoint. +* Understand request/response formats. +* Test API calls directly from the browser. + +This is especially helpful if you’re integrating the DKG Node into a larger application or developing custom tools. + +### **Operational database viewer (Drizzle Studio)** + +[**https://local.drizzle.studio**](https://local.drizzle.studio) + +Drizzle Studio is a visual interface for inspecting the **internal databases** your DKG Node uses.\ +It allows you to: + +* Browse tables and view stored data. +* Inspect Knowledge Assets before and after they’re published. +* Debug database writes and schema changes. + +{% hint style="danger" %} +If you’re using the Brave browser, please disable Shields when accessing Drizzle Studio - otherwise you may not be able to view the database records. +{% endhint %} + + + + + +[^1]: diff --git a/docs/getting-started/interacting-with-your-dkg-agent.md b/docs/getting-started/interacting-with-your-dkg-agent.md new file mode 100644 index 0000000..8f11d03 --- /dev/null +++ b/docs/getting-started/interacting-with-your-dkg-agent.md @@ -0,0 +1,94 @@ +# Interacting with your DKG Agent + +{% hint style="info" %} +This section assumes you have finished [Installation](decentralized-knowle-dge-graph-dkg.md) and will guide you through trying out the basic DKG Agent that comes bundled with the DKG Node. +{% endhint %} + +Each DKG node includes a **collocated neuro-symbolic AI agent** that combines neural model capabilities (e.g., LLMs) with symbolic reasoning over RDF-based graph data. This enables DKG nodes to not only publish and query semantic knowledge but also perform knowledge graph reasoning, summarization, and data transformation tasks directly on locally or remotely stored knowledge. + +The **DKG Agent** is built around a modular **plugin system** centered on the **Model Context Protocol (MCP)**. Plugins define how the agent interacts with external tools, APIs, and reasoning systems. A generic DKG Node ships with a base set of plugins for common operations- such as knowledge publishing, retrieval, and validation - **while developers can extend functionality by creating custom plugins**. + +Each plugin may expose both **MCP endpoints** (for agentic interoperability) and **classic REST/gRPC APIs** (for programmatic access). Example plugin types include ontology-specific retrieval tools (e.g., “social media query” modules), **knowledge-mining pipelines** for crafting Knowledge Assets aligned with domain ontologies, and **reasoning plugins** that apply declarative rule sets to infer new knowledge. + +If you want to jump right into building your custom plugins, head over to the ["Build a DKG Node AI Agent"](../build-a-dkg-node-ai-agent/customizing-your-dkg-agent.md) section. The remainder of this section will familiarize you with the "boilerplate" DKG Node. + +## **Accessing and Using the DKG Agent Interface** + +Your DKG Node comes with a built-in agent interface serving two core purposes: + +* **Secure authentication portal** → OAuth 2.1 login system for accessing your DKG Node +* **AI agent interface** → Direct chat with your DKG-Node-powered agent + +The interface is built with **React Native (Expo)** for cross-platform compatibility, enabling a seamless interaction with your agent and the Decentralized Knowledge Graph (DKG). + +{% hint style="info" %} +If you are following this guide, make sure your [**DKG Node is running**](decentralized-knowle-dge-graph-dkg.md#id-7.-start-the-node), if it’s not already active. +{% endhint %} + +
+ +### What’s included + +Your DKG Node interface provides two initial pages (routes): + +
RouteWhat is itLink (UI)Backend route
/loginAuthenticationhttp://localhost:9200/loginhttp://localhost:8081/login
/chatAgent Chatbot interfacehttp://localhost:9200/chathttp://localhost:8081/chat
+ +{% hint style="info" %} +If you try to access `/chat` while logged out, you’ll be redirected to `/login`. Once signed in, you’re automatically redirected back. +{% endhint %} + +### Using the built-in agent interface + +**Authentication** + +Once you set up the node, you can use the default credentials to sign in. + +``` +Email: admin@example.com +Password: admin123 +``` + +To create additional custom users with required scopes: at least `mcp` and `llm` (see [Configure access & security](security.md)) + +Authentication is based on OAuth 2.1. + +**Agent capabilities** + +* **Natural language conversation** with your DKG Node Agent +* **File uploads** via the **Attach file(s)** button for use with tools +* **Automatic tool usage** → The agent detects and invokes tools from all registered plugins (including DKG Essentials + your custom plugins) based on your queries — e.g., publishing or extracting knowledge from the DKG +* **Source transparency** → When tools return **Knowledge Asset Unique Asset Locators (UALs)** from the DKG, results will display the **source Knowledge Assets** the agent used to generate its answer (see [DKG Essentials plugin for details](../build-a-dkg-node-ai-agent/essentials-plugin.md)). + +### External MCP client integration + +Your DKG Node **uses a** **standard MCP server** (with OAuth 2.1 over HTTPS), so you can connect it to any compatible MCP client, equipping them with the power of the DKG and your DKG Node. For example, you can connect your dev IDE, like Cursor, directly to the DKG Node MCP server. + +#### Supported clients (not exhaustive) + +**Cursor** + +* Go to **Settings → Tools & Integrations → New MCP Server** +* Example config: + + ```json + { + "mcpServers": { + "dkg-mcp": { + "url": "http://localhost:9200/mcp" + } + } + } + ``` +* Click **Needs login** under your server name to authenticate + +**VS Code** + +* Open **Command Palette (Ctrl/Cmd + Shift + P)** → “MCP: Add Server…” +* Select transport: **HTTP** +* Enter server URL: `http://localhost:9200/mcp` +* When prompted, click **Allow** + +**Microsoft Copilot Studio** + +* Follow [Microsoft’s MCP integration docs](https://learn.microsoft.com/en-us/microsoft-copilot-studio/mcp-add-existing-server-to-agent). + diff --git a/docs/getting-started/security.md b/docs/getting-started/security.md new file mode 100644 index 0000000..8a1ba92 --- /dev/null +++ b/docs/getting-started/security.md @@ -0,0 +1,147 @@ +--- +description: >- + Understand authentication, permissioning, and access controls to keep your DKG + Node secure while serving agents and users. +--- + +# Security + +## **Access & authentication overview** + +Your DKG Node includes a secure, built-in authentication system powered by **OAuth 2.1**, ensuring that both human users and AI agents can safely interact with your node and its APIs. + +This section will guide you through: + +* **Understanding OAuth 2.1** - why it’s used, and how it enables secure integrations with tools like Cursor, VS Code, and Copilot. +* **Managing users and tokens** - how to create, edit, and assign access scopes through the CLI or **Drizzle Studio**. +* **Securing custom plugins** - applying scoped authorization so only approved users or agents can access sensitive endpoints. + +### OAuth + +By default, the DKG Node uses **OAuth 2.1** for authentication, powered by: + +* `@dkg/plugin-oauth` +* `@modelcontextprotocol/sdk` (TypeScript framework) + +**Why OAuth 2.1?** + +* Recommended standard for AI agent integrations. +* Works seamlessly with agents like **VS Code/GitHub Copilot**, **Cursor AI Agent mode**, and other OAuth-compatible clients. +* Supports **Dynamic Client Registration** → AI agents can automatically discover and connect to your DKG Node. + +**User data is managed in a built-in SQLite operational database**, which stores: + +* User account information (username & password) +* Permissions and access scopes +* OAuth tokens issued by the server +* Manually created authentication tokens + +### Creating users + +DKG Node includes a script for adding new user accounts with specific permissions. + +Run from `apps/agent/`: + +```bash +npm run script:createUser +``` + +Follow the prompts to enter: + +* **Username** → unique identifier for the user +* **Password** → a secure password +* **Scope(s)** → permissions (e.g., `"mcp llm"` for full access) + +🔍 **Managing users with Drizzle Studio** + +* Starts automatically with `npm run dev` +* Or run manually: + + ```bash + npm run drizzle:studio + ``` +* Accessible at: [http://local.drizzle.studio](http://local.drizzle.studio/?utm_source=chatgpt.com) + +With Drizzle Studio, you can: + +* View all users +* Edit user information +* Manage permissions/scopes +* Monitor issued tokens + +### Creating tokens + +OAuth works with **access tokens**. Tokens allow secure, programmatic access to your DKG Node without user interaction. + +To create a token, run the following command from `apps/agent/` folder: + +```bash +npm run script:createToken +``` + +Follow the prompts: + +* **Scope(s)** → define permissions (e.g., `"mcp llm"`) — more on managing permission scopes in the section [#managing-permission-scopes](security.md#managing-permission-scopes "mention") below +* **Expiration** → choose how long the token should remain valid + +**When to use tokens** + +* Giving **agents access** to tools and resources on your DKG Node +* Automated scripts and integrations +* Service-to-service communication +* Testing and development +* Apps without user interaction + +### Using a token + +DKG Node OAuth Tokens are standard **Bearer tokens**. Include them in the `Authorization` header of your API requests, for example: + +```http +"Authorization": "Bearer 0198a297-f390-76ad-9208-ffae7e248b17" +``` + +### Managing permission scopes + +Access in the DKG Node is **scope-based**: + +* By default: + * `/mcp` → requires `mcp` scope + * `/llm` → requires `llm` scope +* Only users or tokens with those scopes can access the corresponding routes. + +**IMPORTANT: Custom plugins are not protected automatically**\ +When you create custom plugins, you must **assign scopes,** or they’ll be exposed without protection. + +To secure them, register plugins in `apps/agent/src/server/index.ts` using `.withNamespace()`: + +```ts +const app = createPluginServer({ + // ... other config + plugins: [ + defaultPlugin, + oauthPlugin, + dkgEssentialsPlugin, + + // Protect routes with middleware + examplePlugin.withNamespace("protected", { + middlewares: [authorized(["scope123"])], + }), + + // Custom plugin with its own scope + myCustomPlugin.withNamespace("protected", { + middlewares: [authorized(["customscope"])], + }), + ], +}); +``` + +In this example, only users or tokens with the `customscope` scope can access your custom plugin. + +Scopes are assigned during: + +* **User creation** (via `npm run script:createUser`) +* **Token creation** (via `npm run script:createToken`) +* Or later, through **Drizzle Studio**. + + + diff --git a/docs/getting-started/troubleshoot.md b/docs/getting-started/troubleshoot.md new file mode 100644 index 0000000..6f0a73a --- /dev/null +++ b/docs/getting-started/troubleshoot.md @@ -0,0 +1,81 @@ +--- +description: >- + Having issues? This page provides solutions to common errors, setup problems, + and runtime issues, helping you quickly get your DKG Node back online and + fully functional. +--- + +# Troubleshoot + +#### Authentication issues + +**If you run into Browser login problems while working on your project, try the following** + +* Navigate to login: + * `http://localhost:9200/login` + * or `http://localhost:8081/login` (dev mode) +* Open browser dev tools (right-click → Inspect) → **Console** tab +* Run: + + ```js + localStorage.clear() + ``` +* Refresh page + +Root cause: switching DB setups can leave stale `clientId` values in cache. + +**If you have trouble with Cursor accessing your DKG Node MCP server, perform a Cursor authentication reset** + +* Go to **Settings → Tools & Integrations** +* Find your MCP server → click **Disabled** +* Select **Logout** in the pop-up + +**Similarly, VS Code authentication reset** + +* Open **Command Palette (Ctrl/Cmd + Shift + P)** +* Select **Authentication: Remove Dynamic Authentication Providers** +* Choose your MCP server → OK → Remove + +*** + +### Best practices + +#### Effective agent interaction + +* Be specific in your queries +* Upload relevant documents before asking questions +* Use follow-up questions to refine responses + +#### Security considerations + +* Regularly review **user permissions & scopes** +* Monitor **authentication logs** for anomalies +* Keep your **OAuth 2.1 credentials** secure +* Use **dev mode only** in safe environments + +*** + +With the DKG Node interfaces, you can: + +* **Log in securely** to your DKG Node via OAuth 2.1 +* **Chat naturally** with your DKG Node Agent +* **Invoke tools** from DKG Essentials and your custom plugins to interact with the DKG +* **Integrate external agents** (MCP clients like Cursor or VS Code) to be powered by your DKG Node +* **Explore your APIs** and test exposed tools through Swagger + +✨ Your DKG Node and agent are the gateway to **creating and consuming verifiable knowledge** on the DKG. 🚀 + +> **No results yet?**\ +> Publishing can take up to a minute, sometimes it's a short wait. + +> **Wrong blockchain or DKG Node engine?**\ +> Ensure **OT-Node URL** and **chain name** in your `.env` are exactly: +> +> * `https://v6-pegasus-node-02.origin-trail.network/` +> * `otp:20430` + +> **Wallet issues?**\ +> Verify test **TRAC** and **NEURO** balances on **NeuroWeb testnet Subscan**. + +> **Env vars not picked up?**\ +> Confirm values in `apps/agent/.env`. Restart `npm run dev` after changes. diff --git a/docs/graveyard/everything/README.md b/docs/graveyard/everything/README.md new file mode 100644 index 0000000..eb7e24d --- /dev/null +++ b/docs/graveyard/everything/README.md @@ -0,0 +1,2 @@ +# Everything + diff --git a/docs/graveyard/everything/delegated-staking/README.md b/docs/graveyard/everything/delegated-staking/README.md new file mode 100644 index 0000000..d8c2b05 --- /dev/null +++ b/docs/graveyard/everything/delegated-staking/README.md @@ -0,0 +1,88 @@ +--- +description: >- + Delegated staking involves locking up $TRAC for contributing to the DKG + security on selected DKG nodes. DKG node rewards are shared between the $TRAC + stake delegators. +--- + +# Delegated Staking + +Staking is currently enabled on Neuroweb, Base and Gnosis blockchains. + +1. [Staking on Neuroweb](https://docs.origintrail.io/decentralized-knowledge-graph/delegated-staking/staking-trac-on-neuroweb) +2. [Staking on Base](https://docs.origintrail.io/decentralized-knowledge-graph/delegated-staking/staking-trac-on-base) +3. [Staking on Gnosis](https://docs.origintrail.io/decentralized-knowledge-graph/delegated-staking/staking-trac-on-gnosis) + +For a DKG node to be eligible to host a portion of the [DKG](broken-reference) and receive TRAC network rewards, its TRAC stake plays a crucial role. Set at the minimum of 50 000 TRAC on a particular blockchain, stake has an important role in ensuring security of the DKG. The DKG node operators can contribute to the node stake on their own or by attracting more TRAC to their stake through delegated staking. + +There are 2 roles involved in delegated staking - **node operators** and **TRAC delegators**. + +**Node operators** are network participants who choose to host and maintain network nodes (specialized DKG software running servers). Nodes store, validate and make knowledge available to AI systems. They receive $TRAC rewards for this service. All nodes together form a permissionless market of DKG services, competing for their share of network TRAC rewards. + +**Delegators** lock up their $TRAC for contributing to the DKG security on selected DKG nodes and increasing their chance of capturing TRAC network rewards. The rewards that end up being captured by the DKG node are then shared between the $TRAC stake delegators. The delegated tokens are locked in a smart contract and are never accessible to the node operators. + +Note that node operators and node delegators are not distinct - you can be both at the same time. + +{% hint style="info" %} +Contrary to inflationary systems, TRAC staking is strictly utility-based and rewards are generated through DKG usage via knowledge publishing fees. +{% endhint %} + +### How do delegators earn TRAC fees? + +As knowledge publishers create [Knowledge Assets](https://origintrail.io/products/knowledge-assets) on the DKG, they lock an appropriate amount of TRAC tokens in the DKG smart contracts. The TRAC amount offered has to be high enough to ensure enough DKG nodes will store it for a specific amount of time (more details on the DKG market mechanics [here](https://github.com/OriginTrail/OT-RFC-repository/blob/main/RFCs/OT-RFC-14%20DKG%20v6%20TRAC%20Tokenomics.pdf)). The nodes then commit to storing the knowledge assets for a specific amount of time, measured in **epochs which last 3 months**. + +At the end of each epoch, DKG nodes prove to the smart contract that they are still storing a Knowledge Asset and unlock the TRAC reward locked initially by the knowledge publisher. + +Many nodes can compete for the same TRAC reward but only 3 nodes can receive the rewards for each Knowledge Asset. The ranking list of nodes is created on the basis of: + +* storing price (ask) - should be below what publisher offered to pay; +* stake - the higher the DKG node stake, the greater the chance of receiving rewards; +* neighborhood distance - outside of neighborhood use for network addressing purposes, the neighborhood distance is used as a randomisation factor ensuring that the network doesn’t get dominated by a single node. + +You can see more details about the reward mechanism in the [Tokenomics RFC](https://github.com/OriginTrail/OT-RFC-repository/blob/main/RFCs/OT-RFC-14%20DKG%20v6%20TRAC%20Tokenomics.pdf), but the key element of the system is that the DKG nodes with higher stake will be more successful so **by providing TRAC stake to a node, you increase its chances of collecting rewards**. + +After claiming the rewards, they are **automatically restaked, increasing the nodes overall stake by the amount of collected rewards.** + +{% hint style="info" %} +Note: The epoch length on DKG mainnet is 3 months. +{% endhint %} + +### How does DKG delegated staking work? + +Once you delegate TRAC tokens to a node, in return you receive node “share tokens” (similar to Uniswap LP tokens). Each node deployed on OriginTrail has a node-specific, mintable and burnable ERC20 token created during node deployment, with the token symbol and name set by the node operator (an example of such a node share token on NeuroWeb blockchain can be found [here](https://neuroweb.subscan.io/erc20_token/0x98136e72d70b0c52bb253b9bb6902956d213f117?tab=transfers)). + +When you delegate TRAC tokens to a node, they are locked inside DKG smart contracts, a proportional amount of node share tokens is minted for you and sent to your delegating wallet address. To withdraw your TRAC, you simply burn the share tokens via the DKG smart contracts, unlocking your TRAC tokens. As your TRAC is locked in the DKG smart contracts, the node operator has no access to your locked TRAC tokens at any time. + +In order to introduce a level of predictability of network operations, withdrawing tokens is subject to an unbonding period of 28 days. + +{% hint style="warning" %} +If you want to withdraw tokens in order to delegate to another node on the same network (blockchain) - you **do not** have to wait 28 days! See [new-redelegating-trac.md](new-redelegating-trac.md "mention") +{% endhint %} + +{% hint style="success" %} +Delegated staking is a non-custodial system, so the node operator has no access to the locked TRAC tokens at any time. +{% endhint %} + +Each node operator can also set an “**operator fee**” which is taken as a percentage of the TRAC rewards deducted each time when a node claims rewards from a knowledge asset. The remaining TRAC fee is then split proportionally to the share of staked tokens across all delegators. + +{% hint style="warning" %} +**Example**: if a node accumulated **1000 TRAC** tokens in the previous period, and the node has two delegators both of 50% share, and the operator\_fee is 10%: + +* the node operator will receive 100 TRAC (10%) +* each delegator receives 450 TRAC (50% of the remaining 900 TRAC) +{% endhint %} + +### If you are running a node + +If you are running a DKG node you can delegate TRAC tokens to your node in the same way as others. It is recommended you delegate TRAC tokens as well, signalling your commitment to the network via economic stake - this provides a trust signal to other delegators. + +To understand how to set up your operator fee, follow the [node-setup-instructions](../node-setup-instructions/ "mention") instructions for node setup. Note that changing your operator fee incurs a 28 day delay, balancing the 28 day delay delegators experience when withdrawing stake from your node. + +

Depiction of delegating and withdrawing of TRAC from DKG smart contracts

+ +### Have questions? + +Drop by our [Discord](https://discord.com/invite/xCaY7hvNwD) or [Telegram group](https://t.me/origintrail) and feel free to post it there. Make sure to follow our official announcements and stay safe! + +Happy staking! :rocket: + diff --git a/docs/graveyard/everything/delegated-staking/new-redelegating-trac.md b/docs/graveyard/everything/delegated-staking/new-redelegating-trac.md new file mode 100644 index 0000000..7fef2e4 --- /dev/null +++ b/docs/graveyard/everything/delegated-staking/new-redelegating-trac.md @@ -0,0 +1,52 @@ +--- +description: Moving your TRAC stake from one node to another +--- + +# (NEW) Redelegating TRAC + +If you want **move your delegated TRAC stake from one DKG node to another**, you can use the **Redelegate** feature instead of withdrawing and then delegating again. With redelegation, the amount of TRAC stake you are "redelegating" will be transferred from the original DKG node to the new DKG node of your choice, avoiding the 28 day delay which would otherwise take place if you were to withdraw tokens first. + +## Keep in mind + +* The DKG is multichain, however TRAC tokens can only be redelegated within nodes on the same blockchain +* The amount of stake (TRAC) that you want to redelegate does not exceed the second node's remaining capacity (a node can have a maximum of 2.000.000 TRAC stake delegated to it). + +## How can you redelegate TRAC? + +1. Click on the '**Connect Wallet**' button in the top right corner of the navigation bar and follow the prompts to connect your wallet to the interface. +2. Go to the '**My Delegation**' tab to see available nodes that you can redelegate from. +3. Optionally, use the '**Filter by Blockchain**' dropdown to select the desired blockchain, which will filter and display nodes on this network along with their staking information. +4. Once you've decided which node you want to redelegate your TRAC from, click on the 'Manage Stake' button next to the desired node on the right side of the table. Make sure you read the disclaimer. +5. When the staking pop-up opens, you'll have the option to **Delegate**, **Redelegate** or **Withdraw** TRAC tokens from the node. Proceed by selecting '**Redelegate**'. + +
+ +6. After clicking on 'Redelegate' a field to enter the amount of TRAC you wish to redelegate to another node will appear on the right side of the pop-up as well as the select-box for selecting the other node - the one that will receive the TRAC. \ + Enter the amount of TRAC you want redelegated and select the node you want to redelegate to: + +

Kickstarting Redelegation process

+ +{% hint style="warning" %} +You can stake your TRAC only to nodes which have less than 2,000,000 TRAC stake delegated to them. +{% endhint %} + +{% hint style="info" %} +NOTE: Only the nodes from the same network with remaining capacity greater than zero will be shown in the 'Choose available node' select-box. +{% endhint %} + +7. The redelegation process will require two transactions: one to increase the allowance and another to confirm the redelegation contract interaction. Please be patient as this can take some time. + +
+ +8. Once both transactions are signed and confirmed, you should see a 'Stake redelegated successfully' message appear: + +

Example of a successful redelegation

+ +9. To confirm that the process was successful, check your TRAC delegation by going to the 'My Delegations' tab above the table with the nodes and verifying that your delegations are listed there. Additionally, ensure that the stake amount on the node has decreased and the amount on other node increased following the successful redelegation. + +

View of your delegations

+ +{% hint style="info" %} +If you encounter any issues during the staking process or require assistance, please get in touch with the OriginTrail community in [Discord](https://discord.com/invite/QctFuPCMew). +{% endhint %} + diff --git a/docs/graveyard/everything/delegated-staking/staking-trac-on-base.md b/docs/graveyard/everything/delegated-staking/staking-trac-on-base.md new file mode 100644 index 0000000..13c87a5 --- /dev/null +++ b/docs/graveyard/everything/delegated-staking/staking-trac-on-base.md @@ -0,0 +1,69 @@ +--- +description: Instructions for Staking Your TRAC on the Base blockchain +--- + +# Staking TRAC on Base + +These instructions will guide you through the process of staking your TRAC tokens on the Base network via the staking interface. + +## Requirements: + +1. TRAC on Base network +2. Base ETH to pay for fees + +{% hint style="success" %} +The process of bridging TRAC tokens to Base is described [here](https://docs.origintrail.io/integrated-blockchains/base-blockchain). +{% endhint %} + +{% hint style="warning" %} +For the purpose of creating these instructions, we used the Base Sepolia (testnet). The process is exactly the same for the Base mainnet. +{% endhint %} + + + +**The following smart wallets are supported by the interface for staking in Base:** + +1. MetaMask +2. CoinBase +3. OKX Wallet + +### Process steps: + +{% hint style="info" %} +For the purpose of this tutorial we have been using Metamask wallet extension. +{% endhint %} + +Once you have confirmed that you have both Base ETH and TRAC tokens available on the Base blockchain and that your Metamask is connected to the Base blockchain, you can proceed to the staking interface at [https://dkg.origintrail.io/staking](https://dkg.origintrail.io/staking) and follow the steps below: + +1. Click on the '**Connect Wallet**' button in the top right corner of the navigation bar and follow the prompts to connect your wallet to the interface. +2. Use the '**Filter by Blockchain**' dropdown to select the Base blockchain, which will filter and display all available nodes on this network along with their staking information. + +

Filter out Base nodes

+ +3. Once you've decided which node you want to stake your TRAC to, click on the 'Manage Stake' button next to the desired node on the right side of the table. Read the disclaimer and press 'I understand'. + +{% hint style="warning" %} +You can stake your TRAC only to nodes which have less than 2,000,000 TRAC stake delegated to them. +{% endhint %} + +3. When the staking pop-up opens, you'll have the option to delegate or withdraw TRAC tokens from the node. Proceed by selecting 'Delegate TRAC'. +4. After clicking on 'Delegate TRAC,' a field to enter the amount of TRAC you wish to stake will appear on the right side of the pop-up. Enter the amount of TRAC you want to delegate as your stake and click on 'Delegate': + +

Delegation process

+ +5. The delegation process will require two transactions: one to increase the allowance and another to confirm the contract interaction. + +

Sign transactions

+ +6. Once both transactions are signed and confirmed, you should see a 'Stake delegated successfully' message appear: + +

Stake delegated successfully

+ +7. To confirm that the process was successful, check your TRAC delegation by going to the 'My Delegation' tab above the table with the nodes and verifying that your delegation is listed there. Additionally, ensure that the stake amount on the node has increased following the successful delegation. + +

My delegations

+ +{% hint style="info" %} +If you encounter any issues during the staking process or require assistance, please contact our technical support team by sending an email to **tech@origin-trail.com**. +{% endhint %} + diff --git a/docs/graveyard/everything/delegated-staking/staking-trac-on-gnosis.md b/docs/graveyard/everything/delegated-staking/staking-trac-on-gnosis.md new file mode 100644 index 0000000..b7e8c04 --- /dev/null +++ b/docs/graveyard/everything/delegated-staking/staking-trac-on-gnosis.md @@ -0,0 +1,62 @@ +--- +description: Instructions for Staking Your TRAC on the Gnosis blockchain +--- + +# Staking TRAC on Gnosis + +These instructions will guide you through the process of staking your TRAC tokens on the Gnosis network via the staking interface. + +## Requirements: + +1. TRAC on Gnosis network +2. xDai to pay for fees + +{% hint style="warning" %} +For the purpose of creating these instructions, we used the Gnosis Chiado (testnet). The process is exactly the same for the Gnosis mainnet. +{% endhint %} + + + +**The following smart wallets are supported by the interface for staking in Gnosis:** + +1. MetaMask +2. CoinBase +3. OKX Wallet + +### Process steps: + +{% hint style="info" %} +For the purpose of this tutorial we have been using Metamask wallet extension. +{% endhint %} + +Once you have confirmed that you have both xDai and TRAC tokens available on the Gnosis blockchain and that your Metamask is connected to the Gnosis blockchain, you can proceed to the staking interface at [https://dkg.origintrail.io/staking](https://dkg.origintrail.io/staking) and follow the steps below: + +1. Click on the '**Connect Wallet**' button in the top right corner of the navigation bar and follow the prompts to connect your wallet to the interface. +2. Use the '**Filter by Blockchain**' dropdown to select the Gnosis blockchain, which will filter and display all available nodes on this network along with their staking information. + +
+ +3. Once you've decided which node you want to stake your TRAC to, click on the 'Manage Stake' button next to the desired node on the right side of the table. Read the disclaimer and press 'I understand'. + +{% hint style="warning" %} +You can stake your TRAC only to nodes which have less than 2,000,000 TRAC stake delegated to them. +{% endhint %} + +3. When the staking pop-up opens, you'll have the option to delegate or withdraw TRAC tokens from the node. Proceed by selecting 'Delegate TRAC'. +4. After clicking on 'Delegate TRAC,' a field to enter the amount of TRAC you wish to stake will appear on the right side of the pop-up. Enter the amount of TRAC you want to delegate as your stake and click on 'Delegate': + +
+ +5. The delegation process will require two transactions: one to increase the allowance and another to confirm the contract interaction. + +

Sign transactions

+ +6. Once both transactions are signed and confirmed, you should see a 'Stake delegated successfully' message appear: + +

Stake delegated successfully

+ +7. To confirm that the process was successful, check your TRAC delegation by going to the 'My Delegation' tab above the table with the nodes and verifying that your delegation is listed there. Additionally, ensure that the stake amount on the node has increased following the successful delegation. + +{% hint style="info" %} +If you encounter any issues during the staking process or require assistance, please contact our technical support team by sending an email to **tech@origin-trail.com**. +{% endhint %} diff --git a/docs/graveyard/everything/delegated-staking/staking-trac-on-neuroweb.md b/docs/graveyard/everything/delegated-staking/staking-trac-on-neuroweb.md new file mode 100644 index 0000000..667850b --- /dev/null +++ b/docs/graveyard/everything/delegated-staking/staking-trac-on-neuroweb.md @@ -0,0 +1,64 @@ +--- +description: Instructions for Staking Your TRAC on the Neuroweb blockchain +--- + +# Staking TRAC on Neuroweb + +These instructions will guide you through the process of staking your TRAC tokens on the Neuroweb network via the staking interface. + +## Requirements: + +1. TRAC on Neuroweb network +2. NEURO to pay for fees + + + +**The following smart wallets are supported by the interface for staking in Neuroweb:** + +1. MetaMask +2. OKX Wallet + +_**Note:** If you are using OKX Wallet for staking on Neuroweb, please make sure that you update both "**Max base fee**" and "**Priority fee**" to **0.00000001** before signing transactions (see image below)._ + + + +

OKX Wallet fee setting

+ +### Process steps: + +{% hint style="info" %} +For the purpose of this tutorial we have been using Metamask wallet extension. +{% endhint %} + +Once you have confirmed that you have both NEURO and TRAC tokens available on the Neuroweb blockchain, you can proceed to the staking interface at [https://dkg.origintrail.io/staking](https://dkg.origintrail.io/staking) and follow the steps below: + +1. Click on the '**Connect Wallet**' button in the top right corner of the navigation bar and follow the prompts to connect your wallet to the interface. +2. Use the '**Filter by Blockchain**' dropdown to select the Neuroweb blockchain, which will filter and display all available nodes on this network along with their staking information. + +

Filter out Neuroweb nodes

+ +3. Once you've decided which node you want to stake your TRAC to, click on the 'Manage Stake' button next to the desired node on the right side of the table. Read the disclaimer and press 'I understand'. + +{% hint style="warning" %} +You can stake your TRAC only to nodes which have less than 2,000,000 TRAC stake delegated to them. +{% endhint %} + +4. When the staking pop-up opens, you'll have the option to delegate or withdraw TRAC tokens from the node. Proceed by selecting 'Delegate TRAC' +5. After clicking on 'Delegate TRAC,' a field to enter the amount of TRAC you wish to stake will appear on the right side of the pop-up. Enter the amount of TRAC you want to delegate as your stake and click on 'Delegate': + +

Delegation process

+ +6. The delegation process will require two transactions: one to increase the allowance and another to confirm the contract interaction. + +

Sign transactions

+ +Once both transactions are signed and confirmed, you should see a 'Stake delegated successfully' message appear: + +

Stake delegated successfully

+ +7. To confirm that the process was successful, check your TRAC delegation by going to the 'My Delegation' tab above the table with the nodes and verifying that your delegation is listed there. Additionally, ensure that the stake amount on the node has increased following the successful delegation. + +{% hint style="info" %} +If you encounter any issues during the staking process or require assistance, please contact our technical support team by sending an email to **tech@origin-trail.com**. +{% endhint %} + diff --git a/docs/graveyard/everything/dkg-core-node/README.md b/docs/graveyard/everything/dkg-core-node/README.md new file mode 100644 index 0000000..f81f1d0 --- /dev/null +++ b/docs/graveyard/everything/dkg-core-node/README.md @@ -0,0 +1,63 @@ +--- +description: The engine of the DKG Knowledge Layer +hidden: true +icon: circle-nodes +--- + +# DKG Core Node + +**DKG Core Nodes** are the operational foundation of the OriginTrail Decentralized Knowledge Graph (DKG). They form a permissionless peer-to-peer network that stores, verifies, and makes published knowledge assets available to AI systems and other users. By participating in Random Sampling and responding to publishing activity, they ensure that knowledge remains available and the network remains decentralized. + +### What Are DKG Core Nodes? + +Core Nodes are designed for reliability, availability, and fair participation. They are incentivized to maintain uptime, publish new knowledge, and offer services to the network in exchange for TRAC rewards. + +#### Staking Requirements + +To operate a DKG Core Node, a minimum stake of **50,000 TRAC** is required. Nodes with higher stakes gain stronger eligibility in the reward distribution process, as stake directly influences their **Node Power** and visibility to publishers and delegators. + +#### Publishing & Gateway Function + +DKG Core Nodes can also serve as **Gateway Nodes**, meaning they support the publishing of new Knowledge Assets into the network. Nodes that publish more knowledge gain a higher **publishing factor**, which increases their rewards through the Random Sampling proof system. + +To becoeme Gateway nodes, node operators can open their Core Node **publishing API endpoint** to external publishers. This allows developers, AI agents, or platforms to use this Core Node as a gateway service for publishing their knowledge assets - similar to how blockchain RPC providers offer access to blockchain infrastructure. In OriginTrail, however, this access is **incentivized**: if others publish through your Core node, your node's publishing factor grows proportionally improving your Node Power, which improves both your rewards and overall network value. + +### Network Services and Fair Pricing + +Each DKG Core Node sets a **service ask** - a configurable percentage fee charged when serving publishing transactions. This ask value plays a direct role in network competitiveness: + +* Nodes with **lower ask fees** are prioritized by the network +* Ask pricing influences **Node Power**, impacting score in the reward calculation + +Over time, tokenomics naturally favor nodes converging on a **fair price point**, where efficient, well-operated nodes benefit from lower ask values that increase their usage and overall rewards. All of this contributes to a fair and open market of DKG services. + +### Quality Metrics: Node Power and Node Health + +Delegators and publishers alike benefit from understanding two key metrics that determine node performance: + +#### Node Power + +Node Power is a relative metric that reflects a node’s influence in the network’s reward system. It is calculated from: + +* **Staked TRAC** — More stake signals trust and increases reward eligibility +* **Publishing activity** — More published knowledge assets increase score +* **Service ask** — Lower fees improve network competitiveness + +A higher Node Power means the node is more likely to earn rewards. + +#### Node Health + +Node Health tracks how successfully a node is responding to random sampling challenges. It is a relative metric, calculated as: + +* The ratio of **successful proofs submitted** +* Compared to the **expected number of proofs** during an epoch + +High Node Health indicates reliability and strong uptime - crucial for reward consistency and delegator trust. + +### What You Need to Operate a Node + +Operating a DKG Core Node doesn’t require blockchain or knowledge graph expertise, however **it requires diligent monitoring and maintenance in order for your Core node to attract a sensible amount of TRAC rewards**, including regular updates, maintaining uptime etc. It is recommended to have at least some general knowledge about the DKG, operating Linux servers, long running services and firewalls. + +We recommend trying it out on the **DKG testnet** to get familiar with the environment. When you're ready, proceed to the Core Node setup guide to learn how to deploy a node and begin participating. + +> As a Core Node operator, you can also delegate TRAC to your own node. This shows economic commitment to the network and can improve your attractiveness to other delegators. diff --git a/docs/graveyard/everything/dkg-core-node/auto-updater.md b/docs/graveyard/everything/dkg-core-node/auto-updater.md new file mode 100644 index 0000000..9ec7cee --- /dev/null +++ b/docs/graveyard/everything/dkg-core-node/auto-updater.md @@ -0,0 +1,103 @@ +--- +icon: screwdriver-wrench +--- + +# Auto Updater + +### What does the Auto Updater do? + +The Auto Updater module ensures your DKG Core Node is always running the **latest stable version** of the DKG, without you having to manually check, pull, or restart anything. + +Every **15 minutes**, the updater: + +* Checks if a newer version of the code is available. +* If a new version is found, it updates the node and restarts it automatically. + +*** + +### 🔁 How it works + +1. **Check for updates** Every 15 minutes, your node contacts the remote GitHub repository to check if there is a newer version than the one it's currently running. +2. **Version comparison** + * If the **remote version is newer**, it pulls the update and restarts the node. + * If the **remote version is older or the same**, no changes are made. +3. **Update and restart** After a successful update, a file named `UPDATED` is written to disk, and the node is automatically restarted to load the new version. + +*** + +### 🧩 Mainnet vs Testnet Configuration + +Depending on whether you are running a **mainnet** or **testnet** node, the updater checks different branches for updates. + +To enable the auto updater, you will need to have this section in the `.origintrail_noderc` configuration file (found in the `ot-node` folder) + +#### ✅ Mainnet Node Config: + +```json +"modules": { + "autoUpdater": { + "enabled": true, + "implementation": { + "ot-auto-updater": { + "package": "./auto-updater/implementation/ot-auto-updater.js", + "config": { + "branch": "v6/release/mainnet" + } + } + } + } +} +``` + +* The updater will check the `v6/release/mainnet` branch of the GitHub repository for updates. + +*** + +#### 🧪 Testnet Node Config: + +```json +"modules": { + "autoUpdater": { + "enabled": true, + "implementation": { + "ot-auto-updater": { + "package": "./auto-updater/implementation/ot-auto-updater.js", + "config": { + "branch": "v6/release/testnet" + } + } + } + } +} +``` + +* The updater will track updates from the `v6/release/testnet` branch. + +This allows core developers to push test features to testnet users while keeping mainnet nodes stable and production-ready. + +*** + +### 🛡️ Safety Features + +* **No downgrades**: If the remote version is **lower than or equal to your current version**, the update is skipped. +* **Error recovery**: If the updater encounters an error (e.g., network issues), it logs the error and retries again after 15 minutes. +* **Versioning system**: Uses strict semantic versioning to prevent invalid comparisons. +* **Rollback:** If you encounter an error, the auto updater **keeps the previous version available** in the `ot-node` folder, under the version name. In case you need to roll back, change the `current` symlink to point to the location of the previous working version + +*** + +### ✅ Requirements to Use + +* You must set `"enabled": true` in the config. +* Restart permissions must be allowed (handled by `process.exit(1)` in code). + +*** + +### 🧭 What to Expect + +* Smooth, automatic upgrades. +* Minimal manual intervention. +* Your node always on the latest version. + +*** + diff --git a/docs/graveyard/everything/dkg-core-node/deploy-core-node-via-google-cloud-marketplace.md b/docs/graveyard/everything/dkg-core-node/deploy-core-node-via-google-cloud-marketplace.md new file mode 100644 index 0000000..4e28547 --- /dev/null +++ b/docs/graveyard/everything/dkg-core-node/deploy-core-node-via-google-cloud-marketplace.md @@ -0,0 +1,187 @@ +--- +hidden: true +--- + +# Deploy Core node via Google Cloud marketplace + +### Core Node deployment process overview + +This deployment method allows you to fully configure your Coore Node by filling out a solution form provided through the Google Cloud Marketplace. During the process, you’ll be guided through each required field and configure the following: + +1. Core node names for each supported blockchain +2. Core Node wallet keys + +{% hint style="success" %} +Ensure you have a Google Cloud account with billing enabled and permissions to deploy Marketplace solutions. +{% endhint %} + +## **Core node** deployment preparation + +{% hint style="success" %} +With these requirements prepared, you'll be ready to quickly populate the required form inputs and proceed with the deployment of your Core node. +{% endhint %} + +**1. Core Node keys (wallets):** + +The Core Node requires two types of keys: + +* Management keys +* Operational keys + +It is assumed that you are familiar with the wallet preparation process for each supported blockchain. If not, please check the following section related to funding of keys in [Preparation of DKG Core node deployment](run-a-v8-core-node-on-testnet/preparation-for-v8-dkg-core-node-deployment.md#id-3.-funding-your-keys) on testnet. + +{% hint style="success" %} +If you're deploying on **mainnet**, ensure that: + +* You have access to TRAC tokens on the blockchains where you intend to deploy your Edge Node +* You understand how to bridge TRAC tokens to **Gnosis** or **Base**, or use **Teleport** to transfer them to **NeuroWeb**. +* Your operational and management keys are properly funded with both TRAC and the native token, depending on the blockchain to which you are deploying your node. +{% endhint %} + +**Useful documentation:** + +* A list of exchanges where TRAC tokens can be acquired is available on our [official website](https://origintrail.io/get-started/trac-token). +* TRAC Teleport to Neuroweb is explained [here](../teleport-instructions-neuroweb.md). +* Bridging TRAC to Base is explained [here](../../../dkg-knowledge-hub/learn-more/connected-blockchains/base-blockchain/#bridging-trac-to-base). +* Bridging TRAC to Gnosis is explained [here](../../../dkg-knowledge-hub/learn-more/connected-blockchains/gnosis-chain/#bridging-trac-to-gnosis). +* Neuroweb mainnet explorer is available [here](https://neuroweb.subscan.io/). +* Neuroweb testnet explorer is available [here](https://neuroweb-testnet.subscan.io/). +* Base Sepolia explorer is available [here](https://sepolia.basescan.org/). +* Base mainnet explorer is available [here](https://basescan.org/). +* Gnosis Chiado explorer is available [here](https://gnosis-chiado.blockscout.com/). +* Gnosis mainnet explorer is available [here](https://gnosisscan.io/).\ + + +## Deployment guide (step by step) + +With all prerequisites in place, you can proceed to populate the required form inputs and deploy your Core Node by following the steps below. + +### **1. Log in to Google Cloud Console** + +Access your [Google Cloud Console](https://console.cloud.google.com/) using your credentials. + +### **2. Navigate to the Marketplace** + +In the main menu, go to **Marketplace**, then search for **"OriginTrail node"** in the search bar + +### **3. Configure Virtual Machine Parameters:** + +* Choose your desired deployment region, +* Machine type, +* Disk size and other instance-related options + +{% hint style="info" %} +The deployment page comes with recommended machine type and storage options preselected for optimal Edge node performance. For Core node deployments, you can start with reduced hardware, **4GB RAM** and **2 CPU** cores, which is the bare minimum suitable for handling regular or minor workloads. \ +You can scale up the resources later if needed. +{% endhint %} + +### **4. Configure your Core node** + +You can now proceed with populating the form and prepare for the deployment of your Core node. + +**4.1 - Deployment mode** + +{% hint style="success" %} +This field determines how the Edge Node services are managed after installation. Since we are focusing on deploying a Core node, leave it set to **"production"** mode, which is the default. +{% endhint %} + +**4.2 - Blockchain environment** + +Choose between **`mainnet`** or **`testnet`**. Ensure that your Core Node wallets are properly funded according to the blockchain environment where you intend to deploy your node.\ + + +**4.3 - Core node configuration (ot-node service)** + +This section allows you to configure your node’s name for each supported blockchain, as well as define the management and operational keys (wallets). + +* **Node Name**: You can configure the **same node name** for all supported blockchains. This name will help identify your node across all chains. +* **Management wallet (Public EVM key)**: This key grants administrative control, enabling you to configure parameters such as ASK, operator fees, and other settings.\ + You can use the **same management key** across all supported blockchains and this key should funded with the **native token** of each respective blockchain in order to be able to perform updates. +* **Operational wallet (Public EVM key)**: This key is used by the node to perform various blockchain operations. +* **Operational wallet (Private EVM key):** Your operational wallet's private key\ + **Note**: The operational keys **must be different** for each blockchain. + +{% hint style="warning" %} +- Inputs for at least one blockchain configuration must be populated (NeuroWeb, Base, or Gnosis) for the successful deployment of the Core Node. +- If the operational keys are not funded with the native token on the blockchain of choice, node will fail to create its blockchain profile. +- The management and operational keys require a small amount of the native token, NEURO for Neuroweb, ETH for Base, and xDAI for Gnosis, depending on the blockchain you prepare them for. +{% endhint %} + +**4.4 - MySQL password** + +During the deployment process, the password you provide in this field will be set as the **root password** for the MySQL database installed on your Core Node server. + +{% hint style="info" %} +🔐 This root password is critical for accessing and managing your Core node’s database. Do not share it or lose it. +{% endhint %} + +### 5. Deployment + +Once all parameters have been filled out correctly in the form, you can proceed with deploying your Core Node. + +{% hint style="success" %} +Static IP address will be assigned to your server automatically and the firewall will be configured as required by the setup. +{% endhint %} + +Your Core Node installation will be located at:\ +`~/edge_node/ot-node/` + +### 6. Monitor the deployment process + +The Google Cloud Marketplace deployment is generally oriented toward **DKG Edge Node** setups, with the **Core Node** being one of its components. As a result, the deployment process will install **all Edge Node components by default**, which may take approximately **20–30 minutes** to complete. However, the server will become available for **SSH access within a few minutes** after the deployment. + +To monitor the installation progress, SSH into your server, `cd` into the **edge-node-installer** directory and run the following command: + +```shell +bash service-status.sh +``` + +While services are still being installed, a loading spinner will be shown for each one (as shown on the image below). + +

Installation in progress

+ +When all services are marked as **"Active"**, this confirms that the **installation process has been finalized**. + +

Edge node deployment finalized

+ +Since you are planning to run only the **Core Node (`otnode`)**, the following **Edge Node services are not required**. Once the installation is completed, make sure to **stop and disable the following Edge Node services** using the commands provided below: + +{% hint style="danger" %} +Do not disable or stop any services during the installation process but only after the successful finalization of the setup. +{% endhint %} + +```sh +systemctl stop edge-node-api && systemctl disable edge-node-api +systemctl stop auth-service.service && systemctl disable auth-service.service +systemctl stop ka-mining-api.service && systemctl disable ka-mining-api.service +systemctl stop drag-api.service && systemctl disable drag-api.service +systemctl stop airflow-webserver && systemctl disable airflow-webserver +systemctl stop airflow-scheduler && systemctl disable airflow-scheduler +``` + +{% hint style="success" %} +Installed Edge Node components will not interfere with or affect your Core Node's performance in any way once they are stopped and disabled. +{% endhint %} + +{% hint style="warning" %} +⚠️ To prevent potential conflicts or interruptions, it is recommended to avoid performing other actions such as installing additional packages or modifying configurations on the server during the Edge Node deployment process. +{% endhint %} + +### 8. Control and Manage Core Node Service + +Core node will be deployed and ran as `systemd` unit on your instance, making it easy to manage. + +#### Managing Core node service: + +For each service, you can perform the following actions using the respective systemd commands: + +* **Check service status:** `systemctl status otnode.service` +* **Start a service (if it is not already running):** `systemctl start otnode.service` +* **Stop a service:** `systemctl stop otnode.service` +* **Restart a service:** `systemctl restart otnode.service` +* **View Service Logs:** `journalctl -f -u otnode.service` + +## Need assistance? + +In case any of the services show an 'Inactive' status, you can contact our developers for assistance. Please reach out via [Discord](https://discord.com/invite/xCaY7hvNwD) or email us at **tech@origin-trail.com**. \ +Be sure to include the installation log file (`installation_process.log`), which can be found in the `/root/edge-node-installer/log/` directory. diff --git a/docs/graveyard/everything/dkg-core-node/how-to-open-up-your-node-for-publishing.md b/docs/graveyard/everything/dkg-core-node/how-to-open-up-your-node-for-publishing.md new file mode 100644 index 0000000..58dd7ae --- /dev/null +++ b/docs/graveyard/everything/dkg-core-node/how-to-open-up-your-node-for-publishing.md @@ -0,0 +1,55 @@ +--- +description: >- + Help publishers by providing API access to the DKG and increase your node + reward potential +--- + +# How to open up your node for publishing + +One way to increase your Node Power is to make your DKG Core node endpoint publicly accessible to enable publisher access to the DKG. This servers a useful purpose to the network by opening access to the DKG, enabling easier usage and sharing resources, for which Core nodes are incentivised through increasing Node power. + +To make your DKG Core node endpoint publicly accessible, all you need to do is disable both IP-based and token-based authentication. This is done by setting `ipBasedAuthEnabled` and `tokenBasedAuthEnabled` to `false` in the `.origintrail_noderc` file located inside your `ot-node` directory. + +{% hint style="info" %} +`ipBasedAuthEnabled` and `tokenBasedAuthEnabled` fields are not present by default, so you must add them manually. +{% endhint %} + +Below is an example of how your `"auth"` section should be configured in order to disable IP-based and token-based authentication. + +```json +... + }, + "auth": { + "ipBasedAuthEnabled": false, + "tokenBasedAuthEnabled": false, + "ipWhitelist": [ + "::1", + "127.0.0.1", + "" + ] + } +... +``` + +{% hint style="success" %} +Opening your node endpoint by disabling authentication is considered a safe operation. This configuration exposes the node’s API to the public but does not expose any sensitive data such as your node wallets, private keys, or tokens. +{% endhint %} + +{% hint style="warning" %} +Managing rate limits and protecting the endpoint excessive requests is the responsibility of the node operator. +{% endhint %} + +Changes to `.origintrail_noderc` only take effect after a restart. Once the node is back online, its endpoint will be accessible without authentication restrictions. + +Anyone can now use one of the [dkg clients](../../../build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-sdk/) to connect to your Core node an publish through it. So if for example your node URL is _https://super-dkg-node.com_, this is how dkg.js can connect to it: + +```javascript +const dkg = new DKG({ + endpoint: 'https://super-dkg-node.com', // your node URL + port: 8900 // your node DKG Port +}); + +const nodeInfo = await dkg.node.info(); +// if successfully connected, the will return an object indicating the node version +// { 'version': '8.X.X' } +``` diff --git a/docs/graveyard/everything/dkg-core-node/run-a-v8-core-node-on-mainnet/README.md b/docs/graveyard/everything/dkg-core-node/run-a-v8-core-node-on-mainnet/README.md new file mode 100644 index 0000000..4eb2faf --- /dev/null +++ b/docs/graveyard/everything/dkg-core-node/run-a-v8-core-node-on-mainnet/README.md @@ -0,0 +1,11 @@ +# Run a V8 Core Node on mainnet + +{% hint style="info" %} +We encourage everyone to first deploy the V8 DKG Core Node on the testnet before proceeding with the mainnet setup to become familiar with the process of installing and maintaining the node. +{% endhint %} + +The following pages will guide you through the process of setting up a V8 Core Node on the V8 mainnet. For DKG Core Nodes, you will require a Linux server with high uptime, as they are intended to run constantly to support the network. Before installing the V8 DKG Core Node, it's essential to complete a few key steps, such as acquiring tokens, preparing node keys, obtaining blockchain RPC endpoints, and similar. + +The setup process should usually last between 30 minutes to an hour, depending on your proficiency level. In case you have trouble or questions at any point, join the dedicated [OriginTrail #v8-discussion Discord channel](https://discord.com/invite/WCnDQArdzQ) and the OriginTrail community will gladly assist. + +Happy DKG node running! diff --git a/docs/graveyard/everything/dkg-core-node/run-a-v8-core-node-on-mainnet/preparation-for-v8-dkg-core-node-deployment.md b/docs/graveyard/everything/dkg-core-node/run-a-v8-core-node-on-mainnet/preparation-for-v8-dkg-core-node-deployment.md new file mode 100644 index 0000000..2da069e --- /dev/null +++ b/docs/graveyard/everything/dkg-core-node/run-a-v8-core-node-on-mainnet/preparation-for-v8-dkg-core-node-deployment.md @@ -0,0 +1,77 @@ +--- +description: >- + This page will help you prepare all the requirements for the V8 DKG Core Node + installation process. +--- + +# Preparation for V8 DKG Core Node deployment + +Before initiating an installer, each node runner should fulfill a few important requirements to successfully initialize the V8 DKG node. + +## 1. Hardware requirements: + +In order to deploy your V8 DKG node, you will need a Linux server with the minimum recommended hardware as presented below: + +* **4GB RAM** +* **2 CPUs** +* **50GB HDD space** + +{% hint style="success" %} +The initial node setup requires a minimum of 4GB RAM and 2 CPU cores. However, as the node operates for a longer period and accumulates more data, a hardware upgrade may be necessary to maintain optimal performance. +{% endhint %} + +Make sure that you have root access to your server. + +{% hint style="info" %} +The installer script provided in these instructions is designed to install the OriginTrail node on **Ubuntu 20.04 LTS , 22.04 LTS and 24.04 LTS** distributions.\ +\ +It is also possible to install the OriginTrail V8 Core Node on other systems, but it would require modifications to the installer script. If you have any such modifications in mind, we highly encourage your contributions. Please visit our [GitHub](https://github.com/OriginTrail/ot-node) for more information. +{% endhint %} + +{% hint style="warning" %} +During the V8 DKG Core Node installation process, the interactive installer will prompt you to input all the required information outlined below. Please make sure that you have all the required inputs prepared before running the installer. +{% endhint %} + +## 2. Core Node keys (wallets): + +DKG Core Nodes periodically execute blockchain transactions, for which they need node keys (wallets) of the H160 type (Ethereum). There are two categories of keys associated with DKG nodes: + +* **Operational keys,** which are used directly by the node to make blockchain transactions. The nodes require their private keys to be accessible (stored in the node configuration file) so that the node can sign transactions +* **Admin keys**, which are used by the node operator, and not by the node itself. Admin keys are intended for administrative transactions to be done by the node operator (such as rotating keys), meaning the nodes don't need access to their private keys (doesn't need to be stored on the node server or configuration file). An admin key can be stored with a hardware wallet, a multisig, or any other Ethereum-compatible wallet of your preference. + +To get a DKG node running, you will need **at least one operational key and one admin key**. + +The OriginTrail DKG nodes operate with a two-token system: + +* **TRAC token:** The native utility token of the DKG used for knowledge publishing. +* **Native blockchain token of the chosen chain:** Used for interacting with DKG smart contracts. + +## 3. Funding your keys: + +In order to acquire the TRAC token on **NeuroWeb**, the [Teleport](../../teleport-instructions-neuroweb.md) process needs to be executed. + +For Base and Gnosis blockchains, you can use bridge applications as described on the links below: + +* [Base blockchain](../../../../dkg-knowledge-hub/learn-more/connected-blockchains/base-blockchain/) +* [Gnosis blockchain](../../../../dkg-knowledge-hub/learn-more/connected-blockchains/gnosis-chain/) + +## 4. Acquire RPC endpoints: + +There are many RPC providers that host and provide Base and Gnosis endpoints. It's up to the node runner to choose the best possible option for their nodes. + +We recommend checking [Gnosis](https://docs.gnosischain.com/tools/RPC%20Providers/) and [Base](https://docs.base.org/docs/tools/node-providers/#coinbase-developer-platform-cdp) official documentation + +{% hint style="success" %} +Neuroweb RPCs are provided automatically during the node installation process. +{% endhint %} + +## 5. Configure firewall on the server: + +OriginTrail node requires the following ports to be allowed in order to operate properly. + +* 8900 (default node API endpoint) +* 9000 (networking port for communication with other nodes) + +{% hint style="warning" %} +Please keep in mind that different cloud providers use different security practices when it comes to configuring firewalls on the servers. Make sure that your firewall rules are configured according to the practices of the cloud provider you chose. +{% endhint %} diff --git a/docs/graveyard/everything/dkg-core-node/run-a-v8-core-node-on-mainnet/v8-dkg-core-node-installation.md b/docs/graveyard/everything/dkg-core-node/run-a-v8-core-node-on-mainnet/v8-dkg-core-node-installation.md new file mode 100644 index 0000000..70e9894 --- /dev/null +++ b/docs/graveyard/everything/dkg-core-node/run-a-v8-core-node-on-mainnet/v8-dkg-core-node-installation.md @@ -0,0 +1,112 @@ +--- +description: This page will guide you trough the V8 DKG Core Node installation process +--- + +# V8 DKG Core Node installation + +The installation process involves interacting with the installer through the terminal. To proceed, you should have all the required inputs ready, as they will be required by the installer: + +* Ubuntu 20.04, 22.04 or 24.04 instance +* Admin and operational keys and their private keys +* Funds on the wallets +* RPC endpoints +* Firewall configured + +Detailed instructions for the above requirements are available [here](preparation-for-v8-dkg-core-node-deployment.md). + +## 1. **How the installer works**: + +{% hint style="info" %} +The provided installer is designed to install the OriginTrail node on **Ubuntu 20.04 LTS, 22.04 LTS and 24.04 LTS** distributions.\ +\ +It is also possible to install the OriginTrail node on other systems, but it would require modifications to the installer. If you have any such modifications in mind, we highly encourage your contributions. Please visit our [GitHub](https://github.com/OriginTrail/ot-node) for more information. +{% endhint %} + +### **During the installation process, the OriginTrail node installer will execute the following actions:** + +* Check for the Ubuntu OS version compatibility +* Install the required Node.js version together with NPM +* Deploy the OriginTrail node directory and install all required modules +* Configure and enable OriginTrail node service (as systemctl) +* Configure your nodes .origintrail\_noderc file based on the provided inputs: + * Admin and operational keys, + * Node shares token name and symbol + * Operator fee (value between 0 - 100 based on your choice) + * RPC endpoint +* Install and enable MySQL service and create operationaldb for the node +* Configure MySQL user password for the OriginTrail node operational database (based on your inputs) +* Install and enable Tripple store database +* Automatically deploy **otnode-logger.service** in order to pass logs to OriginTrail team + +{% hint style="info" %} +To disable **otnode-logger.service**, execute the following commands on the server once the installation is finalized + +```sh +systemctl disable otnode-logger.service +systemctl stop otnode-logger.service +``` +{% endhint %} + +

Installer interraction

+ +### Installer video tutorial (slightly outdated): + +Before proceeding, make sure to check our quick video tutorial, which explains the process of interacting with the installer. + +{% embed url="https://www.youtube.com/watch?v=RZvIx27I8Ts" %} +8 incentivized DKG Core Node deployment process +{% endembed %} + +## 2. Download OriginTrail V8 DKG Core Node installer: + +Ensure that you're logged in as root. Then, execute the following command in order to download the installer and grant it executable access: + +```sh +cd /root/ && curl -k -o installer.sh https://raw.githubusercontent.com/OriginTrail/ot-node/v8/develop/installer/installer.sh && chmod +x installer.sh +``` + +## 3. Execute the installer by running: + +``` +./installer.sh +``` + +{% hint style="info" %} +Do not run the installer with "sudo". +{% endhint %} + + + +## 4. Verify V8 DKG Core Node installation: + +If your installation has been successful, your node will show the “**Node is up and running!**” log, as shown in the example image below: + +

V8 DKG Core node successful initialization

+ +**`Congratulations, your V8 DKG Core Node is up and running!`** + + + +### **Useful commands:** + +**Starting your node:** `otnode-start` or `systemctl start otnode` + +**Stopping the node:** `otnode-stop` or `systemctl stop otnode` + +**Restarting the node:** `otnode-restart` or `systemctl restart otnode` + +**Showing node logs:** `otnode-logs` or `journalctl -u otnode --output cat -fn 100` + +**Opening the node config:** `otnode-config` or `nano /root/ot-node/.origintrail_noderc` + + + +## Need help? + +If you encounter any issues during the installation process or have questions regarding any of the above topics, jump into our official [Discord](https://discord.gg/xCaY7hvNwD) and ask for assistance. + +Follow our official channels for updates: + +* [X](https://x.com/origin_trail) +* [Medium](https://medium.com/origintrail) +* [Telegram](https://t.me/origintrail) diff --git a/docs/graveyard/everything/dkg-core-node/run-a-v8-core-node-on-testnet/README.md b/docs/graveyard/everything/dkg-core-node/run-a-v8-core-node-on-testnet/README.md new file mode 100644 index 0000000..e802412 --- /dev/null +++ b/docs/graveyard/everything/dkg-core-node/run-a-v8-core-node-on-testnet/README.md @@ -0,0 +1,19 @@ +# Run a V8 Core Node on testnet + +The OriginTrail V8 network is comprised of two types of DKG nodes - **Core Nodes**, which form the DKG network core and host the DKG, and **Edge Nodes,** which run on edge devices and connect to the network core. + +The following pages will guide you through the process of setting up a V8 Core Node on the V8 incentivized testnet. DKG Core Nodes require a Linux server with high uptime, as they are intended to run constantly to support the network. Before installing the V8 DKG Core Node, it's essential to complete a few key steps, such as acquiring tokens, preparing node keys, obtaining blockchain RPC endpoints, and similar. + +The setup process should usually last between 30 minutes to an hour, depending on your proficiency level. In case you have trouble or questions at any point, join the dedicated[ OriginTrail #v8-discussion Discord channel](https://discord.com/invite/WCnDQArdzQ) and the OriginTrail community will gladly assist. + +{% hint style="info" %} +This is a **testnet** environment and is **not intended for production use**.\ +It is subject to frequent changes and updates as part of ongoing testing efforts to validate and improve new releases. +{% endhint %} + +{% hint style="info" %} +This is a **testnet** environment and is **not intended for production use**.\ +It is subject to frequent changes and updates as part of ongoing testing efforts to validate and improve new releases. +{% endhint %} + +Happy DKG node running! diff --git a/docs/graveyard/everything/dkg-core-node/run-a-v8-core-node-on-testnet/preparation-for-v8-dkg-core-node-deployment.md b/docs/graveyard/everything/dkg-core-node/run-a-v8-core-node-on-testnet/preparation-for-v8-dkg-core-node-deployment.md new file mode 100644 index 0000000..fad480a --- /dev/null +++ b/docs/graveyard/everything/dkg-core-node/run-a-v8-core-node-on-testnet/preparation-for-v8-dkg-core-node-deployment.md @@ -0,0 +1,93 @@ +--- +description: >- + This page will help you prepare all the requirements for the V8 DKG Core Node + installation process. +--- + +# Preparation for V8 DKG Core Node deployment + +Before initiating an installer, each node runner should fulfill a few important requirements to successfully initialize the V8 DKG node. + +## 1. Hardware requirements: + +In order to deploy your V8 DKG node, you will need a Linux server with the minimum recommended hardware as presented below: + +* **4GB RAM** +* **2 CPUs** +* **50GB HDD space** + +{% hint style="success" %} +The initial node setup requires a minimum of 4GB RAM and 2 CPU cores. However, as the node operates for a longer period and accumulates more data, a hardware upgrade may be necessary to maintain optimal performance. +{% endhint %} + +Make sure that you have root access to your server. + +{% hint style="info" %} +The installer script provided in these instructions is designed to install the OriginTrail node on **Ubuntu 20.04 LTS , 22.04 LTS and 24.04 LTS** distributions.\ +\ +It is also possible to install the OriginTrail V8 Core Node on other systems, but it would require modifications to the installer script. If you have any such modifications in mind, we highly encourage your contributions. Please visit our [GitHub](https://github.com/OriginTrail/ot-node) for more information. +{% endhint %} + +{% hint style="warning" %} +During the V8 DKG Core Node installation process, the interactive installer will prompt you to input all the required information outlined below. Please make sure that you have all the required inputs prepared before running the installer. +{% endhint %} + +## 2. Core Node keys (wallets): + +DKG Core Nodes periodically execute blockchain transactions, for which they need node keys (wallets) of the H160 type (Ethereum). There are two categories of keys associated with DKG nodes: + +* **Operational keys,** which are used directly by the node to make blockchain transactions. The nodes require their private keys to be accessible (stored in the node configuration file) so that the node can sign transactions +* **Admin keys**, which are used by the node operator, and not by the node itself. Admin keys are intended for administrative transactions to be done by the node operator (such as rotating keys), meaning the nodes don't need access to their private keys (doesn't need to be stored on the node server or configuration file). An admin key can be stored with a hardware wallet, a multisig, or any other Ethereum-compatible wallet of your preference. + +To get a DKG node running, you will need **at least one operational key and one admin key**. + +The OriginTrail DKG nodes operate with a two-token system: + +* **TRAC token:** The native utility token of the DKG used for knowledge publishing. +* **Native blockchain token of the chosen chain:** Used for interacting with DKG smart contracts. + +## 3. Funding your keys: + +### Acquire Base Sepolia ETH: + +The list of Base Sepolia faucets can be found in the official Base [documentation](https://docs.base.org/docs/tools/network-faucets/). + +### Acquire Gnosis Chiado xDai: + +The list of Gnosis Chiado faucets can be found in the official Gnosis [documentation](https://docs.gnosischain.com/tools/Faucets). + +### Acquire NEURO and Base Sepolia, Gnosis Chiado, or NeuroWeb TRAC: + +To acquire TRAC on the Base Sepolia, Gnosis Chiado, or NeuroWeb network, you should use the Faucet Bot provided by the OriginTrail core team in the [#faucet-bot Discord channel](https://discord.gg/WaeSb5Mxj6). + +The Faucet Bot also provides testnet NEURO tokens. + +{% hint style="info" %} +**Instructions for the Faucet Bot** + +You can find **full instructions** on **how to use the Faucet Bot** [**here**](../../../../dkg-knowledge-hub/useful-resources/test-token-faucet.md). + +If you have any problems with the Faucet Bot, you can contact any team member on Discord for assistance or send an email to [tech@origin-trail.com](mailto:tech@origin-trail.com). +{% endhint %} + +## 4. Acquire RPC endpoints: + +There are many RPC providers that host and provide Base and Gnosis endpoints. It's up to the node runner to choose the best possible option for their nodes. + +We recommend checking [Gnosis](https://docs.gnosischain.com/tools/RPC%20Providers/) and [Base](https://docs.base.org/docs/tools/node-providers/#coinbase-developer-platform-cdp) official documentation + +{% hint style="success" %} +Neuroweb RPCs are provided automatically during the node installation process. +{% endhint %} + +## 5. Configure firewall on the server: + +OriginTrail node requires the following ports to be allowed in order to operate properly. + +* 8900 (default node API endpoint) +* 9000 (networking port for communication with other nodes) + +{% hint style="warning" %} +Please keep in mind that different cloud providers use different security practices when it comes to configuring firewalls on the servers. Make sure that your firewall rules are configured according to the practices of the cloud provider you chose. +{% endhint %} + diff --git a/docs/graveyard/everything/dkg-core-node/run-a-v8-core-node-on-testnet/v8-dkg-core-node-installation.md b/docs/graveyard/everything/dkg-core-node/run-a-v8-core-node-on-testnet/v8-dkg-core-node-installation.md new file mode 100644 index 0000000..45c9e10 --- /dev/null +++ b/docs/graveyard/everything/dkg-core-node/run-a-v8-core-node-on-testnet/v8-dkg-core-node-installation.md @@ -0,0 +1,102 @@ +--- +description: This page will guide you trough the V8 DKG Core Node installation process +--- + +# V8 DKG Core Node installation + +The installation process involves interacting with the installer through the terminal. To proceed, you should have all the required inputs ready, as they will be required by the installer: + +* Ubuntu 20.04, 22.04 or 24.04 instance +* Admin and operational keys and their private keys +* Funds on the wallets +* RPC endpoints +* Firewall configured + +Detailed instructions for the above requirements are available [here](preparation-for-v8-dkg-core-node-deployment.md). + +## 1. **How the installer works**: + +{% hint style="info" %} +The provided installer is designed to install the OriginTrail node on **Ubuntu 20.04 LTS, 22.04 LTS and 24.04 LTS** distributions.\ +\ +It is also possible to install the OriginTrail node on other systems, but it would require modifications to the installer. If you have any such modifications in mind, we highly encourage your contributions. Please visit our [GitHub](https://github.com/OriginTrail/ot-node) for more information. +{% endhint %} + +### **During the installation process, the OriginTrail node installer will execute the following actions:** + +* Check for the Ubuntu OS version compatibility +* Install the required Node.js version together with NPM +* Deploy the OriginTrail node directory and install all required modules +* Configure and enable OriginTrail node service (as systemctl) +* Configure your nodes .origintrail\_noderc file based on the provided inputs: + * Admin and operational keys, + * Node shares token name and symbol + * Operator fee (value between 0 - 100 based on your choice) + * RPC endpoint +* Install and enable MySQL service and create operationaldb for the node +* Configure MySQL user password for the OriginTrail node operational database (based on your inputs) +* Install and enable Triple store database + +

Installer interaction

+ +### Installer video tutorial (slightly outdated): + +Before proceeding, make sure to check our quick video tutorial, which explains the process of interacting with the installer. + +{% embed url="https://www.youtube.com/watch?v=RZvIx27I8Ts" %} +V8 incentivized DKG Core Node deployment process +{% endembed %} + +## 2. Download OriginTrail V8 DKG Core Node installer: + +Ensure that you're logged in as root. Then, execute the following command in order to download the installer and grant it executable access: + +```sh +cd /root/ && curl -k -o installer.sh https://raw.githubusercontent.com/OriginTrail/ot-node/v8/develop/installer/installer.sh && chmod +x installer.sh +``` + +## 3. Execute the installer by running: + +``` +./installer.sh +``` + +{% hint style="info" %} +Do not run the installer with "sudo". +{% endhint %} + + + +## 4. Verify V8 DKG Core Node installation: + +If your installation has been successful, your node will show the “**Node is up and running!**” log, as shown in the example image below: + +

V8 DKG Core node successful initialization

+ +**`Congratulations, your V8 DKG Core Node is up and running!`** + + + +### **Useful commands:** + +**Starting your node:** `otnode-start` or `systemctl start otnode` + +**Stopping the node:** `otnode-stop` or `systemctl stop otnode` + +**Restarting the node:** `otnode-restart` or `systemctl restart otnode` + +**Showing node logs:** `otnode-logs` or `journalctl -u otnode --output cat -fn 100` + +**Opening the node config:** `otnode-config` or `nano /root/ot-node/.origintrail_noderc` + + + +## Need help? + +If you encounter any issues during the installation process or have questions regarding any of the above topics, jump into our official [Discord](https://discord.gg/xCaY7hvNwD) and ask for assistance. + +Follow our official channels for updates: + +* [X](https://x.com/origin_trail) +* [Medium](https://medium.com/origintrail) +* [Telegram](https://t.me/origintrail) diff --git a/docs/graveyard/everything/dkg-core-node/upgrading-from-v6-to-v8.md b/docs/graveyard/everything/dkg-core-node/upgrading-from-v6-to-v8.md new file mode 100644 index 0000000..7bd73bf --- /dev/null +++ b/docs/graveyard/everything/dkg-core-node/upgrading-from-v6-to-v8.md @@ -0,0 +1,112 @@ +--- +hidden: true +--- + +# Upgrading from V6 to V8 + +{% hint style="info" %} +This section is only for the node runners who are already running their nodes on V6 and need to upgrade to V8. +{% endhint %} + +A few manual steps must be performed to successfully update your V6 node to a V8 Core Node. The following steps should be performed after the official V8 release has been deployed on the DKG mainnet. + +## Preparing your node for V8 update + +Have your node automatically download the latest version, and verify that the auto-updater is enabled on the ot-node service. To enable the auto-updater, follow the instructions on the following [page](https://docs.origintrail.io/dkg-v6-current-version/node-setup-instructions/useful-resources/manually-configuring-your-node). + +Once the auto-updater is enabled in the .origintrail\_noderc file, restart the node to apply the configuration changes. When the update is released, your node will automatically pull the latest version (V8), install node modules, and restart. + +## Configuring the new V8 blockchainEvents module: + +The new V8 ot-node engine introduces a blockchainEvents module, which you will need to configure on your node by adjusting the settings in the configuration file. The provided template includes configurations for three blockchains: otp:2043, gnosis:100, and base:8453. Depending on which blockchains your node is connected to, you should modify the template accordingly. + +* **Adjust the ****blockchains**** array**: Only include the blockchain IDs relevant to your node. For example, if your node is not connected to the gnosis:100 blockchain, remove it from the array. +* **Update the ****rpcEndpoints**** section**: Provide the appropriate RPC endpoints for each blockchain your node is connected to. Remove any endpoints that do not apply to your node’s blockchain configuration. + +#### **Example configuration with blockchainEvents added:** + +``` +{ + "modules": { + "autoUpdater": { + + }, + "blockchainEvents": { + "enabled": true, + "implementation": { + "ot-ethers": { + "enabled": true, + "package": "./blockchain-events/implementation/ot-ethers/ot-ethers.js", + "config": { + "blockchains": [ + "otp:2043", + "gnosis:100", + "base:8453" + ], + "rpcEndpoints": { + "base:8453": ["https://"], + "gnosis:100": ["https://"] + } + } + } + } + }, + "blockchain": { + + }, + ... +``` + +## Running data migration + +After the node successfully starts with version 8, you **should manually run the data migration script** run-data-migration.sh. Before that, make sure you update the **MAIN\_DIR** variables in these two files if your main directory is **NOT** "root": + +1. run-data-migration.sh +2. constants.js + +Both files are located in the "**`/ot-node/current/v8-data-migration/`**" directory.\ +\ +After that, execute the data migration script in the same directory by running: + +```bash +bash run-data-migration.sh +``` + +The data migration script migrates all triples from their V6 to V8 triple store repositories. Make sure you have configured RPCs properly for all the blockchains your node is connected to. Depending on how much data your node is hosting, this migration might last several hours to several days. + +{% hint style="info" %} +This migration will not influence your node reward performance. Your node will remain fully operational, and all pre-V8 Knowledge Assets will remain queryable via the GET protocol method. +{% endhint %} + +If the data migration is interrupted for any reason (e.g., server restart), simply re-run the script (Make sure the script is not already running, check the [#restarting-terminating-the-data-migration](upgrading-from-v6-to-v8.md#restarting-terminating-the-data-migration "mention") section for more information). It will automatically resume from the point where it was interrupted. + +### Tracking data migration progress + +To track the data migration progress, check the migration script log file located at check the nohup.out file located at "**`/ot-node/data/nohup.out`**", for example by using the command: + +```bash +tail -f nohup.out +``` + +If you want to analyze the logs, we suggest taking a look at the migration log file located at "/**`ot-node/data/data-migration/logs/migration.log`**". + +### Restarting/terminating the data migration + +If you want to restart the script, make sure you terminate the old process first. You can find out whether the old process is still running by entering this into the terminal: + +```bash +ps aux | grep v8-data-migration +``` + +If the old data migration process is still running, you should see an output like this: + +```bash +root 1046979 30.9 3.8 22159880 152996 pts/0 Sl 11:12 0:15 node v8-data-migration.js +root 1047113 0.0 0.0 8164 712 pts/0 S+ 11:13 0:00 grep --color=auto v8-data-migration +``` + +Run "**`kill `**" command on the node process and repeat the steps under the[#running-data-migration](upgrading-from-v6-to-v8.md#running-data-migration "mention") section. + +{% hint style="warning" %} +We strongly encourage all node runners to update as soon as the release is out to ensure continued compatibility and to take advantage of the latest features and improvements. +{% endhint %} diff --git a/docs/graveyard/everything/dkg-edge-node/README.md b/docs/graveyard/everything/dkg-edge-node/README.md new file mode 100644 index 0000000..e439849 --- /dev/null +++ b/docs/graveyard/everything/dkg-edge-node/README.md @@ -0,0 +1,45 @@ +--- +description: >- + The platform for building collective neuro-symbolic AI. Start building with + the DKG Edge Node. +hidden: true +icon: diagram-project +cover: ../../../.gitbook/assets/DKG Edge Node - beta version - docs visual2 1 (1).png +coverY: 0 +--- + +# DKG Edge Node + +Looking to build your decentralized AI project, launch a Decentralized Knowledge Graph (DKG) paranet, build a dRAG system, or just take the DKG for a spin? + +The OriginTrail DKG Edge Node is an open-source platform designed to empower builders of neuro-symbolic AI projects. It provides a collection of fully customizable components for building decentralized AI applications based on the OriginTrail DKG. + +Edge Nodes enable running decentralized AI services on the "network edge", providing a unified platform for: + +* **Producing & publishing Knowledge Assets,** offering a multitude of Knowledge-Asset-creation agents based on knowledge graph creation pipelines, vector embedding generation, and others combined with a paranet publishing system. +* **Maintaining data privately stored on a device hosting the Edge Node,** enabling full configurability of technology used, such as choice of graph databases, vector stores, etc. +* **Building Decentralized Retrieval Augmented Generation (dRAG) pipelines** using the dRAG API and combining knowledge graph queries, vector search, or other retrieval methods. +* **Configurability and choice of AI models,** running either on the device or externally. +* **Customizable and extendable user interface (UI) elements for your projects, such as paranet UIs,** on the basis of the Edge Node codebase. + +{% hint style="info" %} +As an early builder, you can benefit from the [DKG Edge Node inception program](dkg-edge-node-inception-program.md) with up to 100k TRAC tokens for your publishing costs on the DKG mainnet. +{% endhint %} + +{% hint style="warning" %} +The DKG Edge Node and its documentation are a work in progress. They are expected to undergo many improvements and we welcome all your contributions to this. + +Feel free to contribute to these docs directly through [GitHub](https://github.com/OriginTrail/dkg-docs) (click the "Edit on GitHub" button in the top right corner of the screen). +{% endhint %} + +### How can I start building with the Edge Node? + +
Step 1: Learn about the Edge Node internals (architecture)Learn about the Edge Node internals (architecture).png
Step 2: Get started with the Edge Node boilerplateGet started with the Edge Node boilerplate.png
Step 3: Customize & build with the Edge NodeCustomize the Edge Node to build your project (1).png
+ +### What's the difference between a Core Node and an Edge Node? + +The OriginTrail DKG V8 network is comprised of two types of DKG nodes — **Core Nodes**, which form the DKG network core and host the DKG, and **Edge Nodes,** which run on edge devices and connect to the network core. The current beta version is designed to operate on edge devices running Linux and MacOS, with future support planned for a wide range of edge devices such as mobile phones, wearables, IoT devices, and generally enterprise environments. This enables large volumes of sensitive data to safely enter the AI age while maintaining privacy. + +The DKG Edge Node ensures that data remains protected on the device, giving owners full control over how their data is shared. This data becomes part of the global DKG, with precise access management permissions controlled by the data owner. Through dRAG, AI applications can access both private and public DKG data, with the owner’s permission, to power privacy-preserving AI solutions. + +With the ability to run AI applications directly on edge devices, DKG Edge Nodes enable secure and decentralized data usage on the growing number of network edge devices. The DKG Edge Node also plays a key role in expanding the DKG into a large-scale decentralized physical infrastructure network (DePIN). diff --git a/docs/graveyard/everything/dkg-edge-node/customize-and-build-with-the-edge-node.md b/docs/graveyard/everything/dkg-edge-node/customize-and-build-with-the-edge-node.md new file mode 100644 index 0000000..605d1b7 --- /dev/null +++ b/docs/graveyard/everything/dkg-edge-node/customize-and-build-with-the-edge-node.md @@ -0,0 +1,194 @@ +# Customize & build with the Edge Node + +Now that you've set up the boilerplate project with all the essential components, it's time to explore the customization options. Each service in the app — interface, API, Knowledge Mining API, dRAG API, and Authentication Service — is **open** **source** and **fully** **customizable**. You can fork any of these repositories to modify them to better suit your specific requirements. + +This section will focus primarily on customizing the **Knowledge Mining API** and **dRAG API**, two core services that empower you to process data and build Decentralized Retrieval-Augmented Generations (dRAGs) based on Knowledge Assets. + +## Introduction to customization + +Each service is hosted in a separate GitHub repository, enabling you to fork the code, make your changes, and deploy your customized versions. The default setup provides a boilerplate project with out-of-the-box functionality, serving as a solid foundation for customization. + +Before diving into how you can customize Edge Node services, it's important to highlight the role of the Authentication Service, which acts as the connecting hub between all components. This service manages the `UserConfig` data, including both required and optional parameters that are shared across all services to ensure the system functions cohesively. + +Users can add custom variables to the `UserConfig` table, making them accessible across all services. For instance, if a use case requires an external tool with an authentication key, the `UserConfig` table is the ideal place to store this key, ensuring that the variable is available to all services in the system. + +## Local environment setup with forked services + +To begin customizing and building your own solution using the OriginTrail Edge Node stack, we recommend the following local development setup:\ + + +1. ### Fork Core Edge Node Repositories + + In order to fully tailor the Edge Node to your specific use case, it is recommended that you **fork the following components** into your own GitHub account: + + 1. **Edge Node Knowledge Mining API**\ + Handles the ingestion and transformation of your datasets into Knowledge Assets.\ + 👉 Fork this service to add new file format support and custom data transformation logic. + 2. **Edge Node DRAG API**\ + Provides search, retrieval, and chat functionality based on Knowledge Assets.\ + 👉 Fork this to extend the retrieval-augmented generation (RAG) logic or customize how assets are queried. + 3. **Edge Node API**\ + Main backend orchestrator connecting your UI, authentication, and data pipelines.\ + 👉 Fork this if you want to modify business logic, expose new routes, or integrate additional microservices. + 4. **Edge Node UI**\ + The user-facing interface of the Edge Node.\ + 👉 Fork this to customize branding, UX, workflows, or connect it with your own backend services.\ + +2. ### Authentication Service (Optional Fork) + 1. **Edge Node Authentication Service**\ + This handles user sessions and tokens.\ + Recommended to use as-is for most cases to keep things simple and aligned with best practices.\ + 🛠️ Optional: You may fork this if you need: + + 1. Custom authentication methods (e.g., biometric login, enterprise SSO) + 2. Integration with external identity providers + 3. Custom logic for Verifiable Credential issuance or DID resolution + + +3. ### Forked Repositories Setup + + Once you’ve successfully forked the core Edge Node repositories and tested the default setup using the official public repos, you’ll need to **clean your local environment** before installing your customized versions. + + 1. **Prune Default Services** + + To avoid conflicts and ensure a clean state, run the following command from the root of your Edge Node setup:\ + \ + `bash edge-node-installer/reset-env.sh`\ + \ + After pruning the default Edge Node setup, your environment will be reset: + + 1. All previously cloned **Edge Node service repositories** will be deleted + 2. All **Edge Node databases** will be dropped\ + + 2. **Switch to Your Forked Repositories** + 1. **Open your `.env` file** located at the root of the project. + 2. Replace the official repository URLs with the links to your **forked repositories.**\ + + 3. **Install Your Custom Edge Node** + 1. Run Edge node installer script which will install services based on your forked repos. + 2. If your Edge node is set on MacOS, execute following script to run your services:\ + \ + `bash edge-node-installer/run-dev.sh` + +## Common customization scenarios + +### Creating a custom knowledge mining pipeline + +The **Knowledge Mining API** is one of the most powerful services in the Edge Node, offering builders the flexibility to create custom processing pipelines without limitations. Registering new pipelines is straightforward, and the service is built in Python, a versatile environment well-suited for **data** **processing**, **AI** **tools**, **models**, and more. It’s up to the builder to decide how to parse the input data from incoming requests. The only requirement is that the pipeline outputs data in JSON-LD format, as this is necessary for publishing it as a Knowledge Asset on the DKG. + +* **Forking the repository** + * Start by **forking** the Knowledge Mining API repository from GitHub. This creates a copy of the original repository under your GitHub account, which you can modify as needed. + * Clone your forked repository locally: + + ```bash + git clone https://github.com/your-username/edge-node-knowledge-mining.git + ``` +* **Creating your pipeline file** + * To create a new processing pipeline, add a new file in the "dags" folder of your project. + * As a starting point, you can use the existing example file `dags/exampleDAG.py`. Duplicate this file and modify it as needed: + * The example pipeline serves as a basic illustration. It performs two tasks: + * Extracting parameters from the incoming request. + * Converting the input file to JSON-LD format. +* **Building the pipeline** + * There are no limitations on how you build your pipeline. You can: + * Install additional packages as needed. + * Use external APIs for data enrichment. + * Incorporate any processing logic to transform incoming data into the required graph structure (JSON-LD). + * **Requirement:** The pipeline must output data in JSON-LD format (in the same format as existing mining pipelines) so that the Edge Node API can process it and pass it to the Publish Service for creating a Knowledge Asset on the DKG + +{% hint style="info" %} +A **DAG** defines the execution order of tasks within a **pipeline**, while a **pipeline** is the entire data processing workflow. A pipeline can contain multiple DAGs, but a DAG is just one part of a pipeline. + +* All your **DAGs** were initially paused. +{% endhint %} + +* **Starting the pipeline** + * **Restart the services:** After creating the pipeline (DAG), the Edge Node Knowledge Mining API should be restarted + * _python app.py_ and _airflow scheduler_ (scripts mentioned in the previous chapter, required to be running in order to use the knowledge mining service) + * optionally: If _airflow webserver_ is used, it should also be restarted + * **Unpause** your pipeline\ + `airflow dags unpause ${YOUR_DAG_NAME}`\ + &#xNAN;_e.g. If your pipeline filename is xlsx\_to\_jsonld.py, unpause command should be "airflow dags unpause xlsx\_to\_jsonld"_\ + \ + **NOTE:** If you are using Airflow webserver, you should be able to see your pipeline on http://localhost:8080 (or any other port you selected for the service) inside of "unpaused DAGS" +* **Registering the pipeline** + * In order for the Edge Node to **recognize** and **use** your new pipeline for certain filetypes, it needs to be registered in your main **UserConfig** **table** inside of the Authentication Service MYSQL DB (edge-node-auth-service) + * By default, the Edge Node will offer three variables for three main file types + * _kmining\_json\_pipeline\_id_ — this pipeline will be used when the input file MIME type is detected as **"application/json"** + * _kmining\_pdf\_pipeline\_id_ — this pipeline will be used when the input file MIME type is detected as **"application/pdf"** + * _kmining\_csv\_pipeline\_id_ — this pipeline will be used when the input file MIME type is detected as **"text/csv"** + * If your pipeline handles one of the above file types, simply replace the default pipeline with your custom one. **Here are some ideas:** + * **Convert science paper PDFs to JSON-LD using a bibliographic ontology**\ + Extract metadata from science paper PDFs, such as title, authors, publication date, and references, and convert the data into JSON-LD following a bibliographic ontology like BIBO. This allows for structured, machine-readable representation of academic papers for easier citation management and searchability. + * **Convert supply chain Excel documents to JSON-LD using GS1 standard ontology**\ + Parse supply chain-related data from Excel files (e.g., product lists, inventory records) and convert it into JSON-LD using the GS1 standard ontology. + * **Convert images to JSON-LD using OCR**\ + Use Optical Character Recognition (OCR) to extract text and metadata from image files and represent it as JSON-LD. + * **Convert videos to Knowledge Assets by transcribing the audio and extracting key points**\ + Transcribe the audio from videos and extract key points or insights, then represent this information as JSON-LD knowledge assets. + * If you need to support a different file type: + * Create a new variable for the file type, e.g., `kmining_xlsx_pipeline_id` + * adapt the code in [Edge Node API - kMiningService](https://github.com/OriginTrail/edge-node-api/blob/main/services/kMiningService.js) to handle the new variable based on the input file's MIME type +* You should now be ready to test your setup. Visit the Edge Node interface, go to the "Contribute" page, import a file, and verify that the pipeline processes it correctly. + +{% hint style="info" %} +Every **Knowledge Asset** consists of both a **private** and a **public part**, both of which can be published. Regarding publishing private and/or public data, you should note: + +1. When you query the node, you will be able to retrieve both the private and public parts (use `contentType: 'all'` in the `get` function). +2. Anyone with access to the node will be able to view the private knowledge. If you prefer to keep it private, simply whitelist yourself. +{% endhint %} + +### Creating a custom dRAG pipeline + +Now that you have created your processing pipelines and published Knowledge Assets on the DKG, you can customize the dRAG (Decentralized Retrieval-Augmented Generation) service to build your own decentralized RAGs. The dRAG service is built in Node.js, providing a flexible environment that supports integrations with LLMs, AI tools, and more, giving you limitless possibilities to customize your RAGs. + +The native query language for interacting with the DKG is SPARQL, as we use a triple store for data storage. You can combine SPARQL, LLMs, vector databases, or other tools to extract and process data from the DKG, refine the results, and deliver answers to natural language questions. The dRAG service functions like any other RAG but with the added benefits of decentralized data sources from the DKG. + +* **Forking the repository** + * Start by **forking** the dRAG API repository from GitHub. This creates a copy of the original repository under your GitHub account, which you can modify as needed. + * Clone your forked repository locally: + + ```bash + git clone https://github.com/your-username/edge-node-drag.git + ``` +* **Creating your dRAG** + * The app currently contains two dRAGs as a demonstration of how natural language questions can be understood, processed, and answered using SPARQL and vector similarity search + * Creating a new dRAG is basically creating a new API route in the app, and those steps are recommended but not mandatory: \ + NOTE: (_you can create your own path as long as they are compatible with Edge Node interface, which also can be customized by your needs_) + * Create a new Controller in the controllers directory + * Create your dRAG method in Controller + * Register route in route.js using created Controller and dRAG method +* **Building a new dRAG** + * The dRAG service is highly customizable and supports various tools and libraries that can help you extract meaningful insights from the DKG. You can: + * Combine SPARQL queries with AI-powered analysis tools to answer complex natural language questions. + * Use LLMs for text generation or summarization based on the retrieved data. + * Integrate vector databases to enhance search capabilities with similarity search or semantic search. + * **Requirement:** The current version of the Edge Node interface works with the defined response that you can see in the two provided examples. It contains two parameters in response: + * **Answer** — Natural language answer on defined questions + * **Knowledge Assets** — This will be used to show based on which Knowledge Assets the answer is created\ + NOTE: As mentioned earlier, you can fork and update [Edge Node Interface](https://github.com/OriginTrail/edge-node-interface) as well and adapt the responses of your dRAGs to be compatible with the adapted interface. +* **Integration with interface** + * As mentioned earlier, the AI Assistant is available in the current version of the Edge Node interface. To connect your dRAG to the interface, all you need to do is update the route used for fetching answers. +* **Potential dRAG ideas:** + * **Teach an LLM to convert natural language to SPARQL:** Leverage few-shot learning to train an LLM to convert natural language queries into SPARQL, tailored to your specific ontology and data model. Provide a set of example queries and responses to guide the model's understanding. (Example: `exampleSparqlController.js`) + * **Integrate vector search with reranking for precision:** Use a vector database to retrieve content similar to the user's question, then refine the results with AI-based reranking tools to improve relevance and accuracy. This can enhance both the speed and precision of the search. (Example: `exampleVectorController.js`) + * **Intent-matching AI for predefined SPARQL queries:** Create a set of predefined SPARQL queries and use an intent-matching algorithm powered by AI to map user queries to the most relevant SPARQL queries. This allows for efficient querying with minimal overhead. + * **Feedback-loop-based SPARQL refinement:** Combine the LLM's natural language to SPARQL conversion with a feedback loop, in which the AI iteratively enhances the generated SPARQL queries, ensuring they align with the ontology and avoid errors. + * **Hybrid search — Combine vector and symbolic search:** Use a hybrid approach in which vector search (for semantic similarity) and symbolic search (e.g., SPARQL) work in tandem. Balancing structured queries with open-ended search results in this way can help ensure both accuracy and broad coverage. + * **Ontology-aware LLM fine-tuning:** Create a system to fine-tune a large language model (LLM) specifically on a given ontology. This approach involves providing the LLM with structured data from the ontology, including relationships, entities, and definitions, so it can learn to generate responses that align with the specific concepts and rules of the ontology. Then, use the trained model to formulate SPARQL queries based on the natural language. +* You should now be ready to test your setup. Visit the Edge Node interface, go to the "AI Assistant" page, ask a question, and verify that your dRAG can answer it based on your custom logic.\ + + +| Feature | dRAG | Pipeline | +| ----------------- | ----------------------------------------------------------------------------------------- | ---------------------------------------------------------------------------------------------- | +| **Purpose** | Retrieves and generates responses based on decentralized knowledge | Processes and transforms data from input to output | +| **Functionality** | Uses SPARQL, AI, and vector search to answer queries | Automates data processing steps like extraction, transformation, and storage | +| **Workflow** | Queries the Decentralized Knowledge Graph (DKG), processes results, and generates answers | Sequentially processes data through multiple steps (e.g., extraction, transformation, storage) | +| **Components** | Includes query execution, LLM integration, and knowledge retrieval | Can consist of multiple DAGs, data transformation steps, and processing logic | +| **Output** | AI-generated answers + linked Knowledge Assets | Structured, transformed, or enriched data for further use | + +### Customizing other services + +Since all services are open source, you have the freedom to customize any component to suit your specific needs. Your contributions are highly encouraged! For example, if you need to implement a new authentication mechanism, you can add it directly to the Authentication Service and seamlessly integrate it across the other components. + +The open-source nature not only allows you to tailor the system to your requirements but also gives you the opportunity to share your improvements with the community, potentially influencing future features and enhancing the Edge Node ecosystem. Whether you're adding new functionality, optimizing existing features, or experimenting with innovative integrations, your contributions can help drive the evolution of the platform. diff --git a/docs/graveyard/everything/dkg-edge-node/deploy-your-edge-node-based-project/README.md b/docs/graveyard/everything/dkg-edge-node/deploy-your-edge-node-based-project/README.md new file mode 100644 index 0000000..fda13b9 --- /dev/null +++ b/docs/graveyard/everything/dkg-edge-node/deploy-your-edge-node-based-project/README.md @@ -0,0 +1,15 @@ +# Deploy your Edge Node based project + +## Choose Your Deployment Method + +You can deploy the Edge Node using the Automated Installer (more methods coming soon): + +* **Automated Installer:** A flexible and customizable setup process suitable for users comfortable with command-line tools and server environments. You will need to configure the `.env` file and use the terminal to execute the installation. + +{% hint style="warning" %} +**The following instructions are intended for users who have already developed their Edge Node solution and are ready to deploy it to a production environment (Linux server).** These guides do not cover in-depth steps such as wallet preparation, funding, or configuration. +{% endhint %} + +Choose the method that best fits your needs to continue the installation process: + +
Deploy your Edge node with the automated installerhttps___files.gitbook.com_v0_b_gitbook-x-prod.appspot.com_o_spaces%2F-McnEkhdd7JlySeckfHM%2Fuploads%2F1LB2XdwetIONYxxI3xj3%2FAutomated%20environment%20setup (1).pngautomated-deployment-with-installer.md
diff --git a/docs/graveyard/everything/dkg-edge-node/deploy-your-edge-node-based-project/automated-deployment-via-google-cloud-marketplace.md b/docs/graveyard/everything/dkg-edge-node/deploy-your-edge-node-based-project/automated-deployment-via-google-cloud-marketplace.md new file mode 100644 index 0000000..f6f8fb2 --- /dev/null +++ b/docs/graveyard/everything/dkg-edge-node/deploy-your-edge-node-based-project/automated-deployment-via-google-cloud-marketplace.md @@ -0,0 +1,304 @@ +--- +description: >- + This section provides step-by-step instructions for deploying the DKG Edge + Node using the Google Cloud Marketplace. +hidden: true +icon: cart-circle-check +cover: ../../../../.gitbook/assets/Edge Node GCP - docs visual11.png +coverY: 0 +--- + +# Automated deployment via Google Cloud Marketplace + +### Edge Node deployment process overview + +This deployment method allows you to fully configure your Edge Node by filling out a solution form provided through the Google Cloud Marketplace. During the process, you’ll be guided through each required field and configure the following: + +1. Your forked repository URLs +2. Edge Node names for each supported blockchain +3. Edge Node wallet keys +4. Publishing wallet keys +5. OpenAI or Anthropic API key + +{% hint style="success" %} +Ensure you have a Google Cloud account with billing enabled and permissions to deploy Marketplace solutions. +{% endhint %} + +## **Edge Node** deployment preparation + +{% hint style="success" %} +With these requirements prepared, you'll be ready to quickly populate the solution form and proceed with the deployment of your Edge node. +{% endhint %} + +**1. Edge Node keys (wallets):** + +The Edge Node requires three types of keys: + +* Management keys +* Operational keys +* Publishing keys + +**Before deploying to a server, ensure your custom Edge Node project has already been developed** and\ +that you are familiar with the wallet preparation process outlined in the [Funding Your Keys](../get-started-with-the-edge-node-boilerplate/automated-setup-with-the-installer.md#id-3.3-funding-your-keys) section of the [Get started with the Edge Node boilerplate](../get-started-with-the-edge-node-boilerplate/) project. + +{% hint style="info" %} +While the boilerplate offers helpful guidance, it is intended as a starting point for development, not a deployment-ready solution. +{% endhint %} + +{% hint style="success" %} + + +If you're deploying on **mainnet**, ensure the following: + +* You have access to TRAC tokens on the blockchains where you intend to deploy your Edge Node +* You understand how to bridge TRAC tokens to **Gnosis** or **Base**, or use **Teleport** to transfer them to **Neuroweb**. +* Your **operational** and **management** keys are properly funded with both TRAC and the native token, depending on the blockchain to which you are deploying your node. +{% endhint %} + +**Useful links:** + +* A list of exchanges where TRAC tokens can be acquired is available on our [official website](https://origintrail.io/get-started/trac-token). +* TRAC Teleport to Neuroweb is explained [here](../../teleport-instructions-neuroweb.md). +* Bridging TRAC to Base is explained [here](../../../../dkg-knowledge-hub/learn-more/connected-blockchains/base-blockchain/#bridging-trac-to-base). +* Bridging TRAC to Gnosis is explained [here](../../../../dkg-knowledge-hub/learn-more/connected-blockchains/gnosis-chain/#bridging-trac-to-gnosis). +* Neuroweb mainnet explorer is available [here](https://neuroweb.subscan.io/). +* Neuroweb testnet explorer is available [here](https://neuroweb-testnet.subscan.io/). +* Base Sepolia explorer is available [here](https://sepolia.basescan.org/). +* Base mainnet explorer is available [here](https://basescan.org/). +* Gnosis Chiado explorer is available [here](https://gnosis-chiado.blockscout.com/). +* Gnosis mainnet explorer is available [here](https://gnosisscan.io/). + +**2. LLM functionality:** + +Edge Node components require a large language model (LLM) to operate. To ensure full functionality, you must provide at least one of the supported external API keys below: + +* [**OpenAI**](https://platform.openai.com/api-keys) **API Key** – Used to access OpenAI’s GPT models.\ + Example: `sk-4hsR2exampleXv29NSHtDyzt0NlFgk1LXj6zS7...` +* [**Anthropic**](https://console.anthropic.com/account/keys) **API Key** – Used to access Anthropic’s Claude models.\ + Example: `claude-56a8e5eexamplef27b5b6e47b1...` + +{% hint style="success" %} +🔐 Ensure that the API keys have the necessary usage quotas and permissions based on your expected workload. Always keep them secure and never share them publicly. +{% endhint %} + +## Deployment guide (step by step) + +With all prerequisites in place, you can proceed to populate the solution form on Google Cloud Marketplace and deploy your Edge Node by following the steps below. + +### **1. Log in to Google Cloud Console** + +Access your [Google Cloud Console](https://console.cloud.google.com/) using your credentials. + +### **2. Navigate to the Marketplace** + +In the main menu, go to **Marketplace**, then search for **"OriginTrail node"** in the search bar + +### **3. Configure Virtual Machine Parameters:** + +* Choose your desired deployment region, +* Machine type, +* Disk size and other instance-related options + +{% hint style="success" %} +The deployment page comes with recommended machine type and storage options already selected for optimal performance. You can adjust them if needed, but they are sufficient for most standard use cases. +{% endhint %} + +### **4. Configure your Edge Node** + +You can now proceed with populating the Edge Node solution form and prepare for the deployment. + +**4.1 - Deployment mode** + +This field determines how the Edge Node services are handled post-installation. You can choose between two options: + +* **`development`** – Installs all required services as **systemd units**, but does **not** enable or start them automatically. This mode is ideal for manual setup, debugging, or testing. +* **`production`** – Installs the services as **systemd units**, and automatically enables and starts them once the installation is complete. This is the recommended option for live environments where the node should be operational immediately. + +{% hint style="success" %} +By default, this parameter is set to "production" for Google Cloud Marketplace deployments. +{% endhint %} + +**4.2 - Blockchain environment** + +Choose between **`mainnet`** or **`testnet`**. \ + + +**4.3 - Publishing blockchain** + +Specify the blockchain that will be used for publishing Knowledge Assets. Enter one of the currently supported options: `neuroweb`, `base`, or `gnosis`. + + + +**4.4 - Publishing keys** + +You must provide public and private key pairs for up to three wallets that will be used for publishing Knowledge Assets to the selected blockchain defined in "**Publishing blockchain"** input. \ +The keys should be funded with both the **TRAC token** and the **native token** of the blockchain passed via **Publishing blockchain** form input. + +* `EVM Address 01 (Public Key)`: Public key for the first publishing wallet. **\[mandatory]** +* `EVM Address 01 (Private Key)`: Private key for the first publishing wallet. **\[mandatory]** +* `EVM Address 02 (Public Key)`: Public key for the second publishing wallet. +* `EVM Address 02 (Private Key)`: Private key for the second publishing wallet. +* `EVM Address 03 (Public Key)`: Public key for the third publishing wallet. +* `EVM Address 03 (Private Key)`: Private key for the third publishing wallet. + +{% hint style="success" %} +If you're deploying on **mainnet**, ensure the following: + +* **"Blockchain Environment"** parameter in the form is changed to `mainnet` +* Ensure that you've bridged TRAC to Base or Gnosis, or used TRAC Teleport if Neuroweb is your blockchain of choice. +{% endhint %} + +**4.5 - Node Engine configuration (ot-node service)** + +{% hint style="warning" %} +Ensure your Edge Node creates its profile on the blockchain defined in "**Publishing blockchain"** (**neuroweb** by default)**.** Choose between `neuroweb`, `base` and `gnosis` blockchains. +{% endhint %} + +This section allows you to configure your node’s name for each supported blockchain, as well as define the management and operational keys (wallets). + +* **Node Name**: You can configure the **same node name** for all supported blockchains. This name will help identify your node across all chains. +* **Management wallet (Public EVM key)**: This key grants administrative control, enabling you to configure parameters such as ASK, operator fees, and other settings.\ + You can use the **same management key** across all supported blockchains and this key should funded with the **native token** of each respective blockchain in order to be able to perform updates. +* **Operational wallet (Public EVM key)**: This key is used by the node to perform various blockchain operations. +* **Operational wallet (Private EVM key):** Your operational wallet's private key\ + **Note**: The operational keys **must be different** for each blockchain. + +{% hint style="warning" %} +- Inputs for at least one blockchain configuration must be populated (NeuroWeb, Base, or Gnosis) for the successful deployment of the Edge Node. +- If the operational keys are not funded with the native token on the blockchain of choice, the node will fail to create its blockchain profile. +- The management and operational keys require a small amount of the native token, NEURO for Neuroweb, ETH for Base, and xDAI for Gnosis, depending on the blockchain you prepare them for. +{% endhint %} + +**4.6 - MySQL password** + +During the deployment process, the password you provide in this field will be set as the **root password** for the MySQL database installed on your Edge Node server. + +{% hint style="info" %} +🔐 This root password is critical for accessing and managing your Edge node’s database. Do not share it or lose it. +{% endhint %} + +**4.7 - Github credentials** + +To deploy private GitHub repositories, you will need to provide a valid GitHub personal access token (PAT). This token grants access to your private repositories, allowing deployment of your custom Edge Node services. + +* **"GitHub username":** Your GitHub username. +* **"GitHub token":** Your GitHub personal access token (token with appropriate scopes such as `repo`). + +{% hint style="info" %} +The **token must be valid** and have the necessary **permissions** to access your private repositories. + +Ensure the token includes the **`repo`** scope, which provides access to private repositories and other related features. + +Token example: `ghp_1234abcd5678efgh9012ijklmnop34567890` +{% endhint %} + + + +**4.8 - Edge node services (your forked repository URLs)** + +If you are deploying customized or forked versions of the Edge Node services, you can provide the URLs to your private repositories in the corresponding form fields. The deployment process will use these URLs to clone and install your services as part of the setup. + +* **Knowledge Mining Service Repository URL** – Link to your fork of the Knowledge Mining service +* **DRAG Service Repository URL** – Link to your fork of the DRAG service +* **Edge Node API Repository URL** – Link to your fork of the Edge Node API +* **Edge Node UI Repository URL** – Link to your fork of the Edge Node user interface +* **Authentication Service Repository URL** – Link to your fork of the Authentication service + +{% hint style="info" %} +🔐 Make sure your GitHub access token has the `repo` scope and is valid for cloning private repositories. Use the same token and username in the form fields provided above. +{% endhint %} + +{% hint style="success" %} +If none of the URLs are defined, the installer will use the boilerplate Edge node services. +{% endhint %} + +\ +**4.9 - OpenAI or Anthropic API key:** + +Edge Node components require a large language model (LLM) to operate. To ensure full functionality, you must provide at least one of the supported external API keys below: + +* [**OpenAI**](https://platform.openai.com/api-keys) **API Key** – Used to access OpenAI’s GPT models.\ + Example: `sk-4hsR2exampleXv29NSHtDyzt0NlFgk1LXj6zS7` +* [**Anthropic**](https://console.anthropic.com/account/keys) **API Key** – Used to access Anthropic’s Claude models.\ + Example: `claude-56a8e5eexamplef27b5b6e47b1` + +{% hint style="success" %} +🔐 Ensure that the API keys have the necessary usage quotas and permissions based on your expected workload. Always keep them secure and never share them publicly. +{% endhint %} + +### **\[OPTIONAL] Additional Edge node features** + +The Edge Node supports additional external services and tools that can be configured via the `.env` file before running the installer. Below is a list of supported services represented as configuration parameters:\ + + +* **Unstructured API key -** Enables parsing PDF documents and can be obtained from [https://unstructured.io/api-key-free](https://unstructured.io/api-key-free) + +### 5. Deployment + +Once all parameters have been filled out correctly in the form, you can proceed with deploying your Edge Node. + +The server will become available for SSH access within a few minutes, even though the full installation of all Edge Node services may take approximately **20–30 minutes** to complete. + +{% hint style="success" %} +Static IP address will be assigned to your server automatically and the firewall will be configured as required by the setup. +{% endhint %} + +### 6. Monitor the deployment process + +To monitor the installation progress, SSH into your server, `cd` into the **edge-node-installer** directory and run the following command: + +```shell +bash service-status.sh +``` + +This script will display the installation progress and current status of each Edge Node service. + +While services are still being installed, a loading spinner will be shown for each one (as shown in the image below). + +

Installation in progress

+ +Once all services are marked as **"Active"**, your Edge Node is fully operational and ready to use. + +

Edge node deployment finalized

+ +{% hint style="warning" %} +⚠️ To prevent potential conflicts or interruptions, it is recommended to avoid performing other actions such as installing additional packages or modifying configurations on the server during the Edge Node deployment process. +{% endhint %} + +### 7. Access your Edge node interface + +Upon successful installation, the Edge Node user interface will be accessible at `http:///login`. Replace `` with the actual public IP address of your deployed server. + +**Default login credentials:** + +* Username: my\_edge\_node +* Password: edge\_node\_pass + +### 8. Control and Manage Edge Node Services + +Below is a list of the deployed services and how to manage them using `systemd`. \ +All Edge Node services are deployed as `systemd` units, making them easy to manage. + +#### List of Services: + +1. **Edge Node API -** `edge-node-api.service` +2. **Authentication Service** – `auth-service.service` +3. **Knowledge Mining API** – `ka-mining-api.service` +4. **DRAG API** – `drag-api.service.service` +5. **OTNode** – `otnode.service` + +#### Managing Services: + +For each service, you can perform the following actions using the respective systemd commands: + +* **Check service status:** `systemctl status ` +* **Start a service (if it is not already running):** `systemctl start ` +* **Stop a service:** `systemctl stop ` +* **Restart a service:** `systemctl restart ` +* **View Service Logs:** `journalctl -f -u ` + +## Need assistance? + +In case any of the services show an 'Inactive' status, you can contact our developers for assistance. Please reach out via [Discord](https://discord.com/invite/xCaY7hvNwD) or email us at **tech@origin-trail.com**. \ +Be sure to include the installation log file (`installation_process.log`), which can be found in the `/root/edge-node-installer/log/` directory. + diff --git a/docs/graveyard/everything/dkg-edge-node/deploy-your-edge-node-based-project/automated-deployment-with-installer.md b/docs/graveyard/everything/dkg-edge-node/deploy-your-edge-node-based-project/automated-deployment-with-installer.md new file mode 100644 index 0000000..012cec4 --- /dev/null +++ b/docs/graveyard/everything/dkg-edge-node/deploy-your-edge-node-based-project/automated-deployment-with-installer.md @@ -0,0 +1,335 @@ +--- +description: Deployment of the Edge node using the automated installer +icon: rectangle-terminal +--- + +# Automated deployment with installer + +### Edge Node deployment process overview + +Following the steps below will allow you to clone the automated installer, populate the environment file (.env), and then run the installer for a smooth setup of the DKG Edge Node on a Linux server. + +**These instructions cover the following:** + +1. Cloning the [installer repository](https://github.com/OriginTrail/edge-node-installer) +2. Populate the installer's `.env` file with: + * Your forked repository URLs + * Edge node names for each supported blockchain + * Edge Node wallet keys + * Publishing wallet keys + * OpenAI or Anthropic API key +3. Run the installer + +### System Requirements + +In order to successfully deploy your Edge node solution, you will need a **clean** Linux instance as described below + +* **OS:** Linux +* **RAM:** At least 8 GB +* **CPU:** 4 +* **Storage:** At least 20 GB of available space +* **Network:** Stable internet connection + +{% hint style="info" %} +Please note that the automated installer currently supports Ubuntu versions 20.04, 22.04, and 24.04. +{% endhint %} + +## **Edge node** deployment preparation + +{% hint style="success" %} +With these requirements prepared, you'll be ready to quickly populate the `.env` file and proceed with running the installer. +{% endhint %} + +**1. Software Dependencies:** + +* [Git](https://git-scm.com/downloads) – Required for cloning the installer repository to the server.\ + + +**2. Edge Node keys (wallets):** + +The Edge Node requires three types of keys (wallets): + +* Management keys +* Operational keys +* Publishing keys + +**Before deploying to a server, ensure your custom Edge Node project has already been developed** and\ +that you are familiar with the wallet preparation process outlined in the [Funding Your Keys](../get-started-with-the-edge-node-boilerplate/automated-setup-with-the-installer.md#id-3.3-funding-your-keys) section of the [Get started with the Edge Node boilerplate](../get-started-with-the-edge-node-boilerplate/) project. + +{% hint style="info" %} +While the boilerplate offers helpful guidance, it is intended as a starting point for development, not a deployment-ready solution. +{% endhint %} + +{% hint style="success" %} +If you're deploying on **mainnet**, ensure the following: + +* You have access to TRAC tokens on the blockchains where you intend to deploy your Edge Node +* You understand how to bridge TRAC tokens to **Gnosis** or **Base**, or use **Teleport** to transfer them to **Neuroweb**. +* Your **operational** and **management** keys are properly funded with both TRAC and the native token, depending on the blockchain to which you are deploying your node. +{% endhint %} + +**Useful documentation:** + +* A list of exchanges where TRAC tokens can be acquired is available on our [official website](https://origintrail.io/get-started/trac-token). +* TRAC Teleport to Neuroweb is explained [here](../../teleport-instructions-neuroweb.md). +* Bridging TRAC to Base is explained [here](../../../../dkg-knowledge-hub/learn-more/connected-blockchains/base-blockchain/#bridging-trac-to-base). +* Bridging TRAC to Gnosis is explained [here](../../../../dkg-knowledge-hub/learn-more/connected-blockchains/gnosis-chain/#bridging-trac-to-gnosis). +* Neuroweb mainnet explorer is available [here](https://neuroweb.subscan.io/). +* Neuroweb testnet explorer is available [here](https://neuroweb-testnet.subscan.io/). +* Base Sepolia explorer is available [here](https://sepolia.basescan.org/). +* Base mainnet explorer is available [here](https://basescan.org/). +* Gnosis Chiado explorer is available [here](https://gnosis-chiado.blockscout.com/). +* Gnosis mainnet explorer is available [here](https://gnosisscan.io/). + +**3. LLM functionality:** + +Edge Node components require a Large Language Model (LLM) to operate. To ensure full functionality, you must provide at least one of the supported external API keys below: + +* [**OpenAI**](https://platform.openai.com/api-keys) **API Key** – Used to access OpenAI’s GPT models.\ + Example: `sk-4hsR2exampleXv29NSHtDyzt0NlFgk1LXj6zS7...` +* [**Anthropic**](https://console.anthropic.com/account/keys) **API Key** – Used to access Anthropic’s Claude models.\ + Example: `claude-56a8e5eexamplef27b5b6e47b1...` + +{% hint style="success" %} +🔐 Ensure that the API keys have the necessary usage quotas and permissions based on your expected workload. Always keep them secure and never share them publicly. +{% endhint %} + +## Deployment guide (step by step) + +With all prerequisites in place, you can proceed to clone the installer repository, prepare the `.env` file with the required parameters, and deploy your Edge Node solution by following the steps below. + +### 1. Cloning the installer + +Once all requirements are prepared, SSH into your Linux server and clone the installer repository using the commands provided below: + +```sh +git clone https://github.com/OriginTrail/edge-node-installer.git && cd edge-node-installer +``` + +#### 1.1 - Create .env file + +Based on the .env.example file create your `.env` which will be used by the automated installer for the deployment of your Edge Node. + +```bash +cp .env.example .env +``` + +### **2. Configure your Edge Node** + +You are now ready to populate the `.env` file and configure the installer with the necessary parameters. + +{% hint style="success" %} +The .env file contains the explanation for each of the parameters to help you understand the configuration. +{% endhint %} + +**2.1 - Deployment mode** + +This field determines how the Edge Node services are handled post-installation. You can choose between two options: + +* **`development`** – Installs all required services as **`systemd` units**, but does **not** enable or start them automatically. +* **`production`** – Installs the services as **`systemd` units**, and automatically enables and starts them once the installation is complete. This is the recommended option for live environments where the node should be operational immediately. + +**2.2 - Blockchain environment** + +Choose between **`mainnet`** or **`testnet`**. \ + + +**2.3 - Default publish blockchain** + +Specify the blockchain that will be used for publishing Knowledge Assets. Enter one of the currently supported options: `neuroweb`, `base`, or `gnosis`. + +#### **2.4 - Publishing Keys (wallets)** + +You must provide public and private key pairs for up to three wallets that will be used for publishing Knowledge Assets to the selected blockchain (`DEFAULT_PUBLISH_BLOCKCHAIN`). The keys should be funded with both the **TRAC token** and the **native token** of the blockchain passed via "`DEFAULT_PUBLISH_BLOCKCHAIN`" parameter in `.env`. + +* **`PUBLISH_WALLET_01_PUBLIC_KEY`**: Public key for the first publishing wallet. **\[mandatory]** +* **`PUBLISH_WALLET_01_PRIVATE_KEY`**: Private key for the first publishing wallet. **\[mandatory]** +* **`PUBLISH_WALLET_02_PUBLIC_KEY`**: Public key for the second publishing wallet. +* **`PUBLISH_WALLET_02_PRIVATE_KEY`**: Private key for the second publishing wallet. +* **`PUBLISH_WALLET_03_PUBLIC_KEY`**: Public key for the third publishing wallet. +* **`PUBLISH_WALLET_03_PRIVATE_KEY`**: Private key for the third publishing wallet. + +{% hint style="success" %} +If you're deploying on **mainnet**, ensure the following: + +* `BLOCKCHAIN_ENVIRONMENT` parameter in the `.env` is changed to `mainnet` +* Ensure that you've bridged TRAC to Base or Gnosis, or used TRAC Teleport if Neuroweb is your blockchain of choice +{% endhint %} + +#### **2.5 - Node Engine configuration (ot-node service)** + +{% hint style="warning" %} +At least one blockchain configuration is required for the successful deployment of the Edge Node.\ +Ensure your Edge Node creates its profile on the blockchain set in `DEFAULT_PUBLISH_BLOCKCHAIN` (**neuroweb** by default)**.**\ +You can choose between `neuroweb`, `base` and `gnosis` blockchains. +{% endhint %} + +**Neuroweb:** + +* **`NEUROWEB_NODE_NAME`**: Name of your node on Neuroweb. +* **`NEUROWEB_OPERATOR_FEE`**: The operator fee for your Neuroweb node. Set to `0` if no fee is required. +* **`NEUROWEB_MANAGEMENT_KEY_PUBLIC_ADDRESS`**: Public address of your Neuroweb management key. +* **`NEUROWEB_OPERATIONAL_KEY_PUBLIC_ADDRESS`**: Public address of your Neuroweb operational key. +* **`NEUROWEB_OPERATIONAL_KEY_PRIVATE_ADDRESS`**: Private address of your Neuroweb operational key. + +**Base:** + +* **`BASE_NODE_NAME`**: Name of your node on Base. +* **`BASE_OPERATOR_FEE`**: The operator fee for your Base node. +* **`BASE_MANAGEMENT_KEY_PUBLIC_ADDRESS`**: Public address of your Base management key. +* **`BASE_OPERATIONAL_KEY_PUBLIC_ADDRESS`**: Public address of your Base operational key. +* **`BASE_OPERATIONAL_KEY_PRIVATE_ADDRESS`**: Private address of your Base operational key. + +**Gnosis:** + +* **`GNOSIS_NODE_NAME`**: Name of your node on Gnosis. +* **`GNOSIS_OPERATOR_FEE`**: The operator fee for your Gnosis node. Set to `0` if no fee is required. +* **`GNOSIS_MANAGEMENT_KEY_PUBLIC_ADDRESS`**: Public address of your Gnosis management key. +* **`GNOSIS_OPERATIONAL_KEY_PUBLIC_ADDRESS`**: Public address of your Gnosis operational key. +* **`GNOSIS_OPERATIONAL_KEY_PRIVATE_ADDRESS`**: Private address of your Gnosis operational key. + +{% hint style="info" %} +- Keys (wallets) provided in this section **will not** be used for publishing operations. +- Parameters for at least one blockchain are required for the node to be deployed sucessfully. +- If the operational keys are not funded with the native token on the blockchain of choice, Node Engine will fail to create its blockchain profile. +{% endhint %} + +{% hint style="warning" %} +The management and operational keys require a small amount of the native token, NEURO for Neuroweb, ETH for Base, and xDAI for Gnosis, depending on the blockchain you prepare them for. +{% endhint %} + +**2.6 - MySQL password** + +During the deployment process, the password you provide via `DB_PASSWORD` parameter will be set as the **root password** for the MySQL database. By default, the password is set to `otnodedb` but you can change it as you like. + +{% hint style="danger" %} +MySQL user is set to **root**, which is a requirement for the installation. Do not update `DB_USERNAME` parameter inside the `.env`. +{% endhint %} + +{% hint style="info" %} +🔐 This root password is critical for accessing and managing your Edge node’s database. Do not share it or lose it. +{% endhint %} + +#### **2.7 - GitHub Credentials** + +To deploy private GitHub repositories, you will need to provide a valid GitHub personal access token (PAT). This token grants the access to your private repositories, allowing deployment of your custom Edge Node services. + +* **`REPOSITORY_USER`**: Your GitHub username. +* **`REPOSITORY_AUTH`**: Your GitHub personal access token (token with appropriate scopes such as `repo`). + +{% hint style="info" %} +The **token must be valid** and have the necessary **permissions** to access your private repositories. + +Ensure the token includes the **`repo`** scope, which provides access to private repositories and other related features. + +Token example: `ghp_1234abcd5678efgh9012ijklmnop34567890` +{% endhint %} + +#### **2.8 - Edge node services (your forked repository URLs)** + +If you're deploying customized or forked versions of the Edge Node services, you can provide the URLs to your private repositories via parameters presented below. The installer will use these links to clone and deploy the services. + +* **`EDGE_NODE_KNOWLEDGE_MINING_REPO`**: Link to your fork of the knowledge mining service repository. +* **`EDGE_NODE_DRAG_REPO`**: Link to your fork of the DRAG service repository. +* **`EDGE_NODE_API_REPO`**: Link to your fork of the Edge Node API repository. +* **`EDGE_NODE_UI_REPO`**: Link to your fork of the Edge Node user interface repository. +* **`EDGE_NODE_AUTH_SERVICE_REPO`**: Link to your fork of the authentication service repository. + +{% hint style="info" %} +🔐 Make sure your GitHub access token has the `repo` scope and is valid for cloning private repositories. Use the same token and username in the form fields provided above. +{% endhint %} + +{% hint style="success" %} +If none of the URL's are defined, the installer will use the boilerplate Edge node services. +{% endhint %} + +#### **2.9 - OpenAI or Anthropic API keys** + +Edge Node components require a large language model (LLM) to operate. To ensure full functionality, you must provide at least one of the supported external API keys below: + +* [**OpenAI**](https://platform.openai.com/api-keys) **API Key** (`OPENAI_API_KEY`) – Used to access OpenAI’s GPT models.\ + Example: `sk-4hsR2exampleXv29NSHtDyzt0NlFgk1LXj6zS7` +* [**Anthropic**](https://console.anthropic.com/account/keys) **API Key** (`ANTHROPIC_API_KEY`) – Used to access Anthropic’s Claude models.\ + Example: `claude-56a8e5eexamplef27b5b6e47b1` + +{% hint style="success" %} +🔐 Ensure that the API keys have the necessary usage quotas and permissions based on your expected workload. Always keep them secure and never share them publicly. +{% endhint %} + +### **\[OPTIONAL] Additional Edge node features** + +The Edge Node supports additional external services and tools that can be configured via the `.env` file before running the installer. Below is a list of supported services represented as configuration parameters:\ + + +* **Unstructured API key** (`UNSTRUCTURED_API_URL`) **-** Enables parsing PDF documents and can be obtained from [https://unstructured.io/api-key-free](https://unstructured.io/api-key-free) + +### 3. Deployment + +Once all configuration parameters have been populated inside the `.env` correctly, you can proceed with deploying your Edge Node solution by running the edge-node-installer with the command below: + +```sh +bash edge-node-installer.sh +``` + +{% hint style="warning" %} +The installation process may take approximately 20–30 minutes, so feel free to grab a coffee and be sure not to interrupt the process during this time. +{% endhint %} + +### 4. Monitor the deployment process + +To monitor the installation progress, SSH into your server using new terminal tab, `cd` into **edge-node-installer** directory and run the following command: + +```shell +bash service-status.sh +``` + +This script will display the installation progress and current status of each Edge Node service. + +While services are still being installed, a loading spinner will be shown for each one (as shown on the image below). + +

Installation in progress

+ +Once all services are marked as **"Active"**, your Edge Node is fully operational and ready to use. + +

Edge node deployment finalized

+ +{% hint style="warning" %} +⚠️ To prevent potential conflicts or interruptions, it is recommended to avoid performing other actions such as installing additional packages or modifying configurations on the server during the Edge Node installation process. +{% endhint %} + +### 5. Access your Edge node interface + +Upon successful installation, the Edge Node user interface will be accessible at `http:///login`. Replace `` with the actual public IP address of your deployed server. + +**Default login credentials:** + +* Username: my\_edge\_node +* Password: edge\_node\_pass + +### 6. Control and Manage Edge Node Services + +Below is a list of the deployed services and how to manage them using `systemd`. \ +All Edge Node services are deployed as `systemd` units, making them easy to manage. + +#### List of Services: + +1. **Edge Node API -** `edge-node-api.service` +2. **Authentication Service** – `auth-service.service` +3. **Knowledge Mining API** – `ka-mining-api.service` +4. **DRAG API** – `drag-api.service.service` +5. **OTNode** – `otnode.service` + +#### Managing Services: + +For each service, you can perform the following actions using the respective systemd commands: + +* **Check service status:** `systemctl status ` +* **Start a service (if it is not already running):** `systemctl start ` +* **Stop a service:** `systemctl stop ` +* **Restart a service:** `systemctl restart ` +* **View Service Logs:** `journalctl -f -u ` + +### Need assistance? + +In case any of the services show an 'Inactive' status, you can contact our developers for assistance. Please reach out via [Discord](https://discord.com/invite/xCaY7hvNwD) or email us at **tech@origin-trail.com**. \ +Be sure to include the installation log file (`installation_process.log`), which can be found in the `/root/edge-node-installer/log/` directory. diff --git a/docs/graveyard/everything/dkg-edge-node/dkg-edge-node-api-documentation.md b/docs/graveyard/everything/dkg-edge-node/dkg-edge-node-api-documentation.md new file mode 100644 index 0000000..7ba5083 --- /dev/null +++ b/docs/graveyard/everything/dkg-edge-node/dkg-edge-node-api-documentation.md @@ -0,0 +1,4 @@ +# DKG Edge Node API documentation + +Coming soon ... + diff --git a/docs/graveyard/everything/dkg-edge-node/dkg-edge-node-architecture.md b/docs/graveyard/everything/dkg-edge-node/dkg-edge-node-architecture.md new file mode 100644 index 0000000..b9409b4 --- /dev/null +++ b/docs/graveyard/everything/dkg-edge-node/dkg-edge-node-architecture.md @@ -0,0 +1,12 @@ +# DKG Edge Node architecture + +The DKG Edge Node is intended for customisability and extensibility. Its official code is available as a project "boilerplate," and builders can extend it to fit their needs. + +The DKG Edge Node is based on a SoA architecture, providing multiple services designed for the separation of concerns. See the block scheme below for a high-level overview of the architecture + +
+ +The following table describes each of the services and links to their respective repositories: + +
ServiceDescriptionGithub repo
Knowledge Mining APIPerforms knowledge mining via knowledge mining pipelines, taking in various input formats to ultimately produce serialized outputs (JSON-LD), intended to then be published via the Publishing Servicehttps://github.com/OriginTrail/edge-node-knowledge-mining
dRAG APIdRAG (Decentralized Retrieval-Augmented Generation) is a service for building decentralized RAGs using data from the Decentralized Knowledge Graph (DKG). It enables querying and processing data with tools like SPARQL, LLMs, and vector databases, delivering answers from decentralized, verifiable sources.https://github.com/OriginTrail/edge-node-drag
Edge Node User InterfaceThe UI for accessing and utilizing all Edge Node functionalities.https://github.com/OriginTrail/edge-node-interface
Edge Node APIThe Edge Node backend serves as the orchestrator, coordinating operations and interactions between various services.https://github.com/OriginTrail/edge-node-api
Knowledge Graph DB InstanceAn instance of an RDF enabled triple store, such as Amazon Neptune, Ontotext Graph DB, Blazegraph, Apache Jena or others. Respective project repos
LLM servicesLarge Language Model services used by the Edge Node and chosen by the developer, enabling utilizing either local or external LLM based services (e.g. Ollama, OpenAI, Claude etc)Respective project repos
Publishing ServiceResponsible for creating Knowledge Assets on the DKG, intended to take inputs from the Knowledge Mining APIPart of Edge Node API repo
Authentication ServiceActs as the authentication hub for all Edge Node services and serves as the central source for global configuration and custom parameters.https://github.com/OriginTrail/edge-node-authentication-service
Blockchain interfaceThe Edge Node communicates with the blockchain through an array of RPC endpoints configurable by the user.No specific repo, please consult the documentation on setting up nodes and RPC endpoints
DKG V8 Network Engine (ot-node)Network runtime service, responsible for communicating with other DKG nodeshttps://github.com/OriginTrail/ot-node
Operational DBsVarious small databases used for operational purposesDeployed with respective services
+ diff --git a/docs/graveyard/everything/dkg-edge-node/dkg-edge-node-inception-program.md b/docs/graveyard/everything/dkg-edge-node/dkg-edge-node-inception-program.md new file mode 100644 index 0000000..7802a90 --- /dev/null +++ b/docs/graveyard/everything/dkg-edge-node/dkg-edge-node-inception-program.md @@ -0,0 +1,34 @@ +# DKG Edge Node inception program + +
+ +Are you building a next-generation artificial intelligence product and are excited about hybrid, neuro-symbolic AI? + +The **DKG Edge Node inception program** is designed for builders of neuro-symbolic, privacy-enabled AI on the OriginTrail Decentralized Knowledge Graph (DKG) with the DKG Edge Nodes and paranets. + +The inception program provides a pool of **750k TRAC tokens** for teams launching their paranets with DKG Edge Nodes on both the DKG V6 or V8 Mainnet, with tokens exclusively used to publish Knowledge Assets to their paranets. + +Ready to apply? Here's a clear roadmap to help you get started: + +* **Step 1**: Explore and prototype locally + * **Begin** by thoroughly **reading** the [DKG Edge Node documentation](dkg-edge-node-architecture.md), which will provide you with an understanding of key concepts such as Knowledge Assets, paranets, and the role of Edge Nodes within the DKG ecosystem. + * **Install** the DKG Edge Node services on your local machine following [the local setup guide](get-started-with-the-edge-node-boilerplate/manual-setup.md). + * **Experiment** with the node by publishing Knowledge Assets locally and trying out the different knowledge mining and dRAG pipelines. +* **Step 2:** Build your MVP on the testnet + * **Define** a high-level concept for your AI service/application, which you want to build a paranet around. + * **Transition** from your local development to the DKG testnet by [deploying a testnet node](../dkg-core-node/run-a-v8-core-node-on-testnet/) and integrating it into your Edge Node environment. + * **Build** an MVP of your paranet AI service(s) - it should showcase how your team utilizes the DKG's capabilities to power the AI application. + * **Share** the MVP with the OriginTrail community on the [Discord](https://discord.gg/xCaY7hvNwD) #builders-hub channel and get feedback. +* **Step 3**: Launch on mainnet + * **Migrate** your paranet services from the DKG testnet to the DKG mainnet - this includes reconfiguring your Edge Node to operate on your mainnet node and creating a mainnet paranet. + * Optionally, **launch** your [Initial Paranet Offering (IPO)](../../../build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-paranets/initial-paranet-offerings-ipos/launching-your-ipo.md). + * **Continue building**, create Knowledge Assets, and scale your paranet services as the paranet operator! +* **Step 4 (optional):** Apply for the DKG Edge Node inception program + * **Apply** to the inception program [via this form](https://docs.google.com/forms/d/e/1FAIpQLSdOoyoBFC7oEftK67Sioo32Yf1YHHONME4_c8j-34IxwpBgHg/viewform). + * If accepted, **receive** mainnet TRAC tokens which are a reimbursement for the TRAC you spent for creating Knowledge Assets on the mainnet. + +Keep in mind that the applications require a demonstrable working product on the DKG mainnet. + +You can apply to the inception program after completing **step 3** [via this form](https://docs.google.com/forms/d/e/1FAIpQLSdOoyoBFC7oEftK67Sioo32Yf1YHHONME4_c8j-34IxwpBgHg/viewform). + +For any questions about the program join the discussion on [Discord.](https://discord.gg/xCaY7hvNwD) diff --git a/docs/graveyard/everything/dkg-edge-node/get-started-with-the-edge-node-boilerplate/README.md b/docs/graveyard/everything/dkg-edge-node/get-started-with-the-edge-node-boilerplate/README.md new file mode 100644 index 0000000..fa9c1d2 --- /dev/null +++ b/docs/graveyard/everything/dkg-edge-node/get-started-with-the-edge-node-boilerplate/README.md @@ -0,0 +1,18 @@ +--- +description: >- + Set up an Edge Node project to explore the node's capabilities and build your + own projects. As an open-source solution, the Edge Node offers complete + customisations to fit your needs. +--- + +# Get started with the Edge Node boilerplate + +{% hint style="info" %} +The core developers released the [**automated development environment setup**](automated-setup-with-the-installer.md) and would **love to hear your feedback** on the installation process. Join the discussion in the [Discord](https://discord.gg/xCaY7hvNwD) _#_��_builders-hub_ channel! +{% endhint %} + +This guide will walk you through the process of importing a JSON file (example for a movie), transforming it into a JSON-LD format that is compatible with a graph structure using the Knowledge Mining service. Once processed, the data is published to the Decentralized Knowledge Graph (DKG) via the Edge Node API. Additionally, a simple DRAG is used to retrieve the movie data and answer basic queries about the film. + +To set up the DKG Edge Node in your local environment, you can choose between the following two options: + +
Automated development environment setupautomated-setup-with-the-installer.mdAutomated environment setup.png
Manual development environment setupmanual-setup.mdManual dvelopment environment setup.png
diff --git a/docs/graveyard/everything/dkg-edge-node/get-started-with-the-edge-node-boilerplate/automated-setup-with-the-installer.md b/docs/graveyard/everything/dkg-edge-node/get-started-with-the-edge-node-boilerplate/automated-setup-with-the-installer.md new file mode 100644 index 0000000..9d891aa --- /dev/null +++ b/docs/graveyard/everything/dkg-edge-node/get-started-with-the-edge-node-boilerplate/automated-setup-with-the-installer.md @@ -0,0 +1,194 @@ +--- +description: >- + This page will guide you through the DKG Edge Node installation process for + MacOS and Linux +--- + +# Automated setup with the installer + +## Supported operating systems + +* macOS +* Ubuntu 22.04 and Ubuntu 24.04 + +{% hint style="success" %} +Edge node installer will detect your operating system automatically (MacOS or Linux) so no configuration related to this is required by the user. +{% endhint %} + +## 1. Pre-installation Requirements + +Before getting started with the installation, there are a few important preparations to make. The process requires certain inputs such as wallet addresses and funds. The following sections will walk you through each of these prerequisites step by step. + +#### 1.1. Software dependencies: + +* [Git](https://git-scm.com/downloads) +* [Homebrew](https://brew.sh/) (macOS) + +#### 1.2. Edge node keys (wallets): + +The Edge Node uses three types of keys: a **management key**, an **operational key**, and a **publishing key**. The management and operational keys need a small amount of the utility token based on the blockchain you will be using (NEURO, Base ETH, xDai) to sign transactions for certain operations, while publishing key requires both the utility and TRAC tokens to perform publishing operations. To check your wallets funds on Neuroweb testnet, use [Subscan explorer](https://neuroweb-testnet.subscan.io/). + +## 2. Download the OriginTrail DKG Edge Node installer + +Once you have your wallets funded and ready, you may proceed with cloning, configuring and running the Edge node installer by following the instructions provided below. + +#### 2.1 - Clone the edge-node-installer repository using git: + +```sh +git clone https://github.com/OriginTrail/edge-node-installer.git && cd edge-node-installer +``` + +## 3. Configure the environment (.env) + +#### 3.1 - Create .env file using .env.example: + +```bash +cp .env.example .env +``` + +The **.env.example** is used as a guide template, but is well documented and should help you understand what each of the variables mean. + +#### **3.2 - Populate .env file:** + +To simplify the development setup, we’ve provided the `env-setup.js` script. \ +This script will automatically generate and populate all required `.env` parameters, including: + +* Management keys +* Operational keys +* Publishing keys (for all supported chains) +* Node name (same across all chains) + +To populate your `.env` file, run: + +{% hint style="success" %} +We assume that Node.js is already installed on your system to run the `env-setup.js` script.\ +If Node.js is not installed or you're setting up on a fresh system, we recommend using NVM to install Node.js version 20 along with npm. +{% endhint %} + +Once Node.js and npm are successfully installed, you can use the following command to populate your `.env` file: + +``` +npm install ethers && node env-setup.js +``` + +#### 3.3 Funding your keys: + +Once the `.env` file is populated with the mandatory parameters, you will have to fund the keys (wallets) which were added to the .env file. + +1. Open your `.env` file using `nano` or any text editor. +2. Locate the following wallet entries for each blockchain: + * `OPERATIONAL_KEY_PUBLIC_ADDRESS` + * `MANAGEMENT_KEY_PUBLIC_ADDRESS` + * `PUBLISH_WALLET_01_PUBLIC_KEY` +3. Fund wallets for at least one blockchain in the `.env` file using our faucet. Instructions on how to use the Faucet can be found on the [Faucet documentation page](../../../../dkg-knowledge-hub/useful-resources/test-token-faucet.md). Our faucet can provide you with the testnet tokens as follows: + 1. **Neuroweb:** TRAC and NEURO + 2. **Gnosis Chiado:** TRAC and xDai + 3. **Base Sepolia:** TRAC (ETH for Base Sepolia can be received via the faucets provided in the official Base [documentation](https://docs.base.org/chain/network-faucets)) + +{% hint style="info" %} +`OPERATIONAL_KEY_PUBLIC_ADDRESS` and `MANAGEMENT_KEY_PUBLIC_ADDRESS` require a small amount of native token (e.g., NEURO, ETH, xDAI) while `PUBLISH_WALLET_01_PUBLIC_KEY` requires both native and TRAC token in order to be able to publish Knowledge Assets. +{% endhint %} + +{% hint style="warning" %} +It’s important to ensure that your node creates its profile **on the blockchain** specified in `DEFAULT_PUBLISH_BLOCKCHAIN` parameter. + +\ +**For Neuroweb, first use the faucet to acquire NEURO in order to initiate your wallet, and then request TRAC, otherwise the TRAC funding transaction will fail.** +{% endhint %} + +#### 3.4 - Configure LLM functionality: + +Edge Node components require a large language model (LLM) to operate. To ensure full functionality, you must provide at least one of the supported external API keys below: + +* [**OpenAI**](https://platform.openai.com/api-keys) **API Key** – Used to access OpenAI’s GPT models.\ + Example: `sk-4hsR2exampleXv29NSHtDyzt0NlFgk1LXj6zS7` +* [**Anthropic**](https://console.anthropic.com/account/keys) **API Key** – Used to access Anthropic’s Claude models.\ + Example: `claude-56a8e5eexamplef27b5b6e47b1` + +{% hint style="success" %} +🔐 Ensure that the API keys have the necessary usage quotas and permissions based on your expected workload. Always keep them secure and never share them publicly. +{% endhint %} + +#### Optional .env parameters: + +The Edge Node supports additional external services and tools that can be configured via the `.env` file before running the installer. Below is a list of supported services represented as configuration parameters: + +```sh +# Unstructured.io - can be obtained from https://unstructured.io/api-key-free +# (used by default for parsing PDF documents) +UNSTRUCTURED_API_URL="" + +# HuggingFace - can be obtained from https://huggingface.co/settings/tokens +# (used if you use vector search in your dRAG pipeline) +HUGGINGFACE_API_KEY="" +``` + +## 4. Run installation: + +Once all the required parameters have been configured in the .env file, simply run the installer and wait for it to finish the Edge node setup. + +```bash +bash edge-node-installer.sh +``` + +Edge node installer will deploy each component within the **edge\_node** directory: drag-api, edge-node-auth-service, ka-mining-api, ot-node. + +Edge node services will be available and listening on the following local ports upon installation: + +* UI will be available on [http://localhost](http://localhost) +* 3001 - Authentication service +* 3002 - API (backend) +* 5002 - dRAG +* 5005 - Knowledge mining +* 8900 - OT Node + +\ +UI (Edge node interface) will be cloned and configured in the following paths: + +* MacOS - /opt/homebrew/etc/nginx/servers/edge-node-ui +* Linux - /var/www/edge-node-ui + +{% hint style="success" %} +The default login credentials for the Edge node UI are as follows: + +**username:** my\_edge\_node + +**password:** edge\_node\_pass +{% endhint %} + +## 5. Control and Manage Edge Node Services (Linux) + +Below is a list of the deployed services and how to manage them using `systemd`. \ +All Edge Node services are deployed as `systemd` units, making them easy to manage. + +#### List of Services: + +1. **Edge Node API -** `edge-node-api.service` +2. **Authentication Service** – `auth-service.service` +3. **Knowledge Mining API** – `ka-mining-api.service` +4. **DRAG API** – `drag-api.service.service` +5. **OTNode** – `otnode.service` + +#### Managing Services: + +For each service, you can perform the following actions using the respective systemd commands: + +* **Check service status:** `systemctl status ` +* **Start a service (if it is not already running):** `systemctl start ` +* **Stop a service:** `systemctl stop ` +* **Restart a service:** `systemctl restart ` +* **View Service Logs:** `journalctl -f -u ` + +## Exploring Edge Node Capabilities + +Once the Edge Node is up and running in your local environment, you’re ready to explore its features and capabilities. To get started, refer to the [**Usage Example**](usage-example.md) section to learn how to interact with the node and understand what it can do. + +## Need help? + +If you encounter any issues during the installation process or have questions about any of the topics above, jump into our official [Discord](https://discord.gg/xCaY7hvNwD) and ask for assistance. + +Follow our official channels for updates: + +* [X](https://x.com/origin_trail) +* [Medium](https://medium.com/origintrail) +* [Telegram](https://t.me/origintrail) diff --git a/docs/graveyard/everything/dkg-edge-node/get-started-with-the-edge-node-boilerplate/manual-setup.md b/docs/graveyard/everything/dkg-edge-node/get-started-with-the-edge-node-boilerplate/manual-setup.md new file mode 100644 index 0000000..ccb8d1f --- /dev/null +++ b/docs/graveyard/everything/dkg-edge-node/get-started-with-the-edge-node-boilerplate/manual-setup.md @@ -0,0 +1,470 @@ +--- +description: >- + Setting up an Edge Node project to explore its capabilities and build your own + projects. As an open-source solution, the Edge Node offers complete + customization to fit your needs. +--- + +# Manual setup + +To manually set up the DKG Edge Node in your local environment, you can choose between the following two options: + +1. **\[Recommended]** [**Set up local Edge Node services with a local DKG network**](manual-setup.md#setup-with-a-local-dkg-network) +2. [**Configure local Edge Node services with a pre-deployed V8 DKG Core Testnet Node**](manual-setup.md#setup-with-pre-deployed-v8-dkg-runtime-testnet-node) + +## Setup with a local DKG network \[Recommended] + +This option allows you to configure the Edge Node services within your local development environment, utilizing a locally deployed DKG network. This setup provides full control over the environment and is ideal for developing custom processing pipelines and testing in a self-contained network. + +### System Requirements + +* **Operating System:** macOS, Linux +* **RAM:** At least 8 GB +* **CPU:** 4 +* **Storage:** At least 20 GB of available space +* **Network:** Stable internet connection + +### Software Dependencies + +Make sure the following services are installed and properly configured: + +* **Git:** Version control system + * **Windows**: [Download Git for Windows](https://git-scm.com/download/win) + * **Linux**: [Install Git on Linux](https://git-scm.com/book/en/v2/Getting-Started-Installing-Git) (via a package manager such as`sudo apt install git` for Ubuntu) + * **macOS**: [Download Git for macOS](https://git-scm.com/download/mac) +* **MySQL 8:** Database service + * **Windows**: [Download MySQL Installer for Windows](https://dev.mysql.com/downloads/installer/) + * **Linux**: [Install MySQL on Linux](https://dev.mysql.com/doc/refman/8.0/en/linux-installation.html) (via `sudo apt install mysql-server` for Ubuntu) + * **macOS**: [Download MySQL for macOS](https://dev.mysql.com/downloads/mysql/) +* **Redis:** In-memory data structure store ([documentation](https://redis.io/docs/latest/operate/oss_and_stack/install/install-redis/)) + + * **Windows**: [Install Redis on Windows](https://github.com/microsoftarchive/redis/releases) + * **Linux**: + + ``` + sudo apt update + sudo apt install redis-server + ``` + + * **macOS**: + + ``` + brew install redis + ``` +* **Node.js:** JavaScript runtime environment + * **v20.04:** Used for the local network/ot-node setup + * **v22.4.0:** Used for Edge Node services +* [**Java**](#user-content-fn-1)[^1] **v11:** for running Blazegraph triple store +* **Python** v3.11.7 + +{% hint style="info" %} +We recommend using NVM and Pyenv in order to be able to switch Node.js versions easily and install Python. +{% endhint %} + +## Setup process + +### **1 - Setting up local DKG network:** + +Edge Node services require a V8 DKG Runtime Node endpoint in order to be configured and initialized so for this setup, running a local DKG network is essential. + +In order to deploy a local DKG network, do the following: + +* Clone the OriginTrail node engine: `git clone https://github.com/OriginTrail/ot-node` +* Enter **ot-node** directory and checkout to v8/develop branch: `cd ot-node && git checkout v8/develop` +* Switch to Node.js v20 and install node modules: `npm install` +* Download blazegraph.jar (Triple store db) from the following [link](https://github.com/blazegraph/database/releases/tag/BLAZEGRAPH_2_1_6_RC) and save it to the location of your choice locally +* Run blazegraph.jar file with the following command: `java -server -Xmx4g -jar blazegraph.jar` in order to start the triple store database (from the directory where Blazegraph is downloaded) +* Make sure that the MySQL service is running +* In ot-node directory, create .env file in ot-node directory and populate it with the following parameters (If your MySQL is password protected, add the password to REPOSITORY\_PASSWORD): + +
NODE_ENV=development
+RPC_ENDPOINT_BC1=http://127.0.0.1:8545
+RPC_ENDPOINT_BC2=http://127.0.0.1:9545
+REPOSITORY_PASSWORD=
+
+ +* Start the local DKG network by executing the following command from the ot-node directory: `bash tools/local-network-setup/setup-macos-environment.sh --nodes=5` +* Once the network deployment is initiated, you will be provided with the local hardhat blockchain and the amount of local DKG nodes that you defined via **--nodes** flag + +Once the network is up and running, you will select one of the local node endpoints to configure the Edge Node services, which will be explained in the continuation of these instructions. + +### **2 - Cloning DKG Edge Node services:** + +In order to kick off the installation process, you need to clone all Edge Node services to your local environment using **`git clone`** command. + +1. [Authentication service](https://github.com/OriginTrail/edge-node-authentication-service) +2. [Edge Node API](https://github.com/OriginTrail/edge-node-api) +3. [Edge Node interface](https://github.com/OriginTrail/edge-node-interface) +4. [Knowledge mining API](https://github.com/OriginTrail/edge-node-knowledge-mining) +5. [dRAG](https://github.com/OriginTrail/edge-node-drag) + +### **3 - Configuring DKG Edge Node services** + +The instructions for configuring DKG Edge Node services are also available in the README file of each service's GitHub repository, where you can follow the steps provided. + +#### **3.1 Setup Edge Node Authentication Service:** + +* Create database **'edge-node-auth-service'** +* Create .env file with `cp .env.example .env` in the dir. +* Generate random strings for the following .env variables: + * **JWT\_SECRET** and + * **SECRET** (you can use `openssl rand -hex 64` for example) +* Install node modules with `npm install` (use Node.js **v22.4.0**) +* Setup your database with `npx sequelize-cli db:migrate` and `npx sequelize-cli db:seed:all` - This will generate a demo user, a wallet with funds for the local network, and configure your local Edge Node Authentication Service to connect to the first node from your local network. +* Run the following MySQL query on **edge-node-auth-service** database: [UserConfig.sql](https://github.com/OriginTrail/edge-node-authentication-service/blob/main/UserConfig.sql) + +{% hint style="info" %} +`seed` command will create an example user with the following credentials: + +**username** `admin` and password `admin123` +{% endhint %} + +* Initiate Edge Node Authentication service with: `npm run start` + +#### **3.2 - Configuring Edge Node API** **service:** + +1. Create .env file with `cp .env.example .env` in the service directory +2. Create the database mentioned in `.env` +3. Install node modules with `npm install` (use Node.js **v22.4.0**) +4. Execute migrations: `npx sequelize-cli db:migrate` +5. Setup Runtime node MySQL operational db connection by populating the following values in the .env file: + +```bash +RUNTIME_NODE_OPERATIONAL_DB_USERNAME=root +RUNTIME_NODE_OPERATIONAL_DB_PASSWORD= +RUNTIME_NODE_OPERATIONAL_DB_DATABASE=operationaldb0 +RUNTIME_NODE_OPERATIONAL_DB_HOST=127.0.0.1 +RUNTIME_NODE_OPERATIONAL_DB_DIALECT=mysql +``` + +{% hint style="info" %} +When your local DKG network is deployed, each of the local nodes will create its own operational database named **operationaldb0**, **operationaldb1**, **operationaldb2**, and so on, depending on the size of your local network. +{% endhint %} + +7. Initialize Redis and make sure that it's running on its default port 6379 +8. Start the service: `npm run start` + +#### **3.3 - Configuring Edge Node UI:** + +In order to set up the Edge Node UI locally, please run the following commands: + +* Create .env file: `cp .env.example .env` +* Install ionic/cli: `npm install -g @ionic/cli` +* Install electron: `npm install -g electron` +* Install node modules: `npm install` (use Node.js **v22.4.0**) +* To run the application in your browser: `ionic serve` +* To run the application in your desktop app: `npm run start-electron` + +Once the Edge Node UI is ready, the only accessible screen will be **/login**. Use the credentials created during the Auth service setup previously. + +#### **3.4 - Configuring Edge Node Knowledge Mining:** + +In order to setup Edge Node UI locally, please follow the steps below: + +* Create .env file with `cp .env.example .env` in the dir. +* Add the absolute path to your DAGs folder into the `DAG_FOLDER_NAME` variable in the `.env` file +* Setup Python environment: `pyenv local 3.11.7` + +{% hint style="info" %} +It's recommended to use pyenv and install Python 3.11 locally within the app's directory to avoid conflicts with other Python versions on your machine. +{% endhint %} + +* A virtual environment should be set up to install the required dependencies. You can do this by running the following command: `python -m venv .venv && source .venv/bin/activate` +* Install Python requirements: `pip install -r requirements.txt` +* Setup Apache Airflow service: + +{% hint style="info" %} +Airflow pipelines are an integral part of the Knowledge Mining Service, used to create automated data processing workflows. The primary purpose of these pipelines is to generate content for Knowledge Assets based on input files. +{% endhint %} + +* Generate default Airflow config: `airflow config list --defaults` +* Open the Airflow configuration file located at "\~/**airflow/airflow.cfg"** file and update the following parameters as presented below: + +```bash +load_examples = False +dags_folder = YOUR_PATH_TO/edge-node-knowledge-mining/dags +parallelism = 32 +max_active_tasks_per_dag = 16 +max_active_runs_per_dag = 16 +enable_xcom_pickling = True +``` + +* Initiate Airflow database and create an admin user with: + +```bash +airflow db init +airflow users create --role Admin --username admin --email admin --firstname admin --lastname admin --password admin +``` + +* Initiate Airflow scheduler: `airflow scheduler` +* Pick up new jobs and start them: + +```bash +airflow dags unpause exampleDAG +airflow dags unpause pdf_to_jsonld +airflow dags unpause simple_json_to_jsonld +``` + +* Initiate Airflow webserver: `airflow webserver --port 8080` +* Once the Airflow webserver is initiated, your pipelines should be available on [http://localhost:8080/home](http://localhost:8080/home) +* Initiate Edge Node Knowledge Mining: `python app.py` + +{% hint style="info" %} +`airflow scheduler, airflow webserver` and `python app.py` should be initiated in parallel inside the virtual environment (use separated terminal windows to run them). +{% endhint %} + +* Create MySQL logging: ``CREATE DATABASE `ka-mining-api-logging` CHARACTER SET utf8mb4 COLLATE utf8mb4_unicode_ci;`` + +Edge Node Knowledge Mining README is available [here](https://github.com/OriginTrail/edge-node-knowledge-mining). + +#### **3.5 - Configuring Edge Node dRAG:** + +In order to set up the Edge Node dRAG locally, please follow the steps below: + +* Install node modules: `npm install` (use Node.js **v22.4.0**) +* Create .env file: `cp .env.example .env` +* Add your LLM\_API\_KEY into the .env (dRAG uses LLM to formulate the answer) +* Create a database for dRAG logging: `CREATE DATABASE drag_logging;` +* Run migrations: `npx sequelize-cli db:migrate` +* Initiate the service: `npm run start` + +(Optional) If you want to use the vectorization controller, you have to set up the following services: HuggingFace, Zilliz, and Cohere. If you want to customize the experience further, you can also modify the code to use any other service for embedding/vector search and reranking. + +* HuggingFace - used for vectorization embedding model [https://huggingface.co/](https://huggingface.co/) +* Zilliz - used for hosting the vector database [https://cloud.zilliz.com/](https://cloud.zilliz.com/) +* Cohere ReRanker - used for improving retrieval results accuracy [https://dashboard.cohere.com/](https://dashboard.cohere.com/) + + + +Once you've finalized configuring all DKG Edge Node services, please make sure that they are exposed on the following ports: + +* **Edge Node Authentication Service:** [http://localhost:3001](http://localhost:3001) +* **Edge Node API:** [http://localhost:3002](http://localhost:3002) +* **Edge Node UI:** [http://localhost:5173](http://localhost:5173) +* **Edge Node Knowledge Mining:** [http://localhost:5005](http://localhost:5005) +* **Edge Node dRAG:** [http://localhost:5002](http://localhost:5002) + + + +## Setup with pre-deployed V8 DKG Runtime Testnet Node + +{% hint style="warning" %} +Prior to proceeding with this setup option, it is essential to have a V8 DKG Core Node operational on the testnet, as this is a critical requirement. If you have not yet deployed a Testnet Core Node, please consult the installation guide available on the following [page](https://docs.origintrail.io/dkg-v8-upcoming-version/run-a-v8-core-node-on-testnet) for detailed instructions. +{% endhint %} + +### System Requirements + +* **Operating System:** macOS, Linux, Windows with WSL +* **RAM:** At least 8 GB +* **CPU:** 4 +* **Storage:** At least 20 GB available space +* **Network:** Stable internet connection + +### Software Dependencies + +Make sure the following services are installed and properly configured: + +* **Git:** Version control system +* **MySQL 8:** Database service +* **Redis:** In-memory data structure store ([documentation](https://redis.io/docs/latest/operate/oss_and_stack/install/install-redis/)) +* **Node.js v22.4.0:** JavaScript runtime environment +* **Python** v3.11.7 + +{% hint style="info" %} +We recommend using NVM and Pyenv in order to be able to switch Node.js versions easily and install Python. +{% endhint %} + +## **Setup process:** + +### **1 - Cloning DKG Edge Node services:** + +In order to kick off the installation process, you need to clone all Edge Node services to your local environment using **`git clone`** command. + +1. [Authentication service](https://github.com/OriginTrail/edge-node-authentication-service) +2. [Edge Node API](https://github.com/OriginTrail/edge-node-api) +3. [Edge Node interface](https://github.com/OriginTrail/edge-node-interface) +4. [Knowledge mining API](https://github.com/OriginTrail/edge-node-knowledge-mining) +5. [dRAG](https://github.com/OriginTrail/edge-node-drag) + +### **2 - Whitelisting your local IP on the pre-deployed V8 DKG Runtime Node:** + +* SSH to the server where you have your V8 DKG Core Node up and running +* Edit .origintrail\_noderc configuration file with `nano` or any other editor +* Locate the `auth` section in the configuration file and add your local IP as presented below: + +```json + "auth": { + "ipWhitelist": [ + "::1", + "127.0.0.1", + "" + ] + } +``` + +* Restart your node with **`otnode-restart`** command in order for changes to configuration to be applied + +### **3 - Configuring DKG Edge Node services** + +The instructions for configuring DKG Edge Node services are also available in the README file of each service's GitHub repository, where you can follow the steps provided. + +#### **3.1 Setup Edge Node Authentication Service:** + +* Create database **'edge-node-auth-service'** +* Create .env file with `cp .env.example .env` in the dir. +* Generate random strings for the following .env variables: + * **JWT\_SECRET** and + * **SECRET** (you can use `openssl rand -hex 64` for example) +* Install node modules with `npm install` (use Node.js **v22.4.0**) +* Setup your database with `npx sequelize-cli db:migrate` and `npx sequelize-cli db:seed:all` - This will generate a demo user, a wallet with funds for the local network, and configure your local Edge Node Authentication Service to connect to the first node from your local network. + +{% hint style="info" %} +`seed` command will create an example user with the following credentials: + +**username** `admin` and password `admin123` +{% endhint %} + +* Run the following MySQL query on **edge-node-auth-service** database: [UserConfig.sql](https://github.com/OriginTrail/edge-node-authentication-service/blob/main/UserConfig.sql) +* Since the node is by default configured to automatically work with your local network, the following variables inside of the **UserConfigs** table (database: edge-node-auth-service) should be updated to match your pre-deployed V8 DKG Runtime Node information: + * "**run\_time\_node\_endpoint**": http://\ + * "**run\_time\_node\_port**": 8900 + * "**edge\_node\_environment**": testnet + * "**blockchain**": base:84532 +* Replace placeholder wallet - The following table "**user\_wallets**" will be populated with a pre-defined wallet address (local network related) which should be replaced by your Base Sepolia wallet with funds (ETH and TRAC) + +{% hint style="info" %} +Instructions on how to use the TRAC faucet can be found [here](../../../../dkg-knowledge-hub/useful-resources/test-token-faucet.md). +{% endhint %} + +* Initiate Edge Node Authentication Service with: `npm run start` + +#### **3.2 - Configuring Edge Node API** S**ervice:** + +1. Create .env file with `cp .env.example .env` in the service directory +2. Create the database mentioned in `.env` +3. Install node modules with `npm install` (use Node.js **v23**) +4. Execute migrations: `npx sequelize-cli db:migrate` +5. Expose V8 DKG Core node operational database (MySQL) to a local Edge Node API service: + * SSH to your Runtime Node server + * Expose port 3306 on your server to your local IP address (firewall configuration) + * Enable MySQL remote connection by changing the bind-address from **127.0.0.0** to **0.0.0.0** in `/etc/mysql/mysql.conf.d/mysqld.cnf` + * Restart mysql.service: `systemctl restart mysql.service` + * Create new MySQL user for Edge Node API service to use: + * mysql -u root -p + * When asked for the password, use the password you created during the node setup process + * Create a user for remote access: CREATE USER'**username**'@'%' IDENTIFIED BY'**your\_password**'; + * GRANT ALL PRIVILEGES ON\*.\*TO'**username**'@'%'WITH GRANT OPTION; + * FLUSH PRIVILEGES; + * Setup Runtime Node MySQL operational db connection by populating the following values in the .env file: + + ```bash + RUNTIME_NODE_OPERATIONAL_DB_USERNAME=root + RUNTIME_NODE_OPERATIONAL_DB_PASSWORD= + RUNTIME_NODE_OPERATIONAL_DB_DATABASE=operationaldb + RUNTIME_NODE_OPERATIONAL_DB_HOST= + RUNTIME_NODE_OPERATIONAL_DB_DIALECT=mysql + ``` +6. Initialize Redis and make sure that it's running on its default port **6379** +7. Start the service: `npm run start` + +#### **3.3 - Configuring Edge Node UI:** + +In order to set up the Edge Node UI locally, please run the following commands: + +* Create .env file: `cp .env.example .env` +* Install ionic/cli: `npm install -g @ionic/cli` +* Install electron: `npm install -g electron` +* Install node modules: `npm install` (use Node.js **v23**) +* To run the application in your browser: `ionic serve` +* To run the application in your desktop app: `npm run start-electron` + +Once the Edge Node UI is ready, the only accessible screen will be **/login**. Use the credentials created during the Auth Service setup previously. + + + +#### **3.4 - Configuring Edge Node Knowledge Mining:** + +In order to setup Edge Node UI locally, please follow the steps below: + +* Create .env file with `cp .env.example .env` in the dir. +* Add the absolute path to your DAGs folder into the `DAG_FOLDER_NAME` variable in the `.env` file +* Setup Python environment: `pyenv local 3.11.7` + +{% hint style="info" %} +It's recommended to use pyenv and install Python 3.11 locally within the app's directory to avoid conflicts with other Python versions on your machine. +{% endhint %} + +* A virtual environment should be set up to install the required dependencies. You can do this by running the following command: `python -m venv .venv && source .venv/bin/activate` +* Install Python requirements: `pip install -r requirements.txt` +* Setup Apache Airflow service: + +{% hint style="info" %} +Airflow pipelines are an integral part of the Knowledge Mining service, used to create automated data processing workflows. The primary purpose of these pipelines is to generate content for Knowledge Assets based on input files. +{% endhint %} + +* Generate default Airflow config: `airflow config list --defaults` +* Open the Airflow configuration file located at "\~/**airflow/airflow.cfg"** file and update the following parameters as presented below: + +```bash +load_examples = False +dags_folder = YOUR_PATH_TO/edge-node-knowledge-mining/dags +parallelism = 32 +max_active_tasks_per_dag = 16 +max_active_runs_per_dag = 16 +enable_xcom_pickling = True +``` + +* Initiate Airflow database and create an admin user with: + +```bash +airflow db init +airflow users create --role Admin --username admin --email admin --firstname admin --lastname admin --password admin +``` + +* Initiate Airflow scheduler: `airflow scheduler` +* Pick up new jobs and start them: + +``` +airflow dags unpause exampleDAG +airflow dags unpause pdf_to_jsonld +airflow dags unpause simple_json_to_jsonld +``` + +* Initiate Airflow webserver: `airflow webserver --port 8080` +* Once the Airflow webserver is initiated, your pipelines should be available on [http://localhost:8080/home](http://localhost:8080/home) +* Initiate Edge Node Knowledge Mining: `python app.py` + +{% hint style="info" %} +`airflow scheduler, airflow webserver` and `python app.py` should be initiated in parallel inside the virtual environment (use separated terminal windows to run them). +{% endhint %} + +* Create MySQL logging: ``CREATE DATABASE `ka-mining-api-logging` CHARACTER SET utf8mb4 COLLATE utf8mb4_unicode_ci;`` + +#### **3.5 - Configuring Edge Node dRAG:** + +In order to set up Edge Node dRAG locally, please follow the steps below: + +* Install node modules: `npm install` (use Node.js **v22.4.0**) +* Create .env file: `cp .env.example .env` +* Add your LLM\_API\_KEY into the .env (dRAG uses LLM to formulate the answer) +* Create a database for dRAG logging: `CREATE DATABASE drag_logging;` +* Run migrations: `npx sequelize-cli db:migrate` +* Initiate the service: `npm run start` + +**(Optional)** In case you want to use the vectorization controller, you have to set up the following services: HuggingFace, Zilliz, and Cohere. If you want to customize the experience further, you can also modify the code to use any other service for embedding/vector search and reranking. + +* HuggingFace - used for vectorization embedding model [https://huggingface.co/](https://huggingface.co/) +* Zilliz - used for hosting the vector database [https://cloud.zilliz.com/](https://cloud.zilliz.com/) +* Cohere ReRanker - used for improving retrieval results accuracy [https://dashboard.cohere.com/](https://dashboard.cohere.com/) + + + +Once you've finalized configuring all DKG Edge Node services, please make sure that they are exposed on the following ports: + +* **Edge Node Authentication Service:** [http://localhost:3001](http://localhost:3001) +* **Edge Node API:** [http://localhost:3002](http://localhost:3002) +* **Edge Node UI:** [http://localhost:5173](http://localhost:5173) +* **Edge Node Knowledge Mining:** [http://localhost:5005](http://localhost:5005) +* **Edge Node dRAG:** [http://localhost:5002](http://localhost:5002) + +[^1]: diff --git a/docs/graveyard/everything/dkg-edge-node/get-started-with-the-edge-node-boilerplate/usage-example.md b/docs/graveyard/everything/dkg-edge-node/get-started-with-the-edge-node-boilerplate/usage-example.md new file mode 100644 index 0000000..f810c79 --- /dev/null +++ b/docs/graveyard/everything/dkg-edge-node/get-started-with-the-edge-node-boilerplate/usage-example.md @@ -0,0 +1,118 @@ +--- +description: >- + You can find a simple example of the DKG Edge Node usage below to help you get + started. +--- + +# Usage example + +This page demonstrates a **simple end-to-end example** of how you can process any kind of JSON data using the **example pipeline provided in the Knowledge Mining service**, and then **ask questions about that data** through the **AI Assistant** interface. The AI Assistant communicates directly with the **DRAG** (Decentralized Retrieval-Augmented Generation) module, for which we’ve also prepared a basic example to showcase its functionality. + +> **Note:** This is a simplified example meant to demonstrate the concept and basic flow. + +## Purpose of the Example + +The goal of this example is to provide a quick and intuitive way to explore the potential of the **DKG Edge Node** through the UI. It’s designed to help you understand: + +* How input data (in standard JSON format) can be processed through a Knowledge Mining pipeline to generate structured graph data in JSON-LD or N-Quads format. +* How this graph-structured output is published to the OriginTrail Decentralized Knowledge Graph (DKG) as a Knowledge Asset – making the data verifiable, discoverable, and ensuring data integrity and ownership. +* How AI-powered question answering is performed on top of these Knowledge Assets through the DRAG service, which builds RAG (Retrieval-Augmented Generation) systems on decentralized data using powerful LLMs. + +## Focused on UI + +To keep things as accessible as possible, this example focuses purely on **UI-based usage**. If you're looking to explore **programmatic interaction** with the services (e.g., building custom pipelines, publishing data, or querying via APIs), please refer to the dedicated **API Documentation** section. + +#### Note on Simplicity + +This example is intentionally basic and does **not represent a production-ready pipeline**. Instead, it serves to highlight what’s possible when using: + +* Knowledge Mining Pipelines – for transforming raw inputs into semantically rich formats (e.g., JSON-LD). +* Edge Node API – for publishing structured data to the Decentralized Knowledge Graph (DKG). +* Edge Node DRAG – for creating RAG (Retrieval-Augmented Generation) applications based on verifiable, decentralized data. + +The main idea is that users can **build their own Knowledge Mining workflows** tailored to their domain-specific data and leverage the DKG infrastructure for secure, decentralized, and intelligent knowledge applications. + +## Interacting with the Web UI + +You can access the user interface at _http://your-nodes-ip-address_. + +### Logging in + +When accessing your node endpoint, you will be redirected to the login page. + +{% hint style="info" %} +The default login credentials are as follows: + +**username:** my\_edge\_node + +**password:** edge\_node\_pass +{% endhint %} + +### Publishing a Knowledge Asset + +We have prepared a simple example, which is processing JSON files that represent movies. + +Example JSON for a movie: + +```json +{ + "title": "The Terminator", + "release_year": 1984, + "genre": [ + "Action", + "Sci-Fi", + "Thriller" + ], + "director": "James Cameron", + "writers": [ + "James Cameron", + "Gale Anne Hurd", + "William Wisher" + ], + "cast": [ + { + "actor": "Arnold Schwarzenegger", + "role": "The Terminator" + }, + { + "actor": "Linda Hamilton", + "role": "Sarah Connor" + }, + { + "actor": "Michael Biehn", + "role": "Kyle Reese" + }, + { + "actor": "Paul Winfield", + "role": "Lieutenant Ed Traxler" + } + ], + "plot": "A cyborg assassin, known as a Terminator, is sent back from the future to kill Sarah Connor, a woman whose unborn son will lead humanity in a war against machines. A soldier from the future is sent back to protect her.", + "runtime": "107 minutes", + "language": "English", + "country": "USA", + "imdb_rating": 8.1, + "imdb_link": "https://www.imdb.com/title/tt0088247/" +} +``` + +{% hint style="info" %} +You can create any kind of JSON representing a book, movie, or music following the example above. +{% endhint %} + +Go to _http://your-nodes-ip-address/contribute_ and upload your JSON. + +{% hint style="info" %} +If you are using WSL, the node's IP address will be the IP address of the virtual machine. +{% endhint %} + +The Edge Node will: + +1. Process your input file and create a JSON-LD out of it. +2. Publish the Knowledge Asset on the Decentralized Knowledge Graph (DKG) from that JSON-LD. + +### Passing a query + +After successfully creating the Knowledge Asset, go to _http://your-nodes-ip-address/ai-assistant,_ and ask the AI Assistant a question about the movie (or other work) you put in the JSON to see how it works. + +> If you don’t get a good answer right away, feel free to tweak [JSON Knowledge pipeline](https://github.com/OriginTrail/edge-node-knowledge-mining/blob/main/dags/simple_json_to_jsonld.py) — this is just an **example pipeline**, and it may not work perfectly with every type of JSON structure out of the box. diff --git a/docs/graveyard/everything/dkg-edge-node/knowledge-mining-and-drag-examples.md b/docs/graveyard/everything/dkg-edge-node/knowledge-mining-and-drag-examples.md new file mode 100644 index 0000000..fb0b079 --- /dev/null +++ b/docs/graveyard/everything/dkg-edge-node/knowledge-mining-and-drag-examples.md @@ -0,0 +1,15 @@ +# Knowledge Mining and dRAG examples + +This page features real-world open-source examples of various dRAG and Knowledge Mining implementations. You can replace the dRAG and Knowledge Mining repositories in your Edge Node setup with these examples if you want to try them out. + +**DeSci Knowledge Mining:** [Github](https://github.com/brkagithub/desci-knowledge-mining) + +This repository contains a pipeline for converting scientific papers (PDFs) into JSON-LD format for ingestion into the OriginTrail Decentralized Knowledge Graph (DKG). The pipeline extracts structured knowledge from papers using NLP and LLMs, with a focus on neuroscience papers. + +**DeSci dRAG**: [Github](https://github.com/OriginTrail/desci-drag) + +This repository provides an API for natural-language querying the Decentralized Science Knowledge Graph using a combination of LLM-powered SPARQL queries and vector similarity search. + + + +If you’d like to contribute your own example, reach out to us via our [Discord](https://discord.com/invite/xCaY7hvNwD). diff --git a/docs/graveyard/everything/dkg-v8-update-guidebook.md b/docs/graveyard/everything/dkg-v8-update-guidebook.md new file mode 100644 index 0000000..97e0143 --- /dev/null +++ b/docs/graveyard/everything/dkg-v8-update-guidebook.md @@ -0,0 +1,24 @@ +--- +hidden: true +icon: rocket-launch +cover: ../../.gitbook/assets/DKG V8 update guide book - gitbook cover.png +coverY: 0 +--- + +# DKG V8 update guidebook + +V8 is a major update that increases scalability up to 1000x, introduces DKG Edge Nodes to power privacy-preserving AI use, improves Knowledge Assets, and much more. + +This guide will introduce you to the DKG V8 update, what you can expect from the new features, the launch timeline, and most importantly, what you need to do if you are a builder, staker, or node operator. + +Check [this page](https://docs.origintrail.io/dkg-v8-upcoming-version/whats-new-with-origintrail-v8) to get acquainted with the most important new updates of the DKG V8. In this document, we will focus on the implementation details of the protocol. + +**Learn more on** [**Protocol updates here**](../../dkg-knowledge-hub/learn-more/previous-updates/dkg-v8.0-update-guidebook/protocol-updates.md) + +**Learn more on the upcoming** [**Feature roadmap**](../../dkg-knowledge-hub/learn-more/previous-updates/dkg-v8.0-update-guidebook/feature-roadmap.md) **here** + +To understand what you need to **do to upgrade your applications or nodes**, head over to the "[How to upgrade to V8](../../dkg-knowledge-hub/learn-more/previous-updates/dkg-v8.0-update-guidebook/how-to-upgrade-to-v8.md)" page + +If you have any questions or issues, please contact us in the [Discord #v8-discussion](https://discord.gg/9WwMRhP9) channel and post them there. + +Trace On!\ diff --git a/docs/graveyard/everything/node-setup-instructions/README.md b/docs/graveyard/everything/node-setup-instructions/README.md new file mode 100644 index 0000000..eb63dd0 --- /dev/null +++ b/docs/graveyard/everything/node-setup-instructions/README.md @@ -0,0 +1,16 @@ +# Running DKG nodes + +The OriginTrail Decentralized Network is a permissionless system that hosts the Decentralized Knowledge Graph and is ran by the OriginTrail community. **Anyone, anywhere can run an OriginTrail DKG node -** you require no permission, and you can even expand node functionalities as it is completely open source software. + +There are two ways to run a DKG node: + +1. as a **gateway node**, which enables access to the DKG (publishing and querying knowledge assets) +2. as a **full (hosting) node**, which apart from providing access, also hosts the DKG state + +For a DKG node to be eligible to host a portion of the DKG and receive publishing fees (to be a hosting node), it has to accumulate a **minimum of 50 000 TRAC** tokens as stake in the network on a particular blockchain. Network stake provides the economic security layer and can be provided by anyone owning TRAC tokens by the notion of "delegated staking". + +However running a DKG node requires some degree of technical knowledge, so you are advised to carefully consider all of the relevant implications of doing so. In the following pages you will get detailed instructions on how to setup a DKG node, and we encourage you to seek assistance if needed within the [OriginTrail community Discord](https://discord.com/invite/FCgYk2S) server + +{% hint style="info" %} +These setup instructions are a "Work in progress" and are subject to change, and the core developers aim to keep it up to date at all times. However if you notice an issue during installation, we ask for you to please report the issue on our official [Github repo](https://github.com/OriginTrail/ot-node) or [Discord channel](https://discord.com/invite/FCgYk2S). +{% endhint %} diff --git a/docs/graveyard/everything/node-setup-instructions/dkg-node-installation.md b/docs/graveyard/everything/node-setup-instructions/dkg-node-installation.md new file mode 100644 index 0000000..7b21658 --- /dev/null +++ b/docs/graveyard/everything/node-setup-instructions/dkg-node-installation.md @@ -0,0 +1,81 @@ +# DKG node installation + +After preparing all the requirements for running your OriginTrail DKG node which includes wallets, tokens and RPC endpoints, you are now ready to start the installer and launch your OriginTrail DKG node. + +At this stage, there are two approaches when it comes to deploying the OriginTrail DKG node: + +1. Downloading and running the installer (instructions provided below) +2. DigitalOcean 1-click deployment (available on DigitalOcean marketplace) + * Mainnet 1-click deployment: Currently unavailable (Coming soon) + * Testnet 1-click deployment: Currently unavailable (Coming soon) + +{% hint style="info" %} +Make sure that you follow our communication channels closely for all the updates regarding additional options of launching and running the OriginTrail DKG node. +{% endhint %} + +## 1. Running the installer (installation script) + +{% hint style="info" %} +By executing this installer your node will be deployed as a Linux system service (systemctl process). +{% endhint %} + +The installation process will require interaction with the installer script via the terminal. Please ensure that you have all the necessary requirements for the node ready to be inserted into the prompt before running the installer. + +* Wallets and their private keys +* Funds on the wallets +* RPC endpoint + +{% hint style="info" %} +In order to avoid any installation issues, please run the installer on a clean Ubuntu 20.04 or 22.04 server. +{% endhint %} + +## 1.1 - Download OriginTrail DKG node installer: + +Ensure that you're logged in as root. Then, execute the following command in order to download the installation script and grant it executable access: + +``` +cd /root/ && curl -k -o installer.sh https://raw.githubusercontent.com/OriginTrail/ot-node/v6/develop/installer/installer.sh && chmod +x installer.sh +``` + +### 1.2 - **Firewall configuration**: + +OriginTrail node needs the following ports to be opened in order to properly operate. + +* 8900 (default node API endpoint) +* 9000 (networking port for communication with other nodes) + +{% hint style="warning" %} +Installer will automatically enable UFW and open ports 22 (ssh), 8900 and 9000.\ +\ +Please keep in mind that different cloud providers use different security practices when it comes to configuring firewalls on the servers. \ +Make sure that enabling UFW will not cause any networking issues on your server or disable it after the installation process is completed if you wish to configure firewall settings differently (ex. via cloud provider's console). +{% endhint %} + +### 1.3 - **Running the installer script**: + +{% hint style="info" %} +The provided installer script is designed for installing the OriginTrail node on **Ubuntu 20.04 LTS and 22.04 LTS** distributions.\ +\ +It is also possible to install the OriginTrail node on other systems, but it would require modifications to the installer script. If you have any such modifications in mind, we highly encourage your contributions. Please visit our [GitHub](https://github.com/OriginTrail/ot-node) for more information. +{% endhint %} + +#### **During the installation process, OriginTrail node installer will execute the following actions:** + +* Check for the Ubuntu OS version compatibility +* Deploy OriginTrail node directory and install all required modules +* Configure and enable OriginTrail node service +* Enable and configure UFW (firewall) +* Install required Node.js version together with NPM +* Install and enable MySQL service +* Configure MySQL user password for the OriginTrail node operational database (based on your inputs) +* Install and enable BlazeGraph service (Graph database) + +{% hint style="warning" %} +Do not run the installer with "sudo". +{% endhint %} + +#### Execute the installer by running: + +``` +./installer.sh +``` diff --git a/docs/graveyard/everything/node-setup-instructions/houston-origintrail-node-command-center.md b/docs/graveyard/everything/node-setup-instructions/houston-origintrail-node-command-center.md new file mode 100644 index 0000000..32f92b6 --- /dev/null +++ b/docs/graveyard/everything/node-setup-instructions/houston-origintrail-node-command-center.md @@ -0,0 +1,118 @@ +--- +description: >- + Houston V6 is an application that allows node runners to easily control their + nodes through an easy to understand UI. +--- + +# Houston - OriginTrail node command center + +There are two ways you can use the Houston application: + +1. Via a hosted application, which is available at the following link: [https://houston.origintrail.io/](https://houston.origintrail.io/) or +2. Run Houston Web application locally by following the setup [instructions](houston-origintrail-node-command-center.md#setup-houston-locally). + +**Houston is an open source project and we welcome your contributions on the** [**official project repository.**](https://github.com/OriginTrail/houston-v6/) + +### Usage instructions: + +{% hint style="info" %} +The Houston application currently requires the Metamask browser extension to be installed in order to connect to the node. Additional wallets will be supported in the future releases. +{% endhint %} + +1. When Houston application opens, you will be presented with a login form. Make sure to choose the right network (OriginTrail DKG Mainnet or OriginTrail DKG Testnet). +2. After choosing the network you will have to connect and authenticate your node's admin key with Metamask by clicking the “Connect admin wallet” button. +3. Paste your node’s operational wallet address into the “Operational wallet address” input field. +4. Connect by clicking the “Connect via Houston” button. + +

Houston - Login

+ +Houston currently helps you interact with OriginTrail DKG smart contracts. In the future versions Houston will enable connecting to your nodes directly as well. + +### Overview: + +The Overview section shows general information about your node TRAC balance and network metrics. + +

Houston - Overview section

+ +### Service tokenomics: + +Under “**Service tokenomics**” section, you’re able to manage TRAC stake settings on the node as well as: + +* Update the ask parameter, which will determine the preferred amount of TRAC your node is asking for its services in the network. +* Add and withdraw TRAC stake. Note that both operations require executing 2 transactions. + +

Houston - Service tokenomics section 1.1

+ +

Houston - Service tokenomics section 1.2

+ +### Node wallets: + +This section allows node runners to add additional keys (wallets) or remove keys from the node. OriginTrail V6 node operates with 2 key types (admin key and operational key). + +Houston will allow a node runner to remove a key (of a certain type) if there are at least two keys available for that type (e.g. you cannot remove your only admin key, as you would lose the ability to manage your node). + +{% hint style="warning" %} +When you stake TRAC tokens to your DKG node with an admin wallet, that admin wallet will receive a certain amount of “Stake Share Tokens” in return. Stake share tokens are ERC20 tokens that represent the amount of shares of the total token stake that this particular wallet owns. Unstaking TRAC tokens from your DKG node is performed by “burning” an amount of share tokens. (e.g. you can burn 10% of your share tokens to get back 10% of your staked TRAC. This approach is similar to AMMs such as Uniswap and their LP tokens representing ownership share in token pairs liquidity pools).\ + + +This means that, in case of you wanting to use multiple admin wallets to manage your node (by adding additional admin wallets), only the wallets that own share tokens will be able to manage the TRAC stake.\ +\ +For example, if you would like to swap one admin wallet which has already staked TRAC tokens, for another fresh admin wallet, apart from adding the new admin wallet you should make sure to also transfer your TRAC stake to the new wallet. This can be done in two ways:: + +* By transferring share tokens from the current admin wallet to the new wallet (since the Share token is ERC20, they can be transferred to another wallet easily) You can see your node share token address in Houston. This is the recommended option as your node TRAC stake will not change during this operation. +* Unstaking all TRAC tokens and moving them to the new admin wallet. After transferring the tokens to the new wallet, they must be staked again with the new admin wallet. This means that for the period of time that your node is without TRAC stake, it will not participate in hosting in the network, risking that it can lose rewards. + + + +Please make sure to exercise caution when handling staked tokens and make sure to choose the best approach that meets your needs. +{% endhint %} + +

Houston - Node wallets section

+ +To find out more about OriginTrail Node keys, visit the following page: [Setup instructions: Node keys](https://docs.origintrail.io/decentralized-knowledge-graph-layer-2/testnet-node-setup-instructions/node-keys). + +### Upcoming Houston features: + +With the future versions of Houston application, multiple features are to be introduced, such as connecting to your node and token delegation. + +If you have ideas on how to improve or extend Houston, we’d love to have you contribute to the project [via the official repository](https://github.com/OriginTrail/houston-v6). + +\ +In order to stay in the loop with the latest Houston & OriginTrail developments, please join our [Discord](https://discordapp.com/invite/FCgYk2S) channel and follow our social media accounts. + +### Setup Houston locally: + +#### Requirements: + +* Node.js: v16 + +#### How to run Houston: + +1\. Clone the project: + +``` +git clone https://github.com/OriginTrail/houston-v6.git +``` + +2\. Install dependencies: + +``` +cd houston-v6 && npm install +``` + +3\. Run the Houston app: + +``` +npm run serve +``` + +{% hint style="info" %} +Houston application will be available at [http://localhost:8080/](http://localhost:8080/) in the browser of your choice. +{% endhint %} + +4\. Compile and minify for production purposes: + +``` +npm run build +``` + diff --git a/docs/graveyard/everything/node-setup-instructions/installation-prerequisites/README.md b/docs/graveyard/everything/node-setup-instructions/installation-prerequisites/README.md new file mode 100644 index 0000000..42e7399 --- /dev/null +++ b/docs/graveyard/everything/node-setup-instructions/installation-prerequisites/README.md @@ -0,0 +1,7 @@ +# Installation prerequisites + +{% hint style="info" %} +Prior to deploying a mainnet node, it is advisable to initially set up a testnet node in order to familiarize yourself with the technology and the deployment process. +{% endhint %} + +Before we dive into the installation of the OriginTrail DKG node, there are a few important steps that need to be executed which cover acquiring tokens, preparing wallets (node keys), obtaining RPC endpoints, and other requirements for the node's proper functioning on the blockchains of choice. These will be explained as you progress trough the installation instructions pages. diff --git a/docs/graveyard/everything/node-setup-instructions/installation-prerequisites/acquire-archive-rpc-endpoints.md b/docs/graveyard/everything/node-setup-instructions/installation-prerequisites/acquire-archive-rpc-endpoints.md new file mode 100644 index 0000000..6f19d9b --- /dev/null +++ b/docs/graveyard/everything/node-setup-instructions/installation-prerequisites/acquire-archive-rpc-endpoints.md @@ -0,0 +1,43 @@ +--- +description: >- + This section of the tutorial will cover the details regarding the RPC + endpoints for the blockchains supported by the OriginTrail DKG node. +--- + +# Acquire archive RPC endpoints + +In order for your OriginTrail node to be able to connect and communicate with any of the supported blockchains, it will require an archive RPC endpoint. There are different RPC providers offering endpoints for different blockchains and it's up the node runner to choose the provider. + +## 1. NeuroWeb archival RPC endpoint: + +When it comes to deploying your node on OriginTrail's NeuroWeb (testnet or mainnet), your node will automatically be provided with the RPC endpoint. This means that no action is required from the node runner **at the current stage**. + +## 2. Gnosis and Chiado archival RPC endpoints: + +Refer to the [official Gnosis documentation](https://docs.gnosischain.com/tools/rpc/)[ ](https://docs.gnosischain.com/)and choose an RPC provider in order to acquire the archival RPC Endpoint. + +{% hint style="warning" %} +Selecting an archival endpoint is a crucial requirement for the optimal functionality of your DKG node. +{% endhint %} + +During the installation process, installer will ask you to input the Chiado (Testnet) or Gnosis (Mainnet) archival RPC endpoint which will autimatically configure your node to use it. The endpoint you provide, will be inserted into the **.origintrail\_noderc** (custom configuration file) automatically. + +{% hint style="info" %} +You will be able to change your RPC endpoint manually later by editing the **.origintrail\_noderc** custom configuration file which will be located inside the "ot-node" directory. +{% endhint %} + + + +## 3. Base and Base Sepolia archival RPC endpoints: + +Refer to the[ official Base documentation](https://docs.base.org/docs/) and choose an RPC provider in order to acquire the archival RPC Endpoint. + +{% hint style="warning" %} +Selecting an archival endpoint is a crucial requirement for the optimal functionality of your DKG node. +{% endhint %} + +During the installation process, installer will ask you to input the Base Sepolia (Testnet) or Base (Mainnet) archival RPC endpoint which will autimatically configure your node to use it. The endpoint you provide, will be inserted into the **.origintrail\_noderc** (custom configuration file) automatically. + +{% hint style="info" %} +You will be able to change your RPC endpoint manually later by editing the **.origintrail\_noderc** custom configuration file which will be located inside the "ot-node" directory. +{% endhint %} diff --git a/docs/graveyard/everything/node-setup-instructions/installation-prerequisites/acquiring-tokens.md b/docs/graveyard/everything/node-setup-instructions/installation-prerequisites/acquiring-tokens.md new file mode 100644 index 0000000..5e08a8f --- /dev/null +++ b/docs/graveyard/everything/node-setup-instructions/installation-prerequisites/acquiring-tokens.md @@ -0,0 +1,77 @@ +--- +description: >- + This part of the documentation will explain how to acquire tokens for your + OriginTrail DKG node. +--- + +# Acquiring tokens + +## Mainnet networks: + +The OriginTrail DKG nodes operate with a two token system: + +* the TRAC token which is the native utility token of the DKG used for knowledge publishing, and +* the native blockchain token of the chosen chain, used for interacting with DKG smart contracts (e.g NeuroWeb, Gnosis, Ethereum etc and their native tokens) + +In order for your node to successfully start and run, you will need to provide these tokens to its **operational wallet.** + +For more information on OriginTrail two-layer utility tokens - both TRAC and NEURO, please check the [whitepapers](https://origintrail.io/ecosystem/whitepaper). + +#### Layer 1 utility tokens + +OriginTrail has over time evolved into a multichain system (supporting Ethereum, Polkadot, Polygon and Gnosis blockchains) and for a fully functional hosting node you will require native tokens of the blockchains you intend to run it on as well. + +#### Layer 2 utility token - TRAC + +The Trace token (TRAC) is the utility token of the DKG (layer 2 of OriginTrail). It is required to perform operations such as publishing knowledge to the network, and is a utility token that drives the entire DKG. Nodes in the DKG can operate in two ways: + +* **as non-hosting network gateways**, which enables access to the DKG and communication with other network nodes, without hosting the DKG +* **as hosting nodes,** which apart from being network gateways also host a portion of the DKG knowledge assets, for which they receive TRAC publishing fees as rewards. **To run a hosting node in OriginTrail DKG, your node requires at least 50.000 TRAC tokens posted as a security collateral to the network** + +#### How can you acquire TRAC? + +TRAC token can be accessed through multiple (decentralized and centralized) platforms. A non-exhaustive list includes: + +* Coinbase +* Huobi +* Kucoin +* Uniswap +* Bancor +* Wintermute (over-the-counter) + +More options can be found on [CoinMarketCap](https://coinmarketcap.com/currencies/origintrail/) or [CoinGecko](https://www.coingecko.com/en/coins/origintrail#markets) platforms. + +{% hint style="danger" %} +**IMPORTANT: There is no association between the core development team and the above mentioned platforms. You should familiarise yourself with all possible risks of using their services as you are doing so under your own responsibility.** +{% endhint %} + + + +## Testnet networks: + +At the current stage, OriginTrail DKG node can be deployed on the following testnet blockchains: + +1. NeuroWeb testnet +2. Chiado (Gnosis testnet) + +In order to acquire testnet tokens for the above listed blockchains, please refer to OriginTrail's Discord Faucet usage [instructions](../../../../dkg-knowledge-hub/useful-resources/test-token-faucet.md). + +## Teleporting TRAC tokens to NeuroWeb: + +To transfer TRAC tokens between Ethereum and NeuroWeb, you need to execute the Teleport process. The process of Teleporting TRAC between the two networks has been explained in the [Teleport instructions](https://docs.origintrail.io/integrated-blockchains/neuroweb/teleport-instructions) section of our documentation. + +## Bridging TRAC to Gnosis: + +In order to have your TRAC tokens become available on Gnosis mainnet network, you will have to bridge them from Ethereum network with the use of bridging platforms such as [**OmniBridge**](https://omnibridge.gnosischain.com/bridge) or an another platform of your choice. + + + +## Bridging TRAC to Base: + +In order to have your TRAC tokens become available on Base mainnet network, you will have to bridge them from Ethereum network with the use of bridging platforms such as [**Superbridge**](https://superbridge.app/base) or an another platform of your choice. + +TRAC bridge instructions using Superbridge are available [here](https://docs.origintrail.io/integrated-blockchains/ethereum-ecosystem/base-blockchain#bridging-trac-to-base). + +{% hint style="info" %} +Once you acquire TRAC tokens on the desired network, you may proceed with the setup. +{% endhint %} diff --git a/docs/graveyard/everything/node-setup-instructions/installation-prerequisites/choosing-blockchain-networks.md b/docs/graveyard/everything/node-setup-instructions/installation-prerequisites/choosing-blockchain-networks.md new file mode 100644 index 0000000..f3ec8e3 --- /dev/null +++ b/docs/graveyard/everything/node-setup-instructions/installation-prerequisites/choosing-blockchain-networks.md @@ -0,0 +1,31 @@ +# Choosing blockchain networks + +{% hint style="info" %} +Before running a mainnet node, consider running a testnet node first and get familiar with the technology. +{% endhint %} + +OriginTrail Decentralized Knowledge Graph is a Multi-chain DKG. That means nodes can operate and host knowledge assets on multiple blockchains at the same time. OriginTrail is [built to integrate with EVM supporting blockchains](https://github.com/OriginTrail/OT-RFC-repository/tree/main/RFCs/OT-RFC-17-OriginTrail-integration-with-EVM-compatible-blockchains). + +At the current stage, OriginTrail DKG V6 supports the following blockchains: + +**Mainnet blockchains** + +1. [NeuroWeb](https://neuroweb.ai/) (Polkadot) +2. [Gnosis chain ](https://www.gnosis.io/)(Ethereum) +3. [Base](https://www.base.org/) (Ethereum) + +**Testnet blockchains** + +1. NeuroWeb testnet +2. Gnosis Chiado testnet +3. Base Sepolia testnet + +Depending on your network of choice for the OriginTrail DKG node, please make sure that you carefully read through the requirements in the next sections connected to acquiring tokens and preparing node keys (wallets). + +{% hint style="info" %} +For NeuroWeb preparations, additional processes are required, such as wallet mapping. These will be explained under the "**Tokens**" and "**Node keys (wallets)**" sections of the instructions. +{% endhint %} + +{% hint style="info" %} +Prior to current version (V6), OriginTrail has been deployed to Ethereum, Gnosis and Polygon blockchains as well. V5 network is expected to sunset in 2024 as network activity migrates fully to DKG V6. +{% endhint %} diff --git a/docs/graveyard/everything/node-setup-instructions/installation-prerequisites/hardware-requirements.md b/docs/graveyard/everything/node-setup-instructions/installation-prerequisites/hardware-requirements.md new file mode 100644 index 0000000..2ab1bf6 --- /dev/null +++ b/docs/graveyard/everything/node-setup-instructions/installation-prerequisites/hardware-requirements.md @@ -0,0 +1,19 @@ +--- +description: Which are the hardware requirements for running an OriginTrail DKG node? +--- + +# Hardware requirements + +In order to deploy your OriginTrail DKG node, you will need a Linux server with the minimum recommended hardware as presented below: + +* **4GB RAM** +* **2 CPUs** +* **50GB HDD space** + +Make sure that you have root access to your server. + +{% hint style="info" %} +The installer script provided in these instructions is designed for installing the OriginTrail node on **Ubuntu 20.04 LTS and 22.04 LTS** distributions.\ +\ +It is also possible to install the OriginTrail node on other systems, but it would require modifications to the installer script. If you have any such modifications in mind, we highly encourage your contributions. Please visit our [GitHub](https://github.com/OriginTrail/ot-node) for more information. +{% endhint %} diff --git a/docs/graveyard/everything/node-setup-instructions/installation-prerequisites/triple-store-setup.md b/docs/graveyard/everything/node-setup-instructions/installation-prerequisites/triple-store-setup.md new file mode 100644 index 0000000..6404efc --- /dev/null +++ b/docs/graveyard/everything/node-setup-instructions/installation-prerequisites/triple-store-setup.md @@ -0,0 +1,37 @@ +# Triple store setup + +DKG nodes are designed to handle and store graph data using multiple triple store implementations. By supporting various triple stores, DKG nodes ensure flexibility, scalability, and robustness in managing knowledge assets. Currently, several triple store implementations are supported directly: + +* [Ontotext GraphDB](https://www.ontotext.com/products/graphdb/) +* [Blazegraph](https://blazegraph.com/) +* [Apache Jena Fuseki](https://jena.apache.org/documentation/fuseki2/) + +#### Ontotext GraphDB + +To use Ontotext GraphDB with your DKG node, follow these steps: + +1. **Visit the GraphDB website**: Go to the official [Ontotext GraphDB website](https://www.ontotext.com/products/graphdb/). +2. **Request and download GraphDB**: Follow the instructions to request and download the GraphDB software. +3. **Setup GraphDB on your server**: Install and configure GraphDB according to the provided guidelines. +4. **Modify your DKG node configuration**: Edit the configuration file of your DKG node to use GraphDB as the default triple store implementation. + * Navigate to: [Broken link](broken-reference "mention") + * Locate the `tripleStore` object and set the `defaultImplementation` to `ot-graphdb`. + +#### Blazegraph + +Blazegraph setup is streamlined with the OriginTrail Automatic Installer. Follow these steps: + +1. **Run the OriginTrail Automatic Installer**: The installer will handle the setup and configuration of Blazegraph for you. +2. **Verify the configuration**: Ensure that your node's configuration file is correctly set to use Blazegraph as the triple store implementation. + +#### Apache Jena Fuseki + +Similar to Blazegraph, the setup for Apache Jena Fuseki is managed by the OriginTrail Automatic Installer. Here’s what you need to do: + +1. **Run the OriginTrail Automatic Installer**: The installer will automatically set up and configure Fuseki. +2. **Verify the configuration**: Confirm that your node's configuration file is properly configured to use Fuseki. + +#### Custom Triple Store Integrations + +However due to the standard implementation (W3C RDF / SPARQL standards) it is easy to integrate DKG nodes with any standardized RDF triple store. While the above triple stores are directly supported, you can adapt your DKG node to work with other RDF triple store implementations by simple modifications of the code and the configuration settings. + diff --git a/docs/graveyard/everything/node-setup-instructions/origintrail-dkg-node-nat-configuration.md b/docs/graveyard/everything/node-setup-instructions/origintrail-dkg-node-nat-configuration.md new file mode 100644 index 0000000..2a1a82d --- /dev/null +++ b/docs/graveyard/everything/node-setup-instructions/origintrail-dkg-node-nat-configuration.md @@ -0,0 +1,93 @@ +--- +description: >- + In the following section, we will go trough the steps required to configure + your OriginTrail DKG node NAT, ensuring optimal libp2p communication and + effective participation in the network. +--- + +# OriginTrail DKG node NAT configuration + +## What is **Network Address Translation (**NAT) and libp2p? + +libp2p is a peer-to-peer (P2P) networking framework that enables the development of P2P applications. **OriginTrail's DKG node** utilizes libp2p for networking, enabling seamless communication within the OriginTrail decentralized network. + +**NAT (Network Address Translation)** poses a significant challenge for P2P networks like OriginTrail DKG. Devices behind NAT are typically not directly reachable from devices outside the local network due to the translation of private IP addresses to a single public IP address. This creates hurdles for libp2p-based applications, as direct communication between nodes becomes difficult or impossible.\ +\ +Configuring libp2p properly is essential regardless of the cloud provider you are using to run your OriginTrail DKG node. Without proper configuration, your nodes may encounter communication issues, hindering their ability to participate effectively in the OriginTrail DKG network. By addressing NAT traversal and ensuring proper exposure of public IP addresses, you can overcome these challenges and enable seamless communication between DKG nodes. + +## How to configure your OriginTrail DKG node NAT? + +By adding a few parameters to the node's configuration file `.origintrail_noderc`, you can ensure proper NAT traversal and optimal libp2p communication. + +### **Step 1: Get the IP address of your OriginTrail node:** + +Log in to your cloud provider's console and navigate to the dashboard or management interface for your instance. Find the public IP address of your instance. This is the IP address that other nodes will use to communicate with your OriginTrail node. + +{% hint style="info" %} +Make sure that you have a static IP address assigned to OriginTrail node instance. \ +Most cloud providers require you to reserve and associate the IP address to the instance. +{% endhint %} + +{% hint style="danger" %} +Assigning a static IP to your server is crucial to prevent network communication issues caused by IP address changes during server reboots. +{% endhint %} + + + +### **Step 2: Update your OriginTrail DKG node configuration:** + +Navigate to the directory of your OriginTrail node (ot-node), open your configuration (.origintrail\_node) file and configure your **"network"** module as shown below:\ + + +```json +{ + "modules":{ + "blockchain":{ + "implementation":{ + + } + }, + "tripleStore":{ + "implementation":{ + + } + }, + "network":{ + "implementation":{ + "libp2p-service":{ + "config":{ + "nat":{ + "enabled":true, + "externalIp":"" + } + } + } + } + } + }, + "auth":{ + "ipWhitelist":[ + "::1", + "127.0.0.1" + ] + } +} +``` + +### **Step 3: Restart you OriginTrail node:** + +As we have previously learned, each update of the OriginTrail node configuration must be followed by the restart. Restart your node by running the following command: + +``` +systemctl restart otnode.service && journalctl -u otnode --output cat -f +``` + +or + +``` +otnode-restart && otnode-logs +``` + +and wait for your node to print the "Node is up and running!". + +

OriginTrail node started successfully

diff --git a/docs/graveyard/everything/node-setup-instructions/running-a-full-node.md b/docs/graveyard/everything/node-setup-instructions/running-a-full-node.md new file mode 100644 index 0000000..2e03cc7 --- /dev/null +++ b/docs/graveyard/everything/node-setup-instructions/running-a-full-node.md @@ -0,0 +1,44 @@ +--- +description: >- + The OriginTrail full node is playing a crucial role on the Decentralized + Knowledge Graph (DKG) by hosting Knowledge assets. +--- + +# Running a full node + +## How to initiate a full DKG node? + +Once the OriginTrail DKG node starts, **several blockchain transactions need to be executed for your node to become an active part of the network and start hosting the DKG Knowledge assets**. These transactions are needed to set the required TRAC stake for your node (50 000 TRAC at this time) and set the node service ask. Both values are set in TRAC tokens. + +### **Setting the node stake and service ask parameters** + +{% hint style="warning" %} +**50000** **TRAC** is the minimum amount of TRAC for a Full DKG Mainnet for your node to become eligible for DKG hosting and rewards. More information is available in the following OriginTrail [OT-RFC-14](https://github.com/OriginTrail/OT-RFC-repository/blob/main/RFCs/OT-RFC-14%20DKG%20v6%20TRAC%20Tokenomics.pdf). + +If a Full DKG node's TRAC falls below 50,000, reset the operator fee to the desired value before reactivating it. +{% endhint %} + +\ +You can set the node stake and ask settings in two ways. The first option is to use the Houston application and the second is to do it by running a few npm scripts directly on the server where your nodes is installed. + +In order to configure stake and ask parameters on your OriginTrail DKG node, we recommend using [Houston application](houston-origintrail-node-command-center.md). + +### Setting up node stake and service ask via Houston (Recommended): + +[Houston](houston-origintrail-node-command-center.md) is the OriginTrail node user interface (command center) that allows you to execute certain operations regarding your node on the blockchain. There are two ways to run Houston: + +1. Via a hosted application, which is available at the following link: [https://houston.origintrail.io/](https://houston.origintrail.io/) or +2. Run Houston Web application locally by following the setup [instructions](houston-origintrail-node-command-center.md#setup-houston-locally). + +More information on Houston can be found [here](houston-origintrail-node-command-center.md)\ +\ +**Setting your node service ask**\ +Navigate to the "Service tokenomics" page within the Houston application, enter the service ask amount, denominated in TRAC and click "Update ask" button. This will trigger the transaction signing process with Metamask. + +

Node service ask setting interface

+ +**Staking TRAC to your node** + +Navigate to the "Service tokenomics" section within Houston application, enter the stake amount in TRAC and click "Add stake" button. This will again trigger the transaction signing process with Metamask, this time requiring two transactions to complete the process of TRAC staking to your node. + +

Node stake settings interface

diff --git a/docs/graveyard/everything/node-setup-instructions/running-a-gateway-node.md b/docs/graveyard/everything/node-setup-instructions/running-a-gateway-node.md new file mode 100644 index 0000000..46c1d5c --- /dev/null +++ b/docs/graveyard/everything/node-setup-instructions/running-a-gateway-node.md @@ -0,0 +1,11 @@ +--- +description: What is a gateway node and what can i use it for? +--- + +# Running a gateway node + +**OriginTrail gateway node** can be used for building applications that interface with the OriginTrail Decentralized Network and gateway node being a dependency (access point to a network) for the use of the OriginTrail SDK. + +Running a gateway node is not the same as running a **full (DKG hosting) node** and does not require 50000 TRAC tokens to be posted as stake collateral so if you are looking to build applications leveraging [knowledge assets](https://origintrail.io/products/knowledge-assets) on the OriginTrail Decentralized Knowledge Graph (DKG), having a gateway node is a place to start. + +More information on the OriginTrail SDK libraries can be found [here](https://docs.origintrail.io/decentralized-knowledge-graph-layer-2/dkg-sdk/dkg-v6-js-client). diff --git a/docs/graveyard/everything/node-setup-instructions/switch-dkg-node-to-multichain.md b/docs/graveyard/everything/node-setup-instructions/switch-dkg-node-to-multichain.md new file mode 100644 index 0000000..a617de8 --- /dev/null +++ b/docs/graveyard/everything/node-setup-instructions/switch-dkg-node-to-multichain.md @@ -0,0 +1,21 @@ +# 🔗 Switch DKG node to multichain + +Welcome to the guide on expanding the capabilities of your existing OriginTrail DKG node! By following these instructions, you'll seamlessly transition from a single-chain setup to a powerful multichain configuration, connecting your node not only to the NeuroWeb, but also to the other blockchain such as Gnosis or Chiado, depending on which environment your node is currently deployed (mainnet or testnet). + +Before diving into the integration process, ensure you have DKG node fully operational and connected to at least 1 of the currently supported blockchains. + +## OriginTrail node multichain support + +At the current stage, the following multichain integrations are available for the OriginTrail DKG node: + +* NeuroWeb mainnet +* NeuroWeb testnet +* Gnosis +* Chiado (Gnosis testnet) +* Base +* Base Sepolia (Base testnet) + +{% hint style="info" %} +Before proceeding with updating your node to multichain state, ensure you have DKG node fully operational and connected to at least 1 of the currently supported blockchains. +{% endhint %} + diff --git a/docs/graveyard/everything/node-setup-instructions/verify-installation.md b/docs/graveyard/everything/node-setup-instructions/verify-installation.md new file mode 100644 index 0000000..7b0d08d --- /dev/null +++ b/docs/graveyard/everything/node-setup-instructions/verify-installation.md @@ -0,0 +1,27 @@ +# Verify installation + +If your installation has been successful, your node will show the “**Node is up and running!**” log as shown in the example image below: + +

Example of a successfully installed node

+ +## Congratulations! You have just set up your OriginTrail DKG node! + +### **OriginTrail DKG node useful commands:** + +**Starting your node:** _otnode-start_ + +**Stopping the node:** _otnode-stop_ + +**Restarting the node:** _otnode-restart_ + +**Showing node logs:** _otnode-logs_ + +**Opening the node config:** _otnode-config_ + +## Monitoring node operations & rewards + +Over the course of time your node will provide different services to the DKG users as part of the DKG Service market (e.g. your node will receive requests from the network, such as requests to index assets or fetch their state). To be able to provide these services, your node needs to maintain assets (and their associated assertions) in their original form and occasionally provide proofs of utility to the blockchain. Therefore it is essential that you keep your node running with as high uptime as possible, so it is able to answer requests and perform its activities. More details on the functioning of the service market and tokenomics behind it can be found in [OT-RFC-14](https://github.com/OriginTrail/OT-RFC-repository/blob/main/RFCs/OT-RFC-14%20DKG%20v6%20TRAC%20Tokenomics.pdf). + +**It is essential that you familiarise yourself with the tokenomics of v6 and the services your node will be performing. A good understanding will be crucial for you to set up your node tokenomics settings in a way that you find optimal for your node.** The OriginTrail V6 service market is an open market and requires all OriginTrail Nodes to provide a public “service ask” amount in TRAC. + +The OriginTrail V6 node also needs to maintain a constant connection to the blockchains it operates with via dedicated blockchain RPC endpoints (it supports multiple RPC endpoints for one blockchain, which allows for resiliency in case of RPC downtime). diff --git a/docs/graveyard/everything/powering-ai-minds-origintrail-hackathon.md b/docs/graveyard/everything/powering-ai-minds-origintrail-hackathon.md new file mode 100644 index 0000000..034c005 --- /dev/null +++ b/docs/graveyard/everything/powering-ai-minds-origintrail-hackathon.md @@ -0,0 +1,156 @@ +--- +description: Do you have what it takes to revolutionize the future of AI agents? +hidden: true +icon: lightbulb-exclamation-on +cover: >- + ../../.gitbook/assets/Empowering ai minds_ Origintrail hackathon (gitbook doc + cover) 1.png +coverY: 0 +--- + +# Powering AI minds: OriginTrail hackathon + +Harness the power of the **OriginTrail Decentralized Knowledge Graph (DKG)** to create AI agents that push the boundaries of what’s possible. Combine the best of neural and symbolic AI to build agents with collective memory capabilities, meet like-minded pioneers, and compete for exciting prizes! + +First DKG-powered agents like [**ChatDKG**](https://x.com/ChatDKG) (powered by the ElizaOS DKG Plugin) and [**Satoshi Rakic**](https://x.com/dkgsatoshirakic) have already demonstrated how the DKG unlocks next-level intelligence for AI agents. Now, it’s your turn to lead the charge and showcase your brilliance! + +## 💡 How to join the hackathon? + +1. **Explore challenges:** Check out the themes and prizes below. +2. **Register to compete:** Complete the [**Google form** ](https://forms.gle/yAxTfP8taEBQMavu9)by **Friday, January 24, at noon**. +3. **Build your solution:** Create an AI agent using the OriginTrail DKG as its brain. +4. **Submit your work:** Send your GitHub repo (with a clear README) to **office@origin-trail.com** by **Monday, January 27, at noon**. +5. **Showcase your work:** Join us for **Demo day** at Nova Iskra Dorćol, Belgrade, on **January 27, 18:00**! + +The hackathon kicks off remotely, giving you the flexibility to collaborate and innovate from anywhere, with full support via [Discord](https://discord.com/invite/xCaY7hvNwD). Then, come to Belgrade for an unforgettable **hackathon finale**: + +## ✨ What’s in store on Demo day? + +* **Inspiring talks** from the team and guest speakers. +* **Creative presentations** of AI agent projects. +* **Live jury deliberations** to crown the winners. +* **Award celebrations** with the OriginTrail community. +* **Networking opportunities** to connect with innovators and thought leaders. + +Not competing? No problem! [**Register via Eventbrite**](https://bit.ly/OT-AI-hackathon-BGD) to join the event, enjoy the excitement, and connect with the team and fellow enthusiasts. + +Your next breakthrough starts here. Let’s build the future of AI agents together! + +{% hint style="info" %} +This is an in-person hackathon, only teams that will physically join the hackathon Demo day will be eligible for rewards. +{% endhint %} + +## 🤖 Builder resources + +Here's some code and learning resources to get you started: + +* **Get started with the DKG** [**here**](https://docs.origintrail.io/) +* [**Video on how to run a basic DKG-enabled AI agent with ElizaOS**](https://x.com/origin_trail/status/1879573405240107497) +* **Try out** [**ChatDKG**](https://x.com/ChatDKG) **on Twitter and see how it creates memories** +* [**ElizaOS AI agent devschool videos**](https://www.youtube.com/watch?v=ArptLpQiKfI\&list=PLx5pnFXdPTRzWla0RaOxALTSTnVq53fKL) +* Join [**Discord**](https://discord.com/invite/xCaY7hvNwD) and learn together with the community +* Learn more about OriginTrail on our [YouTube Channel](https://www.youtube.com/@OriginTrail/videos) +* **Useful code repos and guides:** + * **ElizaOS DKG agent:** [**https://docs.origintrail.io/dkg-v8-current-version/ai-agents/elizaos-dkg-agent**](https://docs.origintrail.io/dkg-v8-current-version/ai-agents/elizaos-dkg-agent) + * **Build a DKG agent with the Python SDK:** [**https://docs.origintrail.io/dkg-v8-current-version/ai-agents/custom-dkg-python-agent**](https://docs.origintrail.io/dkg-v8-current-version/ai-agents/custom-dkg-python-agent) + * **Learn how to setup & use a DKG Edge node** [**here**](https://docs.origintrail.io/dkg-v8-current-version/v8-dkg-edge-node) + +## 🤺 Hackathon project challenge ideas + +### 1. Social and influencer agents + +**Description:** Develop agents to analyze, monitor, and enhance social media influence or content. These agents could provide insights on audience engagement, detect trends, or suggest strategies for content optimization.\ +\ +**Example use of DKG:** Leverage the OriginTrail DKG's advanced neuro-symbolic capabilities for robust knowledge retrieval, enabling cross-platform connectivity and context-aware insights. Connect and analyze influencer data as knowledge asset graphs (e.g., engagement metrics and content performance) for enhanced interpretability and prediction-making.. + +### 2. Agent infrastructure + +**Description:** Create a launchpad or foundational frameworks to enable the development and deployment of agents that use the DKG (via gateway or DKG Edge Nodes) as a decentralized memory system. These tools could simplify agent creation and ensure interoperability across different platforms.\ +\ +**Example use of DKG:** Utilize the DKG as a shared neuro-symbolic repository where agents store their memories, data, and operational states. Its advanced connectivity and context-aware data modeling ensure resilience, scalability, and collaborative ecosystems. This approach allows agents to retrieve and link complex knowledge assets seamlessly, fostering efficient decentralized workflows. + +### 3. DKG AI Agents + +**Description:** Build agents that offer seamless interactions with the DKG, making it simple to create, query, and manage decentralized knowledge assets. These agents would abstract the complexities of DKG operations, allowing users to deploy and manage their Core and Edge nodes, create knowledge assets, and retrieve data effortlessly.\ +\ +**Example use of DKG:** Enable agents to leverage the neuro-symbolic strengths of the DKG to automate complex tasks like node management, asset creation, and data querying. With its advanced knowledge representation and inferencing capabilities, the DKG empowers agents to retrieve and synthesize meaningful insights while maintaining decentralized data integrity and transparency. + +### 4. Trading agents + +**Description:** Develop agents for autonomous trading in financial markets, leveraging the DKG for data integrity and transparency. These agents can analyze market trends, execute trades, and manage portfolios.\ +\ +**Example use of DKG:** Use the DKG's neuro-symbolic features to store and interlink historical trading data, market signals, and strategy insights. By enabling advanced connectivity and context-aware retrieval, the DKG facilitates tamper-proof, trusted data for informed decision-making. Its decentralized structure ensures a collaborative and transparent environment for autonomous trading agents, while enabling privacy via Edge nodes. + +### 5. Agent swarms + +**Description:** Develop coordinated groups of decentralized agents (swarms) that can collaborate to achieve complex goals, such as large-scale data aggregation, social media trend analysis, or interacting in some interesting ways between them, such as spawning new agents themselves etc. These agents could work together autonomously, leveraging distributed intelligence and dynamic task allocation. + +**Example use of DKG:** Use the DKG to enable seamless communication and synchronization between agents in the swarm, leveraging its advanced neuro-symbolic capabilities for context-aware decision-making and knowledge sharing. For example, in Web3, agent swarms can analyze token distribution patterns, monitor governance proposal activity, and assess the health of decentralized autonomous organizations (DAOs) by interlinking blockchain activity with community metrics. On social media, swarms can track trending hashtags, analyze cross-platform content virality, and identify influencer collaborations. The DKG ensures these operations remain decentralized, transparent, and resistant to manipulation, providing a trustworthy framework for swarm-based intelligence. + +### 6. Meme agents + +**Description:** Build agents dedicated to promoting OriginTrail and the DKG through viral content such as memes. These agents could generate and share memes on platforms like Twitter and incentivize users based on engagement metrics.\ +\ +**Example use of DKG:** Utilize the DKG to store and interconnect the most popular memes with cultural and temporal trends, creating a decentralized, contextually enriched archive of viral content. The DKG's neuro-symbolic strengths facilitate the analysis of meme engagement patterns, enabling incentivization mechanisms rooted in verified, decentralized interaction metrics, promoting community-driven content creation and trust. + +### 7. Wildcard + +Build whatever you think is cool with Agents and the OriginTrail DKG to impress the jury. Anything goes - from graph reasoning, and agent swarms, token launching, to custom Edge Node integrations for privately run, local agents. Review the judging criteria to guide you. + +## 🏆 Prizes + +The prizes are designed to help teams bring their agents to life and move them from MVPs to actual products on the DKG. The rewards are structured as follows: + +* First Place: **$5,000 in total value** (50% in TRAC, 50% in USDC) +* Second Place: **$2,500 in total value** (50% in TRAC, 50% in USDC) +* Third Place: **$1,000 in total value** (50% in TRAC, 50% in USDC) + +{% hint style="info" %} +TRAC rewards are intended to cover the operational cost of your agent on the DKG and will be distributed to your agent's DKG funding contract. + +To ensure the funds are used to launch and support the agents, the winners’ wallet addresses will be publicly shared. This approach encourages transparency and underscores the commitment to turning ideas into actionable, impactful solutions. +{% endhint %} + +## 🧑‍⚖️ Judging criteria for the hackathon challenge + +### 1. Effective use of the DKG (30% weight) + +How well does the submission leverage the OriginTrail Decentralized Knowledge Graph? + +**Relevance:** Does the project demonstrate correct and meaningful use of the DKG? + +**Alignment:** Does the project generate a reasonably high number of knowledge assets on the DKG? + +**Depth:** Are agents contributing valuable, high-quality knowledge to the DKG? + +### 2. Innovation and creativity (20% weight) + +How novel or unique is the solution? + +**Novelty:** Does the submission introduce fresh ideas, methodologies, or applications? + +**Creativity:** Is the approach inventive or unexpected, particularly in agent behavior, workflows, or gamification? + +### 3. Integration and ecosystem impact (15% weight) + +How well does the project integrate with other tools, platforms, frameworks, or blockchain ecosystems? + +**Interoperability:** Does it showcase meaningful integration with other tools, frameworks, APIs, blockchain products, etc.? + +**Ecosystem Contribution:** Could this project strengthen the broader OriginTrail ecosystem? + +### 4. Usability and viability (35% weight) + +Does the submission present a compelling, practical agent, agentic swarm or agentic product that has the potential to grow into its own product or living agent on the DKG? Is there a plan on how this agent can continue being useful post-hackathon? + +**Real-World Potential:** Is your agent or agentic product MVP live on DKG mainnet? Does it show traction, can it grow into a valuable product? + +**Functionality:** Does it work effectively with minimal bugs or errors? + +**User Experience:** How easy, pluggable, and versatile is your agentic product for use? Can it tap into different agents’ memories, speak via Telegram, Discord, Twitter, and video at the same time? The more versatile, the better + +## 💡 Engage with the community + +Feel free to get in touch with the OriginTrail community and core developers for any assistance through [Discord](https://discord.com/invite/xCaY7hvNwD). We’ve also opened a [dedicated channel exclusively for hackathon participants](https://discord.gg/XEd93jsTDB) where you can meet each other, look for team members if you need any, share ideas, and meet the OriginTrail team. (applicants will automatically be added to the channel) + +Have fun hacking and looking forward to seeing you in Belgrade! diff --git a/docs/graveyard/everything/teleport-instructions-neuroweb.md b/docs/graveyard/everything/teleport-instructions-neuroweb.md new file mode 100644 index 0000000..80d5055 --- /dev/null +++ b/docs/graveyard/everything/teleport-instructions-neuroweb.md @@ -0,0 +1,97 @@ +--- +description: >- + The instructions on this page will guide you through the step-by-step process + of teleporting your TRAC tokens between the NeuroWeb blockchain and Ethereum + networks and vice versa. +hidden: true +--- + +# Teleport instructions - NeuroWeb + +After the successful integration of the OriginTrail Decentralized Knowledge Graph (DKG) with NeuroWeb, the teleport interface has been launched in order to allow users to transfer TRAC tokens from Ethereum to NeuroWeb and vice versa. The specific details of the teleport can be found in relevant OriginTrail RFCs: + +* [OT-RFC-12 on OriginTrail Parachain TRAC bridges](https://github.com/OriginTrail/OT-RFC-repository/blob/main/RFCs/OT-RFC-12%20OriginTrail%20Parachain%20TRAC%20bridges%20\(v2\).pdf) +* [OT-RFC-16 on Parachain Bridges Implementation RFC](https://github.com/OriginTrail/OT-RFC-repository/blob/main/RFCs/OT-RFC-16-Parachain-Bridges-Implementation/OT-RFC-16-Parachain-Bridges-Implementation.pdf) + +## Prerequisites: + +* Wallet with TRAC tokens on the Ethereum or NeuroWeb network that you wish to teleport. Make sure you own the private key of the wallets that hold your TRAC token. Do not use exchange wallets or any other wallet that you do not own the private key to. +* Some amount of ETH or NEURO tokens, depending on the teleport direction, on the same wallet where you hold TRAC tokens to pay for the teleport transactions. +* You will need to connect your Ethereum wallet address with a NeuroWeb wallet address using [the mapping interface](https://neuroweb.ai/evm). This will create a mapping between the two addresses, which is necessary for the successful transfer of TRAC between the two networks. +* The minimum amount of TRAC tokens for teleport is 1 TRAC. +* Metamask browser extension. + +{% hint style="info" %} +We recommend that you also go through the [Teleport FAQ](https://teleport.origintrail.io/#faq) page and get additional information on the teleport process. +{% endhint %} + +## Step 1: + +In order to teleport your TRAC tokens between the two networks, go to [https://teleport.origintrail.io/](https://teleport.origintrail.io/) and navigate to the teleport interface by clicking on the **“Teleport TRAC tokens”** button. + +The interface will guide you through the “Get started” process. Please read all the instructions provided carefully to fully familiarize yourself with the teleport process. + +## Step 2: + +There are four steps required to transfer TRAC tokens from Ethereum to the NeuroWeb network and vice versa. + +### Step 2.1 - Choose the teleport direction: + +The first step is selecting the proper network in Metamask. This should be set based on the desired teleport direction between the two networks. + +If you are teleporting TRAC tokens from Ethereum to NeuroWeb, make sure that your selected network in Metamask is “Ethereum mainnet”. If you wish to teleport TRAC tokens from NeuroWeb to Ethereum, select the “NeuroWeb” network. + +{% hint style="info" %} +**Add the NeuroWeb Network to Metamask:** + +* In the "Add a network manually" section, enter the following details: + * **Network Name**: NeuroWeb Mainnet + * **New RPC URL**: https://astrosat-parachain-rpc.origin-trail.network/ + * **Chain ID**: 2043 + * **Currency Symbol**: NEURO + * **Block Explorer URL**: https://neuroweb.subscan.io/ +{% endhint %} + +When the desired network in Metamask is selected and recognized by the interface, select the teleport direction and proceed to the next step. + +{% hint style="warning" %} +Before proceeding to the next step, make sure that you have selected the proper wallet address with the TRAC tokens you wish to teleport. +{% endhint %} + +

Choosing teleport direction

+ +### Step 2.2 - Connect your wallet: + +When you proceed to this step, the interface should automatically recognize the selected wallet address in Metamask and load it into the required field. + +Make sure you double-check that the selected wallet is the one you wish to teleport TRAC tokens from. Also, check that the “Current network” recognized by the interface is the correct one (depending on the desired teleport direction). + +

Connect wallet address

+ +After making sure that all values are correct, proceed to the next step. + +### Step 2.3 - Initiate the teleport process: + +During this step you will have to provide the desired TRAC amount that you wish to teleport from one network to another. The interface will check the TRAC balance on the selected wallet and display it. Enter the TRAC amount into the required input field and initiate the teleport process. + +After initiating it, Metamask will pop up and ask you to approve the transaction. By clicking “Approve” you authorize our contract to transfer your TRAC tokens. After that, you will proceed to the final initialization step. + +

Initiating teleport process

+ +{% hint style="info" %} +You are allowed to teleport different amounts of TRAC multiple times from the same wallet address. +{% endhint %} + +### Step 2.4 - Complete the teleport process: + +During the final step of the process, you will see the TRAC amount you are about to teleport. If the amount is correct, proceed by clicking the “Confirm teleport” button. After confirming, your Metamask will pop up and ask you to confirm the teleport finalization. Upon approval, TRAC tokens will be locked on the smart contract on Ethereum or NeuroWeb, depending on the teleport direction. + +{% hint style="warning" %} +Deposits are possible until the 10th of each month, closing at 15:00 UTC. + +Distributions of TRAC on the destination network will be performed by the 15th of each month (no more than 5 days after the finalization of deposits). +{% endhint %} + +## Need help? + +If you encounter any issues or have any additional questions regarding TRAC teleport, contact technical support at [tech@origin-trail.com](mailto:tech@origin-trail.com). diff --git a/docs/to-be-repositioned/ai-agents/README.md b/docs/to-be-repositioned/ai-agents/README.md new file mode 100644 index 0000000..a4bb766 --- /dev/null +++ b/docs/to-be-repositioned/ai-agents/README.md @@ -0,0 +1,19 @@ +--- +description: >- + Create, expand, and share your AI agents’ memories in a transparent and + verifiable way on the DKG +--- + +# DKG AI Agents + +AI agents can leverage the OriginTrail Decentralized Knowledge Graph (DKG) to create knowledge-graph-based, collective, persistent memory for individual agents or agentic swarms. This functionality enables advanced use cases like long-term interaction tracking, knowledge storage, and retrieval. + +## Get started + +There are several ways to create your DKG-enabled AI agent: + +* **Easiest**: Create an [ElizaOS DKG Agent](elizaos-dkg-agent.md) using the popular ElizaOS framework. + * Full guide [here](elizaos-dkg-agent.md) +* **Advanced:** Use one of the DKG SDKs to build your own custom agent. + * [Python SDK ](custom-dkg-python-agent.md)agent-building guide + * [Javascript SDK ](custom-dkg-javascript-agent.md)agent-building guide (coming soon) diff --git a/docs/to-be-repositioned/ai-agents/custom-dkg-javascript-agent.md b/docs/to-be-repositioned/ai-agents/custom-dkg-javascript-agent.md new file mode 100644 index 0000000..9aa15e2 --- /dev/null +++ b/docs/to-be-repositioned/ai-agents/custom-dkg-javascript-agent.md @@ -0,0 +1,5 @@ +# Custom DKG JavaScript agent + +**Coming soon!**\ +\ +Until then check out [our ElizaOS integration](https://github.com/OriginTrail/elizagraph) example which is written in TypeScript. diff --git a/docs/to-be-repositioned/ai-agents/custom-dkg-python-agent.md b/docs/to-be-repositioned/ai-agents/custom-dkg-python-agent.md new file mode 100644 index 0000000..575f430 --- /dev/null +++ b/docs/to-be-repositioned/ai-agents/custom-dkg-python-agent.md @@ -0,0 +1,155 @@ +# Custom DKG Python agent + +This guide explains how to build a custom agent implementation using the [dkg.py](../../build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-sdk/dkg-v8-py-client/) SDK. AI agents can leverage the Decentralized Knowledge Graph (DKG) to create knowledge-graph-based, collective, persistent memory for individual agents or agentic swarms. + +## Overview of dkg.py + +dkg.py is a Python library for interacting with the DKG. It enables the creation, querying, and retrieval of structured knowledge assets stored in a decentralized and verifiable manner. + +### Key Operations: + +1. Create: Publish a knowledge asset to the DKG. +2. Query: Search for knowledge assets using structured queries. +3. Get: Retrieve a specific knowledge asset by its identifier. + +Set up DKG.py as per the instructions [here](../../build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-sdk/dkg-v8-py-client/). + +## **Use Case: AI Agent Memory** + +### Creating Memory + +AI agents can store structured knowledge or memory by creating knowledge assets in the DKG. For example, an agent can record user interactions or task results: + +```python +memory_asset={ + "@context": "http://schema.org", + "@type": "SocialMediaPosting", + headline: "", + articleBody: + "Check out this amazing project on decentralized cloud networks! @DecentralCloud #Blockchain #Web3", + author: { + "@type": "Person", + "@id": "uuid:john:doe", + name: "John Doe", + identifier: "@JohnDoe", + url: "https://twitter.com/JohnDoe", + }, + dateCreated: "yyyy-mm-ddTHH:mm:ssZ", + interactionStatistic: [ + { + "@type": "InteractionCounter", + interactionType: { + "@type": "LikeAction", + }, + userInteractionCount: 150, + }, + { + "@type": "InteractionCounter", + interactionType: { + "@type": "ShareAction", + }, + userInteractionCount: 45, + }, + ], + mentions: [ + { + "@type": "Person", + name: "Twitter account mentioned name goes here", + identifier: "@TwitterAccount", + url: "https://twitter.com/TwitterAccount", + }, + ], + keywords: [ + { + "@type": "Text", + "@id": "uuid:keyword1", + name: "keyword1", + }, + { + "@type": "Text", + "@id": "uuid:keyword2", + name: "keyword2", + }, + ], + about: [ + { + "@type": "Thing", + "@id": "uuid:thing1", + name: "Blockchain", + url: "https://en.wikipedia.org/wiki/Blockchain", + }, + { + "@type": "Thing", + "@id": "uuid:thing2", + name: "Web3", + url: "https://en.wikipedia.org/wiki/Web3", + }, + { + "@type": "Thing", + "@id": "uuid:thing3", + name: "Decentralized Cloud", + url: "https://example.com/DecentralizedCloud", + }, + ], + url: "https://twitter.com/JohnDoe/status/1234567890", +} + +response = dkg.asset.create(memory_asset) +print("Memory Asset UAL:", response["UAL"]) +``` + +### Querying Memory + +Retrieve specific memories using queries based on metadata or content: + +```python +query = """ +SELECT DISTINCT ?headline ?articleBody + WHERE { + ?s a . + ?s ?headline . + ?s ?articleBody . + + + OPTIONAL { + ?s ?keyword . + ?keyword ?keywordName . + } + + + OPTIONAL { + ?s ?about . + ?about ?aboutName . + } + + + FILTER( + CONTAINS(LCASE(?headline), "example_keyword") || + (BOUND(?keywordName) && CONTAINS(LCASE(?keywordName), "example_keyword")) || + (BOUND(?aboutName) && CONTAINS(LCASE(?aboutName), "example_keyword")) + ) + } + LIMIT 10 +""" + +results = dkg.graph.query(query) +print("Retrieved Memories:", results) +``` + +### Retrieving Specific Memories + +Use the get operation to fetch detailed information about a memory: + +```python +response = dkg.asset.get(response.get("UAL")) +print("Memory Details:", response) + +``` + +## Conclusion + +Using dkg.py, AI agents can create persistent, verifiable, and decentralized memory. This functionality enables advanced use cases like long-term interaction tracking, knowledge storage, and retrieval. Start building smarter AI agents with dkg.py today! + +For further assistance, refer to the rest of the documentation or contact the core development team. + +\ diff --git a/docs/to-be-repositioned/ai-agents/elizaos-dkg-agent.md b/docs/to-be-repositioned/ai-agents/elizaos-dkg-agent.md new file mode 100644 index 0000000..2365f7a --- /dev/null +++ b/docs/to-be-repositioned/ai-agents/elizaos-dkg-agent.md @@ -0,0 +1,119 @@ +# ElizaOS DKG agent + +ElizaOS is a popular open source AI agent framework which supports a native DKG integration through the DKG plugin. You can now build **AI agents that store their memories** on the Decentralized Knowledge Graph (DKG). This setup enables agents to **share memories** (creating a **collective memory**) and manage their knowledge **transparently and verifiably.** + +You can watch the following video to learn how to setup a ElizaOS DKG agent, or follow the text instructions below. + +ElizaOS is operational on **Mac** and **Ubuntu** devices, Windows is not yet supported + +## Video tutorial (\~1h 15mins) + +{% embed url="https://youtu.be/w3-_WBH3uSQ?si=5g3WY2G-HEPu0Kt9" %} +Video: How to build an AI agent with ElizaOS DKG Plugin +{% endembed %} + +## Classic instructions + +You can use the OriginTrail [Elizagraph starter kit](https://github.com/OriginTrail/elizagraph), which includes the plugin, or using the [official ElizaOS repo](https://github.com/elizaOS/eliza), where the plugin has already been integrated. The following instructions will assume you are using the **Elizagraph Starter kit**. + +### **Prerequisites - make sure you have them all set up** + +{% hint style="info" %} +[Python 2.7+](https://www.python.org/downloads/) + +[Node.js 23+](https://docs.npmjs.com/downloading-and-installing-node-js-and-npm/) + +[pnpm](https://pnpm.io/installation) +{% endhint %} + +## **Step-by-step** + +### **1. Clone the Elizagraph repository** + +```bash +git clone https://github.com/OriginTrail/elizagraph.git +cd elizagraph +``` + +### **2. Install dependencies and build the project** + +```bash +pnpm install +pnpm run build +``` + +### **3. Set up environment variables** + +Copy the `.env.example` file (from the Elizagraph folder in cloned repository) and rename it to `.env` and fill in the necessary details. The following instructions will help you populate the .env file. + +1. **Node information** + +If you do not already have a DKG node set up, you can use a public node that the OriginTrail team set up so that everyone has an easy way to interact with the DKG. + +There's one public node available for mainnet ([https://positron.origin-trail.network](https://positron.origin-trail.network/)) and one for testnet ([https://v6-pegasus-node-02.origin-trail.network](https://v6-pegasus-node-02.origin-trail.network)). All blockchains are supported on each of the nodes. + +Alternatively, you could also set up and connect to your own Edge Node for further features like a graph & chatbot interface, customizable knowledge mining pipelines, private knowledge assets, and more. Check out Edge Nodes [here](../../graveyard/everything/dkg-edge-node/). + +{% hint style="info" %} +**Mainnet** is the live blockchain for real transactions, while **testnet** is a risk-free testing environment. +{% endhint %} + +{% hint style="info" %} +Here's an example of how to set up a MetaMask wallet: [here](https://youtu.be/-HTubEJ61zU?si=tUcacxeluIMRFp6q) +{% endhint %} + +* `ENVIRONMENT`: Define your environment. For example, use `development` if you're running a local setup, `testnet`for a testnet setup, or `mainnet`a mainnet setup. +* `OT_NODE_HOSTNAME`: Enter the hostname or IP address of your OriginTrail DKG node. This will be the URL of the node you set up or http://localhost if you're running it locally. +* `OT_NODE_PORT`: The port used by the DKG node, typically `8900`. +* `PUBLIC_KEY`: The public key of the wallet you will use to publish Knowledge Assets. +* `PRIVATE_KEY`: The private key corresponding to the above wallet. Ensure you keep this secure and never share it outside of the .env file. +* `BLOCKCHAIN_NAME`: Specify the blockchain network you’re using. `otp:2043` (NeuroWeb mainnet), `base:8453` (Base mainnet), `gnosis:100` (Gnosis mainnet), `otp:20430` (NeuroWeb testnet), `base:84532` (Base testnet), `gnosis:10200` (Gnosis testnet) + +In order to fund your wallet on testnet, feel free to use the [Faucet](../../dkg-knowledge-hub/useful-resources/test-token-faucet.md) in the [OriginTrail Discord](https://discord.gg/xCaY7hvNwD). There's a message pinned in the **#faucet-bot** channel in case some of the faucets are down. In that case, feel free to ping the core team to send you some testnet funds manually. + +{% hint style="info" %} +If you are building your agent on the NeuroWeb, you need to get NEURO first and then TRAC. +{% endhint %} + +2. **LLM key** + +Eliza supports a wide range of LLM providers. To use one of them, create an API key and paste it into the environment file. + +{% hint style="info" %} +Here’s how to get an [API key from ChatGPT](https://www.youtube.com/watch?v=3BrmNZoPzHA). +{% endhint %} + +For example, if you want to use an Open AI LLM, populate the OPENAI\_API\_KEY variable in the .env. + +```properties +OPENAI_API_KEY= # obtain the key on https://platform.openai.com/ +``` + +3. **X credentials (in case you want to use X)** + +Eliza uses a basic X authentication setup. Use your username, password, and email so that the application can post on the platform. If you encounter issues, we recommend you to use the TWITTER\_COOKIES variable and copy the cookies from the browser. + +### **4. Customize DKG Knowledge Asset & query templates** + +If you wish to do so, modify the templates in `plugin-dkg/constants.ts` to change the format in which your data is stored or queried. + +Additionally, you can modify the interactions with the DKG in the Eliza providers, actions, and evaluators by calling different DKG.js functions, or modifying the parameters. For example, you could use a paranet for your agent. Check out the SDK docs [here](../../build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-sdk/). + +### **5. Run the character** + +Firstly, create a character in the `characters` folder, for example `chatdkg.character.json.` + +You can look at existing examples in the `characters` folder for inspiration. + +```bash +pnpm start --characters="characters/chatdkg.character.json" +``` + + + +{% hint style="warning" %} +#### Notes + +* There is no need to manually add `plugin-dkg` to the `plugins` array, it will load automatically because it is included in the `agent/src/index.ts` file. +* Ensure you configure the X client (if you wish to use X) and select your LLM provider in the character settings. +{% endhint %}