DeAI: decentralized artificial intelligence
  • Introduction
    • General Terminology
  • Landscape
    • Data Providers
    • Computing Power
    • Model Training Task
    • Challenges
  • Privacy Preservation
    • Data Process
    • Privacy Preserved Training
    • Federated Learning
    • Cryptographic Computation
      • Homomorphic encryption
      • Multi-Party Computation
      • Trusted Execution Environment
    • Challenges
  • Security
    • Data Poisoning
    • Model Poisoning
    • Sybil Attacks
    • Impact of Large Models
    • Responsibility
  • Incentive mechanisms
    • Problem Formulation
    • Contribution Evaluation
    • Copyright
  • Verification of Computation
    • Computation on Smart Contract
    • Zero-Knowledge Proof
    • Blockchain Audit
    • Consensus Protocol
  • Network Scalability
    • Local Updating
    • Cryptography Protocol
    • Distribution Topology
    • Compression
    • Parameter-Efficient Fine Tuning
  • Conclusion
Powered by GitBook
On this page
  1. Landscape

Computing Power

Decentralized computing power entails computing nodes owned by different entities and organized via the internet. Large models consuming extensive data require increased network bandwidth, posing a significant challenge. Moreover, large models typically utilize multiple AI accelerators for computation. While Nvidia leverages PCIe to achieve 800GBps data transmission across chips, decentralized computing nodes connected via the internet face bandwidth limitations of 100GBps or lower, sometimes unreliable. Bandwidth emerges as a critical issue in the era of large models.

PreviousData ProvidersNextModel Training Task

Last updated 11 months ago