DeAI: decentralized artificial intelligence
  • Introduction
    • General Terminology
  • Landscape
    • Data Providers
    • Computing Power
    • Model Training Task
    • Challenges
  • Privacy Preservation
    • Data Process
    • Privacy Preserved Training
    • Federated Learning
    • Cryptographic Computation
      • Homomorphic encryption
      • Multi-Party Computation
      • Trusted Execution Environment
    • Challenges
  • Security
    • Data Poisoning
    • Model Poisoning
    • Sybil Attacks
    • Impact of Large Models
    • Responsibility
  • Incentive mechanisms
    • Problem Formulation
    • Contribution Evaluation
    • Copyright
  • Verification of Computation
    • Computation on Smart Contract
    • Zero-Knowledge Proof
    • Blockchain Audit
    • Consensus Protocol
  • Network Scalability
    • Local Updating
    • Cryptography Protocol
    • Distribution Topology
    • Compression
    • Parameter-Efficient Fine Tuning
  • Conclusion
Powered by GitBook
On this page
  1. Network Scalability

Cryptography Protocol

Most secure sharing protocols necessitate multi-round peer-to-peer communication, posing challenges particularly when dealing with models containing billions of parameters, where completing such communication in a reasonable timeframe becomes impractical.

To address this issue, optimization on protocols are essential, notably focusing on garbled-circuites MPC protocols\cite{beerliova2008perfectly, maestre2009distributed}. A key optimization strategy involves the design of compilers tailored to generate fewer garbled-circuit gates, thereby reducing the size of data transmissions and alleviating the burden on network communication. However, by far, there is no practical method applied to large models.

PreviousLocal UpdatingNextDistribution Topology

Last updated 11 months ago