DeAI: decentralized artificial intelligence
  • Introduction
    • General Terminology
  • Landscape
    • Data Providers
    • Computing Power
    • Model Training Task
    • Challenges
  • Privacy Preservation
    • Data Process
    • Privacy Preserved Training
    • Federated Learning
    • Cryptographic Computation
      • Homomorphic encryption
      • Multi-Party Computation
      • Trusted Execution Environment
    • Challenges
  • Security
    • Data Poisoning
    • Model Poisoning
    • Sybil Attacks
    • Impact of Large Models
    • Responsibility
  • Incentive mechanisms
    • Problem Formulation
    • Contribution Evaluation
    • Copyright
  • Verification of Computation
    • Computation on Smart Contract
    • Zero-Knowledge Proof
    • Blockchain Audit
    • Consensus Protocol
  • Network Scalability
    • Local Updating
    • Cryptography Protocol
    • Distribution Topology
    • Compression
    • Parameter-Efficient Fine Tuning
  • Conclusion
Powered by GitBook
On this page

Network Scalability

DeAI leverages the internet infrastructure for facilitating communication among diverse parties. This communication framework is fundamental for Federated Learning (FL) and Multi-party Computation (MPC) within the DeAI paradigm. Both FL and MPC heavily rely on communication protocols that facilitate multiple rounds of interaction among participating nodes.

Efficient network communication is critical for the success of FL and MPC protocols within DeAI settings\cite{li2020federated}. However, the communication overhead associated with transmitting checkpoints and model updates can significantly impact the overall efficiency. The optimized design of computation protocols minimizes the number of communication rounds required between nodes, enhances the efficiency and scalability of DeAI systems.

PreviousConsensus ProtocolNextLocal Updating

Last updated 1 year ago