DeAI: decentralized artificial intelligence
  • Introduction
    • General Terminology
  • Landscape
    • Data Providers
    • Computing Power
    • Model Training Task
    • Challenges
  • Privacy Preservation
    • Data Process
    • Privacy Preserved Training
    • Federated Learning
    • Cryptographic Computation
      • Homomorphic encryption
      • Multi-Party Computation
      • Trusted Execution Environment
    • Challenges
  • Security
    • Data Poisoning
    • Model Poisoning
    • Sybil Attacks
    • Impact of Large Models
    • Responsibility
  • Incentive mechanisms
    • Problem Formulation
    • Contribution Evaluation
    • Copyright
  • Verification of Computation
    • Computation on Smart Contract
    • Zero-Knowledge Proof
    • Blockchain Audit
    • Consensus Protocol
  • Network Scalability
    • Local Updating
    • Cryptography Protocol
    • Distribution Topology
    • Compression
    • Parameter-Efficient Fine Tuning
  • Conclusion
Powered by GitBook
On this page
  1. Privacy Preservation
  2. Cryptographic Computation

Trusted Execution Environment

Trusted Execution Environment (TEE) creates an isolated environment ensuring code authentication, runtime state integrity, and data confidentiality. Intel SGX is one of the most studied TEE solutions, providing a trusted hardware mechanism to create protected containers called enclaves. However, current TEE solutions have limitations for deep learning models, including significant overhead for memory-intensive tasks, limited memory capacity (e.g., 128MB default in Intel SGX), and support for limited CPU instructions without GPU leverage. Efforts have been made to offload computationally intensive layers of deep learning models to the GPU while maintaining integrity and confidentiality within an enclave. Despite these efforts, complicated implementations have led to discovered attacks to TEE.

PreviousMulti-Party ComputationNextChallenges

Last updated 11 months ago