Tether unveils mobile-friendly AI training platform

Stablecoin issuer launches a framework allowing large AI models to run on smartphones and non-Nvidia GPUs, reducing hardware and memory barriers for developers.

Tether has launched an AI framework that runs large language models on smartphones and non-NVIDIA GPUs. The system is part of its QVAC platform and uses Microsoft’s BitNet architecture, along with LoRA techniques to reduce memory and computational requirements.

The framework enables cross-platform training on AMD, Intel, Apple Silicon, and mobile GPUs, allowing models with up to 1 billion parameters to be fine-tuned on phones in under 2 hours.

Larger models with up to 13 billion parameters are also supported on mobile devices. BitNet’s 1-bit architecture reduces VRAM requirements by nearly 78%, enabling larger models to run on limited hardware.

Performance improvements benefit inference, with mobile GPUs outperforming CPUs, enabling on-device training and federated learning. By reducing reliance on cloud infrastructure, the system offers more flexible AI development for distributed environments.


Tether’s expansion into AI mirrors a broader trend in the crypto sector, where companies are investing in AI infrastructure, autonomous agents, and high-performance computing.

Industry activity includes record revenue growth for AI and HPC operations, blockchain-integrated AI agents, and new tools for secure on-chain transactions.

Would you like to learn more about AI, tech and digital diplomacy? If so, ask our Diplo chatbot!