Recently, Acurast introduced a worldwide network of 225,000 smartphones that operate as secure compute nodes on Base, a breakthrough in the history of decentralized artificial intelligence infrastructure. The rollout unveils a fresh paradigm of executing AI workloads onchain without relying on the centralized cloud providers.
The initiative will help introduce confidential, verifiable AI execution to the mainstream development of Web3 by turning ordinary mobile devices into computing resources, decreasing the need to utilize traditional data centers.
Turning Smartphones Into a Global Compute Cloud
Central to the network is a very basic concept: billions of smartphones already have powerful processors and hardware-based security solutions. Acurast is putting this unused capability into a decentralized compute layer and enables developers to execute AI inference jobs on a geographically expanded network that spreads over over 140 countries.
This method is in contrast to traditional cloud infrastructure, where the workload is concentrated in huge server pits. The architecture enhances scalability, minimizes single points of failure, and exposure to censorship or data breach that is inherent in centralized systems.
The network uses Ethereum scaling infrastructure with Base to execute decentralized applications that require AI-driven processing faster and at a lower cost.
Privacy Through Trusted Execution Environments
The use of Trusted Execution Environments (TEEs) that are part of contemporary smartphones is a key technical element that facilitates this model. These enclaves are secured so that sensitive computations can occur in isolation, such that the people who own the devices or outsiders cannot access the undergoing data.
This architecture is specifically useful in AI applications that process financial information, identity authentication or enterprise processes where privacy is a high priority. The system provides transparency and privacy, two features traditionally considered to be incompatible by merging blockchain validation and hardware-level protection.
Enabling Autonomous, Pay-Per-Use AI Services With Acurast
The integration also adds native payment capabilities where AI agents and decentralized applications can pay programmatically per compute resources. This is to aid a usage based model where services are able to compensate devices that provide processing power in real time.
This kind of functionality can facilitate machine-to-machine economies, whereby self-directed software agents can demand data analysis, perform logic, and make payments without human intervention. The outcome is an infrastructure layer that focuses on supporting self-operating applications that can directly interface with onchain services and APIs.
Developers Gain Access to Decentralized AI Tooling
Acurast allows builders to join the network via the management interface, deploy AI agents through the known blockchain wallets, and onboarding devices.
Jesse Pollak claims that the partnership creates additional opportunities for developers aiming to bring real-world applications of full onchain with scalable blockchain infrastructure paired with decentralized compute resources.
Alessandro De Carli, the founder of Acurast, has stressed that to be trusted, AI systems managing digital assets cannot use centralized servers only. Decentralizing computation to user-sovereign devices incentivizes and gives strength to the decentralized spirit of Web3.
Laying the Groundwork for Confidential Onchain AI
The launch is not merely an infrastructure upgrade but it is a transition to a more decentralized physical infrastructure network to drive next-generation applications. With mobile hardware, blockchain settlement and privacy-preserving computation, Acurast is establishing itself as an intersection of DePIN, AI, and autonomous digital services.
With widespread adoption, this smartphone-powered device can change the nature of executing AI workloads, no longer relying on centralized clouds but rather a globally distributed fabric controlled and owned by users themselves.


