Blog
Education

Composability of Computing Environments in AO

By
Rohit Pathare
March 6, 2024
5 min read
Share this post

After months of extensive foundational work, Arweave redefined decentralized computing with the launch of ao. With ao, you have the power of hyper parallel computing at your fingertips. Compute verifiably at scale, from your terminal. And with composability built-in to the system, you have the freedom to choose your computing environment.

Why composability matters

ao’s vision is to bring compute to you. No more protocol-enforced limitations. Have an idea? You can realize it with ao.

ao's modular architecture means your ideas can be brought to life with ease. Its composable nature empowers you to customize your computing environment to suit your needs perfectly. Whether it's opting for EVM bytecode over WASM, crafting a custom interpreter for Solidity, or scaling your processes for more throughput, ao is capable of adapting to your requirements. Even the processes' logic can be tweaked on the fly, as needed.

Understanding ao’s architecture

Grasping the potential for customization begins with understanding ao's architecture, which mirrors traditional computer systems in many ways:

an infographic on the architecture of ao
  • ao represents a virtual computer that exists on the Arweave network, consisting of units that work together to execute tasks. These units are akin to the components of traditional computers but are distributed across the network.
  • Message Units (MUs) are the coordinators that receive interaction requests, make sure they are ordered by SUs and then computed by CUs.
  • Scheduler Units (SUs) are the organizers that order the requests and store their information on the network for reproducibility.
  • Compute Units (CUs) are the processors evaluating the received requests.
  • Each CU has a virtual disk reader, capable of loading specific types of disks (modules).
  • Modules are analogous to the disks that can be loaded into compatible disk readers (CUs).
  • aos is one such module, compiled to WASM, capable of interpreting Lua code.
  • Every aos process can be thought of as a disk (module) loaded with Lua code. This code can be for a game program, a trading bot, or any other set of operations. At the time of creation, every process is automatically assigned MUs, CUs and SUs.
  • Every process/ floppy has its own virtual memory card (independent state). As these processes are persistent, the memory card helps start new interactions from the last checkpoint.

Tailoring your compute environment

ao is designed with the goal of providing a framework that empowers you to personalize your computing setup. The network supports the customization of the units and modules in two ways – by adopting existing infrastructure from other providers or setting up your own.

Currently, only the standard option of units and the aos modules provided by the ao team is available for use. However anyone can build their own modules on ao, and various teams within the ecosystem are already working on expanding the capabilities of ao by setting up their own infrastructure.

The network's framework outlines specifications for integrating any infrastructure seamlessly with the rest of the network. Creating your own units requires a minimum of 2 GB RAM each and compatibility with NodeJS environments for CUs. New modules must align with CUs' evaluation environments, or bespoke infrastructures must be established. Infrastructure providers can also incorporate features like load balancing to facilitate resource autoscaling.

Looking Ahead

One of the features on the roadmap for ao includes the introduction of staking mechanisms and tokens that add economic security to the functionality of the units. This can foster a fair and a competitive market based on cost, compute resources, provider stake and the critical nature of the operation, among other factors.

As the ao ecosystem grows and more modules are developed, users can expect an accelerated development experience with the ability to load pre-built modules into processes. The horizon is broad with potential expansions into modules that support LLM agents, SQL databases, gaming and more.

So this is how ao is bringing the power of hyper parallel compute to you. Have an idea? Let’s chat. Connect with us via the form on our landing page or join our community discord.

Share this post

A monthly exploration of the challenges and opportunities of a decentralized future.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Related blogs

Education
December 10, 2024

Endowment with Arweave

Arweave's endowment mechanism is a unique feature of Arweave's token economy that serves as a pool of AR tokens designed to ensure permanent data storage.
Education
December 6, 2024

Why Gaming on AO

Discover the future of Web3 gaming on AO: unlock true digital ownership, seamless asset interoperability, and scalable, interconnected experiences.
Education
November 6, 2024

Understanding Data Availability on Arweave

Arweave ensures data availability through Wildfire and SPoRA, rewarding fast data sharing/storage, unlike Layer 2 solutions needing Data Availability Committees.