Education

Composability of Computing Environments in AO

Composability of Computing Environments in AO

Composability of Computing Environments in AO

Discover how ao gives you the power of hyper parallel computing at your fingertips. Explore ao's composable architecture and see how you can leverage it to personalize your computing environment.

Discover how ao gives you the power of hyper parallel computing at your fingertips. Explore ao's composable architecture and see how you can leverage it to personalize your computing environment.

Discover how ao gives you the power of hyper parallel computing at your fingertips. Explore ao's composable architecture and see how you can leverage it to personalize your computing environment.

Mar 6, 2024

Mar 6, 2024

banner: the power of hyper parallel computing at your fingertips
banner: the power of hyper parallel computing at your fingertips

After months of extensive foundational work, Arweave redefined decentralized computing with the launch of ao. With ao, you have the power of hyper parallel computing at your fingertips. Compute verifiably at scale, from your terminal. And with composability built-in to the system, you have the freedom to choose your computing environment.

Why composability matters

ao’s vision is to bring compute to you. No more protocol-enforced limitations. Have an idea? You can realize it with ao.

ao's modular architecture means your ideas can be brought to life with ease. Its composable nature empowers you to customize your computing environment to suit your needs perfectly. Whether it's opting for EVM bytecode over WASM, crafting a custom interpreter for Solidity, or scaling your processes for more throughput, ao is capable of adapting to your requirements. Even the processes' logic can be tweaked on the fly, as needed.

Understanding ao’s architecture

Grasping the potential for customization begins with understanding ao's architecture, which mirrors traditional computer systems in many ways:

an infographic on the architecture of ao
  • ao represents a virtual computer that exists on the Arweave network, consisting of units that work together to execute tasks. These units are akin to the components of traditional computers but are distributed across the network.

  • Message Units (MUs) are the coordinators that receive interaction requests, make sure they are ordered by SUs and then computed by CUs.

  • Scheduler Units (SUs) are the organizers that order the requests and store their information on the network for reproducibility.

  • Compute Units (CUs) are the processors evaluating the received requests.

  • Each CU has a virtual disk reader, capable of loading specific types of disks (modules).

  • Modules are analogous to the disks that can be loaded into compatible disk readers (CUs).

  • aos is one such module, compiled to WASM, capable of interpreting Lua code.

  • Every aos process can be thought of as a disk (module) loaded with Lua code. This code can be for a game program, a trading bot, or any other set of operations. At the time of creation, every process is automatically assigned MUs, CUs and SUs. 

  • Every process/ floppy has its own virtual memory card (independent state). As these processes are persistent, the memory card helps start new interactions from the last checkpoint.

Tailoring your compute environment

ao is designed with the goal of providing a framework that empowers you to personalize your computing setup. The network supports the customization of the units and modules in two ways – by adopting existing infrastructure from other providers or setting up your own.

Currently, only the standard option of units and the aos modules provided by the ao team is available for use. However anyone can build their own modules on ao, and various teams within the ecosystem are already working on expanding the capabilities of ao by setting up their own infrastructure.

The network's framework outlines specifications for integrating any infrastructure seamlessly with the rest of the network. Creating your own units requires a minimum of 2 GB RAM each and compatibility with NodeJS environments for CUs. New modules must align with CUs' evaluation environments, or bespoke infrastructures must be established. Infrastructure providers can also incorporate features like load balancing to facilitate resource autoscaling.

Looking Ahead

One of the features on the roadmap for ao includes the introduction of staking mechanisms and tokens that add economic security to the functionality of the units. This can foster a fair and a competitive market based on cost, compute resources, provider stake and the critical nature of the operation, among other factors.

As the ao ecosystem grows and more modules are developed, users can expect an accelerated development experience with the ability to load pre-built modules into processes. The horizon is broad with potential expansions into modules that support LLM agents, SQL databases, gaming and more.

So this is how ao is bringing the power of hyper parallel compute to you. Have an idea? Let’s chat. Connect with us via the form on our landing page or join our community discord.

Rohit Pathare | ropats16

Rohit Pathare

Developer Relations

Rohit Pathare | ropats16

Rohit Pathare

Developer Relations

Rohit Pathare | ropats16

Rohit Pathare

Developer Relations

stay in the loop

Subscribe for more inspiration.

Subscribe for more inspiration.