Blog
Education

Composability of Computing Environments in AO

By
Rohit Pathare
March 6, 2024
5 min read
Share this post

After months of extensive foundational work, Arweave redefined decentralized computing with the launch of ao. With ao, you have the power of hyper parallel computing at your fingertips. Compute verifiably at scale, from your terminal. And with composability built-in to the system, you have the freedom to choose your computing environment.

Why composability matters

ao’s vision is to bring compute to you. No more protocol-enforced limitations. Have an idea? You can realize it with ao.

ao's modular architecture means your ideas can be brought to life with ease. Its composable nature empowers you to customize your computing environment to suit your needs perfectly. Whether it's opting for EVM bytecode over WASM, crafting a custom interpreter for Solidity, or scaling your processes for more throughput, ao is capable of adapting to your requirements. Even the processes' logic can be tweaked on the fly, as needed.

Understanding ao’s architecture

Grasping the potential for customization begins with understanding ao's architecture, which mirrors traditional computer systems in many ways:

an infographic on the architecture of ao
  • ao represents a virtual computer that exists on the Arweave network, consisting of units that work together to execute tasks. These units are akin to the components of traditional computers but are distributed across the network.
  • Message Units (MUs) are the coordinators that receive interaction requests, make sure they are ordered by SUs and then computed by CUs.
  • Scheduler Units (SUs) are the organizers that order the requests and store their information on the network for reproducibility.
  • Compute Units (CUs) are the processors evaluating the received requests.
  • Each CU has a virtual disk reader, capable of loading specific types of disks (modules).
  • Modules are analogous to the disks that can be loaded into compatible disk readers (CUs).
  • aos is one such module, compiled to WASM, capable of interpreting Lua code.
  • Every aos process can be thought of as a disk (module) loaded with Lua code. This code can be for a game program, a trading bot, or any other set of operations. At the time of creation, every process is automatically assigned MUs, CUs and SUs.
  • Every process/ floppy has its own virtual memory card (independent state). As these processes are persistent, the memory card helps start new interactions from the last checkpoint.

Tailoring your compute environment

ao is designed with the goal of providing a framework that empowers you to personalize your computing setup. The network supports the customization of the units and modules in two ways – by adopting existing infrastructure from other providers or setting up your own.

Currently, only the standard option of units and the aos modules provided by the ao team is available for use. However anyone can build their own modules on ao, and various teams within the ecosystem are already working on expanding the capabilities of ao by setting up their own infrastructure.

The network's framework outlines specifications for integrating any infrastructure seamlessly with the rest of the network. Creating your own units requires a minimum of 2 GB RAM each and compatibility with NodeJS environments for CUs. New modules must align with CUs' evaluation environments, or bespoke infrastructures must be established. Infrastructure providers can also incorporate features like load balancing to facilitate resource autoscaling.

Looking Ahead

One of the features on the roadmap for ao includes the introduction of staking mechanisms and tokens that add economic security to the functionality of the units. This can foster a fair and a competitive market based on cost, compute resources, provider stake and the critical nature of the operation, among other factors.

As the ao ecosystem grows and more modules are developed, users can expect an accelerated development experience with the ability to load pre-built modules into processes. The horizon is broad with potential expansions into modules that support LLM agents, SQL databases, gaming and more.

So this is how ao is bringing the power of hyper parallel compute to you. Have an idea? Let’s chat. Connect with us via the form on our landing page or join our community discord.

Share this post

A monthly exploration of the challenges and opportunities of a decentralized future.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Related blogs

Education
October 7, 2024

Private Compute and Why It Matters

AO enables private compute by leveraging FHE and ZKPs to perform secure, verifiable computations on encrypted data, ensuring privacy for applications like healthcare, AI, and blockchain.
Education
September 5, 2024

Quick Guide to ArFleet: The Decentralized Storage Layer Built on Top of Arweave and AO

ArFleet is a decentralized temporary storage system on AO and Arweave, handling data encryption, bundling, provider selection, and verification.
Education
September 3, 2024

Quick Guide to Permanent Storage on Arweave

Arweave is a decentralized storage protocol for permanent data preservation, aiming to reshape the internet by ensuring digital content cannot be altered or deleted.