LOS ANGELES, CA / ACCESS Newswire / March 26, 2026 / We at Axis Robotics are building the distributed scaling layer for Physical AI, and today we're announcing that our main product will launch on the Base blockchain on March 25. Following two rounds of largeโscale community testing that generated nearly 300,000 robotic trajectories, we're moving from test phase to global availability. Our team has developed an endโtoโend system that redefines how data for Physical AI is generated, scaled, and distributed-and we believe the future of embodied intelligence won't be created by a handful of labs, but by broad, worldwide participation. That's why we're betting on simulation data and a globally distributed contributor network as Physical AI approaches a commercial inflection point and as demand for scalable, highly diverse training data accelerates across the industry.
Axis Robotics is pursuing a SimulationโFirst strategy to rebuild how data for Physical AI is generated, diversified, and scaled. By 2025, multiple technological vectors in robotics are converging faster than expected. Hardware supply chains for embodied robots are undergoing rapid commoditization, turning what once were expensive prototypes into devices capable of largeโscale realโworld deployment. VisionโLanguageโAction (VLA) models are giving robots semantic understanding, reasoning, and planning-the cognitive "brain" required for generalโpurpose behavior. And across the data stack, from video priors to advanced synthetic simulation, a multilayered data pyramid is emerging to fuel the continuous evolution of Physical AI.
Yet one bottleneck remains fundamental: data coverage.
Compared with LLMs and autonomous driving, physical intelligence still faces a significant preโtraining data deficit. The industry is pursuing several parallel paths to bridge this gap: largeโscale teleoperation datasets such as UMI, natural humanโrobot interaction via egocentric video, and fastโadvancing synthetic simulation data pipelines. As these sources mature, academia and industry are arriving at a new consensus:
Pretraining on largeโscale, highโquality simulation data-followed by fineโtuning on a small set of realโworld demonstrations-is one of the most practical and effective paths forward.
But this consensus raises the bar: simulation data must be highโquality, lowโcost, and truly scalable. Without this trifecta, training progress will remain constrained by the dual challenges of expensive realโworld data and insufficient synthetic fidelity.
So the question naturally arises:
Is Physical AI's "GPT moment" approaching?
Axis's answer is yes-but only if we reinvent the way robotic data is produced, validated, and deployed at global scale.
Enabling Everyone to Contribute to Physical AI at Scale
Traditional robotic data collection relies on small expert teams or labโbound teleoperation setups-expensive, limited in diversity, and inherently unscalable. Axis breaks this paradigm by building an endโtoโend Physical AI data infrastructure that allows anyone, anywhere, to contribute meaningful data through distributed human participation. Robots will serve humans, but they will also be built and continuously evolved through largeโscale human intelligence.
From day one, Axis understood that "providing data" alone is not enough. Solving the data bottleneck requires a full, vertically integrated pipeline, centered on three core components: task generation, data collection, and data evaluation & processing.
1. Dynamic Task Generation
A nextโgeneration 3D task engine decomposes robotic capabilities into atomic skills, enabling infinite highโquality simulation tasks from a single prompt. From simple singleโstep behaviors to complex chained tasks, robots can continuously expand their capabilities inside a rich, everโevolving task universe. The boundaries of data define the boundaries of robotic capabilities.

๏ผtask generation pipeline๏ผ
2. ZeroโBarrier Data Collection
Axis brings complex simulation environments-historically restricted to robotics labs-into the browser and onto mobile devices. Users can operate robots in real time directly from a web page, generating highโvalue trajectory data as naturally as playing a game. No hardware. No local compute. No expertise required.
3. Data Evaluation & Processing
Every trajectory is automatically replayed, validated, smoothed, and filtered through Axis's evaluation system. It examines completeness, stability, validity, fluidity, and more-producing training-ready data assets at scale, replacing manual curation with systemic automation.
MetaSim: The Core Engine Behind the Pipeline
Underneath this userโfacing system lies MetaSim, Axis's unified infrastructure layer designed specifically for Physical AI. It handles simulator decoupling, data verification, and augmentation. Human demonstration data collected via the lightweight web simulator can be seamlessly reproduced in NVIDIA Isaac Sim for highโfidelity validation.
Leveraging Isaac Sim's physics and rendering engines, Axis performs highโfidelity reconstruction and largeโscale domain randomization-dramatically amplifying SimโtoโReal robustness and overall training value. Each piece of data becomes more generalizable, more useful, and more aligned with realโworld deployment.

๏ผterminal screen recording๏ผ
Why Crypto Matters: Scaling Trustless Participation and Incentives
Infrastructure alone is not enough. To enable true global participation, Axis is integrating crypto as the coordination and incentive layer.
Crypto provides:
Transparent, verifiable contribution records
Distributed participation without geographic barriers
Incentive mechanisms aligned with actual data value
New possibilities for data assetization
This is crypto not as a narrative device, but as a practical delivery mechanism-a way to scale human contributions, ensure fairness, and transform data production into an ecosystem rather than a closed pipeline.
RealโWorld Validation: From "Little Prince's Rose" to LargeโScale Community Testing
Axis has already validated the endโtoโend effectiveness of its data pipeline.
In the "Little Prince's Rose" community event, Axis collected over 10,000 highโquality trajectories in just three days. After automated replay verification and augmentation, these trajectories were fed directly into training pipelines and successfully deployed on a physical Franka arm-executing a fully autonomous flower-watering task.
This milestone demonstrated Axis's zeroโshot SimโtoโReal transfer capability and proved, for the first time, that webโbased crowdsourced simulation can produce trainingโgrade robotic data.
Across two testing rounds totaling 15 days, more than 30,000 users contributed over 180,000 trajectories-all publicly visible on the live data dashboard: https://hub.axisrobotics.ai/

๏ผlive data dashboard: https://hub.axisrobotics.ai/๏ผ
Two Core Deliverables: HighโQuality Data and an Open, Modular Infrastructure Stack
Axis believes that just as robots will eventually serve every person, every person should have the opportunity to help build the next generation of robots.
This mission rests on two pillars:
1. HighโQuality Pretraining Datasets
Axis aims to set the standard for what qualifies as pretrainingโready robotic data: diverse tasks, rich scene layouts, multiโmodal structure, and direct usability for foundational models. This is not about producing more data; it is about producing the right data.
2. A Scalable, Open Infrastructure Stack
Beyond data, Axis is building a flexible, modular infrastructure that will gradually open core interfaces across task generation, data collection, data processing, and training. Developers, researchers, companies, and communities will all be able to plug in-turning Physical AI from a closed pipeline into a global collaborative system.
Industry Partnerships and RealโWorld Deployment
Axis is partnering with manufacturing companies, robot embodiment companies, and model developers-including Lotus Robotics, Booster Robotic, Zeroth, and Manycore Technology-to build scalable pipelines for data generation, model training, and deployment.
For robot embodiment companies needing largeโscale simulation data, Axis converts their hardware into highโfidelity sim-ready digital twins, generates simโready environments, and distributes tasks globally through its browserโbased platform. Users collect diverse trajectories at scale, enabling standardized, lowโcost data production and enterprise collaboration.
As hardware costs fall and supply chains mature, industry value is rapidly shifting toward AI models and data infrastructure. Simulation data-enhanced by highโprecision physics and domain randomization-is becoming a core production factor, representing a potential $100Bnโplus infrastructure category in the trillionโdollar Physical AI economy.
Axis's globally distributed data network transforms a historically expensive and centralized simulation workflow into an exponentially scalable system with powerful commercial potential.
Looking Ahead: Toward Physical AI's GPT Moment
Physical AI's GPT moment requires a system capable of capturing human intelligence and converting it into reliable, verifiable machine behavior. With its upcoming launch on Base Chain, Axis is deploying the distributed infrastructure designed for this future-a resilient, open network built for global collaboration at scale.

๏ผproduct overview๏ผ
On March 25, we launched our main product to the world.
Users, researchers, developers, and AI labs will be able to join and contribute to what may become the largest and most diverse robotic training dataset ever built.
Physical AI will not be owned by a few. It will be built by all of us.
Company: Axis Robotics
Contact: Christine Sun
Email: christine@axis-labs.ai
Website: https://axisrobotics.ai/
SOURCE: Axis Robotics
View the original press release on ACCESS Newswire
