Vision

VNTROnet is a decentralized interface layer. It lets software, sensors, and people exchange context-rich signals across AR, AI, and the physical world.

What this enables

  • Spatial apps that run anywhere the web reaches
  • Device-to-device protocols for overlays and actions
  • Creator economies for 3D assets and interface modules

R&D Focus

We actively research advanced interface paradigms that push the boundaries of human technology interaction. Details are kept private while under evaluation.

  • Multi-platform AR glasses integration (WebXR, OpenXR compatible)
  • Low-power radio + edge compute for distributed networks
  • Contextual AI overlays & agents (on-device processing)
  • Accessibility-first spatial interface design
  • Privacy-preserving protocols and local-first architecture

Network Deep Dive

How modules interact across the VNTROnet and what each module unlocks.

Architecture

  • Web-native, standards-first (HTTP, WebXR, WebRTC)
  • Edge-first: low-latency overlays and local processing
  • Privacy-preserving by default; opt-in telemetry
  • Modular subdomains for clear responsibility boundaries

Data Flows

  • Clients publish/subscribe to context streams
  • On-device inference; offload when user approves
  • Least-privilege access tokens scoped to modules
  • Event logs are user-owned and portable

Module Roles

  • explor: immersive front-end (WebXR) for live experiences
  • app: wallet, identity, and permissions center
  • docs: developer portal, SDKs, and examples
  • labs: experimental previews that graduate into the network

Trust Model

Users choose what to share per-module. Credentials and keys stay in the app domain or on device. No cross-domain tracking.

Interfaces

Device-agnostic overlays, keyboard/gaze/voice inputs, and accessible semantics for screen readers in spatial contexts.

Roadmap

  • Q4: Public SDK docs and example overlays
  • Q1: Beta app wallet + permissions center
  • Q2: Community modules directory

Network Map

A growing network of modules and connections—expanding as new ideas land.

AR/VR & Spatial Computing

VNTRO provides a universal spatial networking layer that works across AR/VR platforms, hardware, and web standards. Built on WebXR and OpenXR for maximum compatibility.

Multi-Platform Support

  • WebXR-first architecture for broad device compatibility
  • Works with AR glasses, VR headsets, and mixed reality devices
  • Cross-platform spatial protocols and device-agnostic APIs
  • Compatible with various OS platforms and hardware ecosystems

Use Cases

  • Multi-user collaborative AR experiences
  • Healthcare and medical training applications
  • Educational spatial interfaces and immersive learning
  • Industrial IoT integration and spatial data visualization
  • Accessible AR/VR for diverse user needs

Explore AR/VR Developer Docs

Labs Spotlight

Where new ideas are incubated before graduating into the network.

Project SOL

WebXR interfaces for AR glasses. Glanceable spatial HUDs designed for accessibility.

Learn More

Project ECHO

Low‑power mesh signaling for ambient awareness and presence without surveillance.

Learn More

Project PRISM

On-device AI overlays for task guidance with strong accessibility support.

Learn More

Visit Labs

Build the VNTROnet

Developers, designers, and hardware hackers welcome. Integrate devices, publish overlays, or extend the protocol.

SDKs & APIs

Use web-native libraries to publish and subscribe to context streams.

Docs

AR Modules

Ship portable overlays that run on standards-compliant browsers and devices.

Explore

Contribute

Join our community channels for devlogs, bounties, and early access.

Portal

VNTRO Labs

A forward research division exploring the frontiers of interface design, accessibility, and machine context. We collaborate with research institutions and open-source communities.

Project SOL

Multi-platform AR glasses integration research. Micro‑OLED spatial HUD experiments for low-friction glanceable data. WebXR-compatible interfaces.

Prototype

Project ECHO

Low‑power radio protocols for distributed networks. Device signaling for ambient awareness and privacy-preserving presence. Inspired by amateur radio networks.

R&D

Project PRISM

Contextual AI overlays with on-device processing. Accessible spatial interfaces for task guidance and environmental annotations. Privacy-first design.

Field Tests

Visit Labs

Accessibility & Inclusivity

VNTRO is committed to building accessible spatial computing experiences that include everyone, regardless of ability, background, or identity. Technology is a medium for communication, communication should be accessible to all.

Accessibility First

  • Multiple interaction modes: voice, gesture, gaze, traditional input
  • Screen reader support for spatial content descriptions
  • High contrast modes and colorblind-friendly palettes
  • Spatial audio and haptic feedback for visual information
  • Reduced motion support and cognitive accessibility

Inclusive Design

We work with accessibility advocates, disability rights organizations, and inclusive design practitioners to ensure VNTRO works for everyone. Our platform is designed for diverse communities and multiple cultural contexts.

Accessibility Guide Read Blog

Accessibility isn't an afterthought, it's built into our core design principles from the start.

The VNTROnet

Explore VNTROnet modules and what each unlocks.

Showing 9 of 9

Contact

For collaborations, integrations, or press inquiries.

Community

Announcements roll out in stages. Follow official channels for updates.

  • X / Twitter (soon)
  • Discord (invite coming)
  • GitHub (public repos in staging)

Security note: we never DM for funds, keys, or seed phrases.

Growing modules

mesh.vntro.net

Ambient device mesh.

Incubating

studio.vntro.net

Creator tools & patterns.

Incubating

market.vntro.net

Modules & overlays.

Incubating

hub.vntro.net

Signals directory.

Incubating

Propose a New Module