Page cover

Introduction

What is LYS Labs?

LYS Labs is a blockchain data infrastructure focused on both real-time structured data streams served under 14ms and contextualized, AI-ready data served in under 30ms. This allows clients - from high-frequency traders to AI model trainers - to react before the rest of the market even knows what happened.

Context Matters

In Web3, data is power. But raw data from the blockchain is fragmented, noisy, and hard to use in real time. Raw blockchain data is just noise without context.

LYS transforms every transaction, event, and state change into contextualized, correlated, and enriched metrics that tell the full story - who is acting, why they might be acting, and how it connects to market dynamics.

We provide:

  • Real-time structured insights – every decoded trade, token launch, or wallet movement is available instantly.

  • Custom ontologies & schemas – define the exact data models you need for your strategies.

  • AI-ready pipelines – ontology-grounded graphs powering retrieval-augmented generation (RAG) for intelligent agents and trading models.

With <30ms end-to-end latency from chain to your API or WebSocket stream, we don’t just deliver fast data - we deliver fast answers.

Offering at a glance

  • 14ms – ultra-low-latency structured blockchain data

  • <30ms – fully contextualized & contextualized data with correlations, aggregations, and enriched metrics

Your journey through these docs

This documentation is structured to mirror your learning curve:

1

Start here

Understand the platform and use cases

2

Dive into the architecture

Learn how the stack is built

3

Explore the data

From raw Solana logs to decoded events

4

Use the APIs

Live data, streaming endpoints, and GraphQL

5

Build

Train AI models or build apps on top of LYS

Who we serve

LYS isn’t just another analytics dashboard - it’s an intelligence platform. Here's how different power users benefit:

For High Frequency Traders:

  • Access structured wallet flows, PnL trails, and anomaly graphs in <14ms.

  • Execute faster with contextual alpha; skip 90% of data munging.

  • Use cases: Real PnL mapping, token flow velocity, backtestable signal construction.

Key capabilities

1. Real-time pipelines

  • Solana decoded in 14ms

  • Live trade and liquidity streaming

  • Support for EVM chains (block-by-block) (coming soon)

2. Custom ontologies

  • Define your own schema over blockchain activity

  • Baseline templates provided for fast start

3. Knowledge graphs

  • Multi-hop wallet, token, protocol relationships

  • Graph-native queries for advanced insights

4. AI-Optimized retrieval

  • Ontology-Grounded RAG (OG-RAG) integration

  • Built for LLMs and autonomous agents

How it works: The LYS data journey

1. Raw data ingestion

  • Direct from full nodes, mempool, or Geyser (for Solana)

  • Avoids third-party APIs = lower latency, more control

2. Parsing & decoding

  • Protocol-specific decoders identify real events (e.g. swaps, votes, liquidity adds)

  • Outputs are normalized to shared schemas

3. Indexing & aggregation

  • Events are indexed in real time and written to memory + database

  • Aggregators summarize trends (OHLCV, buy/sell counts, rug flags)

4. Contextualization

  • Link events across wallets and time

  • Discover hidden correlations (e.g., airdrop farming, vote manipulation)

5. Delivery

  • APIs and WSS streams deliver data to bots, dashboards, or AI agents

  • Sandbox enables querying in natural language, Cypher, or via API

Example:

Token Launch Detection

Here’s what happens when a token launches on Pump.fun:

  • Decoder detects token creation, wallet funding, and initial liquidity

  • Aggregator tracks early buys/sells, bundle flags, price rise

  • Sandbox shows key metrics like bonding % change or suspicious wallets

  • AI agents receive this context and decide whether to trade, alert, or ignore

Getting started with LYS

You don’t need to be an engineer to use LYS. But if you are, we’ll give you every hook you need.

You can:

  • Use prebuilt aggregations (volume spikes, rug detection, bundle counts)

  • Subscribe to live streams of decoded Solana events

  • Train models on historical, structured token activity

Up next

In the next section, we’ll explore the LYS Platform in detail - how it all connects, what we’ve built, and how every layer serves data.

Ready? Let’s go.

Last updated