Pyth Data Marketplace — Expanding Pyth's Distribution Layer for Institutional Datasets

Summary

This post introduces the concept of the Pyth Data Marketplace, a new distribution model that would enable institutions to deliver their unique datasets through Pyth’s infrastructure—without contributing to Pyth’s aggregate price feeds.

The Marketplace represents a significant expansion of Pyth’s role: from the leading first-party oracle network to a global distribution layer for institutional data of any type.

Background

For the past three years, Pyth Network has built the foundation for how financial data is sourced, validated, and distributed. What started as real-time price aggregation has grown into the largest first-party market data network in the world—120+ institutional publishers, 100+ blockchains, and over $2 trillion in cumulative trading volume.

But there’s a gap.

Many institutions want to leverage Pyth’s infrastructure and global reach, but they:

  • Have proprietary datasets they won’t contribute to an aggregated feed

  • Offer non-price data (economic indicators, event data, indices, benchmarks)

  • Need full attribution and control over their data delivery

The Pyth Data Marketplace is designed to address exactly this.

What is the Pyth Data Marketplace?

A dedicated platform for institutions to distribute datasets on-chain without contributing to Pyth’s aggregate price feeds.

Distribution Model Description Example Products
Pyth Price Feeds Aggregated real-time prices from 120+ publishers Pyth Crypto+, Pyth Pro
Pyth Data Marketplace Pass-through delivery of 3rd party datasets Economic data, FX composites, event data, indices

Key distinction: The Marketplace does not alter, aggregate, or transform partner datasets. It acts as a pass-through—data retains its source integrity, metadata, and attribution from end to end.

Important: Marketplace datasets are not covered by Pyth’s Oracle Integrity Staking program. Douro Labs provides infrastructure and delivery (and in some cases, only a referral service), not data validation. Data quality, accuracy, and SLAs remain the sole responsibility of the originating partner.

Why Does This Matter?

For Institutions:

  • Global reach without infrastructure investment — 100+ blockchains, 600+ applications, instant distribution

  • Full attribution and control — Their data, their branding, their pricing

  • New revenue channel — Monetize via Pyth’s subscriber base

  • Parallel distribution — Complements existing sales channels, doesn’t replace them

For the Pyth Ecosystem:

  • Expanded TAM — Revenue not capped by what Pyth produces itself

  • Stronger brand credibility — Being the home for institutional datasets builds trust

  • More surface area — Discovery leads to pull-through to Pyth Pro

  • Network effects — More data → more developers → more applications → more data

For Developers and Users:

  • New dataset categories — Macroeconomic releases, volatility surfaces, FX benchmarks, event probabilities, indices

  • Single API — Unified access to diverse institutional data

  • Proven infrastructure — Same battle-tested stack securing Pyth Price Feeds

How Would It Work?

Douro Labs would operate the Marketplace, using three flexible partnership models:

Model 1 — Pyth Delivery + Partner Billing

  • Partner delivers data to Pyth’s infrastructure

  • Douro Labs handles normalization and delivery

  • Customer contracts with Partner for data license and payment rails

  • Douro earns infrastructure fee or rev share

Model 2 — Full Pyth Distribution

  • Partner licenses data to Douro

  • Douro Labs handles delivery, billing, entitlements

  • Customer signs contracts with Douro

Model 3 — Referral (fastest onboarding)

  • Douro Labs lists feed(s) and introduces customers to data partners

  • Partner handles billing and data provisioning

  • Douro Labs earns referral fee / rev share

Governance Considerations

The Pyth Data Marketplace would require formal DAO approval. Likely path:

  1. Idea Bank discussion ← We are here

  2. Community feedback & iteration — Comments below ↓

  3. Constitutional PIP (CO-PIP) — Authorizing Douro Labs to operate the Marketplace on behalf of the DAO

No on-chain code changes anticipated—this is primarily a constitutional/mandate update.

Open Questions for the Community

  1. Revenue split — What’s an appropriate DAO share for the Marketplace revenue? Should it mirror the 60/40 split from Pyth Pro (CO-PIP-9)?

  2. Scope of authority — How much discretion should Douro have in negotiating partnership terms?

  3. Reporting — What level of transparency do you expect? Partner names? Revenue by partner?

  4. Brand positioning — How do we ensure the Marketplace strengthens (not dilutes) Pyth’s core oracle reputation?

  5. Expansion criteria — Should there be guidelines for what types of datasets are appropriate?

8 Likes

I do like this idea, but on one condition - the marketplace is run on Pyth token gas fees.

Data users would buy and hold Pyth tokens in reserve to pay for marketplace data requests. At current prices, data users would rush in to buy. Token utility rockets along with price. Data publishers of Pyth Network realize enormous gains.

2 Likes

Thanks for writing this up zenyas!

Overall, I’m supportive. The network effects argument is compelling as more data on Pyth means more potential data subscribers (for 3rd party data that can also convert to ‘Pyth’ data).

A few thoughts on the open questions:

  1. Revenue split: The 60/40 framework from CO-PIP-9 makes sense as a starting point, but I’d argue the split could vary by partnership model. Model 3 (referral-only) has lower Douro overhead — should that mean a higher DAO share, or is simplicity worth a flat rate across models?

  2. Scope of authority: I am ok with Douro having reasonable discretion on commercial terms, but I believe the DAO should establish some guardrails around exclusivity agreements and anything that could create conflicts with Pyth’s core feeds. Maybe a threshold (e.g., deals above $X or exclusive arrangements) that triggers DAO visibility?

  3. Reporting: Monthly like Pyth Pro works for me. Doesn’t need to be super detailed, should track partner count, revenue breakdown by category, that kind of thing. Enough to see what’s growing.

1 Like

@Chop Thanks for brining this to my attention. The idea is great! And I think it’s what Pyth has been designed to deliver.

Through in some AI Agent integrations (I’m not technical, so can’t offer much suggestions).

This definitely flew under the radar, seeing the amount of interactions validates my opinion.

I’d like to see more conversation around this and possible vote in the bear future

2 Likes

Love this idea, the more avenues for Pyth to be used the better in my opinion.

Would be great if we could find a way to integrate the use of Pyth tokens in one way or another alongside this.

Reporting: I’m with @KemarTiti same format as Pyth Pro is fine. It would be good however to have a visual representation on the insights hub that can show growth over time as people are onboarded. I would say the same of the number of Pyth Pro subscribers actually.

2 Likes

The Idea is really good for devirsified income at product market fit though, the thing is

  1. reporting – how can douo labs disclosed or unclosed the deal, i mean like in accountant world there is a IFRS right. need clear path.
  2. Brand positioning – by creating new dashboard ?
  3. expantion criteria – is it dataset confidential, i mean in ai terms dataset could refers to rawdata
1 Like

Firstparty data remains Pyth’s strongest comprehensive advantage VS most oracle models

Curious how Marketplace complements this advantage rather than competing with internal resource allocation & what for Institutionals will be eager to check price on marketplace

I fully support this idea.

As @amensch alluded to, I would also like to understand how this might possibly be applied to the AI agentic economy. Or are Pyth Core or Pyth Pro more suited for agentic applications? Curious to hear the product team’s thoughts on this, even if it may require branching into a separate discussion.

Regarding the revenue split, it seems a little early to be fixing this, since we’re not sure about market uptake/adoption. What about taking a similar approach to how Pyth Core was initially - focusing on a strong core product/design (that supports future revenue mechanics), keeping pricing low/negligible, and prioritizing adoption, rather that being too focused on immediate revenue.

Don’t have much opinions about the other open questions. The overall concept and direction seem sound, so for this particular case, I would say just “ship fast and iterate later”.

3 Likes

Further thinking about my earlier points:

I’m landing on flat 60/40 across all three models — same as Pyth Pro. Why?

  1. Marginal difference: The gap between 60/40 and 80/20 on Model 3 sounds significant in percentage terms, but in absolute dollars it’s unlikely to move the needle early on. If a referral deal generates $50K/year, the difference is $10K to the DAO. Not nothing, but not worth the complexity.

  2. Partner communication: “60/40 across the board” is one sentence. Variable rates by model requires a matrix, edge case definitions, and inevitably disputes about which model a deal falls into.

That said, I’d propose baking in a 12-month review as part of the CO-PIP: “Revenue split will be reviewed 12 months after Marketplace launch, once sufficient data exists to evaluate whether model-specific rates are warranted. Any changes would require a follow-up OP-PIP.”

If after a year we see Model 3 is 80% of volume and Douro’s overhead is genuinely minimal, we can adjust. But let’s not over-engineer before we have data (and customers of the Pyth Marketplace).

More than a ‘Scope Authority’, it feels we should also codify the process here and actually was thinking we could follow a governance model inspired by Optimism’s approach: optimistic approval with veto rights.

What does this mean // How does it look practically?

  1. Step 1: Douro Posts Onboarding Notice

A new forum section (“Marketplace Onboardings”) where Douro logs all new partner onboardings with:

  • Partner name

  • Dataset category

  • Commercial model

  • Overlap assessment

  • Projected annual value tier

This starts a 30-day window.

  1. Step 2: Informal Q&A Period (Days 1-30)

Anyone — Council members, token holders, community — can ask questions on the post. Douro responds and clarifies. Most onboardings should resolve here with no objection needed.

  1. Step 3: Formal Objection (If Needed)

If concerns aren’t resolved through Q&A, two paths exist:

Who How They Object Threshold to Pause
Pythian Council Council resolution Super Majority of Council (6/9) must vote to object
Token Holder OP-PIP Must pass standard OP-PIP quorum + majority

Important: Simply raising a vote is not enough. The objection must pass — reach quorum and achieve majority — to pause the onboarding and trigger Step 4. A failed objection vote = onboarding proceeds as planned.

The Council path is faster (trusted body with operational mandate). The token holder path is a permissionless backstop ensuring decentralization.

  1. Step 4: Resolution Process

Once an objection passes, Douro has 14 days to present a structured response addressing the concerns raised.

The same body that objected then votes on the resolution:

  • If Council objected → Council votes

  • If DAO objected via OP-PIP → DAO votes via follow-up OP-PIP

Three possible outcomes:

  1. Approve anyway — Concerns addressed, onboarding proceeds

  2. Reject — Onboarding blocked

  3. Renegotiate — Sent back with specific conditions

If no resolution vote occurs within 30 days of Douro’s response, the objection lapses and onboarding proceeds.

This adds a lapse clause so objections can’t hang indefinitely.

2 Likes

IMO this warrants its own thread — there’s a lot to unpack. My short version: Pyth Pro’s API is already agent-friendly. The Marketplace simply expands what’s accessible through that same endpoint.

The real opportunity is that agents (and humans) get a single API for both Pyth’s first-party feeds and third-party institutional datasets — potentially expanding well beyond financial data. That’s compelling.

That said, agent-native pricing (per-request, usage-based) requires both pricing and product work. Building something that scales from “occasional API call” to “enterprise package” isn’t trivial. Worth exploring, but probably somewhat unrelated to the Marketplace itself.

I’d push back on starting free or heavily discounted. Free adoption doesn’t prove product-market fit — it proves people like free stuff. You learn whether something has value when people pay for it.

Pyth Core followed the “free first” model. Pyth Pro didn’t — it launched at ~$10K/month packages. Today, Pro generates more revenue than Core by a wide margin. Lesson I learned: price for value from day-1, iterate from there of course.

See my proposal above for the governance model and let me know if you have more questions but IMO it directly addresses transparency.

Every onboarding would be logged in a dedicated “Marketplace Onboardings” forum section with pre-established details. This also creates a public record without requiring DAO approval for every deal. The 30-day window + objection mechanism provides oversight without bureaucratic drag.

2 Likes

@zenyas,

This is a very interesting proposal and could represent an important step in expanding Pyth Network beyond price feeds into a broader financial data distribution layer.

The concept reminds me somewhat of how Bloomberg L.P. operates. A similar model applied to the Pyth Network could allow institutional datasets to be distributed through the network and reach not only DeFi protocols but also trading infrastructure, quant systems, and other data-driven platforms.

Beyond the direct revenue opportunity, one potential strategic benefit is distribution. If Pyth becomes a marketplace where institutional datasets are distributed onchain or through its infrastructure, it could gradually position Pyth as a default data layer for both crypto-native applications and non-crypto platforms integrating onchain data. Over time, this could create strong network effects where new protocols, trading infrastructure, and automated agents integrate Pyth not only for price feeds but also for a wider range of datasets.

It would also be interesting to understand how the proposal plans to approach onboarding data providers and managing licensing agreements, as these areas are typically complex in traditional financial data markets.

From SCP with Love

FYI the proposal is now up for vote: [ONGOING] CO-PIP-99: Establishment of the Pyth Data Marketplace and Assign Douro Labs as Operator

3 Likes

this clear approach toward bureucratic flow, nice. let’s get grow together