SEDA Twitter Spaces - A New Standard For Data Providers - Revelo Intel

SEDA Twitter Spaces – A New Standard For Data Providers

In SEDA’s Twitter Spaces which took place on March 21, 2024, MattDavid from SEDA hosted Peter and Stefan Rottler from SEDA and Shawn Douglass from Amberdata to discuss data infrastructure in DeFi, Amberdata, and SEDA’s journey to enhance crypto ecosystems, and more! Read our notes below to learn more.

Background

MattDavid (Host) – Web3 Marketing Specialist at SEDA

Peter (Guest) – Co-Founder of SEDA

Stefan Rottler (Guest) – Head of BD at SEDA

Shawn Douglass (Guest) – Co-Founder and CEO at Amberdata

SEDA –  an intent-based, modular data layer that allows any blockchain to configure & interact with custom data feeds for price data, RPC data, or any available API endpoint

Amberdata – delivers comprehensive digital asset data and insights into blockchain networks, crypto markets, and decentralized finance, empowering financial institutions with critical data for research, trading, risk, analytics, reporting, and compliance.

Big Data Analytics to Building Market Infrastructure with Amberdata and SEDA

  • Shawn recounts his professional journey, starting with Amberdata in 2017, and his prior experience in big data analytics for social media and derivatives algorithmic trading.
  • Stefan shares his entry into the crypto space in 2017, influenced by a college acquaintance, and his career progression from academic involvement to working in SAS tech in New York and Berlin, and now focusing on Oracles and crypto infrastructure at SEDA.
  • Matt notes Stefan’s long-term involvement with SEDA, dating back to when it was known as Flux, highlighting his status as an OG within the company.
  • Peter recounts his start in crypto in 2017, his initial project on a decentralized app store, and the journey towards building open market infrastructure due to the lack of optimized data infrastructure for mainstream applications, leading to the creation of a first-party Oracle with Amberdata as an early data provider.
  • He discusses Amberdata’s functions, the impact of their partnership with SEDA, the Web3 pain points for data provider companies, and the motivation behind the partnership aimed at setting new data provider standards.
  • It’s noted that about 90% of smart contracts require off-chain data, highlighting the critical role of data in supporting smart contracts with accurate information.
  • Shawn explains that Amberdata provides institutional-grade market infrastructure for digital assets, detailing their data collection from various markets and on-chain venues, processing significant daily notional trading volumes, and delivering this data through APIs to a broad range of clients, including TradFi players and Oracle networks.

Ensuring Data Integrity and Accuracy in High-Volume Trading Environments

  • Matt asks how the quality and sourcing of data are ensured for usability.
  • Shawn explains that they connect to both centralized and decentralized trading venues, noting that decentralized venues offer easier data integrity. They use multiple collectors for each exchange and aim for low-latency, high-frequency data. Continuous updates and robust monitoring ensure they capture all trading activity and price discovery processes. 
  • Shawn says that decentralized venues, process every transaction event log and pending pool activity to reconstitute on-chain actions. He mentions the complexity and scale of data collection are highlighted and they manage 16 petabytes of data across multiple cloud providers and regions.
  • Shawn explains their operational scale, including the use of thousands of servers to collect, ingest, and process data to provide a reliable and accurate data feed. 
  • He mentions the challenges of dynamically scaling to match trading volumes and the importance of avoiding data loss.
  • Matt asks about the amount of data managed, with Shawn confirms they handle petabytes of data from major blockchains and trading venues.
  • Peter comments on the disbelief even builders have regarding the extent of work required to ensure data accuracy and availability. He highlights the challenges of ingesting and serving a vast amount of data in a timely manner.

Enhancing Financial Infrastructure with High-Fidelity Data and Strategic Partnerships

  • Shawn highlights the substantial infrastructure costs incurred to support a wide range of financial services, including Oracle networks, trading, portfolio management, and risk applications for major entities in the financial sector. 
  • He highlights the role of SEDA in providing core infrastructure for DeFi, aiming to enhance the financial fabric of the future with high-fidelity data for DeFi applications.
  • Matt asks about the real-time accuracy of data provided by Amberdata, given the dynamic nature of digital asset data.
  • Shawn explains that the data’s accuracy is point-in-time, with a latency of approximately 250 milliseconds for market makers managing order books over traditional web infrastructure. He says that while blockchain data is slower, it remains accurate at the captured moment.
  • Stefan discusses the importance of high-quality, relevant data for Web3 and crypto projects, stating that data acts as the oxygen for ecosystems. 
  • He outlines the challenge of data availability in emerging ecosystems and describes how SEDA aims to address this by providing a foundational layer of data across various platforms.
  • Stefan explains the distinction between SEDA and traditional Oracles, highlighting SEDA’s capability to provide immediate access to diverse data across multiple chains through a single integration. He highlights the efficiency and speed of accessing data through SEDA, which is crucial for protocols seeking rapid growth and competitiveness.
  • Matt highlights the partnership with Amberdata for the best quality data, explaining how price feeds, exchanges, and more are compiled by Amberdata, and then shared with SEDA and over 230 blockchains, allowing permissionless access to this data.
  • Peter discusses the challenges developers face in accessing high-quality data, especially for niche markets or newer tokens, emphasizing the importance of timely and efficient data delivery from oracles. 
  • He points out the broad applications of programmable data feeds, from lending markets to tokenized real estate and commodities, underlining that most crypto protocols are essentially addressing the oracle problem.
  • Peter mentions the collaboration with data providers to streamline access to on-chain data, highlighting the versatility of Amberdata in supporting various DeFi protocols with their needs for price feeds, lending, and other financial tools.

Amberdata: Comprehensive Blockchain Analytics for Beyond Trading

  • Matt asks about the scope of Amberdata beyond trading and price functions, and niche data collections like gaming, sports, weather, or real assets.
  • Shawn explains Amberdata’s comprehensive data collection process, capturing every block and transaction across major blockchains to enable analytics, proprietary data sets, and price feeds for accounting, risk, and portfolio management. 
  • He talks about the significance of capturing pending transactions and mempool data for understanding market dynamics, particularly in DeFi and MEV scenarios, emphasizing the critical role of accurate and timely data in arbitrage and trading strategies.
  • Shawn explains the nuance of publishing high-quality data to Oracle networks, emphasizing concerns about data representation and the mixing of data qualities. 
  • He appreciates the early partnership for focusing on high-quality data sources and notes the complexity of adding new emerging protocols, stating the investment in deep integration with protocols that show significant total value locked and momentum.
  • Shawn mentions the challenge of keeping up with the rapidly changing blockchain space, using Solana as an example of fluctuating developer interest and the high costs of running node infrastructure necessary for data collection and processing.
  • Peter shares that business decisions in data infrastructure focus on developer activity and the longevity of protocols. He describes designing data for future accessibility to layer 1s and layer 2s, enabling developers to plug into any blockchain and addressing the need for high-quality data for new blockchain developments.
  • Peter discusses the concept of fair monetization for data providers within the SEDA platform, allowing them to set custom fees for their data. 
  • He highlights the importance of data in DeFi and the potential for data providers to capture the value generated by their data, contrasting this model with the value leakage to third-party searchers in the current ecosystem.

Shaping the Future of Finance: Towards a DeFi-Dominated Ecosystem

  • Shawn says that they are experimenting with ways to enable ecosystem growth, envisioning a future where financial markets resemble DeFi more than today’s traditional databases. 
  • They aim to facilitate this evolution and participate in emerging market opportunities, believing that the shift towards DeFi will create significant business opportunities as the value of transactions and services increases.
  • Shawn acknowledges the current economic scale of their endeavors isn’t huge but believes in the long-term potential of DeFi, emphasizing the importance of composability in DeFi for innovation and the creation of complex financial products. 
  • He highlights the traditional financial sector’s substantial spending on data and analytics, suggesting a future where data’s value is more widely recognized in the DeFi space.
  • Peter agrees, noting the perception of data as a public good despite the significant effort required to collect, clean, and prepare it for application powering. He mentions the billion-dollar market of MEV on Ethereum, suggesting that driving MEV value back to data sources could enhance monetization for them.
  • Shawn expresses caution about MEV, highlighting concerns over practices that could be seen as value extraction at the expense of others, such as front-running. 
  • He advocates for digital assets and crypto to be used in a way that highlights radical transparency and facilitates positive outcomes, aiming for a financial ecosystem where incentives drive beneficial behavior.
  • Shawn discusses the importance of data integrity and the potential financial harm caused by inaccurate data feeds, emphasizing the need for a stable foundation for the ecosystem to grow and benefit everyone. 
  • He advocates for measures to prevent MEV and ensure the transparency and accuracy of data used in the crypto space.
  • Peter highlights the problem of data manipulation and the need for education about data consumption within the DeFi community. 
  • He criticizes the reliance on single data feeds by major DeFi protocols, stressing the importance of using high-quality data sources to mitigate risk and volatility. 
  • Peter advocates for educating both users and developers about the sources of their data and the implications of their choices.
  • Shawn says that launching a reference rate in December was essential because many of the top ETFs rely on a single data provider, raising institutional concerns about determining the accurate price of $BTC
  • He highlights the need for reliable reference rates to settle derivatives contracts and ETFs, emphasizing the adherence to IOSCO principles and the significant financial investment required to develop accurate, referenceable prices.
  • Stefan comments on the configurability and customizability of SEDA, underlining its importance for builders in terms of security and efficiency. 
  • He appreciates how SEDA and Theta provide tools for developers to tailor the data for their products, fostering the creation of safer, better, and more efficient solutions.

SEDA Enhances DeFi Ecosystem with Agile Data Integration and Development

  • Matt asks about the experiences discussing SEDA’s offerings with data providers and the reception of those conversations.
  • Stefan says that their goal is to work with the best data providers and make their data accessible to various protocols and projects. He explains that the integration process with data providers is straightforward, essentially allowing them to outsource business development efforts to SEDA, which then promotes their data across the DeFi ecosystem.
  • He talks about the challenges of keeping up with the rapidly changing landscape of data traction within the industry. The flexibility of SEDA’s architecture allows for real-time testing of data’s relevance and traction, benefiting both data providers and protocols by enabling agile experimentation and development with minimal cost.
  • Matt expresses excitement about eliminating reliance on single binary network API feeds in the DeFi industry and highlights the importance of understanding MEV and OEV, mentioning resources for deeper insight.
  • Shawn says that adding new assets is done programmatically, simplifying data integration across various platforms, emphasizing the complexity lies in DeFi protocol integrations rather than centralized trading venues.
  • Peter corrects Matt’s assumption about the pain of spinning up new smart contracts for feeds, explaining how SEDA’s design simplifies the process through a proxy node allowing easy support for new feeds without needing new smart contracts.
  • Shawn explains Amberdata’s comprehensive data collection approach, capturing transactions from the moment they occur, whether on centralized or decentralized platforms, readying data for immediate access.
  • Peter discusses the anticipation for the testnet launch of these features, inviting developers to experiment with the feeds on-chain and highlighting the collaboration with Amberdata.
  • Shawn encourages developers to use high-quality data sources like those used by institutional players to ensure the best outcomes for DeFi protocol users and to minimize MEV risks through solutions like SEDA, emphasizing the responsibility to build robust, reliable financial products.

Gold Standards: Amberdata’s Pioneering Launch and SEDA’s Security Milestones

  • Peter says using a variety of high-quality sources is crucial, and highlights the importance of adopting the gold standard for data in building valuable projects, with excitement for Amberdata‘s upcoming launch.
  • Matt highlights the evolving technology and protocols that enable builders to produce high-quality projects. He points out the industry’s shift towards setting gold standards.
  • Shawn invites developers interested in Amberdata to visit their website for extensive resources, including YouTube channels, market updates, and research. He mentions Amberdata’s active presence on social media and their contact email for further engagement.
  • Matt adds that Amberdata’s content is exceptional and shares plans for case studies on the SEDA network’s integration with Amberdata post-launch, urging followers to engage with their Medium and Twitter for updates.
  • Peter announces the completion of a security audit by Trail of Bits and hints at an imminent mainnet launch for SEDA, mentioning ongoing preparations and advising followers to look out for updates on their official Twitter and website.

Check out these important links

Show Information

  • Medium: Twitter (Audio)
  • Show: SEDA Twitter Space 
  • Show Title: Episode #5 A New Standard For Data Providers Ft. Amberdata
  • Show Date: March 21, 2024