/plushcap/analysis/confluent/confluent-reliable-fast-access-to-on-chain-data-insights

Reliable, Fast Access to On-Chain Data Insights

What's this blog post about?

TokenAnalyst is building the core infrastructure for integrating, cleaning, and analyzing blockchain data using Apache Kafka® as its central data hub. They offer historical and low-latency data streams of on-chain data across multiple blockchains. The Confluent Schema Registry helps manage multiple semantically enriched data models for the same underlying data. On the event streaming side, they provide reliable low-latency data streams for financial applications based on Kafka Streams. They also use KSQL to experiment with raw or lifted streams and deploy new machine learning models without writing Java code. The importance of on-chain data is expected to grow, prompting potential on- and off-chain data joins. TokenAnalyst leverages the Confluent Platform for building resilient data pipelines.

Company
Confluent

Date published
June 7, 2019

Author(s)
Jendrik Poloczek, Gil Friedlis, Matt Mangia

Word count
1416

Language
English

Hacker News points
None found.


By Matt Makai. 2021-2024.