Reliable, Fast Access to On-Chain Data Insights
TokenAnalyst is building the core infrastructure for integrating, cleaning, and analyzing blockchain data using Apache Kafka® as its central data hub. They offer historical and low-latency data streams of on-chain data across multiple blockchains. The Confluent Schema Registry helps manage multiple semantically enriched data models for the same underlying data. On the event streaming side, they provide reliable low-latency data streams for financial applications based on Kafka Streams. They also use KSQL to experiment with raw or lifted streams and deploy new machine learning models without writing Java code. The importance of on-chain data is expected to grow, prompting potential on- and off-chain data joins. TokenAnalyst leverages the Confluent Platform for building resilient data pipelines.
Company
Confluent
Date published
June 7, 2019
Author(s)
Jendrik Poloczek, Gil Friedlis, Matt Mangia
Word count
1416
Hacker News points
None found.
Language
English