External Data Hub
A component responsible for collecting and processing on/off-chain data, then storing it in a DB structure optimized for fast data retrieval and aggregation. The platform supports multiple independent data pipelines for each service domain, consisting of the following sub-modules:
Social Crawler: Collects data generated from popular social network services, primarily using OpenAPIs provided by the services or leveraging commercial data providers.
Game Crawler: Targets large volumes of real-time data generated during gameplay or data obtainable from popular game portals. In the former case, utilizes SDKs or OpenAPIs provided by game service providers to collect domain-specific datasets, processing and storing data in a form conducive to post-analysis.
DeFi Crawler: Collects, processes, and stores transaction events generated from well-known decentralized protocols on-chain. Off-chain DeFi-related community data is also targeted for collection.
eCommerce Crawler: Continuously collects and lists products available for purchase with tokens from open marketplaces. Useful in scenarios such as hot deals, gifts, and events, and can be easily integrated into agents using tool functions like posting purchase links on social channels.
Last updated