Make sure you have docker installed.
python3 -m pip install --upgrade acryl-datahub
datahub docker quickstart [--version TEXT (e.g. "v0.9.2")]
datahub docker ingest-sample-data
Go to local_airflow directory and run
docker-compose -f docker-compose.yml up -d
Go to local_kafka directory and run
docker-compose -f docker-compose.yml up -d
Go to local_datahub/recipes
datahub ingest -c kafka_test_recipe.dhub.yaml
Sending ethereum transactions to local kafka (optional) if you want to test kafka ingestion to DataHub
Go to scripts and run
python3 eth_tx.py
docker exec kafka_test_broker \
kafka-topics --bootstrap-server kafka_test_broker:49816 \
--list
docker exec --interactive --tty kafka_test_broker \
kafka-console-consumer --bootstrap-server kafka_test_broker:49816 \
--topic transaction \
--from-beginning