Skip to content

Toy use case on how to use Snowflake as a full ML platform.

License

Notifications You must be signed in to change notification settings

datarootsio/snowflake-ml

Repository files navigation

logo

Maintained by dataroots test Python versions Code style: black Mypy checked test

Snowflake-ML

Toy use case on using Snowflake as a full end-to-end ML platform.

What's inside

Getting started

To get started, whether you want to contribute or run any applications, first clone the repo and install the dependencies.

git clone [email protected]:datarootsio/snowflake-ml.git
cd snowflake-ml
pip install poetry==1.1.2  # optional, install poetry if needed
poetry install
pre-commit install  # optional, though recommended - install pre-commit hooks

Running app

poetry run streamlit run dashboard/👋_hello.py

How it works

The app is consists of

  • Local Apache Kafka connector
  • Apache Kafka cluster on Confluent Cloud
  • Snowflake data warehouse
  • Streamlit app (ran locally)

Where both Confluent Cloud and Snowflake infrastructure are managed by Terraform.

ML Solution Architecture

Taking a closer look in Snowflake, we have the landing tables that are updated in real time via Confluent Cloud's Snowflake Connector and Snowpipe. From there the data is transformed via views and materialized views to get aggregate statistics. Alternatively we use Snowflake's streams, tasks and python UDFs to transform the data using machine learning and store the predictions on a table that is ingested by Streamlit.

ML Solution Architecture

Support

This project is maintained by dataroots.For any questions, contact us at [email protected] 🚀

About

Toy use case on how to use Snowflake as a full ML platform.

Resources

License

Stars

Watchers

Forks

Packages

No packages published