This repository houses an ELT pipeline encapsulated within Docker containers. It is engineered to extract data from a source Postgres database, transfer it seamlessly to a destination Postgres database, and subsequently execute transformations using a DBT model within the destination database. Additionally, the repository includes integration capabilities for scheduling CRON jobs, and the entire pipeline is automated through Airflow.
- airflow: This folder contains the configuration for Airflow.
- airflow/dags/elt_dag.py: python script for the Airflow dags
- custom_postgres: This folder contains the configuration for the DBT model.
- custom_postgres/models/example/*.sql: SQL scripts with references to the tables in the destination database
- custom_postgres/models/example/schema.yaml: yaml file with the schema for the tables in the destination database
- custom_postgres/models/example/sources.yaml: yaml file for the destination database
- elt_script: This folder contains the resources for the ELT activity.
- elt_script/Dockerfile: This Dockerfile sets up a Python environment and installs the PostgreSQL client. It also copies the ELT script into the container and sets it as the default command along with the configuration for the CRON job.
- elt_script/elt_script.py: This Python script performs the ELT process. It waits for the source PostgreSQL database to become available, then dumps its data to an SQL file and loads this data into the destination PostgreSQL database.
- source_db_init: This folder houses the SQL script that initializes the source database with sample data. It creates tables for users, films, film categories, actors, and film actors, and inserts sample data into these tables.
- Dockerfile: This file sets up the Airflow environment.
- docker-compose.yaml: Using this file, the following Docker containers are spun up:
- Source Postgres database with example data
- Destination Postgres database
- Postgres database for Airflow
- Airflow service
- Webserver for Airflow
- Scheduler for Airflow