Skip to content

rohit-ashva900/Dynamic_ETL_Pipeline_Project_with_Azure

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Dynamic_ETL_Pipeline_Project_with_Azure

Developed a dynamic ETL pipeline using Azure Data Factory to extract, transform, and load over 800k rows from the New York City Taxi API into Azure Data Lake Storage Gen2.

Utilized Databricks for data transformation across Bronze, Silver, and Gold layers, optimizing storage with Parquet and Delta Lake formats.

Implemented advanced features like recursive file lookup and DDL schema management for scalable and efficient data workflows.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages