Batch ETL using Cloud Environment which is GCP by utilizing Cloud Composer + Google Cloud Storage + Dataflow + Cloud Build
-
Updated
Jun 6, 2021 - Python
Batch ETL using Cloud Environment which is GCP by utilizing Cloud Composer + Google Cloud Storage + Dataflow + Cloud Build
Manage big data on cloud computing to find a list of best-selling audible books, generate reports and dashboards, and provide products and sales promotions that meet the needs of consumers in Thailand
An ETL pipeline to move an uploaded flat file ffrom GCS, mask PII, store Big Query, and Create a report in Looker.
Ask Ubuntu Logs analysis with PySpark on GCP | Pipeline with Airflow (Cloud Composer)
Inventory value is also important for determining a company's liquidity, or its ability to meet its short-term financial obligations. A high inventory value can indicate that a company has too much money tied up in inventory, which could make it difficult for the company to pay its bills.
A simple dag for triggering the Cloud Data Fusion Pipeline using Apache Airflow.
This project leverages GCS, Composer, Dataflow, BigQuery, and Looker on Google Cloud Platform (GCP) to build a robust data engineering solution for processing, storing, and reporting daily transaction data in the online food delivery industry.
Add a description, image, and links to the cloudcomposer topic page so that developers can more easily learn about it.
To associate your repository with the cloudcomposer topic, visit your repo's landing page and select "manage topics."