flowchart LR
subgraph upload
files[("file(s)")]
data-provider["👤 Data Provider"]
end
subgraph download
data-receiver["👤 Data Receiver"]
files2[("file(s)")]
end
bucket[\"Cloud Storage\n Bucket"/]
files --> |to| bucket
data-provider --> |uploads| files
bucket --> |provides\naccess to| files2
files2 --> |received by| data-receiver
Template leveraging Cookiecutter for creating Cloud Storage bucket on Google Cloud with a service account and related key to enable data or file upload and use.
This repository uses Terraform to maintain cloud resources. See terraform/README.md for documentation on Terraform elements.
See below for an overview of roles which are important to context for various parts of this repository.
- Terraform Administrator: this role involves administrating over cloud resources created with Terraform. Content found under the
terraform
directory and following steps under Tutorial: Bucket Infrastructure apply to this role. - Data Provider: this role involves using content under
utilities/data-provider
to synchronize (add, update, or remove) data to the bucket created by a Terraform Administrator. Instructions specific to this role are provided underutilities/data-provider/README.md
. - Data Receiver: this role is involved with downloading content from the bucket after it has been uploaded by the data provider. Associated content may be found under
utilities/data-receiver/README.md
.
See below for steps which are required for installation.
-
Clone the repository to your development environment.
-
Install Terraform
-
Configure Terraform as necessary for your Google Cloud environment.
See below for brief tutorials on how to implement the work found in this repository for your needs.
These steps cover how to store Terraform state management files in association with the bucket infrastructure below. Terraform tracks cloud resources it creates as a statefile (.tfstate
). If multiple people want to manage the same resources at the same time, they all need to have access to the same statefile or else they risk overwriting or corrupting state data. One option for sharing this statefile is to use a Google Cloud Bucket, which is the option used here.
Note: Terraform cloud state management must be setup before it is referenced as a backend.
- Make adjustments to the content as necessary (for example, this readme file).
- Fill in terraform.tfvars with values that make sense for your initiative (note: these are by default filled in from cookiecutter values).
- Terraform init: to ensure Terraform is initialized use command
terraform -chdir=terraform/state-management init
. - Terraform plan: to plan the work and observe any needs use command
terraform -chdir=terraform/state-management plan
. - Terraform apply: to apply the work and create resources use command
terraform -chdir=terraform/state-management apply
These steps cover how to control the infrastructure found within this repository.
❗ Please note: after applying the Terraform code with the steps below, a service-account.json file is added to your local /utilities/data-provider directory which contains sensitive data which may enable access to your cloud resources. This file should not be checked into source control! |
---|
- Make adjustments to the content as necessary (for example, this readme file).
- Fill in terraform.tfvars with values that make sense for your initiative (note: these are by default filled in from cookiecutter values).
- Terraform init: to ensure Terraform is initialized use command
terraform -chdir=terraform/operations init
. - Terraform plan: to plan the work and observe any needs use command
terraform -chdir=terraform/operations plan
. - Terraform apply: to apply the work and create resources use command
terraform -chdir=terraform/operations apply
When finished with the work, optionally use the following step.
- OPTIONAL: Terraform destroy: to destroy all created resources use command
terraform -chdir=terraform/operations destroy
These steps cover an example of how to use the bucket after creating the surrounding infrastructure.
|
---|
- Data Upload (Data Provider): please see
utilities/data-provider/README.md
for more information. - Data Download (Data Receiver): please see
utilities/data-receiver/README.md
for more information.
Development for this repository is assisted by the following technologies:
- Poetry: Used to help configure pre-commit for local development work.
- Pre-commit: Used for performing checks within local development environment and via Github Actions automated testing. The following sub-items are used as checks through pre-commit-terraform and require local installation when testing outside of Dagger:
- terraform_docs: Used to automatically generate Terraform-specific documentation.
- tflint: Used to perform static analysis (linting) on Terraform content.
- tfsec: Used to perform security-focused static analysis (security linting) on Terraform content.
- Dagger: Used to help orchestrate reproducible testing within local development environment and for automated testing.