terraform init: Initializes the Terraform project and downloads the required providers.terraform plan: Previews the changes Terraform will apply.terraform apply: Executes the plan and applies the changes defined in the configuration files.terraform destroy: Removes all resources defined in the configuration files.
- Format Terraform files:
terraform fmt
The service account created for this project is the following:
This account has been assigned the following permissions:
Note: The permissions outlined here (such as Storage Admin or BigQuery Admin) are broad and intended for development purposes. In a production environment, it is recommended to follow the principle of least privilege by granting more restricted permissions that only allow the necessary actions. This helps to minimize security risks and ensure a more secure environment.
- Authenticate with Google Cloud:
gcloud auth application-default login
- Set the
GOOGLE_APPLICATION_CREDENTIALSenvironment variable to the path of your service account key file:export GOOGLE_APPLICATION_CREDENTIALS="/path/to/your/key.json"
terraform plan -var="project=XXXXXXXX-XXXXXX"
terraform apply -var="project=XXXXXXXX-XXXXXX"
terraform destroy -var="project=XXXXXXXX-XXXXXX"Terraform has been successfully initialized!
Plan: 3 to add, 0 to change, 0 to destroy.
Apply complete! Resources: 3 added, 0 changed, 0 destroyed.It is useful to run a query against the external table in BigQuery to check the data types of the newly created Parquet files before loading them into BigQuery. This allows you to verify the structure and ensure that the data is correctly formatted.
CREATE EXTERNAL TABLE `weather_dataset.test`
OPTIONS (
format = 'PARQUET',
uris = ['gs://aemet-weather-data-bucket/climatic_values/climatic_values/*.parquet']
);
SELECT column_name, data_type
FROM `weather_dataset.INFORMATION_SCHEMA.COLUMNS`
WHERE table_name = 'test';This query will give you the data types of the columns in the Parquet files, which is helpful to ensure compatibility and avoid any potential issues when importing the data into BigQuery.