This Github action can be used to insert rows from a JSON file to Google BigQuery table.
It doesn't do any schema validation of the rows - BQ will return a list of errors if the inserts are failin.
name: "Insert rows to BigQuery"
on:
pull_request: {}
push:
branches: ["main"]
jobs:
deploy_schemas:
runs-on: ubuntu-latest
name: Insert rows to BigQuery
steps:
# To use this repository's private action,
# you must check out the repository
- name: Checkout
uses: actions/[email protected]
- name: Deploy schemas to BigQuery
uses: Atom-Learning/bigquery-upload-action
with:
gcp_project: 'gcp-us-project'
dataset_id: 'dataset-id'
table_id: 'table-id'
bq_rows_as_json_path: 'bq_rows.json'
credentials: ${{ secrets.GCP_SERVICE_ACCOUNT }}
The full name of the GCP project you want to deploy.
Example: gcp-us-project
The dataset containting the table you want to insert the rows to.
Example: best_dataset
The table you want to insert the rows to.
Example: awesome_table
The path to the JSON file containing rows you want to insert in.
Example: rows.json
Google Service Account with permission to create objects in the specified project. Can be stored as a repository secret
See the Contributing Guide for additional information.
To execute tests locally (requires that docker
and docker-compose
are installed):
docker-compose run test
See RELEASE.md for how to release a new version of this Action.
This Github Action was written by Wojciech Chmiel, based on the fork of: https://github.com/jashparekh/bigquery-action