#
# # # # #

Google Cloud: Copying BigQuery Tables Across Different Locations

Category:Tutorial

US$9.99

  • Learn how to create and run an Apache Airflow workflow in Cloud Composer. This self-paced lab takes place in the Google Cloud console and will teach you how to export tables from a BigQuery dataset located in Cloud Storage buckets in the US to buckets in Europe, then import those tables to a BigQuery dataset in Europe.

  • This advanced lab is perfect for data engineers and developers who want to learn how to use Cloud Composer to automate data processing tasks. You'll learn how to create a Cloud Composer environment, create and schedule an Airflow DAG, and use the BigQuery Operators to export and import data.

  • By the end of this lab, you'll be able to: Create a Cloud Composer environment Create and schedule an Airflow DAG Use the BigQuery Operators to export and import data

  • Prerequisites: A Google Cloud account The Google Cloud SDK installed and configured A basic understanding of Apache Airflow

  • To complete this lab, you'll need to: Create a Cloud Composer environment Create a BigQuery dataset in the US region Create a Cloud Storage bucket in the US region Create a BigQuery dataset in the Europe region Create a Cloud Storage bucket in the Europe region

  • Once you've completed the lab, you'll be able to: Export tables from a BigQuery dataset in the US to a Cloud Storage bucket in the US Import tables from a Cloud Storage bucket in the US to a BigQuery dataset in Europe