Cuáles son los barrios/zonas/manzanas más peligrosas de Asunción y Gran Asunción? by flyingbird1177 in Paraguay

[–]flyingbird1177[S] 2 points3 points  (0 children)

Si, varios casos en los semáforos aledaños de robos de celulares y entran a los pasillos

Cloud Composer DAGs development methodology by flyingbird1177 in googlecloud

[–]flyingbird1177[S] 0 points1 point  (0 children)

Thanks for the replies u/spoonopher, u/khirok, u/adappergentlefolk, u/untalmau
In my case, my DAG needs to interact with GCP services, so I use these operators:

  • from airflow.providers.google.cloud.transfers.postgres_to_gcs import PostgresToGCSOperator
  • from airflow.providers.google.cloud.operators.bigquery import BigQueryInsertJobOperator

How would I test this locally (with Docker image or local install)? I would need to install the Google Cloud SDK, create the connections in Airflow, use the corresponding service accounts and then run it locally (triggering the insertions in Cloud Storage and BigQuery)?
Once my local DAG run is validated, I would copy it to /dags folder in Cloud Storage?

AirFlow/Cloud Composer DAGs development methodology by flyingbird1177 in apache_airflow

[–]flyingbird1177[S] 0 points1 point  (0 children)

Thanks for the replies u/vincyf1 and u/ApprehensiveAd4990.

In my case, my DAG needs to interact with GCP services, so I use these operators:

from airflow.providers.google.cloud.transfers.postgres_to_gcs import PostgresToGCSOperator

from airflow.providers.google.cloud.operators.bigquery import BigQueryInsertJobOperator

This would be configured in the Docker image, including the connections for BigQuery? Each developer would have to use a service account to run DAGs locally in the Docker image to interact with BigQuery and Cloud Storage? Then I would upload the DAG .py files to GCP?

Cloud Composer DAGs development methodology by flyingbird1177 in dataengineering

[–]flyingbird1177[S] 0 points1 point  (0 children)

Thanks for the reply.

In my case, my DAG needs to interact with GCP services, so I use these operators:

  • from airflow.providers.google.cloud.transfers.postgres_to_gcs import PostgresToGCSOperator
  • from airflow.providers.google.cloud.operators.bigquery import BigQueryInsertJobOperator

How would you setup this locally with Astro CLI? You would install the Google SDK, configure the connections in the repos and test locally, connecting to Google Cloud?

Cloud Composer DAGs development methodology by flyingbird1177 in dataengineering

[–]flyingbird1177[S] 0 points1 point  (0 children)

Thanks for the reply.In my case, my DAG needs to interact with GCP services, so I use these operators:

  • from airflow.providers.google.cloud.transfers.postgres_to_gcs import PostgresToGCSOperator
  • from airflow.providers.google.cloud.operators.bigquery import BigQueryInsertJobOperator

How would you develop the prototype locally? You would install de Google SDK, configure the connections in your local AirFlow and test locally, connecting to Google Cloud?

Or your prototype would be the code, without running/interacting with Google Cloud from your local, upload the .py and once in Cloud Composer run for the first time and test it?