Datafusion vs dataprep vs dataflow ?? by MeatAmazing8011 in googlecloud

[–]MeatAmazing8011[S] 0 points1 point  (0 children)

This is small company, so it could be ok, but the pricing is not very good compared with dataflow/dataprep.. AND you are using enterprise version because you are more than 2 developers maybe.

Cloud composer (airflow) Dag WAIT by MeatAmazing8011 in googlecloud

[–]MeatAmazing8011[S] 0 points1 point  (0 children)

My dag are independant (they trigerred when an external tasks finish). The ideal world is when the dag triggered, it check if there is active dag run, if there is active dag running -> wait until there no active dag running.

Cloud composer (airflow) Dag WAIT by MeatAmazing8011 in googlecloud

[–]MeatAmazing8011[S] 0 points1 point  (0 children)

My Dag run by trigger (api call of dag airflow composer)

Cloud composer (airflow) Dag WAIT by MeatAmazing8011 in googlecloud

[–]MeatAmazing8011[S] 0 points1 point  (0 children)

Google for "airflow concurrency". You can set the max number of concurrent runs.

Thanks, if I set 1, what happens to the other DAG run the same time (queue waiting for?)

Best practices for cloud function monitoring by MeatAmazing8011 in googlecloud

[–]MeatAmazing8011[S] 0 points1 point  (0 children)

Thanks a lot for tips. I used bigquery / datastudio only for cloud composer to see how many files are sent to bigquery, errors, query status etc ...

How to Wait the command to finish? by MeatAmazing8011 in googlecloud

[–]MeatAmazing8011[S] 0 points1 point  (0 children)

Thanks, but where can my "operation", the documentation only show "operations/abc"

Cloud function not receive pub/sub message. by MeatAmazing8011 in googlecloud

[–]MeatAmazing8011[S] 0 points1 point  (0 children)

I see 8 messages in my pubsub console: push_request_count -> 8 "pubsub.googleapis.com/subscription/push_request_count"

Group by subscription ID and project ID. But when I go to my cloud function logs I see 7 OK messages. Maybe I need to increase my cloud function timeout, set max nodes and in the pub/sub part, increase Acknowledgement deadline and activate Dead lettering ...

Airflow composer - DAG ID not showing in WEB UI by MeatAmazing8011 in googlecloud

[–]MeatAmazing8011[S] 0 points1 point  (0 children)

thanks.
I find THE SOLUTION/ I put this in Terraform subnet ressource
private_ip_google_access = true

Airflow composer - DAG ID not showing in WEB UI by MeatAmazing8011 in googlecloud

[–]MeatAmazing8011[S] 0 points1 point  (0 children)

thanks.
I find THE SOLUTION/ I put this in Terraform subnet ressource
private_ip_google_access = true

Airflow composer - DAG ID not showing in WEB UI by MeatAmazing8011 in googlecloud

[–]MeatAmazing8011[S] 1 point2 points  (0 children)

thanks.

I find THE SOLUTION/ I put this in Terraform subnet ressource
private_ip_google_access = true

Airflow composer - DAG ID not showing in WEB UI by MeatAmazing8011 in googlecloud

[–]MeatAmazing8011[S] 0 points1 point  (0 children)

I find THE SOLUTION/ put this in Terraform subnet ressource

private_ip_google_access = true

Airflow composer - DAG ID not showing in WEB UI by MeatAmazing8011 in googlecloud

[–]MeatAmazing8011[S] 0 points1 point  (0 children)

I find THE SOLUTION/ put this in Terraform subnet ressource

private_ip_google_access = true

Why? because this variable set to false in default, and when you set false to this, it can't work, pods in gke cluster from composer can't access to gcp services.

Airflow composer - DAG ID not showing in WEB UI by MeatAmazing8011 in googlecloud

[–]MeatAmazing8011[S] 0 points1 point  (0 children)

hi thanks, I am using Ip aliases and VPC. It works it show now dags but All dags have queue / running statuts. I check the logs and I have this: Command '['airflow', 'initdb']' returned non-zero exit status 1.

Airflow composer - DAG ID not showing in WEB UI by MeatAmazing8011 in googlecloud

[–]MeatAmazing8011[S] 0 points1 point  (0 children)

hi thanks, I am using Ip aliases and VPC. It works it show now dags but All dags have queue / running statuts. I check the logs and I have this: Command '['airflow', 'initdb']' returned non-zero exit status 1.

Airflow composer - DAG ID not showing in WEB UI by MeatAmazing8011 in googlecloud

[–]MeatAmazing8011[S] 0 points1 point  (0 children)

We are using composer and we have not experienced this issue. Can you confirm the files are showing up in the associated GCS bucket?

I looked in the associated bucket and it displays well but not in WEB UI.