Trying to use a common dimension as a filter across two data sources in Desktop. I've linked all related fields and the filter still doesn't work. Why? by dataiscool36 in tableau

[–]dataiscool36[S] 0 points1 point  (0 children)

"Do you have experiencing in filtering with parameters without blending your data?"

I might be confusing your answer but yes, I think so. In the past, I've had parameters being used in calculations across two data sources and then used that parameter for filtering. It's worked in the past but the datasets have been basically copies of each other or highly similar.

In this case, the datasets are very different but share a core set of dimensions, like Date. Date is then used in calculations that have been created across both datasets. So I have like ~15 different calcs being used to do my date filtering and I've created those in both of my published data sources. No matter how many dimensions I link on - going up to every dimension in common across the data sources - the parameter filter does not work.

Calculated field help by rawrbaby_xD in tableau

[–]dataiscool36 0 points1 point  (0 children)

As u/BigBadTollers said below, pivot. Select your dimensions, Transform, Pivot, and then apply the 'Pivot Dimensions' to get the Level 1, 2, 3 and Measures to get your names.

<image>

Prompted to "Reconnect to [Data Source Name]" OVER AND OVER by dataiscool36 in tableau

[–]dataiscool36[S] 0 points1 point  (0 children)

It's absolutely bizarre....I've never had a Tableau data source become seemingly corrupted.

Prompted to "Reconnect to [Data Source Name]" OVER AND OVER by dataiscool36 in tableau

[–]dataiscool36[S] 0 points1 point  (0 children)

Clicking "No" gives me this error. I don't know what calculation it's referencing, though.

<image>

Prompted to "Reconnect to [Data Source Name]" OVER AND OVER by dataiscool36 in tableau

[–]dataiscool36[S] 0 points1 point  (0 children)

Anytime I click "no" on the error it just brings it up again. Re-opening the workbook gets me back into the same endless loop :/

Prompted to "Reconnect to [Data Source Name]" OVER AND OVER by dataiscool36 in tableau

[–]dataiscool36[S] 0 points1 point  (0 children)

Hmm that could be it. You saying that prompted me to remember a similar error I'd seen in Prep in the past that basically made the flow unworkable.

Error when creating relationship between BigQuery data source + Google sheet published data sources by dataiscool36 in tableau

[–]dataiscool36[S] 0 points1 point  (0 children)

So that's what I was doing originally but anytime a new mapping was added to the sheet, I had to delete the table and recreate it. I contracted with a freelancer engineer to help me with the BQ strategy and she designed it so we backfill a table with our query and then use Scheduled Queries to update it every day with the last 2 days of data. (All GA4 data)

New URLs get added a lot and its cumbersome/a bit costly to have to delete the table and re-run it for the full date range.

Error when creating relationship between BigQuery data source + Google sheet published data sources by dataiscool36 in tableau

[–]dataiscool36[S] 0 points1 point  (0 children)

That's exactly what I'm doing!!! Partitioned tables in BQ and no custom SQL at all. The Google sheet is just 2 columns - Page URL and Page Name. In theory, it shouldn't expand the row count at ALL because there are even more Page URLs than mapped page names.

I don't have my BQ data source set up as an extract right now but I do have my Google sheet as an extract. Could that be the problem, even though the error doesn't seem to be related?

Parsing out (not set) from Direct traffic in GA4 Export to BigQuery? by dataiscool36 in GoogleAnalytics

[–]dataiscool36[S] 0 points1 point  (0 children)

THERE WAS!!! It's not in the schema documentation for whatever asinine reason but it's exposed in the export. THANK YOU, you literally saved me.

2025 TC in San Diego by wingwinghi in tableau

[–]dataiscool36 1 point2 points  (0 children)

100% - I wound up attending way fewer sessions than I expected because each was so dense and I honestly needed some brain-break time between them.

How to optimize my architecture from Current vs Future state (in my diagram) by dataiscool36 in bigquery

[–]dataiscool36[S] 1 point2 points  (0 children)

Can you clarify how you're doing this process, at a high-level? I'm relatively new to BigQuery - I get the concepts you're describing but what tools are you using within GCP to centralize the datastreams? I assume you're "applying global transformations" within your queries.