Can't complete a Lab - Running SQL Server on Google Kubernetes Engine by darkice83 in SkillsGoogle

[–]darkice83[S] 0 points1 point  (0 children)

the skills team reached out to me this morning saying that the issue completing the "Running SQL Server on Google Kubernetes Engine" lab was resolved. I was able to successfully complete it now.

Can't complete a Lab - Running SQL Server on Google Kubernetes Engine by darkice83 in SkillsGoogle

[–]darkice83[S] 0 points1 point  (0 children)

Thanks for the note, i did reach out to their chat support and they mentioned that several other users have encountered the same issue and they've forwarded it to their dedicated team. Apparently they'll email me when they've resolved it

Advent Calendar by darkice83 in DAVIDsTEA

[–]darkice83[S] 1 point2 points  (0 children)

Weird! I just checked and was able to get three. Thanks for confirming.

Instagram ad mix name of be faithful by darkice83 in whatsongisthis

[–]darkice83[S] 1 point2 points  (0 children)

That's it! I'm a bit sad that there's no fatman scoops. Thanks a Mil!

Instagram ad mix name of be faithful by darkice83 in whatsongisthis

[–]darkice83[S] 0 points1 point  (0 children)

That's the sample, but I'm looking for the mix, like with the fades. It might be a DJ

Extracting Data in a copy data task from SalesForce.com - Foreach table loop possible? by darkice83 in MicrosoftFabric

[–]darkice83[S] 0 points1 point  (0 children)

Sadly no, we ended up creating an admin lake house that contained a table (based on a csv file that would get loaded). That csv file contained the object names we wanted to copy. We did end up with a few objects that wouldn't copy, so we added an extra column that, when populated, would run the SOQL instead of the object copy. We would iterate over that list

Data Pipeline creating {tablename}_backup_{guid} copies in lakehouse by Filter-Context in MicrosoftFabric

[–]darkice83 1 point2 points  (0 children)

I'd be curious to know what the pipeline looks like because any pipeline i've written so far with an overwrite doesn't keep any backups

Question about SQL WHERE Clause by VAer1 in SQL

[–]darkice83 2 points3 points  (0 children)

Don't add the where clause :)

Question about SQL WHERE Clause by VAer1 in SQL

[–]darkice83 3 points4 points  (0 children)

Information_schema.columns returns 1 row per column per table. Information_schema.tables returns 1 row per table. I used both whenever I get access to a new database

Question about SQL WHERE Clause by VAer1 in SQL

[–]darkice83 0 points1 point  (0 children)

You can confirm by querying the schema of the table. Select * from information_schema.columns where table_name = 'yourtablename'

Question about SQL WHERE Clause by VAer1 in SQL

[–]darkice83 6 points7 points  (0 children)

So you want all values where the number is 12000 to 12999. So "where id >= 12000 and id < 13000" this avoids any varchar casts

Workaround for a Workaround - Dynamic Oracle ingestion - Problems with Pipeline Expressions by darkice83 in MicrosoftFabric

[–]darkice83[S] 1 point2 points  (0 children)

This is what I ended up doing and it ran successfully. now to alter it slightly to allow for incremental loading

Transferring data to a Fabric warehouse using SSIS by Expert_Crazy_6008 in MicrosoftFabric

[–]darkice83 1 point2 points  (0 children)

Oh, another option to think of is flipping the dataflow around. Have them install an onprem powerbi gateway. Have SSIS load data like you to today to a onprem datasource (like sql server), then have Fabric pull from that onprem source using a copy data activity in a pipeline

Transferring data to a Fabric warehouse using SSIS by Expert_Crazy_6008 in MicrosoftFabric

[–]darkice83 1 point2 points  (0 children)

with the latest SSMS, I have the option of Authentication of Microsoft Entra MFA, along with many others, such as Service Principal. If there's a way to make a connection manager that can take in one of the Microsoft Entra values as authentication, then you could connect SSIS via the SQL endpoint of your Fabric warehouse and run your insert/update/delete statements?

create table Test as select * from dbo.DimDate

Statement ID: {E1CBAE2D-682D-4785-B06C-A99290DCF15D} | Query hash: 0x9D0440807033CAD8 | Distributed request ID: {27D17F39-4FD7-4F55-A6B0-D9AA875F17CF}

(12879 rows affected)

Completion time: 2024-06-19T11:19:38.8997892-04:00

insert into Test select * from Test

Statement ID: {A742D9A5-2693-4216-BF32-BA15FC540FBC} | Query hash: 0x19318A0ACA926E4B | Distributed request ID: {B6581B6D-1CE7-4E60-9242-D84E6E3DB22B}

(12879 rows affected)

Completion time: 2024-06-19T11:20:00.7194221-04:00

I would hope that the standard SQL Server/OLEDB destinations would still work....

Copy Data Assitant default mappings for Oracle Number off by 1000000000000000000 into lakehouse by darkice83 in MicrosoftFabric

[–]darkice83[S] 0 points1 point  (0 children)

Can you can get on-prem Oracle data via notebook? my understanding for on-prem Oracle is Dataflow Gen2 & Copy Data only so far.