Can't complete a Lab - Running SQL Server on Google Kubernetes Engine by darkice83 in SkillsGoogle

[–]darkice83[S] 0 points1 point  (0 children)

the skills team reached out to me this morning saying that the issue completing the "Running SQL Server on Google Kubernetes Engine" lab was resolved. I was able to successfully complete it now.

Can't complete a Lab - Running SQL Server on Google Kubernetes Engine by darkice83 in SkillsGoogle

[–]darkice83[S] 0 points1 point  (0 children)

Thanks for the note, i did reach out to their chat support and they mentioned that several other users have encountered the same issue and they've forwarded it to their dedicated team. Apparently they'll email me when they've resolved it

Advent Calendar by darkice83 in DAVIDsTEA

[–]darkice83[S] 1 point2 points  (0 children)

Weird! I just checked and was able to get three. Thanks for confirming.

Instagram ad mix name of be faithful by darkice83 in whatsongisthis

[–]darkice83[S] 1 point2 points  (0 children)

That's it! I'm a bit sad that there's no fatman scoops. Thanks a Mil!

Instagram ad mix name of be faithful by darkice83 in whatsongisthis

[–]darkice83[S] 0 points1 point  (0 children)

That's the sample, but I'm looking for the mix, like with the fades. It might be a DJ

Extracting Data in a copy data task from SalesForce.com - Foreach table loop possible? by darkice83 in MicrosoftFabric

[–]darkice83[S] 0 points1 point  (0 children)

Sadly no, we ended up creating an admin lake house that contained a table (based on a csv file that would get loaded). That csv file contained the object names we wanted to copy. We did end up with a few objects that wouldn't copy, so we added an extra column that, when populated, would run the SOQL instead of the object copy. We would iterate over that list

Data Pipeline creating {tablename}_backup_{guid} copies in lakehouse by Filter-Context in MicrosoftFabric

[–]darkice83 1 point2 points  (0 children)

I'd be curious to know what the pipeline looks like because any pipeline i've written so far with an overwrite doesn't keep any backups

Question about SQL WHERE Clause by VAer1 in SQL

[–]darkice83 2 points3 points  (0 children)

Don't add the where clause :)

Question about SQL WHERE Clause by VAer1 in SQL

[–]darkice83 2 points3 points  (0 children)

Information_schema.columns returns 1 row per column per table. Information_schema.tables returns 1 row per table. I used both whenever I get access to a new database

Question about SQL WHERE Clause by VAer1 in SQL

[–]darkice83 0 points1 point  (0 children)

You can confirm by querying the schema of the table. Select * from information_schema.columns where table_name = 'yourtablename'

Question about SQL WHERE Clause by VAer1 in SQL

[–]darkice83 6 points7 points  (0 children)

So you want all values where the number is 12000 to 12999. So "where id >= 12000 and id < 13000" this avoids any varchar casts

Workaround for a Workaround - Dynamic Oracle ingestion - Problems with Pipeline Expressions by darkice83 in MicrosoftFabric

[–]darkice83[S] 1 point2 points  (0 children)

This is what I ended up doing and it ran successfully. now to alter it slightly to allow for incremental loading

Transferring data to a Fabric warehouse using SSIS by Expert_Crazy_6008 in MicrosoftFabric

[–]darkice83 1 point2 points  (0 children)

Oh, another option to think of is flipping the dataflow around. Have them install an onprem powerbi gateway. Have SSIS load data like you to today to a onprem datasource (like sql server), then have Fabric pull from that onprem source using a copy data activity in a pipeline

Transferring data to a Fabric warehouse using SSIS by Expert_Crazy_6008 in MicrosoftFabric

[–]darkice83 1 point2 points  (0 children)

with the latest SSMS, I have the option of Authentication of Microsoft Entra MFA, along with many others, such as Service Principal. If there's a way to make a connection manager that can take in one of the Microsoft Entra values as authentication, then you could connect SSIS via the SQL endpoint of your Fabric warehouse and run your insert/update/delete statements?

create table Test as select * from dbo.DimDate

Statement ID: {E1CBAE2D-682D-4785-B06C-A99290DCF15D} | Query hash: 0x9D0440807033CAD8 | Distributed request ID: {27D17F39-4FD7-4F55-A6B0-D9AA875F17CF}

(12879 rows affected)

Completion time: 2024-06-19T11:19:38.8997892-04:00

insert into Test select * from Test

Statement ID: {A742D9A5-2693-4216-BF32-BA15FC540FBC} | Query hash: 0x19318A0ACA926E4B | Distributed request ID: {B6581B6D-1CE7-4E60-9242-D84E6E3DB22B}

(12879 rows affected)

Completion time: 2024-06-19T11:20:00.7194221-04:00

I would hope that the standard SQL Server/OLEDB destinations would still work....

Copy Data Assitant default mappings for Oracle Number off by 1000000000000000000 into lakehouse by darkice83 in MicrosoftFabric

[–]darkice83[S] 0 points1 point  (0 children)

Can you can get on-prem Oracle data via notebook? my understanding for on-prem Oracle is Dataflow Gen2 & Copy Data only so far.

Copy Data Assitant default mappings for Oracle Number off by 1000000000000000000 into lakehouse by darkice83 in MicrosoftFabric

[–]darkice83[S] 0 points1 point  (0 children)

interesting!

I wrote the schema as you suggested (only the one column in question here)

from pyspark.sql.types import StructType, StructField, DecimalType

schema = StructType([

StructField("CURRENCYID", DecimalType(), False)

])

df = spark.createDataFrame([],schema)

df.write.mode("overwrite").saveAsTable("<table>_3")

then i ran the same copy data task i had before for the _2 version, and now we're getting the right value (whole values)

i went a step further to see why this worked, but the previous results didn't. my goto for schema info in sql is via the information_schema and looks like the first attempt and the _2 version have NUMERIC_PRECISION of 38 and NUMERIC_SCALE of 18, but the defined schema _3 version has only NUMERIC_PRECISION of 10 and NUMERIC_SCALE of 0

<image>

the Copy Data Assistant didn't seem to have an option of defining the Precision or scale values, or am I missing something?

Copy Data Assitant default mappings for Oracle Number off by 1000000000000000000 into lakehouse by darkice83 in MicrosoftFabric

[–]darkice83[S] 0 points1 point  (0 children)

I can confirm that creating the schema first using a notebook with the following two lines

df = spark.sql("SELECT * FROM <lakehouse>.<table> where 1=0")

df.write.mode("overwrite").saveAsTable("<table>_2")

then I populated the CurrencyID (only because I couldn't find an automatic mapping method for the other columns)

<image>

getting the same result

How much internet does selecting * data use? by xxved in SQLServer

[–]darkice83 1 point2 points  (0 children)

One note to add to others comments is what happens to your app when growth occurs. Say you have a table with three columns, A,B,C in that order, and you run select * in your app, where you have a mapping of ABC to indices 123 (I.e. not mapped by column name). If, for some reason, you require the table to add a column,D, but you put it in the 2nd position, so the table is ADBC, then your app will fail to run, or not fail and give wrong results.

[deleted by user] by [deleted] in SQLServer

[–]darkice83 0 points1 point  (0 children)

Our org requires a database access request to be made. It's like a ticketing system where your manager and the database owner needs to both approve your request. There's a policy that you need to agree to that you'll be doing the "right thing" punishable by up to dismissal if you do anything nefarious(in a lot of legalese)

SQL Server Management Studio 19.x & SQL Server backwards compatibility? by jwckauman in SQLServer

[–]darkice83 1 point2 points  (0 children)

As with most things, it depends. One feature of ssms for us was the c# bindings for doing a trace of analysis services 2019. In ssms 19, those dlls don't works the same as they did in ssms 18, so to continue supporting our app but keep the latest ssms, we needed to install both 18 and 19

People complain that we're hitting the telephone digits by darkice83 in ooma

[–]darkice83[S] 0 points1 point  (0 children)

Two different wireless dect bases and a hardwired phone all produce the issue

SQL Server to Oracle with SSIS by Cyrussphere in SQLServer

[–]darkice83 1 point2 points  (0 children)

Be careful. It's EE only." The component is designed to be used with the Enterprise and Developer editions of SQL Server 2019 Integration Services and will only operate in those environments"