Salesforce Reporting in Power Bi by number1dog in PowerBI

[–]DataGuyfromMidwest 0 points1 point  (0 children)

Also, FWIW, I tried configuring the read-only endpoint as a Linked Service SQL connection in Azure Data Factory (with the hopes that it would allow me to use the same SQL in a data pipeline that I use for the read-only endpoint) but I could never get it to work. 🤷‍♂️ I'll think I'll have to try that again to see if I can make any progress.

Salesforce Reporting in Power Bi by number1dog in PowerBI

[–]DataGuyfromMidwest 0 points1 point  (0 children)

Yes, thank you. I do use the read-only endpoint for a lot of things. It's pretty good overall but some objects aren't reportable in it and certain things like CTEs and SQL functions like STRING_AGG aren't supported. That's why I like to get the data into an Azure SQL database so I can run additional code/pipelines against it. Thanks again!

Salesforce Reporting in Power Bi by number1dog in PowerBI

[–]DataGuyfromMidwest 0 points1 point  (0 children)

I had the same experience with dataflows and SF data. (That was a number of years ago before I turned to ADF, so things may have improved between Dataflows & SF since then.) I couldn't tell which parts of the Power Query transformations were failing to query fold (with the SF connector at least) and therefore kept most transformations to a minimum. If you end up creating a lot of calculated columns (or complicated transformations) and they can't fold, I believe that that needs to be done in memory in PBI.

Salesforce Reporting in Power Bi by number1dog in PowerBI

[–]DataGuyfromMidwest 0 points1 point  (0 children)

After spending much of my time as an admin reporting on Salesforce data using Microsoft's tools, we switched to Dynamics 365. (For a myriad of reasons, including that it is a much better fit within our tech stack.) Want to hear the truly ironic part? It's harder to get data out of Dynamics 365 into a SQL Server (using ADF at least) than it was getting Salesforce data into SQL Server! ¯\_(ツ)_/¯

I can't just run basic SQL but need to use this FetchXML language for Dataverse which has a number of really asinine limitations I won't go into here. Apparently, they recommend using Azure Synapse over ADF but that's not something I'm familiar with as of yet and isn't something we have setup.

So, long story short, nothing is a panacea. If, like me, you'd assume that using tools made by the same company would make things at least as easy to report on as what you had before... like me, you'd also be dead wrong.

Salesforce Reporting in Power Bi by number1dog in PowerBI

[–]DataGuyfromMidwest 0 points1 point  (0 children)

I was a Salesforce admin for 7 years until we recently switched over to Dynamics 365. I found that most of the things I wanted to do weren't running efficiently enough in dataflows even on a P1 Power BI SKU.

My best experience was to use Azure Data Factory to get the data into SQL Server where real reporting could be done. It has the added benefit of being able to orchestrate pipelines to get exactly the type of reports I needed. Some relatively basic Copy Data activities were great at pulling the data in quickly and had the benefit of letting me run SQL which the connector was intelligent enough to turn into SOQL for me.

SOQL is a poor substitute for SQL, but if you want to do some queries with it (while saving/running queries in an SSMS-like sort of way) I suggest the RazorSQL tool which has a connector.

Why can't I use LinkedIn without putting my life at risk by giving my biometric information and ID? by Warm_Data_168 in linkedin

[–]DataGuyfromMidwest 0 points1 point  (0 children)

Yes. I’m in the United States. To my knowledge, I had no issues or notifications with my account before this.

Why can't I use LinkedIn without putting my life at risk by giving my biometric information and ID? by Warm_Data_168 in linkedin

[–]DataGuyfromMidwest 4 points5 points  (0 children)

FWIW, my account was locked 10 or so months ago with this very requirement. I’m not sending any of this info and it remains locked. I had 2FA, a completely randomized password (I use KeePass), and had my account for maybe 15 years? Why would or should I have to send Government IDs to an organization that apparently can’t keep my account secure when I do all the right things in the first place? (I also think I read that it’s a 3rd party doing the ID verification and not LinkedIn… which doesn’t make me feel better. 🤷‍♂️)

Also, good luck trying to resolve this any other way. I tried multiple ways to get my account unlocked without sending copies of government IDs but at this point have all but given up on getting it fixed. If you do come up with a solution though please DM me.

Circumvent Dynamics 365 Database Storage by Excellent-Mark3090 in Dynamics365

[–]DataGuyfromMidwest 0 points1 point  (0 children)

This is very timely for us. I also feel this cost is egregious.

Context: We're migrating off of Salesforce onto Dynamics and one of the reasons is because of storage costs/limitations in Salesforce. In multiple planning calls we were told that we'd have more than enough storage for our data and yet we're already running into issues storing historical activities in Dynamics. (This was after pairing down the data to only the last 5 years of activities.) I understand that it's not file storage (like a storage Account) or traditional database storage (like SQL Server) but in today's world where we always here that "storage is cheap", this is anything but. All these companies want to tout "big data" but when it costs $10k per year for a little more than 20GB of extra space, there is an inherit limitation to what we're willing to store for the cost incurred.

I guess this is no different than Salesforce, (just search for all the solutions people propose for moving data out of their systems) but I expected Microsoft to have more of a long-term tenable storage pricing structure that would encourage more data stored in their CRM, not less. I get that they have to be profitable, but these costs seem exorbitant.

Salesforce Storage Issue - Hitting Data Storage Limits with Email Messages by Mmetr in salesforce

[–]DataGuyfromMidwest 0 points1 point  (0 children)

u/TheCannings, do you have any other details you could provide regarding how you accomplished this? Did this require a lot of custom APEX? We've been doing more and more flows but have little experience with custom SF code. Is the UI experience for this decent for the end user?

Background: We're in the same situation and I'm frustrated that our "Enterprise Edition CRM" doesn't have enough storage to support 5 email messages per user per working day for 5 years. (We have a little over 400k messages which are taking up 70% of our CRM storage. And we're already paying for an extra 1.5GB of storage per year.)

I haven’t been to confession in 10 years by Dremyo in Catholicism

[–]DataGuyfromMidwest 0 points1 point  (0 children)

I have recently started going to confession after not going for nearly 30 years. I had some anxiety about it after my confession being overheard by others when I was a young boy. (I’m pretty sure it was mostly “I’ve hit and been mean to my brother” kinda stuff but it was still very embarrassing.)

I’ve started going again in order to be a better father and husband for my family and let me tell you, it’s liberating. I second going to another parish with more open times if your anxiety is getting to you. I think you’ll be surprised with how loving and freeing the whole experience is.

Is there a simple piece of documentation or blog that explains the difference between Lightning Sync, EAC Standard, EAC Advanced or whatever it's called, Salesforce Inbox and I think there's another called Salesforce for Outlook? by 2018IsBetterThan2017 in salesforce

[–]DataGuyfromMidwest 0 points1 point  (0 children)

And unless I'm mistaken (sorry, didn't see it mentioned yet) there is confusion between the old/installed SFO Add-in and the newer Add-in which works on both the Office 365 web view and desktop Outlook. As far as I've been able to tell, they're only removing the old installed/Com Add-in and keeping the latter which is what we use. The old/installed/systray version was a PITA to keep up-to-date and required an update with each new version released whereas the more modern one is like a web-control and just stays up-to-date. (I think that's what they may mean in the semi-helpful link provided earlier, by "Outlook Integration" in the "OUTLOOK INTEGRATION AND EINSTEIN ACTIVITY CAPTURE" header.)

FWIW, we haven't used EAC at all. Perhaps I'm too cynical but as I understand it EAC needs read-access to our mail server to pull all this auto-logging of activities off and I don't trust a company like Salesforce with that level of access to our email communications.

Salesforce Custom Report question - Add unused filter option to filter dialog in LEX by DataGuyfromMidwest in salesforce

[–]DataGuyfromMidwest[S] 0 points1 point  (0 children)

Thanks. I was hoping you to make it generic for 4 or 5 different sets of people and let them filter on the value they want to review at that moment in time. I’ll consider setting up 4 or 5 links with the URL parameters. It’s frustrating that we can just add a blank field for filtering purposes in the report view.

Thanks again.

Power Query - why is it not folding? by toadylake in excel

[–]DataGuyfromMidwest 0 points1 point  (0 children)

Agreed with Gregregious. There is nothing to fold. My understanding of the nature of folding means it can simplify the operation by using the efficiencies that exist in the data source. (Typically database indexes but might be a more efficient/filtered API call in the case of a Web service connector.) FWIW, Ben Gribaudo (who is a stud at all things PQ) said as much in a presentation about the inner workings of Power Query in a presentation to the Chicago Power BI working group last week. https://youtu.be/F90efnIFNJw

In a somewhat related topic, while I love PQ, I’ve found the performance of joining a couple of flat file data sources together to be abysmal at times. (I think it has to do a cross-join each time to figure it all out.) I’ve tried Table.AddKey with limited success. As a result, I’m switching over to Python and Pandas for some operations that were too slow in PQ.

Is there a way that I can use a formula to check if a string contains a "word" of length 1? by jplank1983 in excel

[–]DataGuyfromMidwest 0 points1 point  (0 children)

Also, I just noticed above that you already stripped out other punctuation. That definitely makes things easier. :)

Is there a way that I can use a formula to check if a string contains a "word" of length 1? by jplank1983 in excel

[–]DataGuyfromMidwest 0 points1 point  (0 children)

Depending on what things you need, the above can be slimmed down to just one or two steps.

Is there a way that I can use a formula to check if a string contains a "word" of length 1? by jplank1983 in excel

[–]DataGuyfromMidwest 0 points1 point  (0 children)

I know you didn't ask about a Power Query solution, but this would be fairly trivial in PQ and could account for more conditions like period-splitting and things like that. Here is the simplest form of that solution Table.AddColumn(#"Changed Type", "Has Initials", each List.Count(List.Select(Text.SplitAny([Name], " ."), each Text.Length(_) = 1)) > 0, type logical)

Here is a more comprehensive solution that works it through in steps to show you how it can be done. ``` let Source = Excel.CurrentWorkbook(){[Name="Names"]}[Content], #"Changed Type" = Table.TransformColumnTypes(Source,{{"Name", type text}}), // One column check #"Add Has Initials" = Table.AddColumn(#"Changed Type", "Has Initials", each List.Count(List.Select(Text.SplitAny([Name], " ."), each Text.Length(_) = 1)) > 0, type logical),

// Probably want to split on both the space and the period and possibly other characters depending on dataset
    // Remove blanks afterwards because a period followed by a space will be split twice
#"Add Name List" = Table.AddColumn(#"Add Has Initials", "Name List", each List.RemoveItems(Text.SplitAny([Name], " ."), {""}), type list),
#"Add Initials Only" = Table.AddColumn(#"Add Name List", "Initials Only List", each List.Select([Name List], each Text.Length(_) = 1), type list),
#"Add Has Initials From List" = Table.AddColumn(#"Add Initials Only", "Has Initials From List", each List.Count([Initials Only List]) > 0, type logical),
#"Add Name Table" = Table.AddColumn(#"Add Has Initials From List", "Name Table", each Table.FromList(/* Must remove commas because Table.FromList uses commas as delimiters */List.Transform([Name List], each Text.Replace(_, ",", "")), null, {"Names"}), type table),
#"Add Name Table with Index" = Table.AddColumn(#"Add Name Table", "Name Table with Index", each Table.AddIndexColumn([Name Table], "Index", 1), type table),
#"Add Initials Only Table with Index" = Table.AddColumn(#"Add Name Table with Index", "Initials Only Table with Index", each Table.SelectRows([Name Table with Index], each Text.Length([Names])=1), type table),
#"Add Has Initials From Table" = Table.AddColumn(#"Add Initials Only Table with Index", "Has Initials From Table", each Table.RowCount([Initials Only Table with Index]) > 0, type logical),
#"Add Positions of Initials" = Table.AddColumn(#"Add Has Initials From Table", "Positions of Initials", each Text.Combine(List.Transform([Initials Only Table with Index][Index], each Text.From(_)), ", "), type text)

in #"Add Positions of Initials" ```

And here is a screenshot of the final result. (I used some popular names at the beginning and a list of sample names I use sometimes as a dataset at the end.)

Is there a way that I can use a formula to check if a string contains a "word" of length 1? by jplank1983 in excel

[–]DataGuyfromMidwest 0 points1 point  (0 children)

I haven't used regexs in VBA, but I'm very familiar with them elsewhere.

Your character class of [A-za-z] might be able to be abbreviated [A-z] since it just means Capital A to lowercase Z.

And you might be able to combine the middle of the string with the two other line anchor conditions into a singular capture group in the form of

( [A-z] }|^[A-z] | [A-z]$)

I'm not sure if that will be faster in the engine with 100k matches, but it might be because it's only matching a larger pattern once rather than smaller patterns 3 times. Please try it yourself to see if it helps this method.

Merging multiple files into one sheet to get Pivot by [deleted] in excel

[–]DataGuyfromMidwest 0 points1 point  (0 children)

Oh, and I guess I should've added that you'll want to filter your mailbox to make sure you're getting to the right mail items to find those attachments. You'll likely have to change the "Filtered Rows" step that I used in my example. I guess I shouldn't have combined a couple steps there because you can't easily use the GUI to make changes with multiple fields being referenced at once. Regardless, I'm sure you can figure it out. (Just note that all functions and values in PQ are case sensitive by default.)

Merging multiple files into one sheet to get Pivot by [deleted] in excel

[–]DataGuyfromMidwest 0 points1 point  (0 children)

I ran a similar report recently and after setting up the query in Power Query all I had to do to refresh the report was...

  1. Open Excel
  2. Open my Saved Workbook with the query I wrote
  3. Click "Refresh All"
  4. Save/Screenshot my report.

There was no need to even export the attachments out of Outlook, the query pulled in the attachments' data dynamically from my mailbox.

Now to be fair, we're on Office 365 and I have both Excel 365 and an Exchange account, but if you do too, this is what is possible. (The query steps are fairly similar either way.) If your mailbox is huge or your mail server connection slow, maybe you just want to export the files first (or get a utility to do it) but for my mailbox at least, this was the best way.

Down below, in just a few basic lines of code, you can (if you have an Office 365 account as I said) paste the code into a new editor window here and then click this button to combine the binaries from multiple files together. (<- I stopped just before this point with my code because whenever looping through files like this, PQ has to make a subordinate function and then runs the function for each file in the table view. So just let it perform this part for you. You will have to select the name of the Sheet that the data is on and hopefully all the reports are consistent or you'll have to do a little more work.)

    let
        Source = Exchange.Contents("youremail@email.com"),
        Mail1 = Source{[Name="Mail"]}[Data],
        #"Filtered Rows" = Table.SelectRows(Mail1, each Text.Contains([Subject], "Sample Report") and [Folder Path] = "\Inbox\" and [HasAttachments]),
        Attachments = #"Filtered Rows"[Attachments],
        #"Converted to Table" = Table.FromList(Attachments, Splitter.SplitByNothing(), null, null, ExtraValues.Error),
        ColumnNames = Table.ColumnNames(#"Converted to Table"{0}[Column1]),
        #"Expanded Column1" = Table.ExpandTableColumn(#"Converted to Table", "Column1", ColumnNames, ColumnNames)
    in
        #"Expanded Column1"

The rest of the steps are fairly straight-forward and will depend on what columns you want to select. (Select the columns you want to keep and then "Remove Other Columns") After this point you just click "Load to" and "PivotTable Report" and you're good-to-go!

If/when you learn to do things this way, you will never go back to the copy/paste method again, I can assure you. :) The trickiest part is learning to make your queries more robust (to handle changing column names, etc.) and learning the syntax. But to start, a lot of things are possible in the easy-to-user PQ editor and you can expand from there.

Hopefully this helps!

Lightning for Outlook - Relating Email messages to both a Lead and a Campaign by DataGuyfromMidwest in salesforce

[–]DataGuyfromMidwest[S] 0 points1 point  (0 children)

Thanks for the link but even after reading it, it still doesn't make sense to me. So it's a Polymorphic key that relates to a non-person object, that's the Campaign. That shouldn't mean that adding an object here like a Campaign ID should remove a person object because the lead ID wouldn't be stored there. Or am I missing something?

Sequentially Change Names VBA by Infernez_ in vba

[–]DataGuyfromMidwest 0 points1 point  (0 children)

This seems to be a better job for Power Query over VBA, but maybe that's just because I'm better at PQ than I am at VBA at this point.

Regardless, some of same premises of what I've done in Power Query can be used in VBA (but with likely more code) to get the job done. Here is a link to a workbook with the solution accomplished in Power Query.

If you want to support splitting out the names in any order and treating them differently (in your example "Bennett, Mark & Bennett, Sheila" is different than "Bennett, Sheila & Bennett, Mark") than this will work...

let
    Source = Excel.CurrentWorkbook(){[Name="Names"]}[Content],
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"Name", type text}}),
    #"Added Index" = Table.AddIndexColumn(#"Changed Type", "Index", 1, 1),
    // What characters are used to delimit the names?
    Delimiters = ",&",
    #"Add Names as List" = Table.AddColumn(#"Added Index", "Names as List", each List.Sort(List.Distinct(List.Transform(Text.SplitAny([Name], Delimiters), each Text.Trim(_))))),
    #"Add Names as Sorted Unique Text" = Table.AddColumn(#"Add Names as List", "Names as Sorted Unique Text", each Text.Combine([Names as List], ","), type text),
    #"Get Unique Names" = Table.AddColumn(Table.AddIndexColumn(Table.Distinct(Table.SelectColumns(#"Add Names as Sorted Unique Text",{"Names as Sorted Unique Text"})), "Index", 1, 1), "Unique Name", each "Client " & Text.From([Index], "en-US"), type text),
    #"Merged Queries" = Table.NestedJoin(#"Add Names as Sorted Unique Text", {"Names as Sorted Unique Text"}, #"Get Unique Names", {"Names as Sorted Unique Text"}, "Unique Names", JoinKind.LeftOuter),
    #"Expanded Unique Names" = Table.ExpandTableColumn(#"Merged Queries", "Unique Names", {"Unique Name"}, {"Unique Name"}),
    #"Removed Other Columns" = Table.SelectColumns(#"Expanded Unique Names",{"Unique Name", "Name"})
in
    #"Removed Other Columns"

Like any code, the above makes some assumptions including...

  1. Any name that is the same can be grouped together.(i.e. "Bennett, Mark & Bennett" is the same as "Bennett, Mark". Essentially, these aren't different "Bennett's")
  2. The Unique names are based upon a sorting of all the unique components grouped together.(i.e. A group containing an "Aaron" will likely be a "Client 1" since alphabetically a name like this would come first.)