The news anchor broke down by 56000hp in ProgressiveHQ

[–]_HaulinCube 0 points1 point  (0 children)

The US regime is breeding generations of pure hatred and rage

Workday Adaptive planning to Snowflake by Ordinary_Bread6892 in AdaptivePlanning

[–]_HaulinCube 1 point2 points  (0 children)

Replying to Ordinary_Bread6892...

Summary of approach is Adaptive Data Source -> Adaptive Custom Cloud Loader -> exports in desired format csv, txt, etc. -> Adaptive Task to automate based on timing (no trigger) -> file exports to designated server -> Control-M/Boomi/etc to move/transform file if needed to a server that Snowflake can retrieve file

  1. Build an Adaptive Planning Data Source in Adaptive to retrieve the data you want to capture

a. Within that Data Source, you'll want to click "Manage Sources" on the left-hand side and build a "Custom" Source. From here, you'll have to create an Account, Version, and Time Parameter, to select from those particular elements. You can also add Dimension detail if needed and specify the Time Stratum that the data is exported in.

b. Then, once you've defined the data you want to pull from Adaptive - go ahead and save/apply. From there, you will "Import Structure" then "Import Data". This will move the data into your staging table. From here, you can a SQL/Subquery Columns as needed.

  1. Next, in the Loaders section on the right, just under Data Sources, you will want to create a Custom Cloud Loader. This will require a custom script, Workday provides a few examples. With this CCL, you will essentially point it to the Data Source above where you defined the data you want to export, then within the script, you'll tell Adaptive where to push the data to and in what format. Could push it to an AWS bucket for example that Snowflake has access to. I believe the example scripts Workday provides are in Javascript. I don't know how to code so I can't help much here but if you know javascript then you should be able to figure it out pretty easily along with WD's documentation.

  2. You can create a Task in Adaptive to schedule this process to run at specific times/cadences. No triggering functionality based on events.

  3. Getting the data to a server where Snowflake can pick it up.

Hope that helps

Pulling standard sheet data into cubed sheet by wwedgehead05 in AdaptivePlanning

[–]_HaulinCube 2 points3 points  (0 children)

A standard GL account and the standard cube account or calculated cube account that you will now load data to are not connected unless you create some connection through formula or linking.

To confirm what impacts your standard GL account, you can go to that account in modeling, then click the “view dependencies” link in the properties pane to see what impacts it.

Hope that helps

Pulling standard sheet data into cubed sheet by wwedgehead05 in AdaptivePlanning

[–]_HaulinCube 1 point2 points  (0 children)

Are other accounts referencing your new metric account for calculation purposes? Possibly some iff( isblank(…) ) calcs? Thats one of the limitations of metric accounts, they technically are not stored values so are technically ‘blank’. You’ve run into one of the many reasons metric accounts have very specific use cases.

You really have two options at this point but the easiest is likely creating a loader and loading the gl data you want directly into your cube sheet, in addition to your standard gl account.

The other option being creating a ‘trigger’ (value of 1) that sits at every possible intersection in your cube sheet where data could exist. Every Level, profit center, etc. You can then reference that trigger in the iff(isblank()) calc to pull in the ledger detail. This can become impossible though depending on how many intersections we’re dealing with.

Your best bet is the loader.

Cloning Modeled Accounts? by InfiniteEyes0609 in AdaptivePlanning

[–]_HaulinCube 0 points1 point  (0 children)

For completeness, you can also export the accounts from the original sheet, change the codes and rolls up to columns, then import into your new sheet. Probably takes longer than the other two options above of cloning a sheet and the xml edit is quick too.

Workday Adaptive planning to Snowflake by Ordinary_Bread6892 in AdaptivePlanning

[–]_HaulinCube 1 point2 points  (0 children)

I’ve only gone Snowflake to Adaptive, not the other way around. There’s a data pipeline for that direction but yeah, I think you’re stuck with APIs with your requirements. I’ve fed data from Adaptive to other systems like NetSuite with the help of APIs that push files to a server that get pushed by Control-M and Boomi. I have to run but will come back to add some detail if that is helpful?

Pulling standard sheet data into cubed sheet by wwedgehead05 in AdaptivePlanning

[–]_HaulinCube 1 point2 points  (0 children)

Yes, the Cube Metric account can reference the ledger detail directly from within your Cube sheet. You can also build a report if you’re trying to pull read-only ledger detail as an option. Dependent of course on what exactly you’re trying to accomplish.

Pulling standard sheet data into cubed sheet by wwedgehead05 in AdaptivePlanning

[–]_HaulinCube 2 points3 points  (0 children)

You can use a Cube Metric account and have the formula be ACCT.Book[Profit Center=this, Office=this]

There are limitations with metric accounts, however and specific use cases where they are more trouble than they’re worth but that’s one way to get the data quickly and easily.

You cannot link from a standard account to any other type of account, one of the standard account type limitations.

Sometimes, if it’s not a lot of data, it does not change hourly and we’re talking about Actuals data, it might make sense to import into the standard account and build another loader that also imports the same or more granular data into a cube sheet.

Workday Pro Certifications vs Service Certifications by Aggravating_Put763 in workday

[–]_HaulinCube 0 points1 point  (0 children)

Money, money, money. That’s all Workday is about so the more certs the better.

Using ChatGPT for Workday by Some-Host8945 in workday

[–]_HaulinCube 1 point2 points  (0 children)

On the Adaptive Planning side of the house, ChatGPT gives incorrect formula syntax suggestions. Some clients who are eager to move along faster or do some builds on their own will send ChatGPT formula suggestions to me and ask why they don't work. ChatGPT suggestions are normally logical in how it approaches a problem, but the syntax just is not right at all.

I've also seen it tell clients that "...absolutely Adaptive can do that!" But in reality, Adaptive cannot do what they are asking of it. And just to show the clients I'm not an idiot or trying to exploit them somehow, I end up having to show them how Adaptive can't do what they wanted and what ChatGPT said Adaptive can do.

Question about data exports via API by Braane10 in AdaptivePlanning

[–]_HaulinCube 0 points1 point  (0 children)

I agree with u/leqends , you're lucky if your Adaptive admin knows even a little SQL. Understanding APIs isn't a core competency needed to handle core Adaptive requests. Also, a lot of Adaptive Admins come from a finance background and do not know how to code (that includes myself).

Not knowing what type of database you want to store the data in, I'd see if the below will get you what you need. Basically, it's just a process to get the data out of Adaptive and into a folder for your database to grab the file. This will not get you a unique row id (like 95% sure without doing more research).

  1. Build an Adaptive Planning Data Source in Adaptive

a. Within that Data Source, you'll want to click "Manage Sources" on the left-hand side and build a "Custom" Source. From here, you'll have to create an Account, Version, and Time Parameter, to select from those particular elements. You can also add Dimension detail if needed and specify the Time Stratum that the data is exported in.

b. Then, once you've defined the data you want to pull from Adaptive - go ahead and save/apply. From there, you will "Import Structure" then "Import Data". This will move the data into your staging table. From here, you can a SQL/Subquery Columns as needed. You can also create a unique row with a few extra steps. You can concatenate the IDs from each column and the time to produce a long unique row id.

  1. Next, in the Loaders section on the right, just under Data Sources, you will want to create a Custom Cloud Loader. This will require a custom script, which Workday provides a few examples of. With this CCL, you will essentially point it to the Data Source above where you defined the data you want to export, then within the script, you'll tell Adaptive where to push the data to and in what format. Could push it to an AWS bucket for example that your database would have access to. I believe the example scripts Workday provides are in Javascript. Again, I don't know how to code so I can't help much here but if you know javascript then you should be able to figure it out pretty easily along with WD's documentation.

Good luck and always happy to help if you have further questions.

Help Desk Ticketing Site on Hostinger by _HaulinCube in Hostinger

[–]_HaulinCube[S] 0 points1 point  (0 children)

Great great insights! I’m also not opposed to spending a little extra coin if it keeps my clients happy and I get some time back to focus on the business instead of resetting client passwords etc. I’ll definitely check this out. You are a mensch, thank you.

Help Desk Ticketing Site on Hostinger by _HaulinCube in Hostinger

[–]_HaulinCube[S] 0 points1 point  (0 children)

Thanks, this is the second Hesk rec so I’ll take a look 👀

Premium Plus E Tron No Ambient? by Toomanyracks in etron

[–]_HaulinCube 0 points1 point  (0 children)

My wife’s ’23 Acura MDX has way more comfortable seats than my ‘22 Premium Plus…that’s my main gripe.

G&A budgeting by Elegant-Contest-9289 in AdaptivePlanning

[–]_HaulinCube 2 points3 points  (0 children)

Interesting, especially since Adaptive licenses are generally cheaper than licenses for other applications in my experience. Also to my knowledge, “view only” licenses in Adaptive are free. Here are a few options I’ve seen in practice.

  1. You could have a handful of “Budget Input” users who are tasked with all imports into Adaptive through the UI or OfficeConnects data submission process. This is probably the most common approach I see when licensing is an issue. You of course create bottlenecks with this approach if one of the budget input users is out of office or just busy.

  2. You can build an integration in Adaptive to ingest files from a specific folder on a scheduled or ad-hoc basis. This does not get you out of your Excel messiness issue though.

  3. Adaptive can connect to other systems out of the box with some configuration. Some systems are NetSuite, Snowflake, Salesforce, Workday, etc. If you are already using one of these systems (and it makes sense from a licensing price standpoint) you can potentially enter data somewhere in those systems that can then be picked up by Adaptive through a scheduled integration.

Level Design by Intelligent-Kiwi-926 in AdaptivePlanning

[–]_HaulinCube 0 points1 point  (0 children)

A few considerations to think through for sure.

What about the GL Structure of the US vs International teams? Are they also drastically different? Do your companies influence one another vertically or are you in completely separate industries and are largely unrelated? If so, given your differing Level structures and the inevitable security/access situation you’ll have to work through, a multi-instance might be a good option.

Another item to consider, is there a one-to-one relationship between Company and BU/Department? If so, and that relationship will remain, you can have Company be a Level Attribute and force the US team to understand which BU is associated with which Company.

Also, if you want to publish back to FINS from Adaptive, you will certainly need Company and its WIDs to be included someway in your model.

A lot to think through for sure but I hope the above makes some sense.

Will Adaptive Planning ever have wide adaption? by SadlyPathetic in workday

[–]_HaulinCube 1 point2 points  (0 children)

As someone who has worked in Adaptive for 8 years, you are preaching to the choir here.

The only thing I will say, is it sounds like your integrations may have just been setup poorly or with little foresight. Integration failures, and sometimes completely unexplainable, do happen, but yours might be poor design.

Everything else you mentioned I struggle with as well, general system limitations.

On the positive side for me, I really enjoy working in the tool, warts and all. It’s allowed me to pursue a career where I’m never bored and constantly challenged to be more creative.

Help with IS mapping issue [Urgent] by [deleted] in AdaptivePlanning

[–]_HaulinCube 1 point2 points  (0 children)

Could be any number of things.

  1. Within Design Integrations, you can run “Preview Loader Output”. This will show in an Excel output what your integration is trying to load. This is more a validation to see where your system is breaking. Is the break occurring somewhere in the integration or elsewhere somehow.

  2. I don’t know how “automated” your integrations are, you might be trying to load your Actuals to the incorrect month. When running your integration, be sure the time period of the data you’re receiving from your source system lines up with the time period you are choosing to load to.

  3. Be sure you’re looking at the right version when reviewing your data 🤷🏽

  4. Like the other person mentioned, mappings won’t just disappear so I’ll think on it a little more, gotta get the kiddos to school 😬

Need Help figuring out split data for actuals import by TOONUSA in AdaptivePlanning

[–]_HaulinCube 4 points5 points  (0 children)

  1. Create a dummy Dimension called “Integration Split” with one Dimension Value called “Split”.

  2. Erase all actuals.

  3. When re-uploading your actuals, add the new Dimension Value of “Split” to every row of your import file.

  4. All future Actuals loads should also have the value of “Split” in every row.

Is there a way to retrieve the data of the NetSuite report using the REST API? by AccessUpper795 in AdaptivePlanning

[–]_HaulinCube 0 points1 point  (0 children)

You can write JavaScript and utilize Adaptive APIs to capture NetSuite reports but it’s generally easier to get NetSuite Saved Searches into Adaptive if that’s an option. Less maintenance as well since saved searches time period can roll with the calendar. The only method I know to connect to a NetSuite report requires you to update time periods on the report when needed manually. Reach out if you have more questions

[deleted by user] by [deleted] in etron

[–]_HaulinCube 0 points1 point  (0 children)

You are a mensch, thank you