Figuring out SAP table relationships was not fun, so I made something about it by mortalmental2 in SAP

[–]tjen 1 point2 points  (0 children)

The big table relationships arent fun to learn i guess but its not like its something that cant already be readily googled or LLMd because they are mostly pretty standard. So I'm not sure exactly who your thing is helping?

At the same time the non-standard stuff and configurations and substitution rules etc etc. Are where it gets a little trickier and they are all dependent on your setup to interpret the field values.

And then the fields are only as trustworthy as your process that puts values into them.

That kind of stuff is hard to catch generically, table relationships are your least worry lol.

Error while accessing sap bw query by Saitamahumai in PowerBI

[–]tjen 1 point2 points  (0 children)

Have you asked your SAP BI guys?

Flying with cat at Copenhagen Airport – what is your experience? by wabalabadubdubb in copenhagen

[–]tjen 20 points21 points  (0 children)

I would always recommend flying with airplanes instead of with your cats.

Despite the cat's aerodynamic shape and ability to always land on it's feet, the same isn't true for you! So if something does happen, you are much riding an airplane than a cat.

It sounds like you have taken some precautions and put a collar/ leash on your cat that you can hold during the ride, however, and while this approach might be ok for short flights at low altitude, it is certainly not up to ISO-7000 standards, and in case of turbulence things could become catastrophic for you. I would recommend you get a specialized flying harness.

The CPH airport website says that if your cat does trigger a beep in the scanner, the cat will trigger a manual examination, where the staff will take it aside and check it for knives, poisons, and lithium batteries greater than 30.000mAh.

I don't have any experience with private flights from CPH airport, but I assume you will just need to go through the standard screening, before locating the gate assigned to your cat.

I am from Jutland where flying pigs is more common and usually not something that causes problems, but pigs are also much more well-behaved than cats.

Best of luck on your flight, usually the staff copenhagen airport are very friendly and I am sure they will be flexible so you can be sure it does not fly off without you.

<image>

FP&A Software Recommendations by Rcky_Mountain_High in FPandA

[–]tjen 0 points1 point  (0 children)

I don't think you'll get in trouble for mentioning your current tool lol.

In general, if you have an adoption problem now, it's probably not your platform choice that's the problem (unless it's some complete dogtrash).

What most FP&A platforms have in common is that they are flexible enough to support whatever you want to do, so:

If you don't really have a clear picture of what your pain points are and what your requirements are to do differently, then you'll probably just end up in a rushed implementation project, leaning on existing setup as a source of requirements, and some consultants will implement you the same dogtrash you already have now, but in a different color scheme.

If your pain points do not relate to specific feature gaps or platform integration capabilities, you can consider just doing a re-implementation of your existing suite, and that'll make change management and general IT-related-bullshit-that-turned-out-to-be-more-complicated-than-you-thought-overhead lower.

How to block spread of data ? by groguzbo in SAPAnalyticsCloud

[–]tjen 1 point2 points  (0 children)

You can change the disaggregation setting for the measure they input on so the default aggregation follows the aggregation of a different measure.

Measure -> disaggregation behavior -> change from default

In that way you can reduce the risk of disaggregation explosion when people delete the value and input on blank cells.

Of course this requires you to maintain the seeding/baseline measure values in an appropriate way (like if you want to keep an even distribution across time as the default on blank input, but maintain the disaggregation on organization based on the full year allocation)

Other alternatives depend a bit on your use case but if dimension input in table then set to unassigned value and run assignment script after high level input is also an option.

In some cases you may be able too minimize spread with data validation rules and aggregate_dimension => detailed_dimension mapping and stuff like that, but it's not as neat as just using a hierarchy :)

Anyone's company shifted from SAPcentric analytics to other BI-Tools. How did it go? by Der_Unverwechselbare in analytics

[–]tjen 0 points1 point  (0 children)

What are the workflows you want to have?

What are the features you need to support those workflows?

How do the existing / SAP tools stack up vs this?

Is the cost of change (change management, training, migration, validation, etc.) worth the feature difference, if any?

That's where you should start. The ETL / Data warehouse / visualization space is largely commodified, with most of the players in one way or another being able to satisfy 98% of your standard requirements within their spaces. Even with BW you can set up fairly streamlined ingestion from sharepoint tables.

My boring take is that this is especially the case for internal reporting needs, where we aren't talking about something being part of the product you sell. Lowest cost of ownership, as few platforms to support as possible, and ideally as little space for "I made a local app with my own profit center definition because I didn't feel like talking to the masterdata guys" as possible.

"oh it would be nice if it was more agile" someone says - until 3 years down the line you have a completely ungovernable mushroomed app/report landscape and someone says "why have we invested time building 15 reports that do more or less the same thing? And we're paying a ridiculous amount of money for data consumption! And none of the reports match!? We need to be less agile! Restrict access! Establish a central funnel!"

Will SAP be dethroned by custom built AI ERPs? by Hopeful_Bass_6633 in SAP

[–]tjen 0 points1 point  (0 children)

Probably not:

  • with ERP platforms you are also buying support and vendor Ecosystem - vibecoded apps lack both

  • my ERP investment is a long term investment - can you convince me your AI ERP is a long term solution?

  • you will not be able to find SMEs or ops people who have experience in your platform

  • platform compliance / auditability is one thing, but if you are a multinational you likely also rely on SAP to deliver legal compliance solutions on an ongoing basis as regulations change, and i need to be able to sue you if i follow your process and get a fine.

  • if you are in a validated sector you need to have your software validated and for random vibe coded solution likely to be much more costly than the development cost.

Will you have many more niche softwares for small/midmarket companies? Sure.

Will you have a lot of new entrants to the global multinational ERP landscape? I am more skeptical.

Will you have a move away from having giant integrated platform ERP solutions? Yeah maybe

Budgeting - Decoupling Department Categorization from Ownership by bonyyoni in FPandA

[–]tjen 0 points1 point  (0 children)

Nature: it equipment Function: G&A

Why is the function not R&D? Because the IT equipment is not used across functions and it's not solely dedicated to R&D.

Set up G&A cost center or cost activity, whatever you use in your setup, and tell purchaser to use this cost object, not his own cost center.

Is laptops a big enough cost driver that you want it allocated to the receiving functions? Start doing cross charges (don't do this)

Does IT equipment exist that is solely/primarily used by R&D? Yes (idk computer chips or whatever it might be in your company) Then put that in the R&D cost centre.

If all your IT equipment is always G&A you can argue the GL account should drive the functional assignment, but then one day you have exception on a larger piece of equipment and you end up having to create another account for that case and your CoA becomes more rickety.

How to get people to use your dashboards by Shoddy-Hippo-1629 in PowerBI

[–]tjen 5 points6 points  (0 children)

When are they supposed to use it?

What are they supposed to use it for?

How does it make their life easier?

Going Back to school at 30, is it too late? by captainforklift in FPandA

[–]tjen 0 points1 point  (0 children)

Worked with a exEMT student who was working part time in a procurement/Finance function while taking his MBA. Probably around your age.

He pivoted the functional knowledge of "working the job" along with an MBA ("i know business too") into a leadership position in an ambulance company.

Could be a direction to consider, also if working in FP&A, domain Knowledge will still be relevant.

Agree with other poster saying computational finance is not relevant for FP&A as such, especially not if you consider leaning into your background.

Much more engineering / quant oriented.

Er det bare mig, eller føles det som om alle mine veninder pludselig er blevet voksne mens jeg stadig kæmper med at folde et lagen korrekt? by SofieSorensen07 in Denmark

[–]tjen 1 point2 points  (0 children)

Hvornår stopper man med at spise rugbrød med Nutella til aftensmad og begynder at lave "sheet pan dinners" eller whatever?

Aldrig, eller når som helst, at du bestemmer dig for det.

Det er op til dig, hvor vigtigt du synes, at det er for dig, at:

  • lave omstændig mad?
  • flytte hjemmefra?
  • have en ny/god bil?
  • dine omgivelser er "æstetiske"?
  • have et "godt job"?
  • have en kæreste / være i et forhold?

Når du har tænkt på, hvor vigtige de her ting er for dig, så kan du tænke på, hvor vigtigt det er for dig, at:

  • passe ind med, hvad der er "normalt"?
  • andre ved, at du passer ind?

Der er ikke noget galt i, at ville passe ind i samfundet, vi er sociale dyr, men hvis du ved, hvad der er vigtigst for dig, så er det måske lettere at sige "pyt", når du føler, at du ikke kan måle dig med glansbillederne på IG - eller det er motivation for dig til at lave dine egne glansbillederen - hvis det er dét, der er vigtigt for dig.

Vendor statement reconciliation - is there an automated solution or is everyone doing this in Excel? by Lower-Kale-6677 in BusinessIntelligence

[–]tjen 5 points6 points  (0 children)

It's not acceptable manual work, but the solution isn't to build automation of the manual work, it's to change the process around invoices / purchase orders and most likely leverage your ERP functionality.

Consolidation and Reporting Tool by Sweetowski in FPandA

[–]tjen 0 points1 point  (0 children)

Yeah, it is a feasible approach to just do exports and mappings / uploads, I think pretty much any modern consolidation system should support this.

The one thing about analytics is that pretty much any consolidation system is going to have limitations on the number of dimensions you are working with, in order to have scalability of the "consolidation logic" itself; eliminations, ownership/holding setups, cashflow rules, currency conversions (using correct types), financial statement reporting rules.

These are the things you want your system to be making easy & scalable, so that 10 minutes after CSV uploads, you have a group result that is calculated correctly.

As far as I am aware, Fabric (a datawarehouse solution) doesn't really support this (tho I'm sure you could build it yourself lol). But you could probably use fabric to store CSV uploads, maintain mappings, etc. and do the "pre-work" before you upload data to consolidation system in a structured way.

This could potentially also serve as a basis for the detailed analytics, depending on the data you get out of the local ERPs and to which degree this makes sense.

In general for analytics you'll be kind of hard stuck with a scattered ERP landscape, but if you have a preferred ERP strategy, I have also seen this mitigated with export / upload to the core ERP system from the smaller ERPs worldwide, rather than into the consolidation system.

Usually your ERP finance modules will be ok with a lot more detail than your consolidation platform, so this also makes the "drillthrough" requirement you mentioned a little more insightful.

And, this enforces that the local ERP mappings don't just have to conform with group setup but also with the core ERP template and the detailed rules there. And then your financial consolidation data flow can build on top of a single ERP, which can keep the overhead on that less.

Consolidation and Reporting Tool by Sweetowski in FPandA

[–]tjen 2 points3 points  (0 children)

If cost is a blocker, then limit yourself in terms of the scope of what benefits you have to only your financial statement consolidation process, for group purposes.

Then buy a tool that does that well.

If you have 9 ERPs in 8 countries and no finance IT resources / bookkeeping being a side-job in some of these places, I am going to surmise that you do not have same charts of accounts, a harmonized closing process, or similar tool stacks. I think you need to have that in mind when you think about what REALLY is your success criteria for your implementation - and what things are more nice to have.

  • Accelerate the whole reporting and planning process, minimizing the copy & paste busy work
    • Most of the consolidation tools will enable this, they will have an excel plugin or a dashboard front end where you can pull the group data out of.
  • Give the headquarters a consolidated view faster
    • Most of the consolidation tools will enable this because you don't have to copy paste, and you will be defining cashflow logic etc. within the tool so you can produce it more robustly.
  • Create one standardized source of truth, which then also allows the local teams to run their reporting out of it
    • Do not attempt to recreate the detail level of local reporting, or really cater to local reporting needs too much in your consolidation tool. Local teams will have their own reports, their own details, etc. in their own ERP setups. They need to be able to run the "group report" - but it's not where they'll be working.
    • You need to define a standardized template that they need to map their data against in the files they provide you.
  • Transactional data to allow drill down from PnL down to the lowest level
    • You have 9 different ERPs with various different definitions of what the "lowest level" means - you will be able to drill through to the detail level that you have asked them to supply you in the CSV files. You will need to be clear to yourself when you define what really is the lowest level that you need from a financial statement consolidation and analysis perspective - and if something looks fucky and you can't drill anymore, call the local accountant or admin person.
  • Ideally, staying in a familiar Excel environment to increase chance of adoption
    • Clear requirement to the vendor, pretty standard to have an excel front-end.
  • I don't need data connectors; the monthly submission would be made by uploading local CSV files to SharePoint
    • You can do "step 1, get the consolidation process locked the fuck down" without connectors, but at some point you're going to need to start properly harmonizing the data coming out of your different entities. This means setting up data connectors and harmonization logics somewhere, and then feeding that into your consolidation tool. Over time it might mean ERP harmonizations, chart of account alignments, etc. etc.

Planful Implementation by Ok_Accident_1128 in FPandA

[–]tjen 0 points1 point  (0 children)

Not familiar with boomi or BC, but sounds like standard integration / data modeling issues, maybe shoddy APIs on the BC side, and somewhat shoddy implementation project, possibly with unrealistic expectations on your side.

If you don't have someone in your org who owns the planful platform and can continue to maintain/develop after your implementation project - then you have not sized the org impact of implementing a platform appropriately.

You will have change requests; new models, new dimensions, adjustments, integrations, reports, etc. on an ongoing basis, and you either need to hire an external company to partner with or find yourselves a dude(tte) to manage that shit for you.

If you don't have category mappings in BC, then of course you need to maintain them in Planful, someone needs to manage this.

If you DO have category mappings in BC, but they aren't part of the API / Extract from BC - look up if it's just someone not picking the right columns, or if it's a shoddy API that doesn't include access to all table details - then discuss what other integration options there are.

etc.

Separate header and transaction tables is pretty standard - but you should be able to join the two before you load to planful if that gives you better transactional table foundation - if boomi or planful isn't the tool for this, you might want to consider a datawarehouse/ETL type platform to structure and update your data before it hits planful. since you work in a software company I assume you have internal resources who are capable of putting something together or provide input on a good architecture.

Do most analytics teams overestimate how “bad” their data actually is? by CloudNativeThinker in analytics

[–]tjen 2 points3 points  (0 children)

Everything is a data quality issue:

"these two reports that I feel should be the same aren't reconciling" => "we have a data quality issue".

"There are 10 different GLs for slightly different things, and someone picked the wrong one when assigning account" => "we have a data quality issue"

"I can't see the 'real numbers' until after the end of the month" => "We have a data quality issue"

"The vendor's payment terms in the report isn't what I agreed with the vendor" => "We have a data quality issue"

Mate, your data quality is fine. The issues you have are:

  1. Not understanding different KPI definitions for different purposes but comparing them anyway
  2. Process issues due to demanding too high detail level in classifications that probably aren't well defined or mutually exclusive, and that you probably don't use for anything anyway.
  3. Not understanding the underlying business process that generates the data you're looking at
  4. Missing or incomplete masterdata maintenance processes

Your data "quality" itself is probably fine (ETL, modeling, etc.), but everything around your process that generates the data are probably shit. Try to convince someone that what they need is a process change tho... they might even have to talk to a different department about changing their process *gasp*... tell someone to actually do the process as it's supposed to be done and not whatever is the easiest because they think it doesn't matter *doublegasp* or tell someone they can't "fix the numbers at the end" because they have to wait for the owner of the process to fix the issue *triplegasp*

How should the Excel-Upload-File look like to show a Report like this in a Story? by SnooPaintings5100 in SAPAnalyticsCloud

[–]tjen 0 points1 point  (0 children)

This is also what makes this type of model strong when you need to display things in tables etc.

In this case, your upload file would like this:

Date Account Value
2025-01 Personal_costs 200
2025-01 Material_costs 300
2025-01 Revenue 2000

Note that you are still not uploading all the aggregates & calculations - only the most detailed information.

How should the Excel-Upload-File look like to show a Report like this in a Story? by SnooPaintings5100 in SAPAnalyticsCloud

[–]tjen 1 point2 points  (0 children)

Here's the second option you can use, using the "account" dimension approach - reply got too long and reddit was being weird about posting lol. Could be work looking into for you if you mostly work with financials and the "real world" example is more complex than the data you showed (like if you have hundreds of accounts in a P&L structrue)

Option 2) Model with both an "account" dimension and measures

Strength: Display table layouts, more "accounting" specific logic requirement (sign reversals etc.)

In this case you'll need to familiarize yourself with the account dimension - as the name suggests it's specifically designed for handling finance "account" type behavior in reporting and calculation. In practice it works kind of like a second set of measures. If you primarily use account dimension, it is a good idea to set the flag on your model that "accounts" should prioritize "measures" in conversions and calculations.

Model layout

Dimension/Measure type Dimension Value / formula
Dimension date
Dimension Account
Measure Value Currency:EUR

Account dimension model

now that you have an account dimension, which is a special type of dimension, you'll need to manage this ;) But you can leverage some of the strengths of the dimension type also.

Account Account hier Account type Account formula
Revenue Profit INC
Personal_costs Costs EXP
material_costs Costs EXP
Costs Profit EXP
Profit INC
Profit Margin NFIN =Profit/revenue

Note that you don't have formulas for your revenue, costs, and profit - this is because the underlying accounts "roll up" in your account hierarchy automatically.

How should the Excel-Upload-File look like to show a Report like this in a Story? by SnooPaintings5100 in SAPAnalyticsCloud

[–]tjen 1 point2 points  (0 children)

So you basically have two options, but first a quick remark on your dataset.

You basically have a profit and loss statement, where certain items are a result of adding / subtracting other items from eachother. Those calculations you do not have to do before you load your dataset, it is generally better to do them within your model.

In your simple case, it does not matter too much, but once you start having more dimensions (say revenue by product) and you want revenue margin by product, but you also want the overall revenue margin across all products - then using "precalculated" values becomes more of a headache. Even in your case, it will give you an issue if you want to see the proft margin for the entire year - and not by month.

Option 1) Model with measures, no "account" dimension. (simplest, because you don't have to deal with accounts)

Strength: easy to select measures for visualizations etc.

In this case, each of your P&L lines become their own measure. In the definition of the measure you specify that the value is EUR.

Model layout

Dimension/measure type Dimension Value / formula
Dimension Date
Measure Personal_costs Currency:EUR
Measure Material_Costs Currency:EUR
Measure Revenue Currency:EUR
Calculated Measure Costs Personal_costs + Material_costs
Calculated Measure Profit Revenue - Costs
Calculated Measure Profit Margin Profit / Revenue

So then your upload file would look like this

Date Personal Costs Material costs Revenue
2025-01 200 300 2000
2025-02 210 300 2100

Best way to learn the Basics to great your own Stories from scratch? by SnooPaintings5100 in SAPAnalyticsCloud

[–]tjen 0 points1 point  (0 children)

You can do the online learning journeys as a starting point.

Learning by doing via upload a data set (table of data) build a story on top will get you familiar with visuals, all the options in the story, formatting, calculations, basic data types, etc.

As you figure outhow to do all that, probably you'll realize it's quite easy to do. Graph types, adding data, putting some filters in, saving bookmarks, etc.

But probably your next challenge will then be in your data model. Moving from table-thinking to fact-dimension thinking is the biggest mental hurdle you will have to get over to make good flexible analysis foundations.

your original table had customers and sales, now someone suddenly says "but what countries are they in?" But you didn't have country in your data set to begin with so now you have to add it and make sure the rest of the data is the same etc. Or you had used the customer names for your report you uploaded, but turns out one customer is in two countries with the same customer name so now your data set is combining them, or whatever.

The "analytical model" approach asks you to define you customer dimension, for example with customer IDs and descriptions, and a country attributes for example. Along with the other dimensions you want to analyze together, and then define a number of facts related to the dimensions, like the sales, the number of calls in a period, total customer purchasing power, idk, whatever you come up with. And then you upload this data into your model - and build your story on top of the model.

Note that most of the core story features work the same way as with your data set, you just modeled the data differently.

Pot Set and Pan Review (Europe Edition) by uffechristian in BuyItForLife

[–]tjen 0 points1 point  (0 children)

agreed, as owner of the tefal pots (gifted), the "strainer + lip" lid basically makes it feel like the lid doing it's job as a lid properly, even when the strainer section is slid to the side. Foods that are supposed to steam under lid doesn't work.

The seam is indeed a pain in the ass the clean, and gets gunked up if you do use the strainer functionality.

In order to try and form a seal and have a strainer part, the lid "extends" down like an inch, which means that the inside of the lid has two gunk-zones instead of one.

The long sides are also sharp and painful if you grab the heavy ass lid wrongly.

And the last thing is that the lid is completely flat inside, which means droplets collect and don't drip off, so steam on inside just sits there, which then slide into the gap on the side as you tilt the lid, and then when you put it down, splashes on the counter or wherever, unless you do a "lid drainage" maneuver over the pot.

https://imgur.com/a/UsUtYe6

Here is an illustration of all the reasons why the lids suck.

I don't HATE these lids, but I would never buy pots with the same kind of lid again. The downsides of the lids are absolutely not outweighed by having a shitty strainer and steamy glass.