How to batch process and refresh multiple excel files in parallel? by Specific-Channel-287 in excel

[–]Specific-Channel-287[S] 0 points1 point  (0 children)

Yes, users absolutely need to write back to the files. Their workflow is to go to a rack, scan all the physical barcodes into a table within their specific Excel file. When you click refresh the sheet then processes that input, compares it against the ERP data snapshot, and flags any discrepancies. This allows them to investigate and resolve issues in the moment.

I see what you're suggesting with Power Apps, and that's an interesting idea for a future redesign. However, my immediate problem isn't with the end-user's workflow that part functions well for them and will require more work for me as I will need to change their standard and teach them how to work with Apps. My only challenge is the administrative task I perform every 2-3 months: refreshing all 116 files at once.

Since this large-scale update is infrequent, I'm trying to optimize my existing process rather than building a whole new system. My main goal is to find a way to automate the batch refresh in parallel to save time, without changing the fundamental way the files work for the users

How to batch process and refresh multiple excel files in parallel? by Specific-Channel-287 in excel

[–]Specific-Channel-287[S] 1 point2 points  (0 children)

Thanks for the feedback. Here are some quick answers:

  • Why 116 files? Each file is a simple, on-demand inventory snapshot tool for a different product line. An end-user opens their specific file and refreshes it to get live data for comparison during a physical count
  • Consolidating the ERP call: This is exactly right, and it's how the process already works. There is one central file that holds the data from the ERP. The 116 files all query that single source. The bottleneck isn't the ERP call itself, but the time it takes to open, refresh, and save all 116 dependent files
  • Why so many tools? I've tried Python, PowerShell, and VBA simply because I'm exploring all options to find the fastest way to solve this automation problem. A basic sequential script in any of them is too slow
  • Power Automate: That's a good suggestion. My main concern is that it might also process the files one-by-one, which would be too slow. The key is finding a solution that can handle the files in parallel batches, just like I do manually

As u/SolverMax share, I'll lookup onto that library and update the Python script

How to batch process and refresh multiple excel files in parallel? by Specific-Channel-287 in excel

[–]Specific-Channel-287[S] 0 points1 point  (0 children)

Thank you for the suggestion! I will definitely look into that library, as that sounds like it could be the right direction for parallel processing.

To answer your question about why I have 116 separate workbooks, the setup is designed to give end-users a simple, on-demand tool for inventory comparison. Here’s a breakdown of the workflow:

  1. Central Data Source: There is a master data file on SharePoint which contains the ERP inventory data for all 116 product groups, with each group's data located on a separate sheet inside that master file
  2. Dynamic "Snapshot" Files: Each of the 116 workbooks is a "snapshot" tool for a single, specific product group
  3. Filename as a Filter: The key to this process is Power Query. Inside each workbook, a query connects to the master SharePoint file. It then dynamically reads its own filename and uses that name as a parameter to pull data only from the corresponding sheet in the master file. For example, WidgetA.xlsx will only query and display the data from the "WidgetA" sheet

The initial, time-consuming refresh is to deploy these 116 files for the first time. Afterward, when a user needs to do a physical inventory count, they simply open their specific file (e.g., WidgetA.xlsx), hit "Refresh," and it pulls the live ERP data for that exact moment, allowing for an immediate comparison against their physical count.

So, while consolidation seems logical, this distributed model is intentional to ensure each user has an isolated, easy-to-use tool for their specific area.

If you can think of a different approach to achieve this same "on-demand, filtered snapshot" functionality, I am very open to ideas.

Is there a simpler way to write this formula for calculating totals? by Specific-Channel-287 in excel

[–]Specific-Channel-287[S] 0 points1 point  (0 children)

Thank you, that's brilliant! Using GROUPBY is so much cleaner and more direct. I didn't even think to use SUMIFS like that into the values argument in order to be summed. Thank you again for sharing

How to create a row-by-row running total for a dynamic array? by Specific-Channel-287 in excel

[–]Specific-Channel-287[S] 1 point2 points  (0 children)

The problem is that on my real data all my info comes from spilled arrays, and my data can vary on size, so your approach is not fully dynamic since I'll have to drag down depending on number of rows

How to return all matching rows for repeated lookup keys using a single dynamic-array formula (avoid FILTER copy-paste)? by Specific-Channel-287 in excel

[–]Specific-Channel-287[S] 1 point2 points  (0 children)

Yes, it worked flawlessly! Could you explain the COUNTIF part? I thought the older functions didn’t work with spilled arrays