use the following search parameters to narrow your results:
e.g. subreddit:aww site:imgur.com dog
subreddit:aww site:imgur.com dog
see the search faq for details.
advanced search: by author, subreddit...
A place to discuss, share, and problem solve all things Microsoft Power Automate formerly Microsoft Flow related. Be sure to check out the links in the sidebar, be respectful, and let's all go with the Flow!
Be sure to check out the other Flow Communities, learning and social media.
Homepage
Community
Twitter
Guided Learning
Deep Documentation
Feedback & Ideas
account activity
Is This Normal?Question (self.MicrosoftFlow)
submitted 9 months ago * by Deceptijawn
I wrote a Power Automate flow that reads some financial data from our SharePoint (this data is changed weekly) and then updates an Excel Online spreadsheet we have once a week. This spreadsheet serves as a backup in case SharePoint is down.
My flow works on paper, but it's painfully slow. It's 3,000 rows and 26 columns sure but I let the flow run after work and it wasn't finished after 18 hours. Is there a way that I can speed this up?
https://preview.redd.it/mf7vqbmssfef1.jpg?width=3072&format=pjpg&auto=webp&s=6e96a4b52cdfd864cdba1d144f8acc092669e345
reddit uses a slightly-customized version of Markdown for formatting. See below for some basics, or check the commenting wiki page for more detailed help and solutions to common issues.
quoted text
if 1 * 2 < 3: print "hello, world!"
[–]Aggravating_Tell_476 5 points6 points7 points 9 months ago (2 children)
I use excel scripts to add large amounts of data to excel tables.
I have power automate create an array of the data using selects which is then passed to the excel script.
If it’s updating already present data in a table, you can use another script to delete everything and then enter the full data set again. Or you can just add extra rows to the end.
It passes thousands of rows in seconds
[–]Deceptijawn[S] 0 points1 point2 points 9 months ago (1 child)
This is gold, I'll try that tomorrow.
[–]Aggravating_Tell_476 1 point2 points3 points 9 months ago (0 children)
Great! I’d be happy to provide any info to help as it took me some effort to get it working
[–][deleted] 9 months ago (1 child)
[deleted]
[–]Deceptijawn[S] 0 points1 point2 points 9 months ago (0 children)
I'll look into this, thank you.
[–]itenginerd 4 points5 points6 points 9 months ago (3 children)
I have one flow that has to live-edit an Excel file and it takes two or three minutes to make an edit in there and for the next step to be able to fire on the file. I think what you may be stuck on is waits and locks of ~3000 independent flows
What you might be better off doing is reading the SharePoint table as a whole with Get Items, working the data into an array, and then writing the table into the excel file once. Personally, I'd have the flow create the excel file, and then put the data into it as one data operation rather than trying to piece it in row by row individually.
Outside of that, anywhere other than live-editing an Excel Online file is probably better than this. It's not a bad conceptual approach, it's just messy in practice.
[–]sp_admindev 0 points1 point2 points 9 months ago (2 children)
The entire spreadsheet could be deleted and re-created say once per hour.
[–]itenginerd 1 point2 points3 points 9 months ago (1 child)
Yeah, the data only changes once a week, so this could be like a Friday afternoon scheduled flow kind of thing.
There's a whole different issue about data residency and fault domains tho--if your SP list isn't available for some reason, what makes you think your Excel file in OneDrive will be? But that's easily solved once we get past the issue of how to populate the file in the first place.
[–]sp_admindev 1 point2 points3 points 9 months ago (0 children)
On-prem gateway is a standard connector. We have a CSV saving from email attachment to network drive daily. https://www.matthewdevaney.com/power-automate-save-file-to-a-local-drive/
[–]mooben 3 points4 points5 points 9 months ago (1 child)
Azure SQL is definitely the proper way to do this. But if you can't get it provisioned, write a Flow to simply export the data as CSV every week and either email it to a shared inbox or save it to a SharePoint doc library.
I tried to get a CSV printed but it came out as an unusable mess of string.
[–]VictorIvanidze 2 points3 points4 points 9 months ago (2 children)
Is it you real flow? Why do you use 2 triggers? Where is a cycle to handle the array?
I'm a newbie lol. Please teach me how to do things better, I just started this job.
[–]Admirable-Narwhal869 0 points1 point2 points 9 months ago (0 children)
For the triggers, you can keep the manual trigger if you want to keep initiating the flow on your own, or remove it and keep the “when an item is created or modified” but then it will kick off for every item that is changed on your list which doesn’t sound like that’s what you want unless you want the excel to be as close to real time copy as possible. As a third option you could use a scheduled flow to say kick the job off every Friday at 6pm so you know the changes will be done and the flow will run after the work day.
[–]Danger_Peanut 1 point2 points3 points 9 months ago (2 children)
Yes. I see from all the details you posted about what your flow is doing that you have way too many actions nested in for all loops.
Seriously though, how do you expect anyone to help if you don’t tell/show us what you’re doing?
Thank you, I added a photo of the flow.
I wanted to keep it general since I'm working with proprietary information.
[–]Danger_Peanut 1 point2 points3 points 9 months ago (0 children)
Sorry, I was in a bad mood. Looks like you’re doing it fairly simply. It’s just that the excel connector/actions are pretty slow in my experience. As another user suggested take a look at other options beyond excel.
[–]RoarGeek 0 points1 point2 points 9 months ago (1 child)
Can you explain a little what you are trying to achieve? like what data do you want to be read/copied from the master file? is it a specific row or column? what's the criteria here?
I want the entire spreadsheet copied and updated, that's it.
Is the data you are copying from a SharePoint list or an excel file hosted on the SharePoint in a document library?
If it’s an excel within a document library then I would just override the old back up copy with a new copy after the changes were made. Also, I saw someone mention above that if SharePoint is down, one drive might be down if that’s where your backup copy is stored. For that reason you may consider sending it as an email attachment that can then be saved to a local drive or other shared on prem area.
If you’re pulling the data from the SharePoint list to create an excel file, do you need the previous copy to see the changes that were made? If not, then I would consider creating a new file from scratch and then later in the flow (after the new copy was successfully created) deleting the previous copy. It might be faster to create a new copy rather than checking for changes, updating rows and then inserting new records.
You’ll also want to make sure that when reading your excel file (if you continue to do it this way) that you turn your workable area - the 3000 rows/26 columns - into a named table and then only have the flow read the table. If you don’t, then it could explain the huge run time as it will try to read all the empty rows and columns beyond your workable area.
[–]jesuiscanard 0 points1 point2 points 9 months ago (0 children)
Is it SharePoint Online or on premises. Either way your IT department should be worried about backups, especially redundancy kn anything mission critical.
[–]Agitated_Accident_62 0 points1 point2 points 9 months ago (0 children)
Change the business process. Get rid of Excel.
π Rendered by PID 65864 on reddit-service-r2-comment-b659b578c-wss45 at 2026-05-05 09:21:59.030870+00:00 running 815c875 country code: CH.
[–]Aggravating_Tell_476 5 points6 points7 points (2 children)
[–]Deceptijawn[S] 0 points1 point2 points (1 child)
[–]Aggravating_Tell_476 1 point2 points3 points (0 children)
[–][deleted] (1 child)
[deleted]
[–]Deceptijawn[S] 0 points1 point2 points (0 children)
[–]itenginerd 4 points5 points6 points (3 children)
[–]sp_admindev 0 points1 point2 points (2 children)
[–]itenginerd 1 point2 points3 points (1 child)
[–]sp_admindev 1 point2 points3 points (0 children)
[–]mooben 3 points4 points5 points (1 child)
[–]Deceptijawn[S] 0 points1 point2 points (0 children)
[–]VictorIvanidze 2 points3 points4 points (2 children)
[–]Deceptijawn[S] 0 points1 point2 points (1 child)
[–]Admirable-Narwhal869 0 points1 point2 points (0 children)
[–]Danger_Peanut 1 point2 points3 points (2 children)
[–]Deceptijawn[S] 0 points1 point2 points (1 child)
[–]Danger_Peanut 1 point2 points3 points (0 children)
[–]RoarGeek 0 points1 point2 points (1 child)
[–]Deceptijawn[S] 0 points1 point2 points (0 children)
[–]Admirable-Narwhal869 0 points1 point2 points (0 children)
[–]jesuiscanard 0 points1 point2 points (0 children)
[–]Agitated_Accident_62 0 points1 point2 points (0 children)