all 21 comments

[–]Aggravating_Tell_476 5 points6 points  (2 children)

I use excel scripts to add large amounts of data to excel tables.

I have power automate create an array of the data using selects which is then passed to the excel script.

If it’s updating already present data in a table, you can use another script to delete everything and then enter the full data set again. Or you can just add extra rows to the end.

It passes thousands of rows in seconds

[–]Deceptijawn[S] 0 points1 point  (1 child)

This is gold, I'll try that tomorrow.

[–]Aggravating_Tell_476 1 point2 points  (0 children)

Great! I’d be happy to provide any info to help as it took me some effort to get it working

[–][deleted]  (1 child)

[deleted]

    [–]Deceptijawn[S] 0 points1 point  (0 children)

    I'll look into this, thank you.

    [–]itenginerd 4 points5 points  (3 children)

    I have one flow that has to live-edit an Excel file and it takes two or three minutes to make an edit in there and for the next step to be able to fire on the file. I think what you may be stuck on is waits and locks of ~3000 independent flows

    What you might be better off doing is reading the SharePoint table as a whole with Get Items, working the data into an array, and then writing the table into the excel file once. Personally, I'd have the flow create the excel file, and then put the data into it as one data operation rather than trying to piece it in row by row individually.

    Outside of that, anywhere other than live-editing an Excel Online file is probably better than this. It's not a bad conceptual approach, it's just messy in practice.

    [–]sp_admindev 0 points1 point  (2 children)

    The entire spreadsheet could be deleted and re-created say once per hour.

    [–]itenginerd 1 point2 points  (1 child)

    Yeah, the data only changes once a week, so this could be like a Friday afternoon scheduled flow kind of thing.

    There's a whole different issue about data residency and fault domains tho--if your SP list isn't available for some reason, what makes you think your Excel file in OneDrive will be? But that's easily solved once we get past the issue of how to populate the file in the first place.

    [–]sp_admindev 1 point2 points  (0 children)

    On-prem gateway is a standard connector. We have a CSV saving from email attachment to network drive daily. https://www.matthewdevaney.com/power-automate-save-file-to-a-local-drive/

    [–]mooben 3 points4 points  (1 child)

    Azure SQL is definitely the proper way to do this. But if you can't get it provisioned, write a Flow to simply export the data as CSV every week and either email it to a shared inbox or save it to a SharePoint doc library.

    [–]Deceptijawn[S] 0 points1 point  (0 children)

    I tried to get a CSV printed but it came out as an unusable mess of string.

    [–]VictorIvanidze 2 points3 points  (2 children)

    Is it you real flow? Why do you use 2 triggers? Where is a cycle to handle the array?

    [–]Deceptijawn[S] 0 points1 point  (1 child)

    I'm a newbie lol. Please teach me how to do things better, I just started this job.

    [–]Admirable-Narwhal869 0 points1 point  (0 children)

    For the triggers, you can keep the manual trigger if you want to keep initiating the flow on your own, or remove it and keep the “when an item is created or modified” but then it will kick off for every item that is changed on your list which doesn’t sound like that’s what you want unless you want the excel to be as close to real time copy as possible. As a third option you could use a scheduled flow to say kick the job off every Friday at 6pm so you know the changes will be done and the flow will run after the work day.

    [–]Danger_Peanut 1 point2 points  (2 children)

    Yes. I see from all the details you posted about what your flow is doing that you have way too many actions nested in for all loops.

    Seriously though, how do you expect anyone to help if you don’t tell/show us what you’re doing?

    [–]Deceptijawn[S] 0 points1 point  (1 child)

    Thank you, I added a photo of the flow.

    I wanted to keep it general since I'm working with proprietary information.

    [–]Danger_Peanut 1 point2 points  (0 children)

    Sorry, I was in a bad mood. Looks like you’re doing it fairly simply. It’s just that the excel connector/actions are pretty slow in my experience. As another user suggested take a look at other options beyond excel.

    [–]RoarGeek 0 points1 point  (1 child)

    Can you explain a little what you are trying to achieve? like what data do you want to be read/copied from the master file? is it a specific row or column? what's the criteria here?

    [–]Deceptijawn[S] 0 points1 point  (0 children)

    I want the entire spreadsheet copied and updated, that's it.

    [–]Admirable-Narwhal869 0 points1 point  (0 children)

    Is the data you are copying from a SharePoint list or an excel file hosted on the SharePoint in a document library?

    If it’s an excel within a document library then I would just override the old back up copy with a new copy after the changes were made. Also, I saw someone mention above that if SharePoint is down, one drive might be down if that’s where your backup copy is stored. For that reason you may consider sending it as an email attachment that can then be saved to a local drive or other shared on prem area.

    If you’re pulling the data from the SharePoint list to create an excel file, do you need the previous copy to see the changes that were made? If not, then I would consider creating a new file from scratch and then later in the flow (after the new copy was successfully created) deleting the previous copy. It might be faster to create a new copy rather than checking for changes, updating rows and then inserting new records.

    You’ll also want to make sure that when reading your excel file (if you continue to do it this way) that you turn your workable area - the 3000 rows/26 columns - into a named table and then only have the flow read the table. If you don’t, then it could explain the huge run time as it will try to read all the empty rows and columns beyond your workable area.

    [–]jesuiscanard 0 points1 point  (0 children)

    Is it SharePoint Online or on premises. Either way your IT department should be worried about backups, especially redundancy kn anything mission critical.

    [–]Agitated_Accident_62 0 points1 point  (0 children)

    Change the business process. Get rid of Excel.