•1 min read•from Microsoft Excel | Help & Support with your Formula, Macro, and VBA problems | A Reddit Community
Best way to automate data refresh for multiple power queries pulling from web sources?
Our take
Automating data refresh for multiple Power Queries can significantly enhance your workflow, especially when dealing with various web sources. Manually refreshing 15 queries can be time-consuming, and while VBA macros may streamline the process, they often lead to slower startup times. Exploring solutions like Power Automate could provide a more efficient scheduling option without overwhelming your system. By sharing your experiences, you can uncover effective methods others have used to optimize this process, ensuring you spend less time managing data and more time leveraging insights.
Ive got a workbook with about 15 power queries pulling data from different web sources. Some need to refresh daily, others weekly. Right now Im just clicking refresh all manually but its getting tedious. I tried setting up a VBA macro to refresh them on open but it slows down startup a ton. Anyone found a solid way to schedule these refreshes or optimize the process? Considering power automate but not sure if its overkill. Curious how others handle this since my data sources are pretty stable at this point.
[link] [comments]
Read on the original site
Open the publisher's page for the full experience
Related Articles
- I have a power query in sharepoint that consolidates all a folder with many other excel files. How do i make it such that it refreshes automatically without opening it?hi all, as my title states. the total rows adds up to around 100k , up to 500k by the end of the year. it takea me roughly 5 to 10 mins for the power query to fresh the entire dataset manually each time. This consolidated file is also used my many others. Hence i was wondering if i can have it auto refreshed lets say first thing in the morning. submitted by /u/iTakoyaki [link] [comments]
- How to refresh a table that is tied to a sharepoint form without going to the table every timeHello! I have a couple of tables that are populated from forms that various team members fill out and I bring this into a sales report via Power Query. For quite a long time, when I would refresh the PQ query that would bring in the new data, the table would auto refresh but awhile ago there was a change that broke this automatic refresh (I think a MS update from awhile ago?) and now I have to go to each table that has a form, open the sharepoint link, let it update, and then refresh my query to bring in the data. This is an improvement from a share file that had too many issues with users entering wrong data and has been the best solution for a while but the having to manually go to each table and refresh it daily is driving me crazy. What are my options to get this to refresh automatically if its within a form? I tried to dip my toe into power automate but for some reason these forms/tables do not show up within it and I have not had success using that (but I could also be doing it wrong). submitted by /u/Moudy90 [link] [comments]
- Power Query refresh speed with multiple usersHello all. I have a report that has multiple queries with data sources saved in SharePoint. Enable Fast Data load is checked in all the queries. There are about 10 users, with reported run time of around 5-10 minutes. But for me and one user, it takes 30-60 minutes to finish refreshing. I have already made some improvements in the initial code, and did a clean up of old raw data files in SharePoint but this resulted to minimal improvements. Given that all users have the same laptop configurations and are connected to the same internet connection, what are some steps I can do to do improve the speed on the queries? submitted by /u/SYSTEMOFADAMN [link] [comments]
- How to batch process and refresh multiple excel files in parallel?Hi all, I'm looking for an efficient way to automate the refreshing of 116 Excel files located in a single directory. Each of the 116 files runs a data query to an ERP that takes 40-60 seconds to complete. My current scripts (in Python, PowerShell, and VBA) process the files sequentially. This means the total time is roughly 116 files * 1 minute/file ≈ 2 hours, which is too slow. My manual process is much faster (20-40 minutes total) because I process files in batches: I open a batch of about 14 files at once. I trigger "Refresh All" on each of them. Since the queries run in the background, by the time I've triggered the last file, the first ones are nearly done. I then go through the batch, saving and closing each file. I repeat this for the next batch until all 116 files are done. How can I create a script (ideally in Python or PowerShell) that mimics this parallel, batch-based approach? I need a solution that can manage multiple files concurrently to be faster than my manual method, instead of processing them one by one. The script must wait for all data queries to finish refreshing before it saves and closes the files in a batch. submitted by /u/Specific-Channel-287 [link] [comments]
Tagged with
#Excel alternatives for data analysis#generative AI for data analysis#big data management in spreadsheets#conversational data analysis#real-time data collaboration#intelligent data visualization#data visualization tools#enterprise data management#big data performance#data analysis tools#data cleaning solutions#rows.com#Excel compatibility#financial modeling with spreadsheets#Excel alternatives#natural language processing for spreadsheets#data refresh#power queries#VBA macro#scheduled refreshes