Excel Performance optimisation tips!
Our take
Working in demand planning I have got it the point where I am making some pretty advanced files using a suite of techniques.
My files often have lots of rows, with lots of Columns of complex formula including with sumifs, xloopup, ifs & Let. I’ve not advanced to using tables regularly though as I find the constraints & syntax annoying but am trying to get there & have started using power query to blend data for output analysis.
The problem I am encountering is I filter ALOT drilling down into product groups etc, & excel tends to ‘hang’ a lot with ‘Not Responding’.
Now I’m not sure it’s due to an underpowered machine (intel core i7 HP Elitebook) or, more likely lots of complex formula referencing ranges or tables.
My question to the hive brain: share your optimisation tips & tricks!
-Can Lamda combined with Let speed things up?
-Are Tables vital to speeding up complex sumifs & lookups?
- are match helper columns combined with Index leaner & faster than xlookup?
Hit me with best tips & tricks!
[link] [comments]
Read on the original site
Open the publisher's page for the full experience
Related Articles
- I’ve been using Excel more lately and I’m trying to understand some of its more advanced features without making everything overly complicatedWhen working with data that has multiple conditions or needs to update automatically, what are the most efficient functions or tools to use? for example, is it better to rely on formulas like XLOOKUP and FILTER, or are there built in tools that handle this more cleanly? Also, how does excel handle performance when formulas start getting longer or more complex? Is there a point where using too many formulas slows things down significantly? What are the best built-in features in Excel for handling complex data in a simple way? submitted by /u/icepix [link] [comments]
- How to deal with a bulky spreadsheet that is starting to hit the limits of Excel?Hello all, I have been venturing on quite the Excel journey the past year or so. I made a corporate spreadsheet that is approaching 500k formulas and that is starting to get serious speed issues at this point. It is 2026, so I conversed with ChatGPT several times regarding the speed issue, but realized I am way better off asking the experts here anyways. What is the problem So, my spreadsheet imports flat databases with specific information regarding objects that need further analysing. The imported flat databases run from say A tot CC or something, from which I probably draw about 12-15 datafields that are used for further analysis. It 'may' be more in the future. Afterwards, said data gets 'enriched' (manually) by things that aren't in the database, also because said data needs a human eye that cannot be automated. So far, so good. Right now, each object gets analysed from several different angles. As it stands, my spreadsheet runs from A until NA or something on the Formula Page. Many columns receive data from preceding columns, that are in the turn the result of many (slightly complex) logical IF or IFS tests, many of which are nested 3 or 4 deep. Often, they work in conjunction with X.LOOKUP to retrieve values, as the columns on the formula page are not equal. For example: A until BC on the Formula Page may analyze 150 objects, BD until DD may analyse 100 objects (from the same dataset, so narrower), and so forths. Thus a lot of X.LOOKUP is required, also because the first 'block' comes up with values that need to be found with X.LOOKUP. Also, values need to be retrieved from the flat database 'import' page with X.LOOKUP. Finally, X.LOOKUP is an insurance compared to FILTER, as I am not fully convinced that empty values in the flat database always contain a space (" "). To get to the point I use many IF, IFS, AND, and if need be, OR, formulas. Thinks: tens of thousands, probably in excess of 100k. These are compounded with X.LOOKUP, or X.LOOKUP gets used copiously without those. Here too, think tens of thousands. These formulas are - as much as possible - in array format, even though I find it controversial to do that as I consider how it can create a chain of updates throughout the spreadsheet. 'Dependencies' is the name of the game, with one object receiving many possible alterations / adjustments due to manual input data, for which the spreadsheet needs to provide. Right now, when I update a value, it may take up to 4 seconds to update the spreadsheet, which is already beyond the annoyance point for me. This leads me to these (hopefully) simple questions: Is it smart to use array formulas, knowing that each thing I change should only impact that one object line (for example, row 488) and none other? It is important to mention that object 1 does not influence object 488, or any other. Any manual data field only effects the object in the row it is in. In my mind, array formulas do not make sense in that regard, as it can result in a cascade of updates, but apparantly array formulas are 'way more efficient'. Is use of a VBA library the way to go to reduce lag and create more of an instant spreadsheet again? I am not able to code in VBA yet, but I am in the slow process of learning it regardless. Alternatively: should I use LET whenever a repeated lookup is needed in the same formula? Really looking for to your answers! submitted by /u/EvolvedRevolution [link] [comments]
- Slow spreadsheet - need troubleshootingHi, I have a spreadsheet that has two tabs, one is essentially the original data which is YTD driven for a particular GL account, the company has smaller amounts of transactions, so by December we are talking about maybe 3-5k rows of transactions for the account total. The main tab being utilized, has about 30 columns of look up and sumifs formulas referencing the source data and in total approx maybe 500 rows by year end? To me it doesn’t seem excessive. I’ve dealt with way heavier spreadsheets that have more omph and run faster. But for some reason this one is slow as all hell to work in. I’ve even tried barcoded some data and not seen any improvement. I’m not too techy into what else could be slowing it down. And ideas on what to troubleshoot from here? submitted by /u/SlideTemporary1526 [link] [comments]
- How do I Maximize File EfficiencyI work with data sets that I typically look at forecast by year. Currently when I look at 2026 and 2027 it is rougly 1.4M lines of data. I have to put these in two separate data pulls and tables. Then I have six different customers included in this data. so I have to create 6 tabs with six diffrent pivot tables for them to look at. This has created a massive file that lags just to open, save or close so I really have two questions and am open to suggestions. Would it be better to store the data in one worksheet and then link a second worksheet that just has the pivot tables and separated look? If so how would I creat that link? Can you explain to me like I am 5 how I would use power query to combine the 26 and 27 table so that they could be in the same pivot table? Every column in both are identical. submitted by /u/dcal69 [link] [comments]