How to check/improve for perfomance on workbook
Our take
So last week my colleague was working on some super simple reports files. But her laptop was getting super slow (not usual), we thought it was suddenly her laptop having issues and it didn't help that she also had a few workbooks open at the same time.
Now this week and I starting to use them and I am noticing a similar slowdown. So wondering what are some check and debugging we can do to check performance.
These are old reports file, my first assumption is that they used to have a few connections aren't in use anymore or they are named ranges that overlap. Overall the file is simple and self-contained, but the issue could arise when we add other tabs from other similar reports that keep the same named range name but reference the other file. (Think a file that calculates a country's cost and revenue and we need to add another one to do a comparison).
I guess I will start erasing those connections and named ranges first, but wondering what other checks are often done and how do you do performance checks (find which sheet or formula is taking the most compute power)
[link] [comments]
Read on the original site
Open the publisher's page for the full experience
Related Articles
- Slow spreadsheet - need troubleshootingHi, I have a spreadsheet that has two tabs, one is essentially the original data which is YTD driven for a particular GL account, the company has smaller amounts of transactions, so by December we are talking about maybe 3-5k rows of transactions for the account total. The main tab being utilized, has about 30 columns of look up and sumifs formulas referencing the source data and in total approx maybe 500 rows by year end? To me it doesn’t seem excessive. I’ve dealt with way heavier spreadsheets that have more omph and run faster. But for some reason this one is slow as all hell to work in. I’ve even tried barcoded some data and not seen any improvement. I’m not too techy into what else could be slowing it down. And ideas on what to troubleshoot from here? submitted by /u/SlideTemporary1526 [link] [comments]
- How to deal with a bulky spreadsheet that is starting to hit the limits of Excel?Hello all, I have been venturing on quite the Excel journey the past year or so. I made a corporate spreadsheet that is approaching 500k formulas and that is starting to get serious speed issues at this point. It is 2026, so I conversed with ChatGPT several times regarding the speed issue, but realized I am way better off asking the experts here anyways. What is the problem So, my spreadsheet imports flat databases with specific information regarding objects that need further analysing. The imported flat databases run from say A tot CC or something, from which I probably draw about 12-15 datafields that are used for further analysis. It 'may' be more in the future. Afterwards, said data gets 'enriched' (manually) by things that aren't in the database, also because said data needs a human eye that cannot be automated. So far, so good. Right now, each object gets analysed from several different angles. As it stands, my spreadsheet runs from A until NA or something on the Formula Page. Many columns receive data from preceding columns, that are in the turn the result of many (slightly complex) logical IF or IFS tests, many of which are nested 3 or 4 deep. Often, they work in conjunction with X.LOOKUP to retrieve values, as the columns on the formula page are not equal. For example: A until BC on the Formula Page may analyze 150 objects, BD until DD may analyse 100 objects (from the same dataset, so narrower), and so forths. Thus a lot of X.LOOKUP is required, also because the first 'block' comes up with values that need to be found with X.LOOKUP. Also, values need to be retrieved from the flat database 'import' page with X.LOOKUP. Finally, X.LOOKUP is an insurance compared to FILTER, as I am not fully convinced that empty values in the flat database always contain a space (" "). To get to the point I use many IF, IFS, AND, and if need be, OR, formulas. Thinks: tens of thousands, probably in excess of 100k. These are compounded with X.LOOKUP, or X.LOOKUP gets used copiously without those. Here too, think tens of thousands. These formulas are - as much as possible - in array format, even though I find it controversial to do that as I consider how it can create a chain of updates throughout the spreadsheet. 'Dependencies' is the name of the game, with one object receiving many possible alterations / adjustments due to manual input data, for which the spreadsheet needs to provide. Right now, when I update a value, it may take up to 4 seconds to update the spreadsheet, which is already beyond the annoyance point for me. This leads me to these (hopefully) simple questions: Is it smart to use array formulas, knowing that each thing I change should only impact that one object line (for example, row 488) and none other? It is important to mention that object 1 does not influence object 488, or any other. Any manual data field only effects the object in the row it is in. In my mind, array formulas do not make sense in that regard, as it can result in a cascade of updates, but apparantly array formulas are 'way more efficient'. Is use of a VBA library the way to go to reduce lag and create more of an instant spreadsheet again? I am not able to code in VBA yet, but I am in the slow process of learning it regardless. Alternatively: should I use LET whenever a repeated lookup is needed in the same formula? Really looking for to your answers! submitted by /u/EvolvedRevolution [link] [comments]