Huge workbook, lots of tabs & macros--should I use something other than Excel?
Our take
Managing extensive data in Excel can become cumbersome, especially with a workbook featuring numerous tabs and macros. If you're finding your current setup overwhelming, it may be time to explore alternatives that enhance efficiency and organization. Consider solutions like SharePoint and Power BI, which can help transition your data management to a more scalable, web-based platform. For practical guidance on optimizing your data handling, check out our article on "Using a separate table to split records to fields in Power Query.
The growing complexity of spreadsheets in business environments is a common challenge many professionals face. In the case of the purchasing manager who has crafted a robust Excel tool for order and logistics data, we see a classic example of how initially effective solutions can become unwieldy over time. With nearly 20 tabs and 30 macros, it's no wonder that questions arise about scalability and usability. The reality is that as data needs evolve, so too must the tools we use. This scenario highlights the limitations of traditional spreadsheet software, especially when it becomes a crutch rather than a catalyst for productivity.
The user's reliance on macros and conditional formatting underscores a critical point: while Excel provides powerful functions, it can also lead to cumbersome and inefficient workflows. The implementation of XLOOKUP and macros demonstrates an innovative approach to managing data, yet the resultant complexity can ultimately hinder rather than help. As we explore alternatives, it's essential to recognize that tools like SharePoint and Power BI, as suggested by an AI, can offer a more structured and scalable approach. These platforms not only facilitate collaboration but also enable users to create interactive dashboards that can simplify decision-making processes, allowing for a more intuitive data exploration experience. For those grappling with similar challenges, understanding how to leverage such tools is vital—consider reading our article on Using a separate table to split records to fields in Power Query for insights on data management strategies.
The hesitance to trust AI recommendations is understandable, especially in a space where the stakes are high, and data integrity is paramount. However, the evolution of data management tools reflects a broader trend towards embracing technology that empowers users rather than complicating their tasks. It’s essential to foster a mindset that welcomes innovation while remaining cautious of its implementation. By doing so, professionals can navigate the transition from legacy systems to more advanced, integrated solutions that enhance productivity without sacrificing control or simplicity.
Ultimately, the conversation around the limitations of Excel is a microcosm of the broader challenges faced in data management today. As businesses accumulate more data, the need for efficient, user-friendly tools becomes increasingly critical. The shift from traditional spreadsheets to web-based solutions like SharePoint and Power BI not only addresses issues of scale but also promotes a more collaborative and dynamic environment for data handling. As we look to the future, the question arises: how can organizations better equip their teams to embrace these technological advancements without overwhelming them? The answer lies in providing the right guidance and education to ease the transition, ensuring that users feel empowered rather than daunted by new possibilities.
In conclusion, the journey from a complex Excel workbook to a more streamlined data management solution presents an opportunity for growth and innovation. As we watch this space evolve, our focus should remain on how to empower users to harness the full potential of their data without being bogged down by the tools they use. Embracing a future-focused approach will not only improve individual workflows but also contribute to a more agile and responsive business environment. For more on balancing traditional methods with modern solutions, consider exploring our piece on Conditional formatting based on a checkbox, which delves into practical applications of Excel in creative ways.
I work in purchasing for a large company and I have been using an Excel based tool to manage order and logistics data for years. I started writing macros to help with my tasks and now the workbook has nearly 20 tabs and 30-odd macros...things are getting way too beefy and I am wondering if I need to use something else to manage it. The general layout is:
- 2 tabs with order data (one at PO level, one at line item level) using conditional formatting, XLOOKUP columns that reference other related data on other tabs, and macros for importing/processing new data.
- 2 tabs to track receiving--one is 2 years of history and the other is a working sheet w/macros to pull receiving for a given number of days, look at the line items against one of the first 2 tabs, and mark lines that are not full order qty (among other things). Also has a macro that 'steps' through each unique PO# in receiving record and filters the first 2 tabs + receiving history tab for that PO# so I can enter things into purchasing system more efficiently.
- Several tabs for reference data (logistics reports, truck schedules, etc.) which are either referenced manually as needed or actually feed into an XLOOKUP on the first tab.
- 2 tabs with a macro to pull certain PO data based on several parameters, which I manually review and then use another macro to automatically generate emails to vendors.
An AI told me I could use Sharepoint and Power BI to recreate the tool to be web-based and able to handle the volume of data I am working with, but I don't trust AIs lol.
Anyone have recommendations of programs or strategies? I have no help from my company so I am doing this on my own to help myself.
[link] [comments]
Read on the original site
Open the publisher's page for the full experience
Related Articles
- 12 year analyst feeling like a dinosaur. Need advice on moving away from massive flat files without forcing Power BI on my team.I’ve been a Reddit user for a long time, but I’ve recently hunted and scoured the internet for communities to help me with this problem and I heard that this was possibly the place to be. I work as an Analyst in the CPG space (NWA area) and have been doing this for over a decade now. I know my way around a spreadsheet pretty well, but I feel like my technical skills kinda froze in time around 2020. Before COVID hit, I was finally learning proper Data Modeling and Power Pivot from a mentor, but then we got sent home, things changed and I’ve basically been surviving on VLOOKUPs and brute force ever since. I actually inherited some pretty advanced VBA tools from that mentor back in the day, but they were built for the old legacy system (DSS/Retail Link). When the retailer migrated everything over to the new platforms (Luminate/Scintilla and Madrid), all that old automation effectively died. The new export formats and cell limits broke the old code, so I never really rebuilt them and I'm back to manual stitching. I realized about 6 months ago that I am falling behind. I’m still building massive flat-file reports the hard way and it’s killing me. The situation is basically this: I pull data from a vendor portal (think Unify style system) that has a hard export limit of like 3M cells. If I want to pull 52 weeks of history for 500+ items across hundreds of locations, I get the "cell count exceeds limit" error. So I have to pull it in chunks—Dollars, Units, PODs separately—and stitch them together manually. It is a massive pain. To make it worse, my stakeholders are super old school. They want Excel files they can touch, pivot, and scribble on. If I send them a Power BI link, they won't even open it. So I need the flexibility of Excel, but the data volume is getting too big for the standard sheets I'm building. My goal is to build a "set it and forget it" system. I want to be able to just drop those raw data exports into a folder and have Excel just "eat" them. I know Power Query is probably the answer for the stitching, but is Power Pivot still the best way to handle the data model part? I need it to handle the heavy lifting (52 weeks, store level data) but output into standard Pivot Tables that my buyers can still play with. Also, side note on AI—everyone tells me to just use ChatGPT to write python scripts, but that’s not really what I need. I’m trying to get my data structure clean enough that I can feed the final Pivot Tables into an LLM to help me write the recaps/insights. Has anyone had luck with that workflow? Any advice on where to start or a specific course to bridge the gap between "VLOOKUP guy" and "Data Model guy" would be awesome. Thanks. submitted by /u/Excel_Dino_2026 [link] [comments]
- Workbook from Microsoft Form encountering very long load times from excessive complex formulasGood evening, I work in a food production plant in Shipping and Receiving. We have had Microsoft Forms for entering in daily cases produced, cases shipped, and a separate form for doing time studies on trucks that come in, how long to load or unload said truck, and when they leave. I have had a manual workbook to fill in all of this data basically again (this information gets entered into these daily reports we fill out in our Microsoft forms) but to organize it into an easy daily report to give us truck In to Out averages, loading time averages, cases produced vs what was scheduled to produce, etc.. A big issue I have had with this manual data entry workbook, which are done month by month, is the amount of formulas which I have in it..(multiplying cases by item number to give us weight and how many skids, calculating our scheduled amount to produce against what's actually produced, giving percentages, many conditional formatted cells to easily show if we are in the green or red, etc.) Now my boss has always wanted a workbook to do what my manual workbook does but to grab the data from the Excel workbook that these Microsoft forms load the data into. The problem before was we had two separate Microsoft forms for daily cases produced/shipped and the one for our time studies. But I went ahead and made one form which would do both. I was able to copy over many sheets and formulas from my manual workbook into the Excel spreadsheet that loads in the data from this Microsoft Form. My boss really wants it to work indefinitely.. The problem I am encountering which I was afraid of, is the amount of formulas in this one workbook is way too much for a computer to handle. Changing 1 thing results in it needing to calculate a thread for like 20-30 minutes (like with the manual excel spreadsheet, the manual processor has been set to 1). Am I just going about this all wrong? Is there a better way to grab the data from this form that isn't going to overload a computer? Do I make separate workbooks pulling from this form's Excel workbook and just keep the daily report with the initial Microsoft Form workbook (but then would those workbooks update automatically as well?) I imagine there is a way to achieve what my boss is wanting, but my experience with Excel is only so advanced. I'm aware there are other programs or other tools of excel, and that is why I came onto this subreddit for advice. Please help me 🙇🏻♂️ submitted by /u/maverickrose [link] [comments]
- how did you improve your workplace's legacy vba macros?I recently transitioned to a non-clinical role in a public health care system. part of the on-boarding was a 12 page, 20 step tutorial on how to 'do the macros'. The workflow simplified is: - Get source data from EHR/BI - Open the excel online (microsoft 365) "Daily Review" workbook in the desktop ms excel. (hopes and prayers it doesn't crash) - copy data (columns of patient ID, demographics, medications... you get the idea) from EHR, paste into this Daily Review - run macro (click a button) which cleans, filters, applies conditional formatting i think - save - go back to excel online and resume editing there. The VBA code was created (not sure if it was written coz it has no documentation) by a colleague who is on extended mat leave. I can see a lot of 'modules'. Can't tell which is active. There are probably lots of historical decisions. The daily review file with its many many sheets is saved in multiple locations in case newcomers like me or others break it by accident. I am told we can't change anything like move a column closer to the beginning coz well we can't. I don't know VBA but could probably figure it out if I watch a tutorial on it. I am linux user and know basics of C, python and make good use of my claude code with the pro subscription but never really worked with spreadsheets. I am wondering if anyone was in a similar situation and how you managed it. Is moving to office scripts (typescript) a viable alternative? Any other life improving tips would be appreciate it. Or maybe I should just give up and focus the energy elsewhere? submitted by /u/Neat-Badger-5939 [link] [comments]
- How to deal with a bulky spreadsheet that is starting to hit the limits of Excel?Hello all, I have been venturing on quite the Excel journey the past year or so. I made a corporate spreadsheet that is approaching 500k formulas and that is starting to get serious speed issues at this point. It is 2026, so I conversed with ChatGPT several times regarding the speed issue, but realized I am way better off asking the experts here anyways. What is the problem So, my spreadsheet imports flat databases with specific information regarding objects that need further analysing. The imported flat databases run from say A tot CC or something, from which I probably draw about 12-15 datafields that are used for further analysis. It 'may' be more in the future. Afterwards, said data gets 'enriched' (manually) by things that aren't in the database, also because said data needs a human eye that cannot be automated. So far, so good. Right now, each object gets analysed from several different angles. As it stands, my spreadsheet runs from A until NA or something on the Formula Page. Many columns receive data from preceding columns, that are in the turn the result of many (slightly complex) logical IF or IFS tests, many of which are nested 3 or 4 deep. Often, they work in conjunction with X.LOOKUP to retrieve values, as the columns on the formula page are not equal. For example: A until BC on the Formula Page may analyze 150 objects, BD until DD may analyse 100 objects (from the same dataset, so narrower), and so forths. Thus a lot of X.LOOKUP is required, also because the first 'block' comes up with values that need to be found with X.LOOKUP. Also, values need to be retrieved from the flat database 'import' page with X.LOOKUP. Finally, X.LOOKUP is an insurance compared to FILTER, as I am not fully convinced that empty values in the flat database always contain a space (" "). To get to the point I use many IF, IFS, AND, and if need be, OR, formulas. Thinks: tens of thousands, probably in excess of 100k. These are compounded with X.LOOKUP, or X.LOOKUP gets used copiously without those. Here too, think tens of thousands. These formulas are - as much as possible - in array format, even though I find it controversial to do that as I consider how it can create a chain of updates throughout the spreadsheet. 'Dependencies' is the name of the game, with one object receiving many possible alterations / adjustments due to manual input data, for which the spreadsheet needs to provide. Right now, when I update a value, it may take up to 4 seconds to update the spreadsheet, which is already beyond the annoyance point for me. This leads me to these (hopefully) simple questions: Is it smart to use array formulas, knowing that each thing I change should only impact that one object line (for example, row 488) and none other? It is important to mention that object 1 does not influence object 488, or any other. Any manual data field only effects the object in the row it is in. In my mind, array formulas do not make sense in that regard, as it can result in a cascade of updates, but apparantly array formulas are 'way more efficient'. Is use of a VBA library the way to go to reduce lag and create more of an instant spreadsheet again? I am not able to code in VBA yet, but I am in the slow process of learning it regardless. Alternatively: should I use LET whenever a repeated lookup is needed in the same formula? Really looking for to your answers! submitted by /u/EvolvedRevolution [link] [comments]