New accounting job - massive databases in Excel!
Our take
Congratulations on your new role as a Finance Business Partner! It sounds like you’re stepping into an exciting opportunity to transform outdated practices. The issues with oversized Excel files and slow data refreshes are common in many organizations. To tackle this, consider starting with Power Query to filter your data extraction from Sage X3, focusing on relevant fields and date ranges. Additionally, exploring Power Pivot and Power BI can streamline your reporting processes.
In the evolving landscape of finance and data management, the challenges faced by professionals like Rob, a new Finance Business Partner, underscore a critical juncture for many organizations still relying on outdated practices. His experience highlights the inefficiencies born from traditional spreadsheet usage, particularly when it comes to managing extensive databases in Excel. As Rob points out, legacy systems not only burden users with cumbersome files filled with redundant formulas but also hinder productivity through slow data retrieval and processing times. This scenario is all too common in businesses that have yet to embrace modern data management strategies.
The reliance on tools like Sage X3 for accounting, coupled with ODBC data extraction into Excel, often leads to a chaotic data environment. Rob's frustration with the slow refresh rates when pulling extensive datasets reflects a broader issue: many organizations extract more data than they need, piling unnecessary complexity onto their operations. This situation is echoed in discussions around issues like conditional formatting for large datasets or the frustrations of stock prices not updating reliably in spreadsheets, as seen in articles like Conditional formatting for specific character count and Does anyone have issue of stock prices stopped updating?. These discussions not only resonate with Rob's experience but also serve as a clarion call for organizations to reassess their data practices.
For finance professionals, the journey towards efficiency often begins with the adoption of tools like Power Query and Power Pivot. These technologies empower users to create more manageable datasets and streamline reporting processes, allowing for tailored data extraction that meets specific needs without overwhelming the system. By focusing on essential fields and filtering data appropriately, finance teams can transform their workflows, making them more agile and responsive. Rob's awareness of these tools signals a crucial first step; yet, his hesitation reflects a common sentiment among users who may feel overwhelmed by the learning curve associated with adopting new technologies.
The future of finance and data management lies in empowering individuals like Rob to take charge of their data environments. As organizations continue to grapple with legacy systems, the need for innovative solutions becomes increasingly apparent. Investing in training for tools like Power BI can help finance professionals not only to visualize their data effectively but also to derive actionable insights that can drive business decisions. This shift towards a more data-driven culture is vital as companies strive to remain competitive in a rapidly changing landscape.
Looking ahead, we must ask ourselves how we can better support professionals transitioning from outdated methods to more efficient, future-focused solutions. Will organizations prioritize investing in training and resources that foster innovation? The implications of this shift are profound, impacting not just individual productivity but also the overall agility and resilience of businesses in an increasingly data-centric world. As we continue to explore these transformative solutions, it's clear that the future of finance will be defined by those who are willing to embrace change and drive progress.
I’ve just started a new job as a Finance Business Partner at a relatively small business. They do A LOT of things the old fashioned way and there’s plenty of opportunities to improve things.
One of my biggest bugbears at the moment is the sizes of some of their files. Old, redundant formulae and ranges in formulae spanning 10000+ rows when they only need 100 for example is making the files huge and slow to open and run.
They use Sage X3 as their accounting software and extract data into Excel via ODBC - I haven’t seen how this is done yet, but it’s as if it’s extracting all data, in all fields, for all time. And to refresh takes a fucking age.
I want to create something new for querying the database to give me only certain fields, from certain nominal codes, filtered by date ranges.
I know I should be looking into creating power pivot and power query files and also looking at reporting using Power BI, but I haven’t looked into that fully yet. (I know I’m falling behind here!)
Where’s the best place to start unpicking this mess?
[link] [comments]
Read on the original site
Open the publisher's page for the full experience
Related Articles
- 12 year analyst feeling like a dinosaur. Need advice on moving away from massive flat files without forcing Power BI on my team.I’ve been a Reddit user for a long time, but I’ve recently hunted and scoured the internet for communities to help me with this problem and I heard that this was possibly the place to be. I work as an Analyst in the CPG space (NWA area) and have been doing this for over a decade now. I know my way around a spreadsheet pretty well, but I feel like my technical skills kinda froze in time around 2020. Before COVID hit, I was finally learning proper Data Modeling and Power Pivot from a mentor, but then we got sent home, things changed and I’ve basically been surviving on VLOOKUPs and brute force ever since. I actually inherited some pretty advanced VBA tools from that mentor back in the day, but they were built for the old legacy system (DSS/Retail Link). When the retailer migrated everything over to the new platforms (Luminate/Scintilla and Madrid), all that old automation effectively died. The new export formats and cell limits broke the old code, so I never really rebuilt them and I'm back to manual stitching. I realized about 6 months ago that I am falling behind. I’m still building massive flat-file reports the hard way and it’s killing me. The situation is basically this: I pull data from a vendor portal (think Unify style system) that has a hard export limit of like 3M cells. If I want to pull 52 weeks of history for 500+ items across hundreds of locations, I get the "cell count exceeds limit" error. So I have to pull it in chunks—Dollars, Units, PODs separately—and stitch them together manually. It is a massive pain. To make it worse, my stakeholders are super old school. They want Excel files they can touch, pivot, and scribble on. If I send them a Power BI link, they won't even open it. So I need the flexibility of Excel, but the data volume is getting too big for the standard sheets I'm building. My goal is to build a "set it and forget it" system. I want to be able to just drop those raw data exports into a folder and have Excel just "eat" them. I know Power Query is probably the answer for the stitching, but is Power Pivot still the best way to handle the data model part? I need it to handle the heavy lifting (52 weeks, store level data) but output into standard Pivot Tables that my buyers can still play with. Also, side note on AI—everyone tells me to just use ChatGPT to write python scripts, but that’s not really what I need. I’m trying to get my data structure clean enough that I can feed the final Pivot Tables into an LLM to help me write the recaps/insights. Has anyone had luck with that workflow? Any advice on where to start or a specific course to bridge the gap between "VLOOKUP guy" and "Data Model guy" would be awesome. Thanks. submitted by /u/Excel_Dino_2026 [link] [comments]
- Accounting specific advice for creating workpapers?I'm currently working as an accountant at a company where many of the workpapers are a patchwork mess. I went to school for both accounting and data analytics, so I understand some basic data design principles. I also have a basic understanding of Excel and Power Query. The main issue I have is that from my experience accounting files often have different requirements than a data model used for FP&A. Our files need to be reviewable, auditable, ect. since our job invovles regular audits and compliance checks. We need to maintain the support and basically have monthly snapshots of our work. Currently I have been designing the workpapers with tabs specficially for pasting in the report data and then referencing that with Power Query. While this is okay, it can be a little clunky and involve an intermediate step sometimes. For instance, instead of manually inputing invoice data into a workpaper, I created a separate excel file which uses Power Query to pull information from several folders where I dump copies of invoices based on format. Then I copy and paste that data into our actual workpaper. What is some advice you would give to someone recreating workpapers and how data is treated in an accounting environment? submitted by /u/En_Esta_Economia [link] [comments]
- Tools limited. How to automate multiple SQL server queries -> Excel workflow at work?Hi everyone, The initial process was to use a macros enabled excel template for data cleaning and reconciliation (we can still use macros but just this process alone takes a long time to get thru thousands of accounts cos each account needs to be reconciled). I would, -> run a couple of different queries in sql server -> copy & paste results into the excel template -> clean and reconcile debit/credit -> color code and mark tabs to be sent to manager for approval along with a sox template. I need this entire process automated somehow. My permissions are limited so at this point I can only work with sql, excel & power query based on my research (I don’t have prior experience with power query) Has anyone here done something similar before cos I could use some advice. I am trying to see how to integrate the many queries into this as well as what the end product should look like. I just want to create a more efficient process so that I can show my managers and perhaps they can incorporate it in a bigger scale if applicable. Thanks in advance! submitted by /u/Acrobatic_Sample_552 [link] [comments]
- Excel Performance optimisation tips!Working in demand planning I have got it the point where I am making some pretty advanced files using a suite of techniques. My files often have lots of rows, with lots of Columns of complex formula including with sumifs, xloopup, ifs & Let. I’ve not advanced to using tables regularly though as I find the constraints & syntax annoying but am trying to get there & have started using power query to blend data for output analysis. The problem I am encountering is I filter ALOT drilling down into product groups etc, & excel tends to ‘hang’ a lot with ‘Not Responding’. Now I’m not sure it’s due to an underpowered machine (intel core i7 HP Elitebook) or, more likely lots of complex formula referencing ranges or tables. My question to the hive brain: share your optimisation tips & tricks! -Can Lamda combined with Let speed things up? -Are Tables vital to speeding up complex sumifs & lookups? - are match helper columns combined with Index leaner & faster than xlookup? Hit me with best tips & tricks! submitted by /u/NZGRAVELDAD [link] [comments]