•1 min read•from Microsoft Excel | Help & Support with your Formula, Macro, and VBA problems | A Reddit Community
Any way of mass data dumps in excel? Or Automation?
Our take
If you're dealing with massive data entries, like the 500,000 lines from your sales company, finding efficient ways to manage and combine reports is crucial, especially for an IRS audit. Instead of manually compiling and sorting, consider exploring automation tools that can streamline the process. Utilizing Excel's Power Query can help consolidate your data from multiple reports quickly and efficiently. Embracing these innovative solutions not only saves time but also empowers you to focus on analyzing the data rather than just compiling it.
I am pulling general ledger data for my company for an IRS audit and our sales company has almost 500k lines of entries each day and I’ve pulled all of that out of sap but the problem is. I had to pull two separate reports for each and now I need to combine them.
I’ve done it for the smaller companies and just compiling and sorting with a query takes an hour each. Have no other idea how to get it quicker. Looking for any tips
[link] [comments]
Read on the original site
Open the publisher's page for the full experience
Related Articles
- I'm in search of a way to batch extract data from PDFs into Excel?Right now, I have about 300 invoices sitting in a folder and the thought of typing these into a spreadsheet manually will definitely take lots of my time. Now the thing is most of them are the same layout but there are a few outliers. I’m thinking there may be a way to automate this directly in Excel or a tool that isn't going to cost me a fortune, I really don't want to spend my entire weekend on data entry. Thanks in Advance. submitted by /u/justfortodaymyguy [link] [comments]
- Any way to automate removal of older rows of data?I have a spreadsheet with 4 columns: first name, last name, score and date. I have people who are duplicated, some with the same date which I can easily remove with "remove duplicates" but I have examples where there are people with multiple rows from where they have taken a test a few years later, and i am trying to find a way to optimise my chopping up of this spreadsheet to only have a single row per user, and showing only their most recent score for the test. The date column is dd/mm/yyyy and then a 24 hour format time stamp and I can't think of a good way to optimise that as it covers multiple years. Theres no good consistency over the old date and the most recent date I imagine excel has some way of pruning older data. Atleast I hope so or ill have to check 50000+ rows manually to remove old results 😭 submitted by /u/Skellyhell2 [link] [comments]
- Converting VBA scripts to Office scripts for easier automationhey everyone, currently i have a VBA script that i manually run everytime a certain file is sent from a particular email address. for context the flow of the automation is done this way: file received through email -> automatic download -> manually trigger the VBA automation -> data from the sent file is transformed and kept in multiple different excel files. the automation itself is slightly lengthy and complicated. It basically converts bunch of data from the file sent, breaks it into different files with the required pieces of data for upload. what im trying to look at is, is there any way to basically eliminate this need to manually tiggering this vba automation myself? i get this report sent multiple times a month and as convenient, my vba automation has made this process its still a hassle 😂. I've heard of Office scripts being an option but can anyone please let me know if its possible? any resources would also be beneficial. thank you. submitted by /u/unlucky_ko [link] [comments]
- Slow spreadsheet - need troubleshootingHi, I have a spreadsheet that has two tabs, one is essentially the original data which is YTD driven for a particular GL account, the company has smaller amounts of transactions, so by December we are talking about maybe 3-5k rows of transactions for the account total. The main tab being utilized, has about 30 columns of look up and sumifs formulas referencing the source data and in total approx maybe 500 rows by year end? To me it doesn’t seem excessive. I’ve dealt with way heavier spreadsheets that have more omph and run faster. But for some reason this one is slow as all hell to work in. I’ve even tried barcoded some data and not seen any improvement. I’m not too techy into what else could be slowing it down. And ideas on what to troubleshoot from here? submitted by /u/SlideTemporary1526 [link] [comments]
Tagged with
#Excel alternatives for data analysis#generative AI for data analysis#natural language processing for spreadsheets#big data management in spreadsheets#conversational data analysis#Excel compatibility#rows.com#real-time data collaboration#intelligent data visualization#data visualization tools#enterprise data management#big data performance#Excel alternatives#data analysis tools#data cleaning solutions#financial modeling with spreadsheets#automation in spreadsheet workflows#generative AI automation#workflow automation#cognitive automation