Any advice welcome. Work document instructions followed but result still not up to par. Data missing or duplicated. Copy Paste not captured all data selected.
Our take
I'm at my wits end and I can't tell if the problem is me, the person who wrote the instructions, or Excel.
So part of my job is to take an Excel files provide by corporate, transform the data into a table, then split that information up between 4 separate sheets while keeping the table unchanged.
I've been here for 2 months. When I first started the task, I was given a video and written instructions on how to do it. Great. I love that.
Except I follow the instructions and the notes from the video to a T, and there are still a plethora of micro issues, it's like playing whack-a-mole.
Either data is missing that I never deleted, or the file suddenly looses a massive amount of data after being turned into a table, or any other number of errors that have occured in the last 8 submissions.
My direct supervisor says not to be too hard on myself bc it's an intricate process that even she isn't full trained on so it's sort of the blind leading the blind, but that doesn't stop the angry calls from the person down the line who receives the newly formated sheets.
Any advice is welcome.
[link] [comments]
Read on the original site
Open the publisher's page for the full experience
Related Articles
- Issue of irregular data, from multiple sources I have no control overSo, in my working environment, I'm usually tasked with making some sheets for specific needs that take as input data from up to 10 different sheets managed by different people in different ways, we're talking all manual formatting, different everything mostly, albeit very basic with close to no actual calculations, and of course no usage of advanced instruments. My issue is - I always have to make uniform data from all that mess and then do my task, the first part takes 80% of time actually, so I wonder how I can automate that. Examples of issues being, like, data written in different formats, same fields related to one person having slightly different data in nature or form in different sheets, namings not up to standards etc etc. I absolutely can't influence other actors in any way. I'm not sure how to tackle that issue mainly because I will need to somehow implement safeguads which will somehow track data conflicts, overlaps, duplication and such. And second part is to somehow taking into account all types of slightly different ways data is represented among one type of data. Will be grateful for any insights into how you deal with such chaos without ability to influence other people work processes. submitted by /u/Independent_Salary84 [link] [comments]
- Need Excel workflow advice for multi-region data cleanup and tracking progressHi excel pros, I work for a company with about 20k employees, and I’ve got a spreadsheet of roughly 2,000 people who are missing data for two required info columns. These employees are spread out across different regions, and then further down to individual locations/teams. What I need to do is send each region only their portion of the data, have them push it out to their locations to fix, and then somehow track what’s been completed and pull everything back together into one clean file. In the past, I’ve been filtering data, saving separate files, emailing them out, then trying to keep track of who’s done what and combining everything back together. I’m worried I’m going to run into version control issues or miss updates. It’s also very cumbersome and it has ended up just being a big stressful mess in the past. I feel like there has to be a better way to handle this, but I’m not sure if I’m overcomplicating it or missing something obvious in Excel. I’m very much a basic user and not super familiar with more advanced features, but I’m willing to learn. Has anyone set up a process like this before? Appreciate any advice or ideas. Even just “here’s how I’d approach it” would be super helpful. submitted by /u/Magnolia05 [link] [comments]
- How Do You Tame a Messy Excel Spreadsheet?Yesterday I opened a huge Excel file from a coworker that was full of inconsistent formulas, missing data, and messy formatting, and I spent hours just trying to make sense of it before I could even start my analysis. Some formulas broke, charts didn’t update, and tracking changes was a nightmare, so how do you all handle cleaning up or organizing complicated spreadsheets efficiently without losing your mind? submitted by /u/throwawayaasyr [link] [comments]
- What would you do with this task, and how long would it take you to do it?I'm going to describe a situation as specifically as I can. I am curious what people would do in this situation, I worry that I complicate things for myself. I'm describing the whole task as it was described to me and then as I discovered it. Ultimately, I'm here to ask you, what do you do, and how long does it take you to do it? I started a new role this month, I am new to advertising modeling methods like mmm, so I am reading a lot about how to apply the methods specific to mmm in R and python, I use VScode, I don't have a github copilot license, I get to use copilot through windows office license. Although this task did not involve modeling, I do want to ask about that kind of task another day if this goes over well. The task 5, excel sheets are to be provided. You are told that this is a clients data that was given to another party for some other analysis and augmentation. This is a quality assurance task. The previous process was as follows; the data the data structure: 1 workbook per industry for 5 industries 4 workbooks had 1 tab, 1 workbook had 3 tabs each tab had a table that had a date column in days, 2 categorical columns advertising_partner, line_of_business and at least 2 numeric columns per work book. some times data is updated from our side and the partner has to redownload the data and reprocess and share again the process this is done once per client, per quarter (but it's just this client for now) open each workbook navigate to each tab the data is in a "controllable" table bing bing home home impressions spend partner dropdown line of business dropdown where bing and home are controlled with drop down toggles, with a combination of 3-4 categories each. compare with data that is to be downloaded from a tableau dashboard end state: the comparison of the metrics in tableau to the excel tables to ensure that "the numbers are the same" the categories presented map 1 to 1 with the data you have downloaded from tableau aggregate the data in a pivot table, select the matching categories, make sure the values match additional info about the file the summary table is a complicated sumproduct look up table against an extremely wide table hidden to the left. the summary table can start as early as AK and as late as FE. there are 2 broadly different formats of underlying data in the 5 notebooks, with small structure differences between the group of 3. in the group of 3 the structure of this wide table is similar to the summary table with categories in the column headers describing the metric below it. but with additional categories like region, which is the same value for every column header. 1 of these tables has 1 more header category than the other 2 the left most columns have 1 category each, there are 3 date columns for day, quarter. REGION USA USA USA PARTNER bing bing google LOB home home auto impressions spend ...etc date quarter impressions spend ...etc 2023-01-01 q1 1 2 ...etc 2023-01-02 q1 3 4 ...etc in the group of 2 the left most categories are actually the categorical headers in the group of 3, and the metrics, the values in each category mach the dates are now the headers of this very wide table the header labels are separated from the start of the values by 1 column there is an empty row immediately below the final row for column headers. date Label 2023-01-01 2023-01-02 year 2023 2023 quarter q1 q1 blank row REGION PARTNER LOB measure blank row US bing home impressions 1 3 US bing home spend 2 4 US google auto ...etc ...etc ... etc The question is, what do you do, and how long does it take you to do it? I am being honest here, I wrote out this explaination basically in the order in which I was introduced to the information and how I discovered it. (Oh it's easy if it's all the same format even if it's weird, oh there are 2-ish different formatted files) the meeting of this task ended at 11:00AM. I saw this copy paste manual etl project and I simply didn't want to do it. So I outlined my task by identifying the elements of the table, column name ranges, value ranges, stacked / pivoted column ranges, etc... for an R script to extract that data. by passing the ranges of that content to an argument make_clean_table(left_columns="B4:E4", header_dims=c(..etc)) and functions that extract that convert that excel range into the correct position in the table to extract that element. Then the data was transformed to create a tidy long table. the function gets passed once per notebook extracting the data from each worksheet, building a single table with the columns for the workbook industry, the category in the tab, partner, line of business, spend, impressions, etc... IMO; ideally (if I have to check their data in excel that is), I'd like the partner to redo their report so that I received a workbook with the underlying data in a traditionally tabular form and their reporting page to use power query and table references and not cell ranges and formula. submitted by /u/TheTresStateArea [link] [comments]