Content

ESG Bulk Data Input

The ESG bulk data management feature streamlines the entry of large datasets.

The problem: Single-entry inputs were time-consuming, error-prone, and inefficient for managing large volumes of ESG data.

The solve: We developed a bulk data input feature that allowed users to upload large amounts of ESG data, improving efficiency and accuracy.

My role: I was the sole product designer, working with a remote team of engineers and a product manager.

A skeleton load of the bulk input table.

Exploratory Research

We spoke with 5 users to understand their frustrations with the existing system and how they currently handled large datasets.

4 key findings: excel-based workflows, tedious single entry, historical data challenges, and error-prone validation.

Design Process

We explored several options for improving data entry, including examining existing solutions in the market. Based on user feedback and initial testing with lo-fi wireframes, we shortlisted our options.

There are five boxes of shortlisted options. Number 2 invoice capture and number 5 voice input were crossed out at decision point 1, eliminated because of budget constraint. Number 4 API integration was crossed out at decision point 2, eliminated because of minimal coverage. The remaining options are number 1 excel upload and number 3 in-app entry which were evaluated for their pros and cons. Under excel upload, pros: user owns data, can edit as necessary. Cons: Data validation via Excel is cumbersome. Under in-app entry, pros: more interactivity, easier error-handling. Cons: Updates within the app aren’t saved till submission.

We opted for a combination of Excel Upload and In-app Data Entry to balance familiarity and real-time validation.

Once the concept was clearer, we created a user story map to define the key tasks users needed to complete, prioritize features, and ensure we aligned with user goals. This allowed us to understand which parts of the workflow needed the most attention.

Screenshot of a user story map on Miro for bulk input.

With the team aligned on what was needed, I created a user flow to detail how users would upload and validate data. This ensured that users could correct errors and flag anomalies (such as unusual values) before finalizing their inputs.

Title: Bulk Input Records User Flow. Subtext: Scenario Simon wants to input his monthly environmental data. There are about 30 records he wants to input, so he uses the bulk input function. Below, there is a flow diagram made of boxes and arrows. Description of flow: Simon going to data management to input his data. Simon access the bulk input record page. The next point is a branching logic. If Simon already has an existing template file, Simon can move on to fill it out with his month’s data records. If Simon doesn’t have it, he needs to download it first. Next, he uploads the file onto the page, and clicks on a validate button for the system to do a data validation. If there are no errors found during the validation, Simon can submit data record rows. If there are detected errors, Simon will review the flagged row(s) along with a summary. Next, he clicks validate again to, which brings him back to a previous step. This flow seems to repeat until Simon successfully submits his data and sees a toast notification that data submission is successful.

Usability Testing

After building a prototype of the bulk data input feature, we tested it with real users who frequently handle large datasets. We ran through scenarios where users had to upload and validate data, focusing on error handling and workflow efficiency.

After synthesizing our findings, we prioritized and assigned action items to address the key usability issues.

Three key usability problems as well as the action items that were done to address them.

Feature Demonstration

This is a demonstration of the feature using a mock hotel and sample data.

Impact

The average time for monthly data entry was reduced from over an hour to just 10 minutes, by streamlining the process of uploading, validating, correcting, and submitting a template. Time savings increase with larger datasets.