Applying AI in sustainability reporting

In this project case study, I will be sharing one of my experiences working as a founding product designer developing a sustainability reporting software for an ESG start-up.

The goal was to create an web app that automates answering sustainability questionnaires and ESG ratings for corporate businesses. Users were struggling answering these questionnaires, especially when it came to qualitative questions. With the emerge of generative AI, we developed a solution that creates human-like text responses that is trained specifically for sustainability teams. Throughout the process, I focused on creating an intuitive user interface that would enhance the overall user experience and encourage engagement with the app.

Client
Codio Impact
Year
2023-2024
Role
Product Design, PM, UX Research, Strategy
Project image

Problem

Many users had difficulties answering the questions

The introduction of the European Sustainability Reporting Standard (ESRS) gave our customers a great deal of freedom in terms of how sustainability data can be collected and qualitative questions answered. There are no 'best practices' as ESRS is new. It is difficult and time-consuming to answer qualitative questions without prior knowledge or motivation (intrinsic and extrinsic). Qualitative questions require a broader understanding of sustainability in general, in the market and in the corporate context.

Discover

Understand User's Needs

I interviewed 5 users of our current customers. They consistently reported problems understanding and completing qualitative questions. Many also mentioned that they didn't know there was help material available to answer the questions or they didn't have the time or motivation to read it. This confirms the paradox of the active user: users do not read manuals, but start using the software immediately.

Clarifying the Core Problem

Difficulties with open questions gave space to uncertainty and no direction in answering them.

Help material remains unread and additional information does not help users to complete tasks effectively.

Questions are hard to understand and users don't know what to enter.

Users guess answers and enter very short answers or filler words.

Project image
Project image
Project image

Form our ideation workshop during our yearly team offsite

Iterate

Ideation Workshop

During a team offsite, I moderated an ideation workshop. After an icebreaker exercise, I showed our team excerpts from our user interviews, which we then summarised. We then brainstormed solutions as a team and presented them. We then prioritised all the solution ideas together on an impact vs. effort matrix. After prioritisation, we then created an action plan with: (1) Do Now, (2) Research Now, (3) Backlog.

Wireframing and Flow Mapping

Based on our ideation workshop, I created the first wireframes. Together with the CTO and the developers, I developed a user flow that was both user-friendly and technically feasible. We also agreed to integrate a feedback system for the AI suggestions and to experiment with uploaded documents as an additional data source.

Project image
Project image
Project image

User flow map for our solution

User flow map for our solution

User flow map for our solution

Solution

The solution offers AI-generated answer suggestions with customisation options, a transparent presentation of the creation process, a feedback system to improve AI quality and the option to upload documents.

Outcome

Challenges

Difficulties in realisation and a lot of experimentation was necessary.

Positive Feedback

Initial internal feedback was positive and usability and concept tests with customers were successful.

Next Steps

Implementing the improvements from the usability tests and further iterations.