MindGym’s Digital Integration



 Role

Product Designer

Team

Product Manager; Lead Product Designer; Product Designer; Content Designer; Illustrator; Solutions Designer (Behavioural Psychologist); User Researcher; Engineers (x3)

Duration

December 2021 - January 2022 (6 weeks)


INTRODUCTION

CREATING A NEW PRODUCT

At the beginning of December, I, along with a cross-functional team, began to look at how we could offer a digital touch point to optimise and integrate with MindGym’s current offering. I took this greenfield product from initial research phases, through multiple rounds of design iteration and testing, to release.

A quick background: MindGym has existed since the year 2000. The way they operate is by selling behavioural change in the workplace learning packages to clients. The core of these packages are sessions that take place in small groups, led by a MindGym coach.

RESEARCH

(Plotting our research findings on Miro)

As a team, we had previously not been in close touch with this side of MindGym, since our mission had been to create a self-directed learning platform for new managers. So, our starting point was to research as much as possible about how MindGym operated. We reached out to people across the company for interviews and links to documents, performed competitor reviews, and sat in on MindGym coaching sessions.

(Interview questions and insights)

(Insights from existing research)

(Coach session survey feedback)

A big finding that came from the data collected in feedback forms (filled out by participants post-coaching session) was that participants love the sessions and the coaches. We therefore knew that the coaching sessions themselves was not something we should seek to replace, and instead aim to build up and enhance the experience that surrounded it.

(MindGym’s document plan structure)


SERVICE MAP

In order to visualise MindGym’s current system, I put together a service map (seen below in the black boxes). This plotted the key steps of each of the players involved in MindGym’s system: MindGym’s client team, the client (buyer), MindGym’s coach, and the participants. This allowed me to understand the participants' touch points from beginning to end. We then used this service map to list out the opportunities we saw to improve the process with a digital product.

(Opportunities plotted on top of MindGym’s delivery service map)


PROBLEMS

PARTICIPANTS

One of the biggest problems we identified was that we knew very little about our participants. We had no interaction with them outside of the session and therefore relied heavily on our clients to manage communication and to convey the needs of the group. We had no visibility of what information participants actually received before and after sessions, and, as a business, no way of tracking what actually happened within sessions.

Although participants had reported enjoying the sessions, they reported that there is not enough time to delve deeper into areas of interest. Some had an expectation of what they wanted out of sessions and those expectations were not always met. Others were unsure what they’d get out of sessions, which leads to low occupancy and poor engagement during the session.

MINDGYM COACH

Had no visibility of the group’s needs or level of expertise ahead of sessions. This made it difficult to prepare and meant they improvised sessions.

CLIENT

Low occupancy, poor engagement and unmet expectations negatively impacts ROI and general success of MindGym programmes. We had no way of tracking or quantifying our success, which made our product harder to sell to clients.

MINDGYM AS A BUSINESS

The lack of data meant we operated blind – we couldn’t effectively measure the success of the programmes or identify opportunities for growth and improvement if we didn’t know who our audience was and what did or didn’t work for them.


HMWs

The three problems we decided to tackle first were: participant engagement, data collection and session enhancement. In order to address these pain points, together with the user researcher and the other product designer, we came up with ‘How Might We’s’. Then used the HMWs to list out ideas.

(Using the problem statements to plot ‘How might we’s’ and ideas)


USER JOURNEY MAPPING

(Quickly experimenting with user flows)

Using these ideas, I started to experiment with various user flows with my fellow designer. The important steps we decided were necessary to include were:

  • To send an email to participants to inform and excite them about their upcoming session, along with account set up information.

  • Give users an introduction to the product.

  • Introduce the coach (since this was found to be the best part of the experience for our users).

  • Gather information from our users to learn:

    • What motivates them

    • What they find challenging

    • How confident they are with a particular skill.

  • A dashboard to give users information about their session:

    • Details about when and where

    • How to attend

    • Further information surrounding the session (to build excitement/engagement)

We then plotted these steps against the user and business needs we had found in our research, to ensure we were covering all of the important issues:

(User journey mapping)


EVENT MODELLING

As we began to experiment with the various wireframes, we took time with the rest of the team to have an event modelling session. I took the lead by presenting the flow back to the team to ensure everyone was aligned with the direction we were heading in. Then our PM took the lead ensuring devs and analytics plotted all the necessary touch points needed along the flow.

(Event modelling on Miro)


INCREASING THE FIDELITY

(Initial wireframes testing different flows)

Once we had decided as a team around the flow, I worked with the other product designer to experiment with our wireframes a little further. Steadily iterating on them based on feedback from the team and stakeholders across the company.

We knew we should keep the flow as short as possible, since we didn’t want to lose our users’ engagement and we wanted them to be as excited as possible going into the coaching session. That meant we tried to keep content and number of pages to a minimum.

As we iterated and made design decision we were constantly feeding our designs back to devs. They were building this new product by repurposing as much of our old project as possible. That meant certain decisions, such as using radio buttons without an input field, were made to ship the product as quickly as possible to real users.

(UI steadily getting cemented & the flow solidified)


UI DECISIONS

As the flow got solidified, the other designer and myself branched off and tackled the UI of different parts the flow. We were working with a design system that included our colour, font and button guidelines. And we communicated with designers on other teams to align wherever possible.

HIGHLIGHTING CONTENT

(Experimenting with highlighting sections of content)

One of the the aspects I looked at included highlighting certain sections of copy on the page. I experimented with different layouts, sizes, spacing, background colours and styles of illustration.

The bottom section of copy was there to set up what came next, so I tested different font sizes and background colours for that too to see if it changed the way the page would be viewed as a whole.

GIVING A QUESTION CONTEXT

(Experimenting with designs for question context)

During the onboarding flow we ask the users three questions. We thought, and it was confirmed in testing, that the questions by themselves were a little unclear. We therefore decided to add a context section to the questions in order to steer users in the right direction. I designed multiple options, eventually landing on a solution that added an emoji to the copy, along with a slightly darker background in order to set it back from the rest of the page.

(Experimenting with the flow of contextualising a question)

Another option I experimented with was priming users with the context section first, before having them click to reveal the question it was associated with. However, we decided not to take this idea forward as it added extra clicks to the users journey. We had decided the progress bar at the top would be one click per page.


CREATING A USER TIMELINE

(Timeline of communication with participants)

Whilst talking to the MindGym client team, we found out that there was no standardised timeline that we worked to. Participants might be contacted moths in advance of their MindGym sessions, or maybe just weeks before. I therefore designed emails and made a plan for a timeline to release them, based on whether or not users had signed up yet.

(Email campaign)


TESTING (Userlytics)

PLAN

Testing was led by our UX Researcher. She tested a prototype I put together with six users on Userlytics. All managers, aged 28 - 42, who had previously experienced workplace soft-skill training. She tested the prototype to gather feedback on the concept and to identify any usability issues. She asked probing interview questions to explore user needs and had users answer survey questions in order to quantify aspects of the user experience. The main aim was to evaluate the information we provided to users (e.g. coach profile, x3 questions, dashboard), in order to further understand what people wanted and needed to know before attending a training session. The hope was that we would receive quick wins to action ahead of upcoming internal testing; notice emerging themes to ideate on and explore in future rounds of testing; and gain insights to inform next steps.

FINDINGS

Loved that my motivation was on the dashboard at the end. So simple but feels so much more personalised.
— Test User

The general consensus was very positive. Participants felt the process was quick and easy. They responded positively to the coach, the motivational question and the data playback on the dashboard — which they felt created a “more personalised” experience. They understood the value of answering the questions, which was to get them in the right mindset and acknowledge the things they needed to work on ahead of the session.

Two participants were confused by two CTAs on the email, querying which would be “better” to click on. The coach profile was popular, but users wanted more information — this was actually something we thought too, but had reached a stumbling block in gathering the information from our coaches. One user wanted an option to ‘add their own’ challenge, however, this was something we had forgone for the time being since our engineers had not built that functionality. Users reported that the onboarding experience looked longer than it ended up being (based on the progress bar) and suggested simplifying it into sections — we decided to wait for further trial feedback on this.


LIVE BUILD

Having tested the prototype, we felt confident for our first round of internal testing. We made some content changes to the CMS and were ready to go.


QUALITY ASSURANCE

The build looked fantastic, but I went through and ‘inspected’ it to ensure it was built to the highest quality, and that nothing had been lost in translation throughout the process. There were only minor updates needed that I relayed to the engineering team.

(Example of build QA)


INTERNAL TRIAL TEST FINDINGS

Internal trials involved 56 MindGym employees from across the business, none of whom had any involvement in the making of the product.

Overall Satisfaction Rating: [4.5 out of 5 stars]

  • Highest score: 5 (63%)

  • Lowest score: 3 (11%)

Level of excitement for session after flow: [73% average]

  • Interested (46%)

  • Enthusiastic (38%)

  • Very enthusiastic (15%)

(N.B. 64% of participants knew they were not going to the actual session; their enthusiasm is likely to be impacted by this. Monitoring sentiments in client trials will give us a much more accurate picture.)

Overall, a really great experience. There is just enough content on each of the pages to easily digest and understand, and I think this is fantastic integration with ‘Workouts’. The data captured here will allow us to measure behavioural change for the first time for our Live sessions, which is very exciting!
— Test User

Feedback was generally very positive, noting they could see value both for clients and MindGym. Participants were particularly fond of the motivational questions and playback on the dashboard - voted the two most useful features, followed closely by coach profile — though they also noted a desire for more information about a coach’s experience.

Users had issues finding the product’s URL when trying to access the product again. They went to their emails, however the link in the email was an expiring token (for security reasons). I therefore designed a new pathway for users to sign in through the token expiration screen.

Users hesitated on the free text field associated with the challenge question, since users struggled to think of a good response — perhaps adding some prompts and examples would help. A few users wanted to edit their responses to the questions on the dashboard, however this functionality was deemed out of scope due to the way the system had been built.

Really neat journey, style, design, layout and speed to read.
— Test User

INTERACTIVE PROTOTYPE

Please have a read and click through of the experience. I suggest expanding the viewer to full screen.

Please note, we designed everything at four different breakpoints (1440, 1024, 768 & 360px), but since data suggested most users would use a desktop we used that for our prototype.


NEXT STEPS

I iterated on the designs based on the feedback we received whilst trialling. But, as a team, we decided to wait for our external testing (scheduled to start March 25th) before implementing any further changes. Our focus shifted the post-session experience. The questions being: What do we do with participants after their coaching session? How do we follow up and ensure their expectations have been met? How do we continue our participants’ learning journey and help them overcome their blockers?