How human-centred design
supported digital transformation
of the student experience

Awards
Project
Recognition

CQUniversity Australia had a wicked problem that it needed help solving.

The higher-education institution had identified that a fractured digital ecosystem was making it difficult for students to make sense of all the things they needed to be prepared and stay organised, undermining their ability to engage with their studies.

What CQUniversity was looking for was help with digging into the details of the problem and designing a solution that would turn a fragmented experience into a seamless and personalised one.

What Liquid brings to the table is deep expertise in human-centred design that extends all the way through to delivery. The collaboration with CQU focused on reimagining the student portal; from the original goal of helping students get the information they need, to launching the new MyCQU digital student experience.

EXPAND
HUMAN-CENTRED DIGITAL TRANSFORMATION

Learn more about what it takes to create a seamless digital experience for university students by downloading our whitepaper: Human-Centred Digital Transformation.

iua-image
iua-image

Getting to know the users

The first step in human-centred design is, unsurprisingly, getting to know the humans at the centre. With CQU, we knew this would be a challenge. The university has nearly 30 campuses around Australia, offering everything from TAFE qualifications to Research Higher Degrees. Every program represents a different set of needs, priorities, and resources — and then there’s the differences between students themselves.

With such a vast range of experiences, it was vital to speak to as large a cross-section of the student body as we could, building a rich portrait as efficiently as possible. We organised a three-day workshop bringing together dozens of people from a range of student representative bodies, as well as the student services frontline, who were experienced with many student journeys and could speak to a wide range of needs.

iua-image
iua-image
EXPAND
DON’T LET THE PERFECT BE A BARRIER TO THE GOOD

HCD has ideals, but it is fundamentally pragmatic. While the gold standard might be talking to users, frontline workers can make an excellent proxy, especially if they have daily contact with diverse users. You’re effectively getting to sample a large population, but with more richness than a survey and more scale than individual user interviews.
Also, we can often find other sources of data created by the target users. In this instance, we used a list of the top 300 questions that students asked support services to help corroborate the pain points we found in the workshop.

Each workshop had a different theme. The first day was for team-building, alignment, and onboarding. The following two days were dedicated to learning about the student body from various angles.

First, we focused on the raw details: Who were they? Where did they come from? What did they need? Then we turned to process: What did students do month-by-month, year-by-year? What were the key phases and activities they went through? How could a student platform assist over time?

From these discussions, two themes emerged:

  • Students struggled with getting organised — knowing where to go, what to do, and when to do it to ensure they got the best start to their studies
  • Not every student cohort felt like they were represented equally in the current systems. For example, VET students were receiving orientation emails about HE studies, and were finding it much harder to sift through what, if anything, was relevant to them

We had an opportunity to use the human-centred design process to give these groups a more prominent voice so that the new platform worked for them, too.

Making sense and setting a direction

The discovery workshops made it clear that a successful solution would boost student confidence by helping them feel prepared, engaged, and supported.

This wasn’t a huge surprise; CQU had already formulated a set of design criteria based on their previous work. The latest workshops confirmed that existing work but also suggested two new criteria: guided and timely. These new criteria captured the need for the solution to proactively help students make sense of the university ecosystem and provide the right information at the right time.

Design criteria can often feel like common sense — after all, who doesn’t want their solution to be easy to use, relevant, understandable? But there will always be trade-offs between what you want and what you’re able to deliver. An explicit set of criteria based on user needs and preferences can help guide priorities.

As well as the design criteria, we created student journey maps to articulate all the process information we had collected, along with relevant pain points and opportunities, and began to write user stories to capture student needs. Over the course of the project, we wrote more than 100 user stories, but we started off with about a dozen fairly high-level stories, such as:

  • As a student, I want to be able to see all of the calendar information in one place
  • As a student, I want to be able to easily find what kind of support resources are available to me
  • As a student, I want to have an easy way to understand what I need to do to get ready for studying so that I feel prepared and confident in anticipation of my first day.

As well as the high-level stories, we started to narrow in on some behavioural archetypes: “As a student who is planning on the go… ”, or “As a student who is planning ahead… ”.

Needs that only applied to specific cohorts (“As a VET student…”, “As an online student…” etc) would come along further down the track as we narrowed in on the student experience.

At every stage, we checked back in with our group of student representatives. This helped ensure that our insights were accurate, as well as prioritise our work and manage the process effectively.

iua-image
iua-image
Closing in on the solution

As we explored the problem space, we also brainstormed features for our solution that were aligned with the pain points and user stories we had gathered.

We sketched ideas for wayfinding, scheduling tools, better system integrations, how to serve content, notifications, and more. The first sketches were very rough — just enough detail for someone to have a naive understanding of what we intended. This meant that when we put these sketches in front of student focus groups, the students could tell us whether such a feature would be useful in general, but also what specifically would make it useful for them.

The focus groups were the first time we engaged directly with students. We hosted one-hour sessions covering one to four features with four students fortnightly, with separate sessions for each program type — Higher Education (HE), Vocational Education and Training (VET), and Research.

We had to revise some of the assumptions we had from our work with the student representative groups. For example:

EXPAND
THE MASS SHIFT TO DIGITAL

COVID-19 restrictions brought an unexpected boon to user testing for this project. The normalisation of digital tools, particularly video conferencing and real-time collaboration tools like Miro, meant it was possible to engage with students from all over the country more directly.

Assumptions
Research students were underserved by current student support systems.
Reality
Research students don’t need many support systems, as they are used to being independent learners and managing their own studies. This meant we could focus our efforts on supporting VET and HE students.
Assumptions
Wayfinding on campus was a key issue.
Reality
The onset of COVID-19 lockdowns meant wayfinding became a non-issue, but it was also unclear if students found it a major difficulty before the shutdowns. That meant we could put wayfinding features aside for the first iteration and focus on other features.
Assumptions
Students would be most keen for smart, personalised calendars and timely notifications.
Reality
The feature that got overwhelming positive feedback was a simple-looking checklist that guided students through the steps they needed to get ready for study. This checklist became a priority for design and development.

Diving into the details

Armed with a good idea of what to work on, it was time to dive into the gritty details of how to make our features functional.

To help students make sense of all the tasks they needed to do after enrolment, we saw an opportunity to pull information from across multiple systems into one singular checklist. This needed to be personalised to every student and integrated to ensure it could accurately reflect the status of a task. If students have created a student ID card, we wanted to make sure we didn't show that as a task that still needed to be done, causing them to lose faith in the accuracy or relevance of the checklist.

We worked with business analysts to figure out all the steps students needed to complete to be ready for their first day of study, and how those steps varied between populations and situations: VET students, HE students, different programs, different campuses, online, on-site, and so on.

The checklist was a complicated piece of development, requiring high levels of personalisation and detail. From a design perspective, we needed to narrow in. We continued to pursue unknowns, such as:

  • What tasks would students prefer to do on mobile vs desktop devices?
  • How should due dates be expressed?
  • How long should completed checklist items stay on the Today screen?

Over the next few months, our rough sketches evolved into more complex wireframes, and eventually clickable prototypes, testing more and more of the actual functionality and usability of each feature.

Another design challenge for the checklist was managing the differences in digital maturity between the systems used by students from different programs. HE and VET students interacted with highly mature digital systems, which meant we could automate the entire administrative process for them. But this was not true for research students at the time of development. Where research students needed to submit paper forms or wait for manual validation, we had to focus on making the experience clearer and more transparent (for example, by marking tasks as ‘in progress’).

It wasn’t ideal — after all, one of the desires that came out of the workshop was to improve the student experience across programs in line with that of HE — but it was an achievable start.

The human-centred design process was able to highlight the improvements needed so they could be added to CQU’s broader organisational roadmap.

 

EXPAND
ANALYSING THE WHOLE PICTURE

Human-centred design processes produce all kinds of data — quantitative, qualitative, structured, unstructured — and it’s important to think about what is being collected, and how to interpret it. No single source can give the One True Answer.
For example, looking only at web analytics data, the checklist appears to be barely used. With context, this makes sense; it only appears for the short window of time in which it is relevant. But was it worth the huge time investment in design and development for something that’s barely even seen? Interviews with students since launch suggest yes. The checklist is often brought up as the most important feature, and students will even say they can’t imagine the platform without it.

Launch is not the end

The MyCQU Digital Student Experience launched in October 2020. But that was not the end for us. With students using the platform for real, we’ve had new opportunities to learn and improve.

As new students start at CQU, we gain insights from people with no experience of the old CQU student portal. This is not only exciting from the standpoint of getting to help a new cohort of students confidently prepare for their studies, it also gives us a way to tease out the platform’s merits and opportunities for improvement that aren’t anchored in students’ experiences of the old platform.

For example, a question we were unable to answer in a satisfying way with 2nd and 3rd year students was what to do with quick links. From the start, CQU wanted to keep away from the connotations of a ‘portal’ — somewhere you go just to click through to somewhere else. We agreed. We needed to surface the value of the MyCQU platform as a guided, sense-making tool, not just a fancy set of bookmarks. But students who have been studying for 2-3 years already know their way around the university systems and a lot of the time they just want to get to those systems quickly.

If we only listened to these students, and put the quick links up front, we ran the risk of constraining how new students used the platform and burying its true value. However, removing the links completely is frustrating for those more experienced students.

By working with the new cohort of students, we hope to find a solution that works for both groups and fulfils the organisational goal to create a rich digital student experience. It is a journey we are excited to continue with CQU.

Closing thoughts

Discovery activities and synthesis can be costly, and because the outputs are usually complex, it can be difficult to articulate its value until further down the track when you can look back and see how far you've come. Planning can feel loose and hand-wavy because of the need to remain open, flexible, and adaptive to new information.

There’s an argument that using a human-centred design approach is more efficient than heavily planned-in-advance approaches. You could even argue that the very things that make the human-centred design approach feel overwhelming are what make it efficient.

Spending quality time and resources on discovery means greater efficiency when it comes to drafting and building features that work for users, and regular check-ins and course correction means that drastic changes in priorities or goals are unlikely, saving time and resources in the long run.

To reduce that overwhelming feeling and uncertainty that come from discovery activities and loose planning, we find it’s important to take note of milestones and other markers of progress.

We are aware of our progression from broad knowledge and understanding to putting together the highly specific, gritty details that form the actual solution. Common outputs — like the creation of user journeys, user stories, and wireframes — help define our understanding, show progress, and reveal where the next milestone might be.

We are hunters, using the tools and strategies encompassed by human-centred design philosophy to find the solutions that make people’s lives better.

If you want to know more about this collaboration or what it takes to create a successful digital experience for students, download our whitepaper.