Turns out, it wasn’t a “Tech” problem

Aishvarya
5 min readJun 4, 2021

--

I worked on a large government project where the requirement was design a procurement platform that would completely digitize the buying and selling of goods within the state government offices. The scope of work included design and development of the platform and crafting a marketing communications strategy for the platform to increase traffic coming into the site.

Naturally, the first step was understanding the demographics of our users.

Once we collected some basic information about our users, we were ready to do a card sorting exercise with thirty participants who were potential users of the platform. We then created a wireframe keeping in mind three essential elements that we identified were important to this group of users — 1. Simplicity; 2. Usability and 3. Accessibility.

Card sorting

The next stage was designing some low-fi prototypes of the screens. This was then tested internally, keeping in mind the demographics of our target users. We got the screens signed off and finally proceeded to the development stage.

The challenge: The platform went live. A couple of months down the line, despite the government emphasizing on the importance of going digital with procurement of goods, only 21% of the users were using the platform to buy goods. There were around 500 sellers registered on the platform but many of them started withdrawing since their goods were not being sold online. The larger plan was to increase the number of sellers by at least 40% in the next quarter which was proving to be challenging.

The client contacted us complaining that the website was not “working properly”, and that the users did not find it “usable”. Isn’t that usually the case? When something doesn’t work, feel free to blame the technology. Just like everyone else, we too believed it could be a technical glitch or a massive problem with the interface. We moved on to conduct another round of research and testing:

  1. We checked for Nielsen’s 10 Principles of Heuristics. We ticked off all the boxes.
  2. We conducted a task analysis with 10 participants, and determined task completion rates and time on task.

When no obvious problems emerged from either of the two, we then realized something important: that maybe the platform was not the problem, but the way the users were using the platform. We then set out to get approvals for contextual inquiry to perform on-site, and when we finally received the approvals, we set out to explore the behaviors and attitudes of our users.

The first week or so on-site went in observing our users in their natural habitat — 1. How they interacted with their computers; 2. How often they used a mouse and how often they used their keyboards; 3. How comfortable they were in navigating through systems; 4. How much time they spent in a day working on their computers; and 5. What other devices they used day-to-day at work. Below were the observations (O) and assumptions(A).

  1. O: They used a wired mouse predominantly — A: They did not know shortcuts on a keyboard
  2. O: In addition to a computer, they used ledgers and books to make a note of all transactions and purchases. They often used a book and a pen more than computers — A: They trusted writing down things more than typing them on their computer
  3. O: They spent around 3.5 hours a day on an average using computers — A: They were not relying heavily on computers for their everyday tasks
  4. O: Most of the older users often asked the younger users for technical support — A: Older users were not comfortable using computers
  5. O: They used basic features on smartphones such as messaging services, making and receiving calls —A: Tthey were not very tech-savvy, and were not too keen to learn either

We then mapped the user journey from the time they arrived at work to the time they left for the day. We specifically looked at how they were using the procurement platform.

User journey

We also started looking closely at the professional relationships between people in the office, and how collaboratively or independently they worked. Based on some of our observations and findings, we organized one-on-one interviews with a few employees and had meaningful conversations with them.

We identified two main challenges which were potentially contributing to the low traffic on the website.

  1. We understood that before a digital platform for procurement was introduced, all the buying and selling took place via middlemen. We also understood that the relationships the employees formed with these middlemen were long-lasting based on trust. This could not be easily replaced by technology.
  2. Additionally, the users were also intimidated by technology — there was a fear associated with, “what if I make a wrong purchase? What if my payment goes though twice?”

Addressing the trust factor: It was unfair to ask users to shift their trust from middlemen to a platform driven digitally. Instead, we decided to overshadow the trust factor with something exciting — discounts. We proposed a marketing plan where sellers would offer discounts on products around months that are considered “auspicious” to buy goods. We also began getting video-based testimonials from existing users, and circulated them across offices. Additionally, workshops were organized where existing users would talk about their buying experiences and the percentage decrease in the purchasing costs.

Addressing fears associated with digital-buying: We created a simple manual in video, audio and text-based formats that the employees could view before they started using the platform. Additionally, they were also given cards that would take them through the process of procurement step-by-step, from selecting a product to making the final payment.

Both of these significantly increased the interest levels in using the platform. It also improved confidence levels as people began using the platform. The first quarter since these measures were introduced, saw a whooping 65% increase in traffic. Everyone was happy in the end :)

Due to the nature of this project, I have been unable to share some of the artefacts we developed during the project.

I also wanted to share my most favorite and least favorite moments during this journey. Most favorite: When we received approvals for conducting contextual research in no time! I guess we were in the right place at the right time. Least favorite: Sitting in a city, miles away from the users and conducting research. That changed quickly, of course.

Thanks for reading!

--

--

Aishvarya
Aishvarya

Written by Aishvarya

Experience Researcher at Commonwealth Bank of Australia | Views are my own | Strategist, Content Creator | Food, wine and everything nice

No responses yet