Laura Zuk

works resume

Improving Web and Mobile Native features leading to a 65% increase in user activity

Background

Timeline: 9 months (2023-2024)
Team: Niley Barros (UX/UI Design), Christine St. Pierre (Product Owner), BJ Collins (Front-end Developer), Rory Gibson (Back-end developer)
My role: Senior Product Designer & Researcher
Tools: Figma, Figjam, Testmo, Pendo

While working on Community Co (CCO)’s Product team, we had been refining and expanding our user analytics systems. On the product team, we mainly used Pendo and worked with the Business Intelligence team for user outreach and additional data. With the two team’s collaboration, we found a clear place for improvement that, we believed, would increase user retention significantly: Improving Expert Panels.

Now, what are Expert Panels? One of the main draws of our product, Expert Panels, or EPs for short, are collaborative articles: a question will be posted in our platform that users can submit answers to. From there, they have a chance to be selected if an editor decides that their answer is a good fit for the question and article topic. It’s an easy and relatively low effort way for users to get published with our clients such as Forbes, Rolling Stone, or Fast Company.

We began this project in December of 2023, with an estimated delivery to users in June 2024. It would be a complete overhaul of our most trafficked area of the product, so we needed to do our due diligence and make sure this would be something that our users would love and easily understand. With two product designers, myself and Niley Barros, we got to work.

Planning

Our analytics showed that most users, around 70%, contributed to EPs on the day new questions were launched. Our users were excited about them, but the current user flow left much to be desired.

A simple but dated flow, it started by dropping the user on the Publishing page, where they needed to identify and click into the EPs section. From there, they could select a question to answer. However, there was no additional information about the questions: no information about the date posted, how long it had been open for, and if it was going to close soon and users would be unable to submit their input. Once the user clicked into a question, they could view other user’s responses and answer themselves, but when they tried to answer, the other responses would disappear - our data found that many users dropped off at this point, as they wanted to read their peer’s input. Once their answer was submitted, the question would disappear completely, responses and all. This was a major pain point for users - many reached out and requested a way to see their peer’s responses after their own submission, and some waited until what they thought was the last day to submit because of this. The users were under the impression that questions operated on a monthly release and close cycle, but questions often closed when they reached a certain amount of responses too.

Lastly, the submission process was vague, with no indication of what users should expect after submission, whether their submission was accepted or not. Because the opportunity to get published by one of our clients was so coveted, they dealt with this frustrating flow anyways.

Research

We started with the analytics data that kicked off this whole project. What did users like? What did they dislike? What are we missing? A lot of these answers were admittedly very obvious and users were happy to tell us where we could improve.

We started with our own review of the current flow, one designed before both of our times at the company started, and then dug into user testimonies to see their thoughts. For the most part, the users agreed with our identified pain points, and even gave us some ideas on where to further improve as well. They suggested clarity for the process, as well as being able to see answered questions and access to their peer’s answers as well, as the users wanted to interact with and support answers they liked.

In addition to the users themselves, we did a lot of external research on similar platforms and how their userflows went. This project started right as LinkedIn’s collaborative articles were released to the public, so we of course looked into their designs. Additionally, sites like Quora, Reddit, and others that dealt with question and answer or publishing flows were added to our giant research board. We went through all the flows with a fine toothed comb - inspecting what worked, what didn't, what we were missing from our own flows. After we felt that we had enough to get started, we reviewed our findings with our stakeholders, got approval, and then jumped into the wireframes process.

Wireframes

After all the research, we had a much more clear idea of what we should do. We got to work, combining the information we researched with our own brand styles, converting them in low fidelity, but understandable, visuals. The wireframing process underwent tons of brainstorming sessions, messy scribbles to convey ideas, and tens of reviews with our stakeholders, product owner, and the development team. We especially wanted the developer’s feedback, as they would let us know how much time and effort it would take to build out our designs: this process was going to be much more intensive then the previous one, so their input was vital in keeping on track for our launch date. After about 2 months of reviews and refining our work, we got the go ahead to start on our prototypes, and the engineers began to set up for the backend changes.

Prototypes

The EPs were about to have a huge facelift. First, we separated the articles section of the site from EPs. Articles were another huge benefit for users, and we wanted both parts of the product to have their place to shine. We compiled them under an expandable section of our main navigation - the left nav, as we call it - now called Publishing. The articles page has all references to EPs removed, as they were now living in their own space. To catch the eye of users who wanted to answer EPs as soon as they were launched, we included new indicators in the navigation as well as a banner on the homepage showcasing the new questions that were available to answer.

The new EPs page now has three distinct sections: Open, Closed, and About. The Open and Closed sections house the EP questions that users can answer and an archive of questions that are either under review for or have been published respectively. The clear distinction we hoped would help increase clarity for users, as well as give them an option to see those highly coveted questions their peers have answered, regardless of whether the question was available to answer or not.

Both sections house a new tagging system, where editors can add descriptive tags to help users quickly scan for questions they might be interested in answering. We also alerted engineers of our future goals of having them be available for sorting and filtering in future versions, but at the time, our product didn’t have those capabilities yet. We also added several status tags to the Open section, as well as an “Answered” tag for both Closed and Open, for users to quickly identify what they’ve already answered. Other tags included “New”, “Closing soon”, and some others to indicate the amount of answers, to encourage users to answer questions that would give them a higher chance of being published, which we found had a direct correlation to user retention.

When users clicked on a question, a drawer would open up and allow users to view the full question, tags and date published, the option to submit or view their own answer, and previous replies by other users. Open questions only allowed users to enter their own submission, whereas Closed questions would have this feature disabled. Then, the question would close when time was up and our team of editors would pick the best responses.

If a user's response was published, then they would receive a notification that their submission was up and published with their respective community. Due to concerns that new users would not know what Expert Panels are, I included a brief intro text on both Open and Closed tabs with a link to the About page. The About page included an introduction to what EPs are, why they are a huge benefit to users, and how users can contribute to them, as well as disclaimers about how not everyone will be selected for each one (as many received hundreds of responses every month). Additionally, a link to our help center sat at the bottom for users who might have additional questions about the site.

Conclusion

Overall, the new updates wowed users. According to the platform analytics, we saw an 65% increase for EPs activity in the weeks after launch, with 55% of users submitting answers to questions in the first week: A massive turnout. We received a lot of feedback from users as well about how eye catching and easy to use it was, as well as feelings that the barrier to entry for both responding to and viewing EP questions and responses was lowered significantly.

This project was not without challenges, however. This project needed the involvement of multiple teams, as it tackled publishing, editing, member outreach, and of course the product team’s development and design. There was a lot of back and forth to accommodate busy schedules, and a lot of “wait! I forgot this!” moments from members of teams who weren't always involved in product development. Because of this, we had lot of eleventh hour edits, to the frustration of the dev and us on the design teams. To remedy this, we created several small workshops for stakeholders and other teams to better understand our processes and workflow as well as ways they can reach out to us and let us know how to accommodate their own busy schedules and educate us on their own workflows. This helped foster a more inclusive environment at CCO, and encouraged teams to just reach out if they had questions, since the environment before has teams kind of keeping to themselves and only reaching out via managers. With a more collaborative environment established, we moved on to our next projects!