Wayfinding

Project Focus

Ecommerce + Wayfinding + Critical paths + B2C

Helping people find more of what they want faster.

Generative research

Multivariate experiment

Data analysis

End-to-end design

Challenge

Understand how folks are navigating the site to complete top tasks. Then identify areas of opportunity for both small wins and big bets. Measure these new experiences via A/B testing against company identified metrics of success.

Outcomes

Impact analysis, 1 month after launch. Looking at adopters vs non-adopters.

Nav engagements

+1.3M

Order Completed

+200%

Traffic to surfaces

+238%

Listings clicked

+280%

Product item clicked

+256%

Strategy

Stakeholder interviews

Critical Paths

Tree Testing

Data Analysis

Collect signals, or evidence, that illuminate the problem space from both a business and user perspective. Once collected, use these signals to paint a clear picture of the problem and recommend a path forward.

Tree Testing

Removing the interface to deeply understand how people make a decision based on categorization of content.

This was a first at Weedmaps. I've had experience conducting these tests before so I was able to share some learnings with team and eventually provide enough rationale to get stakeholder buy-in. I mention this because it required a bit of patience and advocacy to adopt this tool, as with most things research. Once that hurdle had been crossed we were up and running.


A BRIEF HISTORY OF NAVIGATION AT WEEDMAPS

10+ years of evolving the product and navigation to support user needs.

Weedmaps has been around for 10+ years and during that time it has created new surfaces (pages) to satisfy user needs. This meant we had very similar labels in our navigation that directed folks to the same place, just filtered differently. I'm not going to say redundancy is always a bad thing, a lot of navigations have success with poly-parenting of items, but that's usually at the subcategory level and not at the top of the hierarchy, so I had my concerns.
COLLECTING A BASELINE
In my experience, writing unbiased and non-leading tasks are both the hardest part to this method and the easiest to get wrong. I leaned on SME's within the org across different functions to help draft tasks for the test. Afterward, I paired down the list of tasks based on stakeholder feedback and would be ready to launch the test.
I would then release the first test to 50 participants. The sample size was larger since this is a quantitative method. The larger sample size also helped with the initial skepticism some folks had for this method.

The similarity in labels caused hesitation and confusion.

The baseline test highlighted what we all kinda knew but never really knew how to quantify. We now had a clearer sense of which labels were causing the most confusion and could move to make a proposal.

Data Analysis

Armed with newly found insights from our users I looked to the data to add another signal to the research. I wanted to understand how people navigate across platforms and surfaces.

The team at WM has a very sophisticated data stack and I would only be scratching the surface of it. My goal was to go wide and look for anything that supported the confusion observed in the tree test.


PROJECTING TRAFFIC DISTRIBUTION

What happens if we direct more traffic to a particular destination?

The new proposed tree would be doing just this, but what would that actually mean for the business? I set out to try and understand the implications of redistributing traffic from one destination on the site to the other. I partnered with data to check my logic and ended up using a simple formula of:
% change in traffic = current nav clicks + (% change in tree test * current nav clicks)
IDENTIFYING A BIG BET
This was a great conversation starter and reminder of how valuable wayfinding is. I would share these insights in various meetings and channels until we aligned on a path forward. That path forward would be consolidation of certain pages. This was the "big bet" and would require several more months of planning before we could test.
MOVING ON A SMALL WIN
The consolidation effort wouldn't yield any returns for awhile since folks had to align on a few business goals before making a decision. I was still eager to demonstrate how valuable navigation can be and proposed a smaller quicker experiment. There was one point of data that stuck out to me.

Desktop navigation engagement rates and mobile navigation rates were very similar.

This seemed very odd given that mobile sees about 90% of our traffic volume (9m-11m) and desktop only accounts for about 10% of traffic volume. Yet both had around 600k in navigation engagements per month. This suggested to me that navigation was valuable when exposed and not hidden away in a hamburger menu.

EXECUTION

By exposing the navigation on mobile web, we believe wayfinding will drastically improve and lead to more people finding what they want on Weedmaps. We will know this is true if we see a lift in Navigation CTR, Listing Page Views, Engagement Rates, and Promoted Engagements.

PROTOTYPE

A simple solution to a sophisticated problem.

MEGA MENUS, DROP-DOWNS, AND MORE
Several treatments were explored from simple drop-downs to mega menus. The mega-menu was something that sales really liked the idea of since it could also provide a new surface to monetize. Drop-down navigation had never been implemented at Weedmaps and could complement the new proposed navigation tree.

EXPOSING THE NAVIGATION
What would happen if we place critical navigation items from the hamburger menu in a horizontal row near the top of the site? This was attractive for a few reasons. First, the interaction cost was lowered versus being hidden behind a menu. Second, it would provide a clear signpost for people to use during their journey on the site. Lastly, we could move on this almost immediately.

EXPERIMENTATION

Quantifying the value of wayfinding with a hypothesis grounded in research and supported by data.

MEASURING SUCCESS
Navigation plays such a profound role in almost in every journey that people take on the site and app that we had to scope what we would measure. We decided to keep it high level and then look to conduct an impact analysis after.

EXPERIMENT RESULTS
Due to a hiccup in our data stack, we had to pull the test after only 2 days. We thought of relaunching the test but our Data team determined we had enough data to reach statistical significance.

The results were positive and supported the initial hypothesis. This gave us enough confidence to roll the feature out to 100% of our audience on web. Once ramped to 100% is when we really started to observe how impactful this change was. The post-impact analysis is shared above in the "Outcomes" section.

METRIC

RESULT

Navigation CTR

+41%

Listing Page Views

+20%

Engagement Rate

+8%

Promoted Engagements

+7%

Final Thoughts

This is a glimpse into an ongoing effort to improve wayfinding for our users. It's inspired new ideas and harder conversations around all things navigation. Additionally, it has helped to mature the design team's research efforts and provided us a measurable outcome to speak to.