Case Study: Lowe’s Visual Search, Mobile Web
My role: UX Lead and Designer
When I designed it, Lowe’s Visual Search was part of a three-fold search bar — text search, image search, and barcode search. We also started preparing a fourth element — voice search.
In addition to using text-to-search, we wanted to offer customers the option of searching using an image (either one they take using their camera or uploading from their local device) or scanning a barcode, like they could on the Lowe’s app.
Lowe’s apps launched Visual Search a year before the mobile web version. So there was a lot of data I started looking for to understand what customers' problems were and how they wanted to solve them using Visual Search.
- What ratio of Lowe’s customers uses apps vs. mobile web in general? What relevant findings do we have on Lowe’s Visual Search usage from the apps team?
- What are people currently using Lowe’s visual search for? Where are they using it? (In-store, at home, on the go, etc?) How is our accuracy?
- How many people scan barcodes and search by taking photos vs. uploading photos from their machines?
- What technical issues and constraints does our technology have, and what unique limitations will we have for mobile web that we don’t have for apps?
- How are Lowe’s DIY customers using Visual search vs. “Lowe’s Pros” (contractor and other professionals?)
- What categories, if any, are people using Visual Search for the most? (Where do they find it most valuable)
- What is the Product team’s long-term vision for Visual search?
- What are business expectations around the feature and timeline?
Barcode search
At the time of this design cycle, the current power users for barcode scanning were:
- Lowe’s employees — in-store
- Lowe’s customers — in-store
- PRO users — contractors, etc. They are using barcode search largely when reordering their go-to items.
The work
I designed Lowe’s Visual Search on mobile and tablet web and tablet from 0–1 while leading research and content teams through validation and usability testing.
I also worked daily with a team of fast-paced engineers through technical constraints while working with Product on prioritizing critical UX elements with tight engineering bandwidth and deadlines.
- Researchers on my team: 3
- Content strategists / UX writers: 3
- Product Managers and Leaders: 5
- Engineers: 12+; I worked directly with 4
Impact of work
+2X conversions compared to apps
The search bar — entry to Visual Search
Entry to Visual search can start anywhere on Lowes.com from the masthead and search bar, as pictured above. Access can also happen via Google searches or marketing e-mails.
Understanding current customer awareness of using visual search for retail products, in general, played a crucial role in testing versions of the barcode and image icons inside and outside the search bar and when they’re visible.
Lowes.com had been working hard to drive the tone of our copy across the site toward a more conversational one. Using copy as an essential UX tool to create a more human shopping experience is something I’d been advocating heavily with my teams. I continuously challenged and advocated for in-store-like conversation copy, to simulate a more human conversation that is on brand for Lowe’s with our customers.
That said, the search bar text above was an example where conversational text only complicated things (especially on mobile viewports.) So we landed on consolidating the search text to “Search” on mobile and tablet and keeping “What are you looking for today?” on laptop and desktop views.
(Later, I designed and tested a new unified icon, as did the app team. More on that in a bit.)
When customers click the “scan barcode” icon inside the search bar, they’re taken directly to the scan barcode experience.
They can then move between image search and back to barcode search as needed. The scan will happen automatically as we recognize a barcode.
If customers know in advance that a barcode won’t be scannable (maybe it’s partially ripped off or colored on), they can enter the code manually.
Setting customers up for success when our tech fails:
1. What happens when a customer tries to scan a barcode, but our tech can’t pick it up?
Messaging was crucial here. Customers get frustrated when things don’t work. To mitigate frustration, we offered context and transparency about why things weren’t working, logical next steps, and easy ways out/back.
- We suggest better lighting or moving closer to the barcode, then a “Try again” CTA.
- We also offer customers to enter their code manually instead of scanning it.
2. What happens when we can scan a barcode and recognize it’s a universal UPC — but one that we don’t carry in Lowe’s product catalog?
Version A was a “quick fix” that we implemented in the meantime.
At the same time, I continuously advocated for the technical bandwidth of integrating an easily accessible search bar, as shown above in version B.
🚨 During usability testing, Version A was the point that 90% of customers were most likely to leave Lowes.com to search for their item at a competitor site. Version B offers a more conversational and transparent way of providing customers easy access to use text-to-search to look for a similar item.
The integrated search bar tested very well during a follow-up usability test — almost 100% of customers tried to use it as a next step to search for an alternate product. (The engineering team was working on integrating the search bar inside this messaging at the time of this writing.)
Image search — security
Security was one of the first things I considered on entry to image search.
It was very important to me to get this first step right because people are getting increasingly weary of allowing access to their devices. We needed a customized message that offered customers transparency on why we needed their permissions for visual search.
While our Product Manager checked the legal requirements of showing a permissions screen, I wanted to understand how important it is for customers.
Of 30 people tested (during a larger general usability test for a complete first version of visual search), 21 of 30 found the camera permissions request completely trustworthy and expected it before using the feature the first time. The other nine people still found the request somewhat trustworthy but overall have a little apprehension when giving a new app or website access to their phone and personal information.
🗣️ “This is something I would expect to see if I’ve never used my camera on the particular site before, for privacy reasons. I guess I’m glad to see it. I have no problems granting access in this case because I understand I need my camera to be able to search.”
🗣️ “It’s what we generally expect using anything like Google Lens or taking a picture to do something like that. It always asks permission to use cameras when we’re doing that. So, I think it’s totally trustworthy.”
(The center-aligned text is in our style guide for these messages. Left-aligned would have been a better choice for readability.)
Understanding current customer awareness of visual search played a key role in testing versions of the barcode and image icons inside and outside the search bar and when they’re visible.
My oversight — A technical constraint I discovered late.
There were multiple problems with the first design of Visual Search. Two of the biggest problems were:
- The viewport for photo allowance was compromised. The semi-transparent overlay behind the shutter button suggests that whatever a customer can see will be part of the picture they take and search for, but this was not the case. We were only returning everything above the overlay.
- “Your photos” and “Take a photo” will both prompt the iOS and local Android components that further prompt customers to upload or retake a photo. So we’re making people choose this twice in two places on one page. (This is a mobile web issue I should have learned of earlier. It was a significantly painful learning experience!)
As we had a tight incremental release schedule, we chose the “quick fix” version so customers wouldn’t need to take a picture twice.
Then, I worked with our Product Manager and Lead Engineer to prioritize new releases vs. fixes. We all worked together and compromised. When our PM wanted to add a tooltip underneath the search bar to attract more usage to Visual search, I asked that we implement the better fix on the right above ⬆️ first. It was still far from perfect, but much better than what we had. I didn’t want to approve onboarding more Lowe’s customers to a new but broken experience. We also expanded the viewport by improving our header. Instead of ‘closing’ Visual Search, we are moving customers’ back’ wherever they were on the lowes.com site before they entered Visual Search.
How do we introduce Visual Search?
Tooltip versions ⬇️
Image cropping tool
I added six touchpoints to the cropping tool to be super easy and intuitive to use and give customers maximum control to customize the image they want to use to search. (We have plans for automatic image detection, but this is what we’re using in the interim.) After a customer uses the cropping tool, we show them their result so they can choose whether to use it, “retake” their picture, or “redo” an uploaded image.
Setting customers up for success — The “Relevancy Screen” photo tips
🤳 The photo tips on the screens above are based on the thousands of images we parsed through to see what images aren’t making it through and how people could have taken pictures that would improve our capability to get them more relevant results.
Our tech is imperfect, and we don’t want customers to think they’re at fault when we can’t show them relevant results.
After a customer takes a picture and while we “are looking for best matches,” we can recognize if we can offer results better than 60% relevancy. When we can’t, we wanted to show helpful tips for taking a better picture.
We alternated between showing this screen on people’s first entry to Visual search and only after we analyzed their photos. The first option (with examples of images) was something we used for V1 of Visual Search for education purposes. V2 was an improvement — offering more accessible ways for customers to click out and access their intended task.
Search results text customized for different product categories.
Our technology couldn’t yet accurately analyze size-specific home improvement products — like nails and bolts. And those details are critical for those categories!
So, I worked with our UX writing team on creating custom messaging for the results of those products — “Results may vary on size based on product” and something that works for any other general inaccuracy for other categories— “Search results may vary.”
SKU Search will only return one result, so the message is straightforward. “Here is what we found...”
How can we get genuine customer feedback without too much intrusion into a customer's mobile shopping experience?
The Feedback Component
We rely heavily on customer feedback and constantly monitor for bugs and improvements. One of the ways we do this is through a service called Medallia, which has its technical requirements and constraints. And while we need this feedback, we don’t want it to intrude on a customer's experience.
🧐 How many products do customers need to see before they know if they’re getting relevant results for their search?
We tested this for multiple categories, and our testing showed that people usually know whether we have what they are looking for after the first two or three results (on mobile.) They aren’t scrolling much farther than that.
There was a technical issue around the “undo” button on the “Thank you for your feedback” message. After a customer clicks on either “yes” or “no” inside the feedback component, that message is sent to Medallia’s system, and there is no way for us to call it back. So technically, we can’t make “undo” work.
I still advocated adding it anyway, even if it doesn't work. Customers WILL inevitably click on a “yes or no” when they mistakenly on a small mobile screen. I didn’t want them to feel frustrated that they may have sent feedback that doesn't best represent their experience using our product.
Visual Search V2 — Customizing error messages
The more we can understand what is causing an error and customize the error message and corresponding action to fix it, the faster and better people can complete their tasks. “Little” things like this are what build a more trusting relationship between a brand and its customers.
Unified barcode icon
I worked in unison with the app team throughout Visual Search, especially with the iconography. While our design system and design patterns are different for apps and mobile web, we wanted to achieve consistency with branding elements.
Visual search for tablet
The next addition to visual search will be QR codes. The product team identified this because customers already use image search to take pictures of QR codes. That’s how they expect QR search to work — based on learned behavior and design patterns in other digital products they use in their day-to-day.
My last contribution to this project at Lowe’s was integrating UX strategy and prioritization for Visual Search. And measuring the value to customers based on all we know, I didn’t see QR code scanning needing additional UX work.
If people already use image search for QR codes, let’s build that capability technically and meet their expectations. We can, in the meantime, prioritize other things for visual search — like merging image and barcode search into one collective experience. This would change everything anyway, be a V3 for visual search, the next step we were waiting for based on technical capabilities for the mobile web version.
Thanks for reading! Please contact me on Linkedin with any inquiries; I love working with mission-driven e-commerce and other interactive brands and products
-RB