Use your mobile camera or images from your device's photo library to shop on eBay. Visual search is part of a large effort for eBay with the goal of making shopping simple on the platform. Part of this goal is to attract millennial users to the eBay shopping platform.
I worked as the lead designer on this project and brought it from concept to a working released feature. This project required close collaboration between 2 core search engineering teams, 3 project managers, and the lead designer from eBay's core search platform. It took approximately 9 months to design, test, and release this feature.
At the start of the project I needed to define goals before starting in on designs. I needed to understand who the target audience would be, how does visual search provide value to customers, and what services would power this feature. (It was determined that we would release this feature on native mobile first.) I also was curious to understand people's current behaviors when shopping. Do people take photos while shopping, do they share images, or do they do research online using photos (I.E., search within Google Images)? Answers to these questions will have a large impact on the design so the first steps toward creating a user story started with a competitive analysis.
Visual search is a fairly common feature found on most of the top e-commerce shopping apps. I downloaded quite a few shopping and non-shopping apps that utilized some form of image recognition feature. Entry points was a big question for me. I needed to know how users discover the visual search feature.
Here's a small list of apps that have an image search feature.
The competitive analysis clearly pointed out that most apps surface a small icon inside their search input as the main visual search entry point. The global search input felt like an appropriate place to surface the entry point since visual search is a subset of the textual search UI.
Beyond the entry point into the app, I wanted to see what the overall experience was like to enter the feature, take a photo, and view visually similar results. I wanted to learn how accurate were the results that were being returned from an image search.
The competitive analysis showed that most of the apps were able to return a result set that matched closely to the items I took pictures of. I also took special note that I encountered very little friction in the experience of entering the feature, taking a picture, and viewing results.
I observed how the camera components looked as well. I wanted to see if companies were utilizing the standard device camera or did they invest in a custom camera component.
Macy's, Home Depot, Amazon, and a few other apps all used a custom camera UI instead of the standard default device's camera. The custom UI gives the app a feeling of excitement. For 1st time users, this is the wow moment an app needs to encourage engagement. The need for a custom camera is also helpful with providing users with the context that they have entered into a visual search experience (not to be confused with just taking a picture.)
The camera UIs shown above are a few of the best I liked while doing research. Amazon allows users to perform a subset of smaller types of visual searches (scan barcodes and use photos) while inside their camera UI.
After completing the competitor analysis I worked with a member of the eBay research team to define the specific target customers of this feature. We created the following buckets to learn what characteristics we wanted to focus on.
I felt comfortable with where I was in this phase of the project and started to sketch out a few user flows. I wanted to get a feel for how the user would discover the entry point, enter the feature, snap a photo, and view relevant search results.
The team had an extremely short amount of time to get this feature to market. We needed to get designs validated, get a proof of concept feature built for testing backend services, and hash out the general experience of how this feature would return visually similar results based on an image.
After sketching a multitude of flows I narrowed down a few concepts that I could use to start wireframing a basic user flow. These rough wireframes helped communicate a basic high-level idea of how the feature would work.
At the core of the experience, a user would perform the following:
Once I started to move away from low fidelity designs an onto feasible approaches I began looking at how existing patterns could support this feature. Shoehorning this experience into an existing component set would not be idea but time constraints really forced me to explore this potential situation.
The camera component was an area that I heavily explored during the post wireframing stage. This is the only area of the experience that is new to eBay. With the exception that there is a camera component that is seen when customers list items on the platform for sale.
I worked through several variations of the camera to try and find what would work best for the customer. I also looked for areas that would allow for innovation to be pushed such as auto-detection of items, allowance of barcode scanning (which would work with an existing feature called Red Laser), and a way to allow users to access a help screen.
As I'm frequently meeting with my project managers, engineering team, and design leadership I'm getting closer to locking down the user flow and camera UI. This was not without its fair share of hurdles.
1. Custom camera component OR default native camera
Due to the time constraint, we were forced to utilize the existing camera component that exists inside of the user flow for listing items for sale on eBay. We could only perform small modifications to support the visual search use case. The custom camera component was not critical to the experience. However, it is a much cleaner interface that allows the customer to focus on an item much easier than the existing component. A piece of the component that I fought for is the information icon. When a user taps this icon a help screen appears that gives guidance on how to use the feature and receive optimal results.
2. Support for scanning barcodes:
Prior to the development of visual search, eBay built a feature called Red Laser that allowed customers to search for items by scanning a barcode on your phone. The entry point for this feature was also surfaced inside the global search input on native mobile. However, the engagement on this feature was less than 2% which basically meant that not many customers used it.
During the design of visual search, there were discussions from leadership to possible remove Red Laser. I did not want to remove this service as I had an assumption why the engagement was low with Red Laser. My assumption was that the icon used as the entry point was not recognizable enough for customers to tap on it. And even if customers tapped on it, there was not much context how to use it or what it would do.
So, instead of removing this barcode scanning feature, I decided to fold it into the visual search feature. Search with a picture or scan a barcode to retrieve results... why not.
The entry point for performing a visual search invokes an action sheet menu where customers can choose to perform an image search or scan a barcode label. Ideally, the feature would not need an action sheet menu to make a choice. The best experience would be for the feature to be smart enough to recognize an inanimate object from a barcode without having to make a menu selection on what course of action you want to take. The engineering effort to build autodetection would be too costly and time consuming for the initial release.
3. Forcing users to crop their image
After a user takes a picture there is an automatic crop overlay that appears. This is to make users crop their subject matter before tapping the done button to view results.
Why does this happen? For the MVP release, the search science team was concerned that the image recognition service might not be able to properly detect the main subject from background noise. So, to ensure the best quality results from captured images we force the users to crop their images. Ouch! (This is not a permanent solution.)
4. Visual searches on items that are in an unsupported category
Visual search was initially only supported in the Clothing, Shoes, and Accessories categories. This means that if a user performed a visual search on an item that falls outside of the supported categories previously mentioned they would see a null result set. This would severely impact the user's trust of the feature. To combat this potentially disruptive scenario we were able to at least direct the search results to a category that is relevant to the unsupported item.
The user has searched for an item that is listed inside an unsupported category. Instead of showing a null result page, we redirect the user to a category page relevant to the item that was captured.
We surface a call-to-action banner to give context why they are on a category page vs. search results.
User research was performed using 6 participants. I worked directly with our researcher and helped create a script/criteria for her to use while moderating the test. For this user research session, we focused on:
Questions to answer
*To protect the privacy of the participants stock photos used above.
We recruited 6 candidates to participate in this study. There were a mix of eBay and non-eBay users. All of whom are frequent online shoppers that use their mobile phones for shopping.
Here are some interesting learnings that we learned from the study.
Prior to the user research study, we were doing some intro research on the average customer's current behaviors with shopping/images. People get inspiration from all different places like shopping online or in-store, someone’s house, social media, Buzzfeed, Pinterest, blogs. These are places that are off eBay.
There were a few comments from the participants during the study that validated an idea that we had. We were thinking of how we could enable users to share an external online image with the eBay app to trigger a visual search. I came up with a quick concept flow on how this could potentially work. I was able to build a prototype that could be testing at the end of the main objects of the study.
The goal of this project is to create a frictionless experience from when users find a product they are interested in to actually being able to run image search on an e-commerce site like ebay.
Use Cases (Send external image to visual search)
Users leverage native iOS and Android 'share' functionality in apps, photos, and mobile websites to share content with friends via text message, email social networks etc... Instead of sharing the image to one of these communication platforms, users will be able to 'share' to the ebay app and easily run image search on the image. This is the key difference - using 'share' for sharing into ecommerce rather than via messaging communications.
As a shopper, I’d like to easily run an image search on any image I come across while browsing the web or another app so that I can quickly see the relevant inventory on ebay.
Some of the biggest takeaways of this session were unanimously, "Develop this feature now! We need this!!". The participants provided a wealth of feedback that ultimately will help improve a few rough edges in the experience. Ultimately, for MVP release the was nothing terribly negative that emerged from this study.
We've captured a great amount of initial research data to use for improving the next release of Visual Search. In conclusion, it was great to validate that this visual search feature makes good business sense for the company. eBay has a HUGE inventory that the feature can pull from in order to serve up visually similar results.
This feels like a natural progression for searching with pictures when shopping. A picture is worth 1,000 words!!