Food and Beverage Delivery App Competitive Analysis
2022
Customer: Starbucks
My role: Co-researcher
Methods: Unmoderated usability test and 1:1 moderated interview
Tools used: UserTesting, Figma, Miro, Google Suite
Stakeholders: Starbucks Product Managers
Timeframe: 8 weeks
Challenge
Starbucks was considering integrating a delivery service into their app. To help inform their future strategy, they wanted to better understand customer needs and expectations throughout the ordering and delivery process.
They also wanted to collect customer feedback on differing levels of 3rd party delivery integration and delivery fee models used by competitors.
Solution
My co-researcher and I recommended a two-part strategy: 1) an unmoderated test to capture in-the-moment feedback while placing an order, and 2) a follow-up moderated session the following day to dive deeper into pricing and fee sentiment. We split in half moderation and analysis responsibilities for both touchpoints.
The brands chosen for this study (Peet’s Coffee, Chipotle, Chick-fil-A, and Domino’s) allowed us to evaluate a variety of delivery experiences that included partial and full third party integration, premium pricing, and fees.
Influence
Based on our findings, we recommended exploring a partial or full third-party integration solution to ensure the best user experience.
A few months after this study, Starbucks announced the decision offer their products via DoorDash and Uber Eats.
Overview
Research Questions
The purpose of this study is to answer the following research questions:
What are participants’ impressions of each app’s delivery ordering process and premium pricing?
What are participants’ impressions of the checkout process of each app?
How, if at all, does third-party integration and premium pricing for delivery factor into their impressions?
What expectations do participants have regarding order tracking during delivery and how well, if at all, are those expectations met?
What areas of opportunity are there for the company to consider for their own app?
Methodology
Longitudinal study with 12 participants split across 4 competitors (3 per competitor) who offer food and beverage delivery through their brand’s app
Brands tested: Peet’s Coffee, Chipotle, Chick-fil-A, and Domino’s
Participants were a mix of those who were familiar with ordering delivery via a competitor app and those unfamiliar with it
Each participant completed 2-touchpoint sessions (1 unmoderated, 1 moderated) across the following competitor apps:
Touchpoint 1: 15-minute unmoderated testTasks: Participants purchased food and/or a drink using a provided gift card and their designated brand’s mobile app (iOS and Android)
Participants thought out loud as they attempted to complete tasks and answered questions about their experience
Touchpoint 2: 30-minute moderated session
Completed within 24 hours of the unmoderated test
Provided an opportunity to ask follow-up questions regarding their purchase experience
Approach
Below is an overview of considerations I made as part of the research strategy:
Unmoderated tasks informed the moderated interview: The unmoderated test plan focused participants on completing an order and providing usability feedback along the way. I structured the test to include multiple question types to make it quick and easy to review key points of feedback in the session before heading into the mod session the next day. The mod session then allowed participants further reflection on their experience and an opportunity for researchers to ask in-depth questions regarding cost sentiment.
QC and Pilot sessions: Before the full launch, we ran one pilot session for each brand. Once complete, we watched each session to ensure the time length was within our target, participants were able to successfully complete the tasks, and identify any task confusion or misunderstanding. Based on the pilot results, we further refined the test plan.
Memory loss between sessions: A portion of the delivery experience occurred after the unmoderated test ended. To help jog participants’ memories in the following day’s mod session, we asked them to notate any thoughts they had about their delivery experience after they placed their order.
Participant communications: This study required more effort and coordination from our participants than most studies. Understanding that communication was key to ensuring success, we provided frequent, proactive, and detailed instructions.
Research Operations assistance: My co-researcher and I enlisted help from a Research Operations Specialist teammate to ensure the project ran smoothly and on time. They focused on recruitment, participant scheduling, and working through technical issues that arose during the unmoderated session, which allowed me to remain focused on research-related tasks.
Planned ahead for check-ins: Session moderation was divided between my co-researcher and me. We preemptively scheduled check-in meetings where we reviewed our progress and themes as they emerged. This approach helped us gradually create a report outline and quickly move into analysis and reporting once we completed the sessions.
Report format: This study had a condensed timeline. To ensure the customer could quickly and easily review and share our findings, my co-researcher and I created a high-level findings report with some supporting quotes and clips, and screenshots from each brand app.
Technical Issues Encountered
Chick-fil-A (CFA) payment method
Problem: The CFA app offered the choice of a delivery from a restaurant or DoorDash. We provided participants with a CFA gift card and instructed participants to select delivery via DoorDash. But, the Chick-fil-A gift card wasn’t a valid payment method once they were on the co-branded DoorDash screen.
Solution: We sent participants a Visa gift card to use in place of the CFA gift card.
Impact: This delayed completion of the CFA sessions by a couple of days while we waited for Starbucks to purchase three Visa gift cards.
McDonald’s inconsistent errors
Problem: When attempting to place orders through the McDonald’s app, participants received errors that my co-researcher and I were unable to replicate. This prevented them from being able to submit their order.
Solution: With permission from Starbucks, we canceled the McDonald’s sessions and replaced them with Chipotle.
Impact: This delayed completion of three sessions by a couple of days and required Starbucks to purchase additional gift cards for newly recruited participants.
Result
During the readout, we recommended exploring a partial or full third party integration solution to ensure the high user expectations are met.
A few months after this study, the customer announced the decision offer their products via DoorDash and Uber Eats.
Reflection
What went well
Our ability to quickly pivot and address technical issues thanks to established contingency plans
Partnership with Research Operations to coordinate recruiting and participant communications
Strong collaboration and regular check-ins amongst the UserTesting team allowed us to quickly identify themes and outline the report
Quick test plan development because of our ability to leverage a Figma file provided by Starbucks with screenshots of order flows
What I’d change
If requiring the participant to make a purchase using a gift card, I would verify the payment method you provide can be accepted (when QCing, we stopped at the payment screen)
I would have increased the sample size to at least five participants per brand to help increase confidence in our findings. Unfortunately, this wasn’t an option due to Starbucks budget constraints
Deliverables
Touchpoint 1: Unmoderated test plan snapshot
High-level findings report snapshot
Touchpoint 2: Moderated discussion guide snapshot
High-level findings opportunities snapshot