Helping Small Businesses Use Automated Ads
Understanding why ease of use doesn’t lead to adoption
Research Abstract
TikTok launched GMV Max, a new advertising tool replacing the old system. The goal was simple: help advertisers create and manage campaigns more efficiently. On paper, the new system was easier to use, especially for beginners. Despite the new system’s benefits, many advertisers refused to migrate from the old tools. Research was needed to identify friction points, in order to improve adoption and support user’s transition from old tools to GMV Max.
Role
UX Researcher, TikTok internship
Timeline
5 weeks
Collaboration
Product, Design, Engineering
My Responsibilities
- • End-to-end UX research ownership
- • Mixed-method research (35 interviews)
- • Insight synthesis → product decisions
- • Influenced roadmap priorities via stakeholder alignment
- • Presented research recommendations (bilingual format)
- • Led feature prioritization workshop
- • Built a reusable research framework
Research Context
The Challenge Behind Migrating To GMV Max
TikTok launched GMV Max, a new advertising tool replacing the old system. The goal was simple: help advertisers create and manage ad campaigns more efficiently. On paper, the new system was easier to use, especially for beginners.
Beyond simple usability issues, there was a lack of confidence.
Many advertisers refused to migrate from the old tools. If advertisers don’t adopt GMV Max, the company risks wasted product effort, frustrated users, and reduced revenue. Research was needed to identify friction points to improve adoption and support users' transition.
Research Question
"How do advertisers use and evaluate GMV Max, and how does that influence adoption?"
Research Methods
Mixed interview approach
Led online semi-structured moderated interviews and configured Dscout unmoderated sessions.
Why?
To maximize insight depth and pattern discovery while saving time under tight deadlines.
Comparative study design
Interviewed users of TikTok's tool and another competitor tool in separate sessions.
Why?
To reduce bias and identify gaps and opportunities, enabling a best-in-class experience for users.
Qual + Quant data collection
Pair quantitative UX scores with qualitative insights to understand pain points, workflows, and mental models.
Why?
To combine depth with scale, pairing behavioral context with measurable patterns.
Timeline & Workflow
Key Findings
Despite GMV Max being easier to set up than legacy tools, adoption was lower than expected. Why?
Beyond simple usability issues, there was a lack of confidence. Advertisers struggled to understand, trust, and be in control of an automated system managing their money.
Unclear return made automation feel risky
User: “I never really know if the system is helping my sales or just burning through my budget. It's hard to interpret the results.”
Insight
When users can't clearly connect spend to outcomes, automation feels risky. To protect themselves, advertisers limited spend, ignored recommendations, or stayed on legacy tools they felt more familiar with.
Recommendation
Surface all relevant metrics in a unified view, linking spend, ROI, and outcome trends with simple visual cues, enabling users to quickly see performance patterns and interpret results without guessing.
Recommendations feel irrelevant or confusing
User: “The recommendations don’t make sense for my business. It’s like the system doesn’t even know what I sell, so I just ignore the recommendations.”
Insight
Without context, recommendations felt generic and easy to ignore, weakening trust in the system's intelligence.
Recommendation
Personalize suggestions based on advertiser characteristics, past performance, and industry context to improve adoption.
Fluctuating ad spend creates frustration
User: “Some days my budget jumps or drops and I have no clue why. It feels like a black box."
Insight
Unexplained budget changes made users anxious and reinforced the feeling that GMV Max was unpredictable. Even when results were fine, the lack of transparency kept them from fully trusting the platform.
Recommendation
Surface clear explanations for spend changes, helping advertisers understand automation decisions and feel in control.
Project Impact
Reframed Problem from usability to a trust-in-automation challenge
Roadmap features
influenced
Established a repeatable benchmarking framework to guide future product improvements
Learnings & Reflection
Strong research partnership starts with understanding team goals
Being a good research partner meant deeply understanding product goals, constraints, and timelines, while still advocating for users. Aligning insights to what the team was trying to ship can make outcomes stronger.
Prioritization is as important as discovery
Not all insights are equally valuable. I became more intentional about collaborating with PMs to rank findings by impact and importance, making research easier to use under tight timelines.
Scalable research creates long-term value
I learned to think beyond one-off studies by building repeatable frameworks that made future research faster and more consistent.