Verizon B2B AI Tools Suite

Designing AI-powered tools that earn trust through context, accuracy, and voice—helping sales reps spend more time selling and less time on admin.

Client Verizon (via Bounteous)
Timeline 2024–Present
Role UX Lead
Team UX, Engineering, Data Science, Sales Ops
[HERO IMAGE: AI interface concept, sales workflow diagram showing AI integration points, or abstract visualization of AI-human collaboration]
The Challenge

AI That Sales Reps Will Actually Trust

Verizon's B2B sales organization wanted to leverage AI to increase rep efficiency—helping them spend more time selling and less time on administrative tasks like email drafting, call prep, contract review, and CRM updates. The vision was ambitious: a full suite of AI-powered tools that could transform how reps work.

But the challenge wasn't just building AI tools—it was building AI tools that sales reps would actually trust and use. Early explorations revealed deep skepticism: reps had seen "AI solutions" before that created more work than they saved. They needed tools that understood their context, matched their voice, and produced outputs they could confidently send to customers without extensive editing.

[IMAGE: Problem space visualization—journey map showing sales rep pain points, or diagram of the administrative burden consuming selling time]

The stakes were high: a failed AI rollout would waste significant investment and, worse, poison the well for future AI initiatives. Reps who had bad experiences would be resistant to trying new tools, even improved ones. We had one shot to get the experience right.

Discovery

Understanding the Real Problem

We started with research, not solutions. I led a comprehensive discovery effort to understand not just what reps do, but why—what they're really trying to accomplish, what success looks like, and where current tools fall short.

  • Jobs-to-be-Done Interviews

    Conducted JTBD interviews with sales reps across segments to understand their goals, struggles, and definitions of success. This revealed that the "job" wasn't just "send emails faster"—it was "maintain relationships and close deals while juggling competing demands on my time."

  • Shadowing and Contextual Inquiry

    Spent time observing reps in their natural workflow, watching how they actually use (and work around) existing tools. This revealed a critical insight: reps' browser bookmark bars were often faster than the systems they were "supposed" to use. They had already optimized around the system's failures.

  • Session and Log Analysis

    Reviewed call recordings and system logs to understand patterns in rep behavior, common questions from customers, and where reps spent the most time on non-selling activities. This quantified the opportunity and identified specific high-impact areas.

  • Concept Testing with Skeptics

    Deliberately recruited reps who were skeptical of AI for concept testing. Their feedback was invaluable: they identified exactly where AI outputs would fail to meet their standards and what would make them trust (or distrust) the system.

Key Insights

What We Learned

The Bookmark Bar Problem

Reps struggled to find information in any system. Their bookmark bars—curated collections of shortcuts to specific tools and data sources—were quicker than the AI solutions we were testing. This set a clear bar: our tools had to be faster and more accurate than a well-organized bookmark bar.

Trust Requires Verifiability

Reps needed to trust the information before it went to customers. That meant AI couldn't be a black box—it had to surface its sources so reps could verify accuracy. Trust is earned through transparency, not magic.

Voice Matters

AI-generated content had to sound like the rep, not like a robot. Generic outputs that required extensive rewriting defeated the purpose. Reps have spent years building relationships with specific communication styles—AI needed to match that, not override it.

Context is Everything

Without understanding customer history, deal stage, and previous conversations, AI suggestions were useless. The most common feedback: "That would make sense for a new prospect, but this customer has been with us for 10 years."

[IMAGE: Insight visualization—quote card with the bookmark bar insight, or research synthesis showing the key themes that emerged]
The Solution

AI Tools Built on Trust

Based on our research, we established design principles centered on trust: AI must surface its sources, match the rep's voice, and understand context before generating anything. We then designed a suite of tools, each solving a specific job-to-be-done while maintaining the human in the loop.

  • Email Agent

    AI-assisted email drafting that pulls context from CRM, previous correspondence, and deal stage to generate responses that sound like the rep, not a robot. Reps review and edit before sending—AI accelerates, doesn't replace. Sources are visible so reps can verify before sending.

  • Single Pane of Glass (SPOG)

    Unified interface that aggregates customer information from disparate systems, eliminating the "bookmark bar problem" by bringing everything reps need into one searchable, AI-enhanced view. No more tab-switching and copy-pasting between systems.

  • Call Recording Intelligence

    Interface for reviewing and coaching on customer calls, with AI-generated summaries, key moment flagging, and coaching suggestions based on call patterns. Helps reps and managers identify opportunities for improvement without reviewing full recordings.

  • Contract and Legal Review

    AI-assisted review of contracts and legal documents, flagging areas that need attention and suggesting standard language—while keeping final decisions in human hands. Reduces time to review while maintaining compliance.

  • Content Generation

    Tools for generating customer-facing content (proposals, presentations, follow-up materials) that maintain brand voice and incorporate deal-specific context. Starting points, not final products—always reviewed and refined by reps.

[IMAGE: Email Agent UI mockup—showing the context panel with customer history, the draft generation, and the source attribution that lets reps verify accuracy]

Email Agent with context panel and source attribution for verifiable AI assistance.

[IMAGE: Single Pane of Glass interface—showing the unified customer view that replaces the bookmark bar workaround]

Single Pane of Glass: everything a rep needs in one searchable view.

Results

Early Signals

This project is currently in development and early rollout. While we don't have full ROI metrics yet, early signals are promising:

  • Positive reception in concept testing with previously skeptical reps—the "bookmark bar" insight resonated
  • Trust-first design principles adopted as standard for all AI initiatives across the organization
  • Cross-functional alignment between UX, engineering, data science, and sales leadership on the approach
  • Early adopters reporting reduced time on administrative tasks without sacrificing personalization

"[PLACEHOLDER: Add quote from Verizon stakeholder or sales rep about the approach—something about trust, the discovery process, or early experience with the tools]"

[Name] — [Title], Verizon

Reflection

What I Learned

What Worked

Starting with JTBD interviews rather than jumping to solutions was critical. The "bookmark bar" insight gave us a concrete bar to clear: our AI tools had to be faster and more accurate than a curated set of bookmarks. That reframed the success criteria from "does the AI work" to "is the AI better than what reps already do."

What I Learned

Trust in AI is earned through transparency, not magic. Reps don't want AI that "just works"—they want to understand why it's suggesting what it's suggesting, so they can confidently put their name on it. Designing for verifiability and human control isn't a limitation—it's the path to adoption.

Connection to AI/ML Design Leadership

This project demonstrates leading discovery for AI-powered tools, partnering with data science teams, and designing for trust and adoption in an enterprise context. The experience shows how to move AI initiatives from "technology looking for a problem" to "user needs enabled by technology"—essential for any organization investing in AI-driven experiences.