Verizon B2B AI Tools Suite

Designing AI-powered tools that earn trust through context, accuracy, and voice. The goal: help sales representatives spend more time selling and less time on administrative tasks.

Client Verizon (via Bounteous)
Timeline 2024–Present
Role UX Lead
Team UX, Engineering, Data Science, Sales Ops
[HERO IMAGE: AI interface concept, sales workflow diagram showing AI integration points, or abstract visualization of AI-human collaboration]
The Challenge

AI That Sales Representatives Will Actually Trust

Verizon's B2B sales organization wanted to leverage AI to increase representative efficiency. The goal was to help them spend more time selling and less time on administrative tasks like email drafting, call preparation, contract review, and CRM updates. The vision was ambitious: a full suite of AI-powered tools that could transform how representatives work.

But the challenge wasn't just building AI tools. It was building AI tools that sales representatives would actually trust and use. Early explorations revealed deep skepticism: representatives had seen "AI solutions" before that created more work than they saved. They needed tools that understood their context, matched their voice, and produced outputs they could confidently send to customers without extensive editing.

[IMAGE: Problem space visualization—journey map showing sales representative pain points, or diagram of the administrative burden consuming selling time]

The stakes were high: a failed AI rollout would waste significant investment and, worse, poison the well for future AI initiatives. Representatives who had bad experiences would be resistant to trying new tools, even improved ones. We had one shot to get the experience right.

Discovery

Understanding the Real Problem

We started with research, not solutions. I led a comprehensive discovery effort to understand not just what representatives do, but why. What are they really trying to accomplish? What does success look like? Where do current tools fall short?

  • Jobs-to-be-Done Interviews

    Conducted JTBD interviews with sales representatives across segments to understand their goals, struggles, and definitions of success. This revealed that the "job" wasn't just "send emails faster." It was "maintain relationships and close deals while juggling competing demands on my time."

  • Shadowing and Contextual Inquiry

    Spent time observing representatives in their natural workflow, watching how they actually use (and work around) existing tools. This revealed a critical insight: representatives' browser bookmark bars were often faster than the systems they were "supposed" to use. They had already optimized around the system's failures.

  • Session and Log Analysis

    Reviewed call recordings and system logs to understand patterns in representative behavior, common questions from customers, and where representatives spent the most time on non-selling activities. This quantified the opportunity and identified specific high-impact areas.

  • Concept Testing with Skeptics

    Deliberately recruited representatives who were skeptical of AI for concept testing. Their feedback was invaluable: they identified exactly where AI outputs would fail to meet their standards and what would make them trust (or distrust) the system.

Key Insights

What We Learned

The Bookmark Bar Problem

Representatives struggled to find information in any system. Their bookmark bars (curated collections of shortcuts to specific tools and data sources) were quicker than the AI solutions we were testing. This set a clear bar: our tools had to be faster and more accurate than a well-organized bookmark bar.

Trust Requires Verifiability

Representatives needed to trust the information before it went to customers. That meant AI couldn't be a black box. It had to surface its sources so representatives could verify accuracy. Trust is earned through transparency, not magic.

Voice Matters

AI-generated content had to sound like the representative, not like a robot. Generic outputs that required extensive rewriting defeated the purpose. Representatives have spent years building relationships with specific communication styles, and AI needed to match that, not override it.

Context is Everything

Without understanding customer history, deal stage, and previous conversations, AI suggestions were useless. The most common feedback: "That would make sense for a new prospect, but this customer has been with us for 10 years."

[IMAGE: Insight visualization—quote card with the bookmark bar insight, or research synthesis showing the key themes that emerged]
The Solution

AI Tools Built on Trust

Based on our research, we established design principles centered on trust: AI must surface its sources, match the representative's voice, and understand context before generating anything. We then designed a suite of tools, each solving a specific job-to-be-done while maintaining the human in the loop.

  • Email Agent

    AI-assisted email drafting that pulls context from CRM, previous correspondence, and deal stage to generate responses that sound like the representative, not a robot. Representatives review and edit before sending. AI accelerates, doesn't replace. Sources are visible so representatives can verify before sending.

  • Single Pane of Glass (SPOG)

    Unified interface that aggregates customer information from disparate systems, eliminating the "bookmark bar problem" by bringing everything representatives need into one searchable, AI-enhanced view. No more tab-switching and copy-pasting between systems.

  • Call Recording Intelligence

    Interface for reviewing and coaching on customer calls, with AI-generated summaries, key moment flagging, and coaching suggestions based on call patterns. Helps representatives and managers identify opportunities for improvement without reviewing full recordings.

  • Contract and Legal Review

    AI-assisted review of contracts and legal documents, flagging areas that need attention and suggesting standard language, while keeping final decisions in human hands. Reduces time to review while maintaining compliance.

  • Content Generation

    Tools for generating customer-facing content (proposals, presentations, follow-up materials) that maintain brand voice and incorporate deal-specific context. Starting points, not final products, always reviewed and refined by representatives.

[IMAGE: Email Agent UI mockup—showing the context panel with customer history, the draft generation, and the source attribution that lets representatives verify accuracy]

Email Agent with context panel and source attribution for verifiable AI assistance.

[IMAGE: Single Pane of Glass interface—showing the unified customer view that replaces the bookmark bar workaround]

Single Pane of Glass: everything a representative needs in one searchable view.

Results

Early Signals

This project is currently in development and early rollout. While we don't have full ROI metrics yet, early signals are promising:

  • Positive reception in concept testing with previously skeptical representatives. The "bookmark bar" insight resonated
  • Trust-first design principles adopted as standard for all AI initiatives across the organization
  • Cross-functional alignment between UX, engineering, data science, and sales leadership on the approach
  • Early adopters reporting reduced time on administrative tasks without sacrificing personalization

"[PLACEHOLDER: Add quote from Verizon stakeholder or sales representative about the approach—something about trust, the discovery process, or early experience with the tools]"

[Name] | [Title], Verizon

Reflection

What I Learned

What Worked

Starting with JTBD interviews rather than jumping to solutions was critical. The "bookmark bar" insight gave us a concrete bar to clear: our AI tools had to be faster and more accurate than a curated set of bookmarks. That reframed the success criteria from "does the AI work" to "is the AI better than what representatives already do."

What I Learned

Trust in AI is earned through transparency, not magic. Representatives don't want AI that "just works." They want to understand why it's suggesting what it's suggesting, so they can confidently put their name on it. Designing for verifiability and human control isn't a limitation. It's the path to adoption.

Connection to AI/ML Design Leadership

This project demonstrates leading discovery for AI-powered tools, partnering with data science teams, and designing for trust and adoption in an enterprise context. The experience shows how to move AI initiatives from "technology looking for a problem" to "user needs enabled by technology," which is essential for any organization investing in AI-driven experiences.