Support Driven Expo 2023: Day 2 Notes

Notes from Support Driven Expo day 2 talks and workshops that I attended.

Empowering Support Teams: Practical AI Tips for Today’s Challenges

Kenji Hayward, Front

  • technology has always transformed support: mail, phone, email, chat
  • audit tools, then consolidate
  • rethink how you measure performance metrics: separate AI (B(ot)SAT) and human performance
  • shift your focus from transactional to complex: transactional support will be commoditized with AI, need to grow team to handle more complex issues
  • reimagine roles on your team: potential for new career paths; Examples: Head of Scalable Support, Support Program Specialist with focus on content creation, community, and internal docs
  • invest in support operations: focus on delivering better customer experience through docs, tools, and automation. Examples: Support Operations Analyst (data, workflows), Support Operations Engineer (automation/technical builds)

Building Tomorrow’s Support Team: A Glimpse into the Future

Declan Ivory, Intercom

  • Customer service transformation making faster better cheaper more powerful through AI
  • in an AI-driven support world, what changes?
  • How should support leaders prepare their team?
  • be very clear about your strategy for AI, be open and transparent about the drivers and goals
  • acknowledge things will change
  • communicate early and often about changes ahead
  • involve support team in planning changes, listen to your team
  • highlight the opportunities that move to AI model presents: less mundane work, new skills, new roles, more fulfilling work
  • support role changing: increasing complexity of issue, stronger problem solving skills, SME, consultative role, focused on customer value (vs customer issue), AI tools to augment role, EQ skills, proactive reach out, continuous improvement mindset/culture, knowledge creation/management, intentional development
  • implications: hiring, onboarding, KPIs, performance management, capacity planning, customer insights
  • new roles: conversation designer, conversation analyst, knowledge manager, prompt engineer, design strategist
  • yesterday: CSAT/NPS/CES/QA, today: customer experience: how do you measure customer experience with AI bot?
  • questions to think about: role of front line, hiring/training, measure team experience, customers satisfaction, SME/skills required, resource planning, what are best in calss SLA/SLO, knowledge/content creation/management, economics of CS change, role of manager, KPIs
  • no playbook, but moving quite fast, hopefully in a year, will talk about it
  • embrace the opportunity, be very intentional, think through implications, best practice still evolving but don’t wait for a runbook (help create it)

Navigating Knowledge Evolution: Strategies for under-resourced teams

Michael Brady, RightPage

  • integrating knowledge into products: context specific way
  • smarter search experiences
  • dynamic content: more personalize, customized
  • harnessing analytics: should help identify underperforming content and potential gaps
  • content metrics to watch: exit rate, ticket creation rate, content completion rate (scroll to the end)
  • automatically identify out-of-date content: use summaries of product changes/updates to scan content and identify items that likely need to be changed
  • identify relevant product change “data sources”, use change summaries as inputs to search/identify content changes, review recommended changes

How to Troubleshoot Anything

Matt Dale, Moxie CX

  • eight steps to troubleshoot all the things in your life
  • universal troubleshooting principles exist
  • good news: you can learn
  • why bother: makes you better at your job, saves money, time
  • frame of mind: all problems are fixable, just follow the process, “how can I narrow it down?”, emotion vs. intellect, RTFM/Google, ask for (and give) help
  • prepare properly: clean your workspace, get organized, gather resources, understand the system
  • Eight steps:
    1. Make a plan: What are the “danger spots” and how can you avoid them?
    2. Ask good questions: What’s happening? What’s supposed to happen? How can I reproduce it? Understand what they’re saying.
    3. Reproduce the problem: If you can’t repeat it, you won’t know if you’ve fixed it.
    4. Narrow it down: Use binary searches to methodically narrow-in on the problem.
    5. The Quadruple tradeoff: ease, likelihood, safety, event divisions
    6. Fix it / find a workaround: You know what’s broken, now fix it! Involve another person if needed.
    7. Test you fix: always check to see if your fix actually fixed the issue (cf. step 3). Check it if someone else did the work.
    8. Celebrate: Pause and celebrate your win; you fixed it!
    9. Prevent future problems: Make sure you’re not fixing the same problem over and over again

Creating Reciprocal Feedback Loops: Beyond Product

Tori Mayernick, NerdWallet

  • queue spikes due to other teams’ work
  • when comms fall through the cracks: long wait times, missed SLAs, delayed responses, inadequate info, outdated self service tools, threats on abandoning the brand
  • examples: policy change, marketing campaign
  • ended up building 12 partnerships with non-product teams
  • why: avoid queue spikes, reach goals, working with these teams allow to be proactive, support org to reach revenue and retention
  • partner teams: share user-impacting information such as changes to how we communicate with users, new legal or security requirements; share goals around sales, campaigns
  • to support gains: internal and external success tools (such as templates, help articles), understand how to handle, opportunities to contribute to revenue and retention efforts, good way to get team engaged and excited
  • support team: consumer feedback and trends that individual teams or company may find valuable
  • to partner gains
  • feedback loop: updates, feedback, improvements
  • which teams?
  • criteria to consider: does their work impact the consumer experience? team to support; could they benefit from consumer perspective? support to team. Does their success depend on a successful consumer experience?
  • Examples: legal, security, product, marketing, sales, business development
  • building these relationships take a lot of time, but these collaborations are key to the work
  • steps to forging relationships:
    • get buy-in: establish need to collaborate with hard data, identify what success could look like, drivers, frame message around their goals
    • make it easy: lay out vision with documentation, suggest what info and data might share, recommend processes based on previous successes
    • follow through: own support’s aligned role, lead collaboration efforts, drive conversations in the initial stages
    • reach alignment: partnership drivers, information shared, process for sharing info
  • feedback loop template
    • criteria, data/info, processes, closing the loop
    • Example: info that should be shared immediately, high impact blocker, file ticket on partner team’s board, partner to update support at regular sync
    • Example: info that should be share with support, changes that affect CX, file issue, support to provide feedback
    • want to make it customizable
  • set up for success: developing a partnership program
    • back in support: partnerships manager role, co-ownership model and back up ownership, develop doc and templates
    • org wide visibility: share successful feedback loops in public spaces/meetings, find opportunities to act as the voice of consumer
    • org wide transparency: one team member acting as eyes and ears, proactive outreach and built in expectations
    • reevaluate program structure: regular check-ins with CFPs, shift priorities as org priorities shift
  • what success looks like: example:
    • sales: communicated sales goals, outlined process to surface consumer trends and blockers
    • user operations: ticket tracking system picked up trend: target consumer segment blocked from converting, sufaced
    • feedback loop: Updates to SMB conversion goals and application process, feedback on updated process surfaced blocker to conversion, improvements to application process to remove blocker

Death of the CSAT survey? Not so fast!

Mart Objartel, Klaus

  • CSAT measure used to quantify the degree to which customers are satisfied with a service.
  • many problems with most feedback surveys: no motivation, don’t remember, wrong timing, survey fatigue, bias, disappointment about something else
  • how to design surveys that work?
  • minimalism is in: choose one question and make it count
  • need to make them passionate about the cause: establish a connection using bio, who is going to read it, what is going to happen with it
  • timing and channel switching: switched from email to end of chat; 15 to 60% response rate
  • Mixing things up: ask for a reason before rating, experiment
  • new possibilities using AI/ML: adding context to surveys, include beginning snippets/prompts (increased 15 to 45% comment rate), conversation recap
  • hyper personalization: doesn’t need to be the same question for all customers, may not even want to ask if it was resolved very quickly
  • feedback survey > feedback conversation

Closing panel: Navigating the future with AI

Brittany Ferguson, ?; Mollie Holland, DevRev; Declan Ivory, Intercom; Phil Verghis, Klever

  • the application and practical parts, key thing don’t want to over-analyze, find repeatable, error-prone task. Don’t try to solve the world’s problems from the beginning, start with what bugs you. Do something 2-3 times, raise your hand, look at how to automate; similar approach to AI. Think about small, practical applications.
  • narrow vs. general: focus on a narrow area
  • different types of AI
  • example: interface with customers, better experience for support, support operations, data/insights driven on insights into CX/feedback
  • seed with lots of data, the more, usually the better, can look at unstructured data from multiple sources
  • initiative to test it out, seeing it how it works, teaching it, conditioning it
  • know what your responsibility is in the training, to make AI effective, it has to be continually trained and tuned
  • Example: getting product/engineering to write release notes; use content to auto-generate release notes
  • Example: expensive machinery, use all data to predictive when need replacement, order parts ahead of time, schedule maintenance
  • Example: community engagement, gather a lot of the good information, updating it, and supplying it back
  • Example: didn’t get requested headcount, released chatbot and used it in beta, focus on content, prompt content update in knowledge base
  • New role: chatbot manager
  • Be paranoid about your content: structured, up to date, machine can read it and understand it
  • Be paranoid about QA: analyze why couldn’t answer question or answered incorrectly, separate CSAT
  • think about automating
  • Work with someone you trust, have a direct line, can help you figure it out
  • Build out internal tools for team first, before public launch
  • watch out for “Highest paid person’s opinion”
  • Concerns: becoming too reliant on AI, data security, legal
  • Focus on improving things for the customer
  • Make clear what the expectations are
  • How to get everyone in the same place: found many as much as 10 tools in an org, look at objects across team, auto-prioritization based on customer value
  • Can help understand process and the customer journey, what is inefficient, then automate
  • consistent, efficient, least friction, customer to have what they need when they need it
  • future: understanding images, generate screenshots, get it to identify near-misses, self-serve idea
  • resources: https://freakonomics.com/podcasts/

End

Hope to see you again next year!