Overview
Incident IQ provides the backbone of daily K–12 operations. The platform powers help desks, asset tracking, and HR workflows for over 12 million students.
The challenge was not adoption -- it was visibility. The UX team had no clear way to measure its impact. Work was being done, but it was hard to prove where design was moving the needle or how it was improving the system.
Vision & Context
The goal was to transform UX from a service function into a performance-driven discipline. The team needed structured metrics, predictive research workflows, and visibility into design’s effect on velocity and quality.
The objective was to quantify UX outcomes with the same rigor engineering used for delivery. Design needed to become measurable, operational, and repeatable.
Understanding the Problem
Before the shift, UX had no structured scorecard or data visibility.
Impact was anecdotal. Team efficiency was untracked. Engineering saw design as qualitative and reactive.
Key challenges included:
No structured UX performance data
Limited visibility into contribution and reuse
Inability to forecast impact or measure velocity
Poor backlog alignment and planning predictability
Weak UX-Engineering synchronization in development cycles
The absence of data made it difficult to justify time investments, forecast outcomes, or optimize design debt reduction.
Strategic Approach
The solution began with the UX Performance Scorecard, a data-driven framework for quantifying UX efficiency, velocity, and ROI.
It integrated research ops, Jira tracking, and design system analytics into a unified dashboard.
This allowed the team to measure reuse rates, track late-stage rework, and identify bottlenecks early.
Predictive UX research automation was introduced, cutting workflow speed by 31 percent and reducing rework by 35 percent.
Design system component reuse increased 20 percent in nine months, creating measurable proof of scale.
Accessibility-first standards were embedded across product workflows, achieving WCAG 2.1 AA compliance and improving user confidence in K–12 environments.
System Architecture Thinking
UX was treated as infrastructure. Every metric connected back to operational efficiency -- reuse, velocity, and churn. The UX Scorecard visualized design health, forecasted delivery timelines, and allowed the team to prioritize research with quantifiable outcomes.
What began as a measurement framework evolved into a new operating model for design and engineering alignment.
Scaling Up
The system made UX performance transparent across leadership, product, and engineering.
Pre-development validation cycles reduced late-stage churn by 27 percent.
Design-to-dev handoff became a data-backed process, and research velocity improved by one-third.
Cross-functional collaboration strengthened. The UX Scorecard became part of sprint reviews and OKR planning, turning design metrics into a common performance language.
Cross-Functional Collaboration
UX became the connective layer between engineering and product.
Partnership with the VP of Engineering and CTO created shared visibility into quality and process efficiency.
Design reviews evolved into performance reviews, measuring impact through delivery outcomes instead of subjective aesthetics.
The UX team and engineering leads co-owned metrics like velocity, rework, and adoption -- creating mutual accountability and operational trust.
“We moved from isolated outputs to a connected UX system. Brandon’s focus on predictive metrics helped us align faster and iterate smarter.”
Product Designer
Product Designer, Incident IQ
“We moved from isolated outputs to a connected UX system. Brandon’s focus on predictive metrics helped us align faster and iterate smarter.”
Product Designer
Product Designer, Incident IQ
“We moved from isolated outputs to a connected UX system. Brandon’s focus on predictive metrics helped us align faster and iterate smarter.”
Product Designer
Product Designer, Incident IQ
Key Initiatives
Designed and implemented the UX Performance Scorecard to measure UX ROI and velocity.
Integrated predictive research automation, improving speed by 31% and cutting rework by 35%.
Scaled the design system, increasing component reuse by 20% in nine months.
Embedded accessibility-first workflows, achieving WCAG 2.1 AA compliance.
Established persona systems, usability validation cycles, and research ops infrastructure.
Reduced late-stage dev churn by 27% through pre-dev validation and usability pass cycles.
Key Results
31% faster UX workflow speed
35% reduction in rework
27% decrease in late-stage dev churn
20% increase in design system reuse
Full WCAG 2.1 AA compliance across the platform
Financial Impact & Business Enablement
Data-driven UX improved predictability and reduced risk.
31% faster research workflows
35% reduction in rework
27% decrease in late-stage churn
20% higher component reuse
WCAG 2.1 AA accessibility achieved across core product lines
The outcome was a measurable UX ROI model that tied design directly to company growth metrics and investor confidence.
The Impact
Design became data. The UX team gained the ability to forecast, prove value, and guide engineering decisions through measurable clarity.
What began as a visibility problem ended as an operational transformation.
Clarity made UX measurable. Measurement made UX essential.
Design pays for itself when it can prove it.
Role
Director of UX, reporting to the VP of Engineering and CTO.
Led UX strategy, design systems, accessibility, and research operations across the Incident IQ platform.
Directed cross-functional UX team spanning design, research, and content.
Built the company’s first UX performance framework and research ops infrastructure.
Partnered with engineering leadership to align design outcomes with delivery metrics.
Established design systems governance and accessibility-first standards across all workflows.