Overview
Incident IQ is a K–12 workflow and service management platform used by over 2,000 districts and 12 million students. It connects IT help desks, asset management, facilities, and HR workflows into one ecosystem. The company has raised $40M with growth investments from JMI Equity and Cove Hill Partners.
Incident IQ powers the infrastructure behind everyday K–12 operations. The platform connects help desks, devices, facilities, and HR workflows across districts serving millions of students.
The challenge wasn’t adoption -- it was visibility. UX was moving fast, but its impact wasn’t measurable. Teams couldn’t quantify where design was saving time, reducing churn, or improving developer efficiency. The goal was to transform UX from a service function into a performance system -- something that could measure, forecast, and scale.
Vision & Context
When I joined, UX impact was anecdotal. Engineers shipped features quickly, but without metrics, design was reactive instead of predictive.
The vision was to build a measurable UX organization that mirrored the operational rigor of engineering -- complete with KPIs, automation, and accountability. UX needed structure: performance data, design systems, accessibility compliance, and a way to prove its value through delivery.
The objective was simple -- make UX measurable, repeatable, and indispensable.
Understanding the Problem
Before we introduced performance systems, the UX team operated without consistent visibility into efficiency or outcomes. Work was happening, but impact wasn’t traceable.
Key challenges included:
No structured UX performance metrics or KPIs
Limited visibility into reuse, adoption, and design ROI
Difficulty forecasting design velocity and delivery dependencies
Backlog unpredictability and weak UX-Engineering synchronization
Accessibility handled reactively instead of as a system
Without data, UX couldn’t advocate for resourcing or process improvements. It was time to operationalize design.
Strategic Approach
The solution began with the UX Performance Scorecard, a data-driven framework for quantifying UX efficiency, velocity, and ROI.
It integrated research ops, Jira tracking, and design system analytics into a unified dashboard.
This allowed the team to measure reuse rates, track late-stage rework, and identify bottlenecks early.
Predictive UX research automation was introduced, cutting workflow speed by 31 percent and reducing rework by 35 percent.
Design system component reuse increased 20 percent in nine months, creating measurable proof of scale.
Accessibility-first standards were embedded across product workflows, achieving WCAG 2.1 AA compliance and improving user confidence in K–12 environments.
Design System & Accessibility Foundation
While metrics created visibility, the design system created consistency.
Phase 1 established the foundation: component libraries, naming conventions, and usage tracking. Every component was versioned, documented, and tied to analytics so adoption could be measured across products.
Phase 2 focused on scalability and compliance. Components were rebuilt with accessibility baked in -- color contrast, keyboard navigation, focus states, and ARIA labeling were standardized and tested in partnership with Level Access.
This work aligned with new ADA Title II rulings for K–12 institutions, ensuring long-term compliance and reducing legal risk. We introduced an ADA Compliance Impact Assessment, mapping how accessibility influenced district readiness, customer retention, and market expansion.
Accessibility wasn’t treated as a checklist. It was infrastructure.
System Architecture Thinking
UX evolved from aesthetics to architecture.
The Scorecard visualized UX performance. The design system standardized delivery. Together, they created a model for measurable clarity.
Each layer -- metrics, research, accessibility, and systems -- worked in harmony. Reuse became trackable. Rework became predictable. Velocity became a shared language across teams.
UX operated like an internal service, measurable and accountable.
Scaling Up
Once the foundation was in place, we scaled horizontally across functions.
Pre-dev validation cycles were introduced to catch misalignment early, cutting late-stage churn by 27%.
UX and QA collaboration became standard, reducing design-related defects and improving release confidence.
The backlog evolved from reactive requests into a prioritized, strategic roadmap aligned with engineering velocity.
UX representation in sprint planning became the norm, aligning design dependencies before code.
DesignOps practices were formalized -- automated intake systems, clear ownership models, and visibility dashboards for all design work.
UX became a living part of the product pipeline.
Cross-Functional Collaboration
Integration was everything.
UX embedded directly with engineering, product, and QA, operating within the same systems, language, and metrics. The team reported through the VP of Engineering, giving UX shared accountability for delivery speed and quality.
Weekly tactical meetings ensured UX visibility into dependencies and reduced sprint friction.
UX/PM Research Frameworks standardized how research fed into product decisions, aligning insights with business outcomes.
UX and Engineering co-owned metrics for velocity, rework, and adoption -- transforming design from a creative partner into an operational driver.
DesignOps & Operational Clarity
DesignOps connected people, process, and performance.
I built automated dashboards that tracked UX workload, backlog movement, and team contribution rates in real time.
Azure DevOps was integrated with Asana, creating a single source of truth for design progress, dependencies, and outcomes.
Documentation across Notion and Confluence became living systems -- centralizing personas, usability findings, and design validations.
This operational clarity turned UX into a forecastable function -- one that could show measurable ROI and predict delivery timelines with accuracy.
“We moved from isolated outputs to a connected UX system. Brandon’s focus on predictive metrics helped us align faster and iterate smarter.”
Product Designer
Product Designer, Incident IQ
“We moved from isolated outputs to a connected UX system. Brandon’s focus on predictive metrics helped us align faster and iterate smarter.”
Product Designer
Product Designer, Incident IQ
“We moved from isolated outputs to a connected UX system. Brandon’s focus on predictive metrics helped us align faster and iterate smarter.”
Product Designer
Product Designer, Incident IQ
Key Initiatives
Designed and implemented the UX Performance Scorecard, transforming UX from qualitative to data-driven by tracking efficiency, velocity, and ROI.
Integrated predictive UX research automation, improving workflow speed by 31% and reducing rework by 35%.
Scaled the Design System, increasing component reuse by 20% in nine months and creating a measurable foundation for UI consistency.
Established accessibility-first workflows, achieving WCAG 2.1 AA compliance across all core products and reducing user friction in key flows.
Partnered with Level Access to audit and remediate accessibility gaps, creating a sustainable framework for ongoing ADA Title II compliance.
Created a UX/PM Research Framework aligning research findings with business priorities, improving roadmap accuracy and feature validation.
Built a UX Research Repository centralizing personas, usability studies, and insights for cross-team access and reuse.
Established DesignOps infrastructure, including automated intake, visibility dashboards, and backlog alignment to reduce sprint friction.
Introduced pre-dev validation and usability pass cycles, cutting late-stage development churn by 27%.
Developed QA collaboration workflows, integrating UX validation checkpoints into testing processes to reduce design-related defects.
Implemented UX performance dashboards in Notion and Azure DevOps for real-time reporting on adoption, progress, and contribution metrics.
Integrated UX metrics into leadership reporting, visualizing team efficiency, design impact, and product velocity improvements.
Led Design System Phase 2, embedding accessibility and documentation standards, ensuring scalable governance and maintainability.
Created UX/QA validation checklists to ensure every design met Definition of Ready and Definition of Done criteria before handoff.
Formalized UX tactical alignment with engineering, embedding design discussions in sprint planning and dependency reviews.
Established UX workload and efficiency tracking, balancing team distribution and optimizing throughput.
Financial Impact & Business Enablement
Data driven UX reduced waste, improved predictability, and expanded usable capacity across product and engineering. Faster workflows, lower churn, and higher component reuse created measurable cost savings without adding headcount. The combined effect of UX workflow speed, reduced rework, fewer late stage fixes, and increased design system reuse returned hundreds of thousands of dollars in regained engineering and design capacity.
Based on engineering multipliers and average salary load, the total UX ROI delivered an estimated 360k to 615k in annual operational value.
31% faster UX workflow speed, saving $128k
35% less rework, saving $360k
27% fewer late cycle fixes, saving $80k
20% higher system reuse, saving $45k
WCAG 2.1 AA accessibility achieved across core product lines
Estimated annual value created: $420k to $680k in saved engineering hours, avoided rework, and reduced delivery risk
31% faster UX workflow speed, saving $128k
35% less rework, saving $360k
27% fewer late cycle fixes, saving $80k
20% higher system reuse, saving $45k
WCAG 2.1 AA accessibility achieved across core product lines
Estimated annual value created: $420k to $680k in saved engineering hours, avoided rework, and reduced delivery risk
The Impact
The work made design measurable. The numbers made UX matter.
Once we tracked the real cost of drag, rework, and late fixes, UX stopped being treated like overhead. The Scorecard turned guesswork into facts. The design system turned consistency into speed.
What started as a visibility problem became a structural shift in how the org worked. UX got clearer. Engineering got faster. Decisions got cleaner.
Clarity showed the impact. Measurement proved the value. In the end, design paid for itself.
Role
Director of UX, reporting to the VP of Engineering and CTO.
Led UX strategy, design systems, accessibility, and research operations across the platform.
Directed a cross-functional team spanning design, research, and content. Built the company’s first UX performance framework and DesignOps infrastructure.
Partnered with engineering and product leadership to align UX outcomes with delivery metrics, ensuring consistency, predictability, and measurable ROI.