Industry
Headquarters
Baltimore, MD
Founded
2010
Company Size
Key Markets
1M+ users globally
Growth Stage
ARR $8M by 2023
Website
Use case
Timeline planning, task scheduling, team coordination
Highlights
Improved timeline usability without formal analytics
Reduced confusion in core scheduling workflows
Grounded decisions in real user behavior
Established research patterns that informed product direction
Overview
Early in TeamGantt, data was limited.
We did not have strong behavioral analytics. There was no clear funnel showing where users struggled or dropped off.
At the same time, the product was evolving quickly. Decisions still had to be made.
Understanding the Problem
The lack of data created uncertainty. It was not clear where friction existed or which problems mattered most.
Limited visibility into user behavior
No reliable analytics to guide prioritization
Risk of solving the wrong problems
High dependence on interpretation
The challenge was not absence of signals. It was knowing which signals to trust.
Strategic approach
The approach shifted from indirect data to direct observation.
Learning from real behavior
Instead of waiting for analytics, I focused on how users actually worked.
I reviewed support tickets, looked at customer feedback, and observed how teams planned projects in the timeline. Patterns emerged quickly:
Where users hesitated. Where they made mistakes. Where workflows broke down.
Design decisions were based on what users did, not what we assumed.
Iteration followed the same model: observe, adjust, validate in real use.
Principle: When analytics are weak, behavior is the strongest signal.
Key Initiatives
Direct observation of planning workflows
Analytics did not clearly show where users struggled.
What I did
Watched how teams built and edited plans
Identified friction points in real workflows
Mapped behavior patterns across different use cases
What changed
Clear understanding of where the product broke down
Better prioritization of UX improvements
Design grounded in actual usage
Support and feedback synthesis
Customer signals were fragmented across channels.
What I did
Reviewed support tickets and feedback consistently
Grouped issues by recurring patterns
Used qualitative data to guide design direction
What changed
Faster identification of common problems
Stronger alignment between user needs and product decisions
Reduced reliance on assumptions
Iterative design based on real usage
Decisions needed validation without formal metrics.
What I did
Shipped improvements in small increments
Observed how users responded in real workflows
Adjusted based on behavior, not speculation
What changed
Continuous improvement loop grounded in reality
Higher confidence in design decisions
More stable evolution of the product
Additional improvements
Reduced confusion in scheduling interactions
Improved usability of the timeline surface
Strengthened user mental models
Built early research discipline without formal tooling
Cross-Functional Collaboration
Worked closely with product and engineering to align on observed user behavior.
Shared real examples instead of abstract data.
Used user workflows as the basis for prioritization and tradeoffs.
Financial Impact & Business Enablement
The product improved without relying on traditional analytics.
Decisions became clearer because they were grounded in what users actually did.
The timeline became easier to use. Scheduling felt more predictable.
Reduced support volume through clearer workflows
Improved retention through better usability
More efficient prioritization without heavy analytics investment
Faster iteration cycles based on real feedback
Takeaway
Data is useful. Behavior is decisive.
When you can see how people use a product, you know what to fix.
Role
Head of Product Design and Design Systems
Led UX research and product direction in early-stage conditions. Replaced missing analytics with direct observation, customer feedback, and iterative design to improve usability and guide product evolution.
Related Case Studies

Reducing onboarding friction by restructuring how users reach first value.

Making dependency behavior predictable in a visual planning system.


Scaling a product with limited design resources by building systems, not screens.
Lowering early drop off by restructuring how users reach first value.

Keep the product focused on the planning model, not expanding features.

Making product decisions that preserve control and predictability.

Designing scheduling behavior that feels powerful without becoming unpredictable.

Using direct observation to guide product decisions when analytics are incomplete.
Turning design from a service into a system tied to delivery outcomes.
Driving adoption and executive buy-in through phased, measurable impact.

Turning design into a decision driver by making its impact visible.
Turning fragmented research into a system that guides product decisions.




