Hook
A strategy lead at one of our enterprise clients said:
‘We have all this data, and zero confidence in what to do next.’
That insight shaped everything.
Many organizations face the same struggle: they’re flooded with data but lack a unified source of truth, the infrastructure to process it, or the expertise to extract real value.
The result? Low confidence in decisions that should be data-driven.
Our goal was to design a new kind of insight platform — one that empowers teams to plan, analyze, and act, without the weight of complexity that defines today’s tools.
Discovery & Research
Before jumping into solutions, we conducted a deep discovery phase to understand how strategy and analytics teams actually work — and where they get stuck.
We interviewed 10+ users across different roles (strategy, operations, data), and mapped their workflows using existing platforms of the competitors.
The key insights were clear:
These findings helped us define two guiding principles for design:
Personas & Journeys
Based on our research, we created distinct personas — from data-savvy strategy leads to operations managers who needed clarity at a glance.
Their journeys highlighted moments of friction: unclear insight paths, overwhelming interfaces, and lack of trust in data outputs.
User flows
We mapped existing workflows and highlighted pain points in extracting insights, especially in moving from question → data → action.
These maps became a foundation for rethinking the interaction model.
Understanding the data
One critical insight: users weren’t just overwhelmed by volume — they didn’t understand how the models worked.
We collaborated with data scientists to unpack key inputs, outputs, and dependencies, and translated them into visual metaphors users could trust.
Mapping the needs
We grouped user needs into functional clusters: explore, compare, simulate.
This informed our modular architecture and helped prioritize which capabilities to surface first.
Wireframing & Prototyping
We started with low-fidelity wireframes to explore different ways users might explore and simulate insights.
The goal was to reduce cognitive load, present predictions visually, and allow flexible "what-if" modeling without overwhelming the interface.
Each iteration focused on simplifying navigation and guiding users through layered data views — from high-level trends to model-level assumptions.
We tested early prototypes with internal users and domain experts. Their feedback helped us refine terminology, prioritize context over controls, and validate core interaction patterns before committing to visuals.
Usability Testing
We conducted a series of moderated usability tests with data analysts and business decision-makers across three partner organizations. The testing aimed to validate both the navigational flow and the interpretability of data visualizations.
Key findings included:
As a result, we iterated on the data labeling system, introduced contextual tooltips, and redesigned the filter panel to be more intuitive and modular.
Final UI
The final UI reflected a significant evolution from the initial concepts. We focused on delivering clarity, depth, and control — without overwhelming users.
Key design decisions:
The UI was implemented using a design system we created from scratch, aligned with Curve’s visual identity and built to scale across future products.
Impact
Reflection / What I Learned
This project reinforced the importance of aligning technical complexity with user-centered simplicity. Balancing the needs of business and data teams required constant communication and iteration.
It also highlighted how a strong design narrative — supported by real data and testing — can help bring alignment between stakeholders with different priorities.
Hook
A strategy lead at one of our enterprise clients said:
‘We have all this data, and zero confidence in what to do next.’
That insight shaped everything.
Many organizations face the same struggle: they’re flooded with data but lack a unified source of truth, the infrastructure to process it, or the expertise to extract real value.
The result? Low confidence in decisions that should be data-driven.
Our goal was to design a new kind of insight platform — one that empowers teams to plan, analyze, and act, without the weight of complexity that defines today’s tools.
Discovery & Research
Before jumping into solutions, we conducted a deep discovery phase to understand how strategy and analytics teams actually work — and where they get stuck.
We interviewed 10+ users across different roles (strategy, operations, data), and mapped their workflows using existing platforms of the competitors.
The key insights were clear:
These findings helped us define two guiding principles for design:
Personas & Journeys
Based on our research, we created distinct personas — from data-savvy strategy leads to operations managers who needed clarity at a glance.
Their journeys highlighted moments of friction: unclear insight paths, overwhelming interfaces, and lack of trust in data outputs.
User flows
We mapped existing workflows and highlighted pain points in extracting insights, especially in moving from question → data → action.
These maps became a foundation for rethinking the interaction model.
Understanding the data
One critical insight: users weren’t just overwhelmed by volume — they didn’t understand how the models worked.
We collaborated with data scientists to unpack key inputs, outputs, and dependencies, and translated them into visual metaphors users could trust.
Mapping the needs
We grouped user needs into functional clusters: explore, compare, simulate.
This informed our modular architecture and helped prioritize which capabilities to surface first.
Wireframing & Prototyping
We started with low-fidelity wireframes to explore different ways users might explore and simulate insights.
The goal was to reduce cognitive load, present predictions visually, and allow flexible "what-if" modeling without overwhelming the interface.
Each iteration focused on simplifying navigation and guiding users through layered data views — from high-level trends to model-level assumptions.
We tested early prototypes with internal users and domain experts. Their feedback helped us refine terminology, prioritize context over controls, and validate core interaction patterns before committing to visuals.
Usability Testing
We conducted a series of moderated usability tests with data analysts and business decision-makers across three partner organizations. The testing aimed to validate both the navigational flow and the interpretability of data visualizations.
Key findings included:
As a result, we iterated on the data labeling system, introduced contextual tooltips, and redesigned the filter panel to be more intuitive and modular.
Final UI
The final UI reflected a significant evolution from the initial concepts. We focused on delivering clarity, depth, and control — without overwhelming users.
Key design decisions:
The UI was implemented using a design system we created from scratch, aligned with Curve’s visual identity and built to scale across future products.
Impact
Reflection / What I Learned
This project reinforced the importance of aligning technical complexity with user-centered simplicity. Balancing the needs of business and data teams required constant communication and iteration.
It also highlighted how a strong design narrative — supported by real data and testing — can help bring alignment between stakeholders with different priorities.
Hook
A strategy lead at one of our enterprise clients said:
‘We have all this data, and zero confidence in what to do next.’
That insight shaped everything.
Many organizations face the same struggle: they’re flooded with data but lack a unified source of truth, the infrastructure to process it, or the expertise to extract real value.
The result? Low confidence in decisions that should be data-driven.
Our goal was to design a new kind of insight platform — one that empowers teams to plan, analyze, and act, without the weight of complexity that defines today’s tools.
Discovery & Research
Before jumping into solutions, we conducted a deep discovery phase to understand how strategy and analytics teams actually work — and where they get stuck.
We interviewed 10+ users across different roles (strategy, operations, data), and mapped their workflows using existing platforms of the competitors.
The key insights were clear:
These findings helped us define two guiding principles for design:
Personas & Journeys
Based on our research, we created distinct personas — from data-savvy strategy leads to operations managers who needed clarity at a glance.
Their journeys highlighted moments of friction: unclear insight paths, overwhelming interfaces, and lack of trust in data outputs.
User flows
We mapped existing workflows and highlighted pain points in extracting insights, especially in moving from question → data → action.
These maps became a foundation for rethinking the interaction model.
Understanding the data
One critical insight: users weren’t just overwhelmed by volume — they didn’t understand how the models worked.
We collaborated with data scientists to unpack key inputs, outputs, and dependencies, and translated them into visual metaphors users could trust.
Mapping the needs
We grouped user needs into functional clusters: explore, compare, simulate.
This informed our modular architecture and helped prioritize which capabilities to surface first.
Wireframing & Prototyping
We started with low-fidelity wireframes to explore different ways users might explore and simulate insights.
The goal was to reduce cognitive load, present predictions visually, and allow flexible "what-if" modeling without overwhelming the interface.
Each iteration focused on simplifying navigation and guiding users through layered data views — from high-level trends to model-level assumptions.
We tested early prototypes with internal users and domain experts. Their feedback helped us refine terminology, prioritize context over controls, and validate core interaction patterns before committing to visuals.
Usability Testing
We conducted a series of moderated usability tests with data analysts and business decision-makers across three partner organizations. The testing aimed to validate both the navigational flow and the interpretability of data visualizations.
Key findings included:
As a result, we iterated on the data labeling system, introduced contextual tooltips, and redesigned the filter panel to be more intuitive and modular.
Final UI
The final UI reflected a significant evolution from the initial concepts. We focused on delivering clarity, depth, and control — without overwhelming users.
Key design decisions:
The UI was implemented using a design system we created from scratch, aligned with Curve’s visual identity and built to scale across future products.
Impact
Reflection / What I Learned
This project reinforced the importance of aligning technical complexity with user-centered simplicity. Balancing the needs of business and data teams required constant communication and iteration.
It also highlighted how a strong design narrative — supported by real data and testing — can help bring alignment between stakeholders with different priorities.