Hook

A strategy lead at one of our enterprise clients said:
‘We have all this data, and zero confidence in what to do next.’

That insight shaped everything.

Many organizations face the same struggle: they’re flooded with data but lack a unified source of truth, the infrastructure to process it, or the expertise to extract real value.
The result? Low confidence in decisions that should be data-driven.

Our goal was to design a new kind of insight platform — one that empowers teams to plan, analyze, and act, without the weight of complexity that defines today’s tools.

Discovery & Research

Before jumping into solutions, we conducted a deep discovery phase to understand how strategy and analytics teams actually work — and where they get stuck.

We interviewed 10+ users across different roles (strategy, operations, data), and mapped their workflows using existing platforms of the competitors.

The key insights were clear:

  • Most users relied heavily on analysts to extract even basic insights.
  • Tools were perceived as overwhelming and rigid.
  • Strategic planning was often disconnected from real-time operational data.

These findings helped us define two guiding principles for design:

  1. Make data exploration self-serve.
  2. Bridge strategy and execution in one fluid experience

Personas & Journeys

Based on our research, we created distinct personas — from data-savvy strategy leads to operations managers who needed clarity at a glance.
Their journeys highlighted moments of friction: unclear insight paths, overwhelming interfaces, and lack of trust in data outputs.

User flows

We mapped existing workflows and highlighted pain points in extracting insights, especially in moving from question → data → action.
These maps became a foundation for rethinking the interaction model.

Understanding the data

One critical insight: users weren’t just overwhelmed by volume — they didn’t understand how the models worked.
We collaborated with data scientists to unpack key inputs, outputs, and dependencies, and translated them into visual metaphors users could trust.

Mapping the needs

We grouped user needs into functional clusters: explore, compare, simulate.
This informed our modular architecture and helped prioritize which capabilities to surface first.

Wireframing & Prototyping

We started with low-fidelity wireframes to explore different ways users might explore and simulate insights.
The goal was to reduce cognitive load, present predictions visually, and allow flexible "what-if" modeling without overwhelming the interface.

Each iteration focused on simplifying navigation and guiding users through layered data views — from high-level trends to model-level assumptions.

We tested early prototypes with internal users and domain experts. Their feedback helped us refine terminology, prioritize context over controls, and validate core interaction patterns before committing to visuals.

Usability Testing

We conducted a series of moderated usability tests with data analysts and business decision-makers across three partner organizations. The testing aimed to validate both the navigational flow and the interpretability of data visualizations.

Key findings included:

  • Users struggled to understand the predictive confidence scores without contextual hints.
  • Several users misinterpreted the filtering logic in the insight builder.
  • Stakeholders expressed the need for easier ways to compare historical and forecasted data side-by-side.

As a result, we iterated on the data labeling system, introduced contextual tooltips, and redesigned the filter panel to be more intuitive and modular.

Final UI

The final UI reflected a significant evolution from the initial concepts. We focused on delivering clarity, depth, and control — without overwhelming users.

Key design decisions:

  • A modular dashboard layout that allowed users to customize their view depending on their role (e.g., Analyst vs. Strategy Lead).
  • Predictive insights were surfaced prominently with clear confidence indicators and a supporting “Explain this Insight” modal to improve transparency.
  • The insight builder supported natural-language filters and scenario comparisons, making it easier for users to ask "what if?" without relying on SQL.

The UI was implemented using a design system we created from scratch, aligned with Curve’s visual identity and built to scale across future products.

 

Impact

  • Enhanced Decision-Making: Users reported a 40% increase in confidence when making data-driven decisions.
  • Efficiency Gains: Reduced time spent on data analysis by 30%, streamlining business operations.
  • Achieved a 25% increase in user adoption rates within the first quarter post-launch.

Reflection / What I Learned

This project reinforced the importance of aligning technical complexity with user-centered simplicity. Balancing the needs of business and data teams required constant communication and iteration.
It also highlighted how a strong design narrative — supported by real data and testing — can help bring alignment between stakeholders with different priorities.

Portfolio

Art

About

Hook

A strategy lead at one of our enterprise clients said:
‘We have all this data, and zero confidence in what to do next.’

That insight shaped everything.

Many organizations face the same struggle: they’re flooded with data but lack a unified source of truth, the infrastructure to process it, or the expertise to extract real value.
The result? Low confidence in decisions that should be data-driven.

Our goal was to design a new kind of insight platform — one that empowers teams to plan, analyze, and act, without the weight of complexity that defines today’s tools.

Discovery & Research

Before jumping into solutions, we conducted a deep discovery phase to understand how strategy and analytics teams actually work — and where they get stuck.

We interviewed 10+ users across different roles (strategy, operations, data), and mapped their workflows using existing platforms of the competitors.

The key insights were clear:

  • Most users relied heavily on analysts to extract even basic insights.
  • Tools were perceived as overwhelming and rigid.
  • Strategic planning was often disconnected from real-time operational data.

These findings helped us define two guiding principles for design:

  1. Make data exploration self-serve.
  2. Bridge strategy and execution in one fluid experience

Personas & Journeys

Based on our research, we created distinct personas — from data-savvy strategy leads to operations managers who needed clarity at a glance.
Their journeys highlighted moments of friction: unclear insight paths, overwhelming interfaces, and lack of trust in data outputs.

User flows

We mapped existing workflows and highlighted pain points in extracting insights, especially in moving from question → data → action.
These maps became a foundation for rethinking the interaction model.

Understanding the data

One critical insight: users weren’t just overwhelmed by volume — they didn’t understand how the models worked.
We collaborated with data scientists to unpack key inputs, outputs, and dependencies, and translated them into visual metaphors users could trust.

Mapping the needs

We grouped user needs into functional clusters: explore, compare, simulate.
This informed our modular architecture and helped prioritize which capabilities to surface first.

Wireframing & Prototyping

We started with low-fidelity wireframes to explore different ways users might explore and simulate insights.
The goal was to reduce cognitive load, present predictions visually, and allow flexible "what-if" modeling without overwhelming the interface.

Each iteration focused on simplifying navigation and guiding users through layered data views — from high-level trends to model-level assumptions.

We tested early prototypes with internal users and domain experts. Their feedback helped us refine terminology, prioritize context over controls, and validate core interaction patterns before committing to visuals.

Usability Testing

We conducted a series of moderated usability tests with data analysts and business decision-makers across three partner organizations. The testing aimed to validate both the navigational flow and the interpretability of data visualizations.

Key findings included:

  • Users struggled to understand the predictive confidence scores without contextual hints.
  • Several users misinterpreted the filtering logic in the insight builder.
  • Stakeholders expressed the need for easier ways to compare historical and forecasted data side-by-side.

As a result, we iterated on the data labeling system, introduced contextual tooltips, and redesigned the filter panel to be more intuitive and modular.

Final UI

The final UI reflected a significant evolution from the initial concepts. We focused on delivering clarity, depth, and control — without overwhelming users.

Key design decisions:

  • A modular dashboard layout that allowed users to customize their view depending on their role (e.g., Analyst vs. Strategy Lead).
  • Predictive insights were surfaced prominently with clear confidence indicators and a supporting “Explain this Insight” modal to improve transparency.
  • The insight builder supported natural-language filters and scenario comparisons, making it easier for users to ask "what if?" without relying on SQL.

The UI was implemented using a design system we created from scratch, aligned with Curve’s visual identity and built to scale across future products.

 

Impact

  • Enhanced Decision-Making: Users reported a 40% increase in confidence when making data-driven decisions.
  • Efficiency Gains: Reduced time spent on data analysis by 30%, streamlining business operations.
  • User Adoption: Achieved a 25% increase in user adoption rates within the first quarter post-launch.

Reflection / What I Learned

This project reinforced the importance of aligning technical complexity with user-centered simplicity. Balancing the needs of business and data teams required constant communication and iteration.
It also highlighted how a strong design narrative — supported by real data and testing — can help bring alignment between stakeholders with different priorities.

Portfolio

Art

About

Hook

A strategy lead at one of our enterprise clients said:
‘We have all this data, and zero confidence in what to do next.’

That insight shaped everything.

Many organizations face the same struggle: they’re flooded with data but lack a unified source of truth, the infrastructure to process it, or the expertise to extract real value.
The result? Low confidence in decisions that should be data-driven.

Our goal was to design a new kind of insight platform — one that empowers teams to plan, analyze, and act, without the weight of complexity that defines today’s tools.

Discovery & Research

Before jumping into solutions, we conducted a deep discovery phase to understand how strategy and analytics teams actually work — and where they get stuck.

We interviewed 10+ users across different roles (strategy, operations, data), and mapped their workflows using existing platforms of the competitors.

The key insights were clear:

  • Most users relied heavily on analysts to extract even basic insights.
  • Tools were perceived as overwhelming and rigid.
  • Strategic planning was often disconnected from real-time operational data.

These findings helped us define two guiding principles for design:

  1. Make data exploration self-serve.
  2. Bridge strategy and execution in one fluid experience

Personas & Journeys

Based on our research, we created distinct personas — from data-savvy strategy leads to operations managers who needed clarity at a glance.
Their journeys highlighted moments of friction: unclear insight paths, overwhelming interfaces, and lack of trust in data outputs.

User flows

We mapped existing workflows and highlighted pain points in extracting insights, especially in moving from question → data → action.
These maps became a foundation for rethinking the interaction model.

Understanding the data

One critical insight: users weren’t just overwhelmed by volume — they didn’t understand how the models worked.
We collaborated with data scientists to unpack key inputs, outputs, and dependencies, and translated them into visual metaphors users could trust.

Mapping the needs

We grouped user needs into functional clusters: explore, compare, simulate.
This informed our modular architecture and helped prioritize which capabilities to surface first.

Wireframing & Prototyping

We started with low-fidelity wireframes to explore different ways users might explore and simulate insights.
The goal was to reduce cognitive load, present predictions visually, and allow flexible "what-if" modeling without overwhelming the interface.

Each iteration focused on simplifying navigation and guiding users through layered data views — from high-level trends to model-level assumptions.

We tested early prototypes with internal users and domain experts. Their feedback helped us refine terminology, prioritize context over controls, and validate core interaction patterns before committing to visuals.

Usability Testing

We conducted a series of moderated usability tests with data analysts and business decision-makers across three partner organizations. The testing aimed to validate both the navigational flow and the interpretability of data visualizations.

Key findings included:

  • Users struggled to understand the predictive confidence scores without contextual hints.
  • Several users misinterpreted the filtering logic in the insight builder.
  • Stakeholders expressed the need for easier ways to compare historical and forecasted data side-by-side.

As a result, we iterated on the data labeling system, introduced contextual tooltips, and redesigned the filter panel to be more intuitive and modular.

Final UI

The final UI reflected a significant evolution from the initial concepts. We focused on delivering clarity, depth, and control — without overwhelming users.

Key design decisions:

  • A modular dashboard layout that allowed users to customize their view depending on their role (e.g., Analyst vs. Strategy Lead).
  • Predictive insights were surfaced prominently with clear confidence indicators and a supporting “Explain this Insight” modal to improve transparency.
  • The insight builder supported natural-language filters and scenario comparisons, making it easier for users to ask "what if?" without relying on SQL.

The UI was implemented using a design system we created from scratch, aligned with Curve’s visual identity and built to scale across future products.

 

Impact

  • Enhanced Decision-Making: Users reported a 40% increase in confidence when making data-driven decisions.
  • Efficiency Gains: Reduced time spent on data analysis by 30%, streamlining business operations.
  • User Adoption: Achieved a 25% increase in user adoption rates within the first quarter post-launch.

Reflection / What I Learned

This project reinforced the importance of aligning technical complexity with user-centered simplicity. Balancing the needs of business and data teams required constant communication and iteration.
It also highlighted how a strong design narrative — supported by real data and testing — can help bring alignment between stakeholders with different priorities.

Portfolio

Art

About