ENTERPRISE UX • PRODUCT DESIGN

Turning fragmentation data into actionable insights

Timeline

August - October 2025

Contributions

UX Research, UX Architecture, Wireframing, Rapid Prototyping

Team

Designer (Me), Data Integration Specialist, Director of CLD

Overview

Improving mining operations at Weir Motion Metrics.

Weir Motion Metrics develops AI-powered monitoring systems for mining operations. Working in the Cloud Team, I led user research for the Fragmentation Monitoring module of their cloud-based platform, MotionMetrics Pro ↗, focusing on improving how mining engineers analyze fragmentation patterns and diagnose operational issues.

Understanding The Product

How does Fragmentation Monitoring fit into the mining process?

In open-pit mining, blasting breaks rock into fragments that are then loaded and processed. Fragmentation Monitoring occurs during loading, where sensors on the shovel bucket capture rock size in real time. This data is used to evaluate blast performance and guide downstream decisions.

Blast determines fragmentation

Step 1: Blasting

Engineers use controlled blasting to break rocks into smaller fragments.

Real-time rock size capture

Step 2: Loading

Shovels collect fragmented rock as sensors capture rock size in real time.

Impacts downstream processing

Step 3: Processing

Material moves to the conveyor belt and crusher for further processing.

Illustrations generated using AI

What Was The Problem?

Fragmentation data was difficult to interpret and unreliable.

While the initial goal focused on improving chart readability, research revealed a deeper issue. Fragmentation data was presented as isolated measurements from individual shovel loads, making it difficult for engineers to interpret results within a broader operational context.

Challenge #1

Fragmentation data was siloed at the shovel level.

Fragmentation was analyzed one shovel at a time, limiting visibility into overall performance and broader patterns.

Fragmentation monitoring before redesign

Challenge #2

Raw data visualization obscured meaningful patterns.

Individual shovel loads were plotted as raw data points, creating visual noise, slowing load times, and obscuring meaningful trends or insights.

Particle size distribution timeline chart before redesign

The Solution

Designing a contextual fragmentation monitoring experience.

Trend Analysis

Operational Overview

Visual Validation

The dashboard, the primary entry point for Fragmentation Monitoring, previously displayed image-based shovel cards with limited analytical value. This was redesigned into time-series trend charts, enabling quick interpretation of fragmentation patterns and comparison across shovels.

Before

Image-based

Trend-based, comparable

After

Fragmentation Trends Across Shovels

Dashboard - Fragmentation Monitoring entry point

Trend Analysis

Operational Overview

Visual Validation

The dashboard, the primary entry point for Fragmentation Monitoring, previously displayed image-based shovel cards with limited analytical value. This was redesigned into time-series trend charts, enabling quick interpretation of fragmentation patterns and comparison across shovels.

Before

Image-based

Trend-based, comparable

After

Fragmentation Trends Across Shovels

Dashboard - Fragmentation Monitoring entry point

Trend Analysis

Operational Overview

Visual Validation

The dashboard, the primary entry point for Fragmentation Monitoring, previously displayed image-based shovel cards with limited analytical value. This was redesigned into time-series trend charts, enabling quick interpretation of fragmentation patterns and comparison across shovels.

Before

Image-based

Trend-based, comparable

After

Fragmentation Trends Across Shovels

Dashboard - Fragmentation Monitoring entry point

First Research Step

Understanding the existing user workflows.

I began by interviewing a Customer Success Manager who had direct visibility into how customers used the Fragmentation Monitoring module. This allowed me to understand the existing workflows and recurring pain points of users, as highlighted below:

"The current module supports only single-shovel analysis, requiring manual investigation through time ranges and image scanning. For broader analysis, users rely on external tools to connect fragmentation data with blast context."

— Reid W., Customer Success Manager

Customer Success Manager Interview

Manual Time-Based Investigation

Finding relevant images requires manually scanning time-ranges. A slow, error-prone process with no filtering or search.

Finding relevant images requires manually scanning time-ranges. A slow, error-prone process with no filtering or search.

Isolated Shovel-Level Analysis

Each shovel's fragmentation data is analyzed in isolation, no way to compare across shovels or tie results to blast context.

Shovel C

Shovel B

Shovel A

Reliance On External Tools

Users export fragmentation data onto external tools to gain a broader context, work that is not available inside the product.

Users export fragmentation data onto external tools to gain a broader context, work that is not available inside the product.

Design Implications

Users need a way to compare fragmentation data across shovels and blast events, without leaving the platform or piecing together external tools. The design should bring a broader context into the module.

Validated by Survey Data

Survey insights reinforced the same limitations.

To validate whether these patterns extended beyond a single perspective, I reviewed prior user survey data, which revealed a consistent gap: fragmentation was presented at a single-shovel level, making it difficult for users to understand broader patterns or evaluate overall performance.

Joe O.

Support Team Member

"Not sure if PSD by bucket will be accurate for being valuable to our customers. Location-based data would be more relevant."

Carl T.

Quality Control Manager

"Location information would be best, but without it, users rely on rough estimates based on timestamps."

Gabriel H.

Sales Manager

"Customers would find having a joint chart to compare PSD trends across multiple shovels in a single, unified view."

Competitive Analysis

How competing fragmentation analysis tools approach this problem.

I reviewed comparable industrial analytics and mining platforms to understand how they structure fragmentation analysis. Across competitors, fragmentation data is contextual, aggregated, and designed for comparison, highlighting a gap in the current module.

BlastIQ - FragTrack

→ Links fragmentation to spatial blast context

→ Uses spatial overlays to reveal patterns

→ Supports blast-level performance evaluation

Maptek - PointStudio

→ Combines visual overlays with aggregated PSD

→ Links blast design to performance insights

→ Tracks fragmentation blast by blast

Strayos - Fragmentation AI

→ Enables blast-wide performance insights

→ Analyzes fragmentation across the muck pile

→ Generates PSD distributions

Synthesizing Insights

Translating insights into design direction.

Four patterns emerged consistently across the interview and survey data, together pointing to the same underlying gap: fragmentation data exists, but the product doesn't give engineers the context to act on it.

Fragmentation data lacks operational context

Fragmentation data is isolated at the shovel level, limiting understanding of overall performance.

Current workflows support monitoring, not evaluation

Current workflows support quick checks, but not deeper evaluation of blast outcomes.

Users rely on external tools to fill the gap

Some users export data or build custom dashboards using external tools to connect data and gain broader insights.

There is a need for aggregated and comparative views

Users need the ability to view and compare fragmentation trends across shovels to identify trends, variability and anomalies.

This led to one core question

How might we?

Move engineers from isolated shovel snapshots to a contextual, comparative understanding of fragmentation without adding workflow complexity?

Design Process

Collaborating to define the right solution aligned with constraints, goals, and impact.

I worked closely with the Director of Cloud Team and a Data Integration Specialist through weekly scrum meetings to explore options, navigate technical constraints, and define a solution that delivers value across sites with varying data access and needs.

Exploring data visualizations

I researched and explored multiple data visualization patterns for presenting fragmentation data.

Aligning with technical constraints

I refined my designs and flows based on data availability, integration logic, and performance limitations across sites.

Converging on a scalable solution

I converged on a solution that balances clarity, performance, and flexibility across different site conditions.

What I Designed

Designed for mining engineers monitoring fragmentation performance.

The redesign enables engineers to assess performance, identify patterns, and validate results in one place, shifting fragmentation monitoring from isolated measurements to a contextual, actionable workflow.

User Scenario

An engineer opens the Fragmentation Monitoring module after a blast. They need to know — fast — whether fragmentation is within the target, which shovels are underperforming, and why.

Improvement #1

Dashboard — Quick Assessment

Engineers start by reviewing recent fragmentation performance across shovels. Trend visualisations replace bucket images — letting them spot anomalies and changes over time at a glance.

2

1

1

Trend lines replace bucket images

Showing a quick visualization of fragmentation from the last 24 hours of operation by shovel.

2

Consistent KPIs enable cross-shovel scanning

Users can select P-values or target particle sizes and set threshold values to visualize performance.

Improvement #2

Overview — Pattern Analysis

To understand broader trends, engineers compare fragmentation across multiple shovels using aggregated views, identifying variability and patterns that would be invisible at the shovel level.

2

1

1

Selected Shovel ID list indication

Confirms which shovels are included in the aggregated view, giving engineers context before reading any data.

2

Aggregated cross-shovel data

Fragmentation data are calculated across selected shovels to reveal patterns, variability, and outliers across operations.

Improvement #3

Gallery View — Visual Validation

When anomalies are detected, engineers review captured images to validate results. Images are now grouped by day and filterable by shovel — replacing a flat, unorganised list.

1

2

1

2

1

Shovel list navigation

A dedicated side panel allows users to filter images by shovel, enabling quicker access to relevant data.

2

Structured image grouping

Images are organized by day instead of a flat chronological list, improving performance and enabling faster navigation.

My Contributions

My contributions beyond Fragmentation Monitoring.

During my time at Weir Motion Metrics, I contributed to improving MotionMetrics Pro across product, design, and system-level initiatives.

Spanish translation project

Ensured UI consistency across translated strings by identifying and resolving layout, truncation, formatting.

New feature brainstorming

Translated product requirements into early user flows, wireframes, and personas for new feature directions.

Design system components

The team was restructuring the design system, and I supported creating new usable components.

Reflection

Growth through ambiguity.

This project and time at Motion Metrics marked an important chapter in my growth as a designer. While my time on the team ended unexpectedly due to a layoff, the experience was both bittersweet and formative. It reinforced how much I enjoy stepping into new learning curves, quickly building context in complex domains, and developing a genuine curiosity for the problems I'm solving.

Design as an evolving process.

This work reminded me that strong design rarely comes from finding the “right” answer upfront. It’s an iterative process shaped by asking better questions, reframing assumptions, and staying open to change. Especially when working with complex data, clarity comes not from simplifying too early, but from continuously refining how information is structured and understood.

Based in Vancouver, BC

© 2026 Min Kang

5:58:01 UTC

Based in Vancouver, BC

© 2026 Min Kang

5:58:01 UTC