Fortune 100 client
2022-2024
Designing trust and actionability in AI-assisted decision systems.
Problem area
Why adoption—not model accuracy—was the real problem
The AI model was working — it flagged anomalies and offered insights — but adoption was low. Engineers didn’t trust the output, didn’t know what to prioritize, and were stuck jumping between tools. It wasn’t an algorithm problem — it was a UX one. We needed to make the AI interpretable, actionable, and actually usable in a real manufacturing environment.
Design goals
Design goals in a safety-critical context
Design decision 01
Making AI output interpretable through feedback loops
Engineers were overwhelmed by flagged anomalies but lacked tools to validate or understand them. I designed clustering by signal pattern to help them spot trends across transmissions and added in-app feedback mechanisms so they could confirm or reject anomalies without needing ML expertise. I also introduced warranty correlation views to show when AI predictions matched real-world failures. These changes reduced manual work, improved model training, and made AI behavior more transparent and trustworthy.
Key Design Highlights
Design decision 02
Workflow redesign for enterprise efficiency
Before our redesign, anomaly investigations were spread across emails, spreadsheets, and side chats. I mapped the real workflows and redesigned the tool to support centralized reviews, hold requests, and status tracking — all in-app. This eliminated context switching, improved throughput, and enabled cross-team collaboration in a way that was visible, structured, and easy to follow.
Key Design Highlights
Design decision 03
Driving product adoption and building trust
Adoption was low because engineers didn’t trust the AI or see its impact. I redesigned the homepage to prioritize urgent issues, added timelines and comparisons to support better judgment, and built feedback views showing where AI matched real failures. We also integrated the tool into the plant's hold system, connecting AI actions to real outcomes. As a result, usage increased, trust improved, and the tool expanded to 8 plants.
Key Design Highlights
Retrospective









