175: Deploying Digital Pathology Tools - Challenges and Insights with Dr. Andrew Janowczyk
Send us a textWhy does it take three years to deploy a digital pathology tool that only took three weeks to build? That’s the reality no one talks about—but every lab feels every time they deploy a new tool...In this episode, I sit down with Andrew Janowczyk, Assistant Professor at Emory University and one of the leading voices in computational pathology, to unpack the practical, messy, real-world truth behind deploying, validating, and accrediting digital pathology tools in the clinic.We walk through Andrew’s experience building and implementing an H. pylori detection algorithm at Geneva University Hospital—a project that exposed every hidden challenge in the transition from research to a clinical-grade tool.From algorithmic hardening, multidisciplinary roles, usability studies, and ISO 15189 accreditation, to the constant tug-of-war between research ambition and clinical reality… this conversation is a roadmap for anyone building digital tools that actually need to work in practice.Episode Highlights[00:00–04:20] Why multidisciplinary collaboration is the non-negotiable cornerstone of clinical digital pathology deployment[04:20–08:30] Real-world insight: The H. pylori detection tool and how it surfaces “top 20” likely regions for pathologist review[08:30–12:50] The painful truth: Algorithms take weeks to build—but years to deploy, validate, and accredit[12:50–17:40] Why curated research datasets fail in the real world (and how to fix it with unbiased data collection)[17:40–23:00] Algorithmic hardening: turning fragile research code into production-ready clinical software[23:00–28:10] Why every hospital is a snowflake: no standard workflows, no copy-paste deployments[28:10–33:00] The 12 validation and accreditation roles every lab needs to define (EP, DE, QE, IT, etc.)[33:00–38:15] Validation vs. accreditation—what they are, how they differ, and when each matters[38:15–43:40] Version locking, drift prevention, and why monitoring is as important as deployment[43:40–48:55] Deskilling concerns: how AI changes perception and what pathologists need before adoption[48:55–55:00] Usability testing: why naive users reveal the truth about your UI[55:00–61:00] Scaling to dozens of algorithms: bottlenecks, documentation, and the future of clinical digital pathology and AI workflowsResources From This EpisodeJanowczyk & Ferrari: Guide to Deploying Clinical Digital Pathology Tools (discussed)Sectra Image Management System (IMS)Endoscopist deskilling risk after exposure to artificial intelligence in colonoscopy: a multicentre, observational study - PubMedDigital Pathology 101 (Aleksandra Zuraw)Key TakeawaysAlgorithm creation is the easy part—deployment is the mountain.Clinical algorithms require multidisciplinary ownership across 12 institutional roles.Real-world data is messy—and that’s exactly why algorithms must be trained on it.No two hospitals are alike; every deployment requires local adaptation.Usability matters as much as accuracy—naive users expose real workflow constraints.PathoSupport the showGet the "Digital Pathology 101" FREE E-book and join us!