| The skinny Issues with deployment and scaling are the real barriers holding back healthcare AI from delivering value, according to leaders at AI companies Nvidia and Hoppr. That’s why they’re shifting their focus away from building standalone models and zeroing in on the infrastructure needed for those models to actually be used in clinical practice. Hoppr has built an AI foundry that uses Nvidia’s computing and foundation models — an offering the partners say gives developers access to tools for launching medical imaging AI more easily at scale. Making AI deployment easier “We're providing the platform where health systems, radiology practices and med device companies can now build their fine-tuned models very quickly and deploy them very quickly in their practice or in their product,” said Hoppr CEO Khan Siddiqui. Hospitals no longer need huge amounts of data or infrastructure to create their own models because Hoppr and Nvidia pre-train their foundation models on massive datasets, he pointed out. In the past, providers needed to purchase massive datasets containing about 100,00 patient records to train AI models, but pre-trained foundation models allow hospitals to shape models using much smaller datasets, sometimes containing just hundreds of records, Siddiqui stated Providers’ internal model development David Niewolny, global head of business development at Nvidia, said that the AI foundry is a sign of a broader shift moving healthcare AI from isolated model development to a full ecosystem of tools that can be deployed directly into clinical workflows. He said Hoppr solves the “last mile” problem. “Nvidia is providing the tools and the raw performance. Hoppr is taking that, and through the use of open models and the fine-tuning that they're doing, turns it into a much more turnkey clinical-grade AI that is designed to be run inside of hospitals,” Niewolny remarked. The partners’ effort reflects a push to turn healthcare AI into something closer to a software development ecosystem than a collection of point solutions. They’re betting that as foundation models and deployment platforms mature, providers will move more and more from purchasing AI applications to building and iterating on them internally. — By Katie Adams |
No comments