We use cookies to understand how you use our site and to improve your experience. This includes personalizing content and advertising. To learn more, click here. By continuing to use our site, you accept our use of cookies. Cookie Policy.

Features Partner Sites Information LinkXpress hp
Sign In
Advertise with Us
Sekisui Diagnostics UK Ltd.

Philips Healthcare

Operates in Diagnostic Imaging Systems, Patient Care and Clinical Informatics, Customer Services, and Home Healthcare... read more Featured Products: More products

Download Mobile App




Intel and Philips Partner to Speed Up Imaging Analysis Using AI

By HospiMedica International staff writers
Posted on 22 Aug 2018
Intel Corporation (Santa Clara, CA, USA) and Royal Philips (Amsterdam, Netherlands) have tested two healthcare use cases for deep learning inference models: one on X-rays of bones for bone-age-prediction modeling and the other on CT scans of lungs for lung segmentation. More...
In these tests, which were conducted using Intel Xeon Scalable processors and the OpenVINO toolkit, the researchers achieved a speed improvement of 188 times for the bone-age-prediction model and 38 times for the lung-segmentation model over the baseline measurements. These tests show that healthcare organizations can implement artificial intelligence (AI) workloads without expensive hardware investments.

The size of medical image files is growing along with the improvement in medical image resolution, with most images having a size of 1GB or greater. More healthcare organizations are using deep learning inference to more quickly and accurately review patient images. AI techniques such as object detection and segmentation can help radiologists identify issues faster and more accurately, which can translate to better prioritization of cases, better outcomes for more patients and reduced costs for hospitals. Deep learning inference applications typically process workloads in small batches or in a streaming manner, which means they do not exhibit large batch sizes. Until recently, graphics processing unit (GPUs) was the prominent hardware solution to accelerate deep learning. By design, GPUs work well with images, but also have inherent memory constraints that data scientists have had to work around when building some models.

Central processing units (CPUs), such as Intel Xeon Scalable processors, do not have such memory constraints and can accelerate complex, hybrid workloads, including larger, memory-intensive models typically found in medical imaging. For a large subset of AI workloads, CPUs can better meet the needs of data scientists as compared to GPU-based systems. Running healthcare deep learning workloads on CPU-based devices offers direct benefits to companies such as Philips as it allows them to offer AI-based services that do not drive up costs for their end customers.

“Intel Xeon Scalable processors appear to be the right solution for this type of AI workload. Our customers can use their existing hardware to its maximum potential, while still aiming to achieve quality output resolution at exceptional speeds,” said Vijayananda J., chief architect and fellow, Data Science and AI at Philips HealthSuite Insights.


Platinum Member
Real-Time Diagnostics Onscreen Viewer
GEMweb Live
Gold Member
NEW PRODUCT : SILICONE WASHING MACHINE TRAY COVER WITH VICOLAB SILICONE NET VICOLAB®
REGISTRED 682.9
Exam Table
PF400
Infrared Digital Thermometer
R1B1
Read the full article by registering today, it's FREE! It's Free!
Register now for FREE to HospiMedica.com and get access to news and events that shape the world of Hospital Medicine.
  • Free digital version edition of HospiMedica International sent by email on regular basis
  • Free print version of HospiMedica International magazine (available only outside USA and Canada).
  • Free and unlimited access to back issues of HospiMedica International in digital format
  • Free HospiMedica International Newsletter sent every week containing the latest news
  • Free breaking news sent via email
  • Free access to Events Calendar
  • Free access to LinkXpress new product services
  • REGISTRATION IS FREE AND EASY!
Click here to Register








Channels

Surgical Techniques

view channel
Image: Miniaturized electric generators based on hydrogels for use in biomedical devices (Photo courtesy of HKU)

Hydrogel-Based Miniaturized Electric Generators to Power Biomedical Devices

The development of engineered devices that can harvest and convert the mechanical motion of the human body into electricity is essential for powering bioelectronic devices. This mechanoelectrical energy... Read more

Patient Care

view channel
Image: The newly-launched solution can transform operating room scheduling and boost utilization rates (Photo courtesy of Fujitsu)

Surgical Capacity Optimization Solution Helps Hospitals Boost OR Utilization

An innovative solution has the capability to transform surgical capacity utilization by targeting the root cause of surgical block time inefficiencies. Fujitsu Limited’s (Tokyo, Japan) Surgical Capacity... Read more

Health IT

view channel
Image: First ever institution-specific model provides significant performance advantage over current population-derived models (Photo courtesy of Mount Sinai)

Machine Learning Model Improves Mortality Risk Prediction for Cardiac Surgery Patients

Machine learning algorithms have been deployed to create predictive models in various medical fields, with some demonstrating improved outcomes compared to their standard-of-care counterparts.... Read more

Point of Care

view channel
Image: The Quantra Hemostasis System has received US FDA special 510(k) clearance for use with its Quantra QStat Cartridge (Photo courtesy of HemoSonics)

Critical Bleeding Management System to Help Hospitals Further Standardize Viscoelastic Testing

Surgical procedures are often accompanied by significant blood loss and the subsequent high likelihood of the need for allogeneic blood transfusions. These transfusions, while critical, are linked to various... Read more
Copyright © 2000-2025 Globetech Media. All rights reserved.