Med Tech Standards: Why DICOM is Stuck in the 90s and What Needs to Change

You probably don’t think much about medical scan data. But they’re everywhere.

If you’ve got an X-ray or an MRI, your images were almost certainly processed by DICOM (Digital Imaging and Communications in Medicine), the globally accepted standard for storing and sharing medical imaging data like X-rays, MRIs, and CT scans between hospitals, clinics, and research institutions since the late 80s and early 90s.

But there’s a problem: while medical technology has made incredible leaps in the last 30 years, DICOM hasn’t kept up.

What is DICOM anyway?

DICOM still operates in ways that feel more suited to a 1990s environment of local networks and limited computing power. Despite updates, the system doesn’t meet the demands of cloud computing, AI-driven diagnostics, and real-time collaboration. It lacks cloud-native support and rigid file structures, and shows inconsistencies between different manufacturers.

If your doctor still hands you a CD with your scan on it in 2025 (!), DICOM is a big part of that story.

Med Tech Standarts

The DICOM Legacy

How DICOM Came to Be

When DICOM was developed in the 1980s, the focus was on solving some big problems in medical imaging, and honestly, it did the job brilliantly for its time.

The initial idea was to create a universal language for different hardware and software platforms to communicate with each other, sort of like building a shared language for technology. They also had to make sure it was compatible with older devices already in use.

At that time, the most practical option was to rely on local networks since cloud-based solutions simply didn’t exist yet.

These decisions helped DICOM become the go-to standard, but they also locked it into an outdated framework that’s now tough to update.

Why It’s Hard to Change DICOM

Medical standards don’t evolve as fast as consumer technology like phones or computers. Changing something like DICOM doesn’t happen overnight. It’s a slow and complicated process muddled by layers of regulatory approvals and opinions from a tangled web or organizations and stakeholders.

What’s more, hospitals have decades of patient data tied to these systems, and making big changes that may break compatibility isn’t easy.

And to top it all off, device manufacturers have different ways of interpreting and implementing DICOM, so it’s nearly impossible to enforce consistency.

The Trouble With Staying Backwards Compatible

DICOM’s focus on working perfectly with old systems was smart at the time, but it’s created some long-term problems.

Technological advancements have moved on with AI, cloud storage, and tools for real-time diagnostics. They have shown immediately how limited DICOM can be in catching up with these innovations. Also, vendor-specific implementations have created quirks that make devices less compatible with one another than they should be.

And don’t even get started on trying to link DICOM with modern healthcare systems like electronic records or telemedicine platforms. It would be like trying to plug a 1980s gadget into a smart technology ecosystem — not impossible, but far from seamless.

Med Tech Standarts

Why Your CT Scanner and MRI Machine Aren’t Speaking the Same Language

Interoperability in medical imaging sounds great in theory — everything just works, no matter the device or manufacturer — however, in practice, things got messy. Some issues sound abstract, but for doctors and hospitals, they mean delays, misinterpretations, and extra burden. So, why don’t devices always play nice?

The Problem With “Standards” That Aren’t Very Standard

You’d think having a universal standard like DICOM would ensure easy interoperability because everybody follows the same rules.

Not exactly. Device manufacturers implement it differently, and this leads to:

  • Private tags. These are proprietary pieces of data that only specific software can understand. If your software doesn’t understand them, you’re out of luck.
  • Missing or vague fields. Some devices leave out crucial metadata or define it differently.
  • File structure issues. Small differences in how data is formatted sometimes make files unreadable.

The idea of a universal standard is nice, but the way it’s applied leaves a lot to be desired.

Metadata and Tag Interpretation Issues

DICOM images contain extensive metadata to describe details like how the patient was positioned during the scan or how the images fit together. But when this metadata isn’t standardized, you end up with metadata and tag interpretation issues.

For example, inconsistencies in slice spacing or image order can throw off 3D reconstructions, leaving scans misaligned. As a result, when doctors try to compare scans over time or across different systems, they often have to deal with mismatched or incomplete data.

These inconsistencies make what should be straightforward tasks unnecessarily complicated and create challenges for accurate diagnoses and proper patient care.

File Structure and Storage Inconsistencies

The way images are stored varies so much between devices that it often causes problems.

Some scanners save each image slice separately. Others put them together in one file. Then there are slight differences in DICOM implementations that make it difficult to read images on some systems. Compression adds another layer of complexity — it’s not the same across the board. File sizes and levels of quality vary widely.

All these mismatches and inconsistencies make everything harder for hospitals and doctors trying to work together.

Med Tech Standarts

Orientation and Interpretation Issues

Medical imaging is incredible, but sometimes working with scans slows things down when time matters most and makes it harder to get accurate insights for patient care.

There are several reasons for this.

Different Coordinate Systems

Sometimes, DICOM permits the use of different coordination systems and causes confusions.

For instance, patient-based coordinates relate to the patient’s body, like top-to-bottom (head-to-feet) or side-to-side (left-to-right). Scanner-based coordinates, on the other hand, are based on the imaging device itself.

When these systems don’t match up, it creates misalignment issues in multi-modal imaging studies, where scans from different devices need to work together.

Slice Ordering Problems

Scans like MRIs and CTs are made up of thin cross-sectional images called slices. But not every scanner orders or numbers these slices in the same way.

Some slices can be stored from top-to-bottom or bottom-to-top. If the order isn’t clear, reconstructing 3D models becomes harder. Certain scanners use inconsistent slice numbering and make volume alignment challenging.

Display Inconsistencies Across Viewers

It’s weird to think that a medical scan looks completely different depending on which viewer you open it with, but that’s exactly the problem with DICOM viewers. The problem is, there’s no universal approach to how images should be presented.

For example, the brightness and contrast look perfect in one viewer but totally off on another because they each interpret presets differently. Or images can be flipped or rotated because one system handles orientation metadata in a different way. There are also cross-platform compatibility issues when a scan looks perfect on one viewer but appears distorted or altered when opened on another platform.

All these inconsistencies add up, and interpretation becomes more complicated than it should be.

Med Tech Standarts

Interoperability: Why It’s Breaking Down

When you think about healthcare, the last thing you want is for technology to get in the way of patient care. However, that’s exactly what happens when systems can’t talk to each other.

Interoperability challenges slow down workflows, add stress to healthcare systems, and impact how quickly patients get the care they need.

Interoperability Isn’t Optional Anymore

Interoperability is absolutely critical in healthcare, and it’s not hard to see why.

Hospitals use equipment from different manufacturers, and everything should work seamlessly together. Doctors at different facilities need to share images for second opinions and collaboration. Finally, AI tools and cloud-based services only work well when they have clean data to analyze.

These connections break down without interoperability, and it becomes harder for healthcare teams to give their patients the proper care.

Common Stumbling Blocks

When each vendor configures the DICOM standard to suit their devices, you get broken compatibility between systems. Moreover, trying to connect DICOM systems to modern cloud platforms is a pain because there aren’t enough standard APIs to make it simple.

And for hospitals, it’s even worse. They often feel stuck with a specific PACS vendor. The software is so locked down and proprietary that switching to something else feels almost impossible.

Problems With Cloud and AI Integration

Large file sizes and inefficient compression drag down cloud-based workflows and make everything slower than it needs to be.

Then there’s real-time remote diagnostics. It becomes harder to manage without native streaming support, and delays are almost guaranteed.

What about AI, it’s one of the most powerful tools we have, but it relies on consistent and clean metadata to really perform. The problem is, DICOM data is often inconsistent. So, instead of letting AI do what it’s designed to do, like analyzing and automating tasks, it hits a wall because metadata isn’t aligned, and you end up spending more time trying to make the data compatible.

These are real challenges that get in the way of what could be a more efficient system. The sooner we address these issues, the sooner the system flows like it’s meant to.

Med Tech Standarts

Efforts to Update Medical Imaging Standards

Medical imaging is going through some much-needed upgrades, and honestly, it’s about time. Systems that have been around forever, like DICOM, are finally getting the updates they need to keep up with the pace of healthcare.

For example, with DICOMweb, you can now pull up imaging files using RESTful APIs. FHIR (Fast Healthcare Interoperability Resources) helps DICOM work better with newer healthcare systems.

Of course, DICOM isn’t the only option. The NIfTI format works well for 3D volumetric imaging and is a favorite in neuroscience research. FHIR-based imaging workflows offer cloud-native alternatives. The whole point is to give people alternatives that fit the way the world looks like now, not the way it worked 20 years ago.

AI and cloud computing are also making an impact — but they do have some big requirements. AI is powerful, no doubt, but it needs well-organized image data to produce accurate results. They’re simply held back when the data isn’t consistent. On the cloud side of things, PACS systems depend on strong DICOM support to run properly. Metadata is a key element that ties all this together. It powers automation, speeds up workflows, and ensures precision in every process.

Piece by piece, all these changes are helping medical imaging to keep up with what modern healthcare actually needs.

Med Tech Standarts

What’s next for medical imaging?

DICOM was never built for today’s needs like real-time collaboration or AI-powered analysis. They weren’t even on the radar when DICOM was created. It’s an old system trying to function in a new reality.

What medical imaging systems need is a fresh approach. There should be clearer rules to enforce standardization and stop vendors from going rogue with custom modifications.

Modern systems would also benefit from cloud-native imaging formats that handle modern needs and don’t break old data. Moreover, smart APIs can simplify bridging imaging systems with EHRs and the many other tools healthcare depends on.

If imaging systems can make these changes, it could finally start working the way modern medicine really needs it to.

Final Thoughts

Medical imaging deserves better. DICOM had its time, but modern medicine needs systems that keep pace with advancements.

Change is possible, but it’s going to take teamwork between healthcare professionals, software developers, and policymakers to move on from the 90s and build something that works for the challenges of today and tomorrow.

References & Resources

Latest Articles

Digital Twins for Digital Transformation Strategy in the Industrial Sector
April 22, 2026
Digital Twins for Industry 5.0 Transformation Strategy

Industrial digital transformation is no longer just about automation or collecting data. More and more, it comes down to having a live, accurate digital representation of what is actually happening across physical operations. That is what a digital twin does: it creates a virtual model of a machine, a production line, or an entire facility, and keeps it synchronized with real-world data in real time. This makes it more than a visualization tool. It becomes a working instrument for a variety of industrial applications: simulations, predictive maintenance, monitoring and analytics, process and operational optimization, quality control, worker enablement, EHS solutions, and faster decision-making. Industrial Extended Reality (XR) and immersive technologies are entering their second wave of adoption. While the first wave was shaped mainly by experimentation with XR, the current stage is enabled by mature hardware and significantly stronger digital capabilities, allowing organizations to realize the true value of VR and AR in practical, scalable ways. In parallel, digital transformation is shifting from the automation-led, low-human-involvement logic of Industry 4.0 toward a human-centric model built on human-machine collaboration and co-piloting in Industry 5.0. Industry is adopting Extended Reality (XR) faster than any other sector. Manufacturing and industrial operations accounted for 35.1% of the global digital twin market in 2025. More than half of companies using digital twins report profitability increases of over 20%, and Gartner predicts that by 2027, 40% of large industrial companies will use the technology, resulting in increased revenue. The market overall is projected to grow from $49.2 billion in 2026 to $228.46 billion by 2031. These numbers show that digital twins become a core part of how industrial companies compete and operate. In this article, we look at the specific areas where digital twins create the most value in the industrial sector today, walk through real-world cases from companies already using them at scale, and discuss where the technology is headed next. Why Digital Twins are more than virtual models The role of digital twins has broadened significantly, now covering simulation, planning, operations, and essential 3D visualization needs. As a strategic capability, the digital twin helps organizations understand the present state of assets and systems, anticipate what comes next, and make more precise, informed decisions. This is what separates them from the technologies they are often confused with. A 3D model is static and disconnected from physical reality. A simulation runs defined scenarios but doesn’t update as circumstances change. BIM captures asset properties at a point in time—valuable, but not dynamic. A digital twin does all three, continuously. Let’s look at how this works from a technological perspective. The technology stack behind the intelligence Within the virtual model, three interconnected layers work together.  The first is the data storage and processing layer, responsible for ingesting, organizing, and structuring incoming data streams. IoT sensors and edge devices form the foundation of data acquisition, continuously capturing physical parameters: temperature, vibration, pressure, energy consumption, throughput. This data moves through real-time pipelines into processing environments. The second is the analytics and AI layer, which interprets this data by detecting anomalies, identifying patterns, generating forecasts, and providing recommendations to guide operational decisions.  The third is the visualization and interface layer, translating these insights into clear, actionable formats: dashboards, alerts, or interactive simulations, that engineers, operators, and executives can easily use. A digital twin also integrates with the broader enterprise ecosystem, including engineering documentation, GIS platforms, maintenance systems, financial tools, and business networks. The result is a closed loop of intelligence. Physical reality continuously updates the virtual mode → the model generates insights → and those insights guide decisions that impact the physical system. Types of digital twins Depending on the level of detail and the specific operational goals, a digital twin can focus on a single component, a complete asset, an entire system, or even a full process. Recognizing these distinctions helps organizations select the right model for each use case. A component twin represents a single element (a pump, a bearing, a sensor) and is primarily used for granular condition monitoring and early failure detection.  An asset twin integrates multiple components into a unified model of a complete physical asset, such as a machine or a turbine, enabling a more comprehensive view of performance and interdependencies.  A system twin extends this further, representing how multiple assets interact within a broader operational environment (a production line, a power grid, or a supply chain node).  A process twin models entire workflows and decision sequences, making it possible to trace how disruptions, inefficiencies, or interventions propagate across an organization. In real-world deployments, these levels are layered: component twins feed into asset twins, which feed into system and process twins. This nested setup mirrors actual operational complexity and enables insights at any level, from individual parts to entire workflows. Where digital twins create the most industrial value Below, we break down the use cases where digital twins are generating the most value in the industrial sector today. Predictive maintenance and asset reliability Unplanned equipment downtime remains one of the most costly scenarios for any industrial enterprise. When a critical asset fails unexpectedly, the company loses not only on repairs but also on production chain disruptions, logistical failures, and reputational risks. This is why predictive maintenance powered by digital twins has become one of the most mature and economically justified applications of the technology. The traditional approach to maintenance operates on two models: reactive (repair after failure) or scheduled preventive (servicing on a fixed schedule, regardless of the actual condition of the equipment). Both models are inefficient. The first leads to emergency shutdowns, while the second results in excessive spending on servicing components that still have significant remaining life. The digital twin changes this paradigm. It creates a virtual copy of a physical asset that continuously receives sensor data and updates in real time. Through machine learning algorithms, the system analyzes wear patterns, compares current conditions against historical data, and predicts the moment when a component will reach a critical state. This enables maintenance to…

April 2, 2026
Quality and Security You Can Trust, Proven Again: Qualium Renews ISO 27001 and 9001 Certifications

More than 2 years ago, we initiated a focused effort to elevate our security and quality frameworks. Our objective wasn’t just to satisfy standards – it was to make security an integral part of our operations, from daily workflows to strategic decisions. Leading the initiative, Dmytro Stetsenko, Co-founder and CTO at Qualium Systems, stepped up to lead the audit internally, ensuring completion of formal ISO 9001 & 27001 auditor training and reinforcing our internal capabilities. In the months that followed, he partnered with compliance experts and process owners to enhance key operational workflows – from asset management and physical security to HR governance, risk management and business continuity. As Dmytro highlights: “The most significant transformation is in risk awareness. We didn’t just offer new controls, we fundamentally redefined how risks are identified, evaluated and addressed across a company.” Last month we successfully renewed both certifications, involving three-phase audits: an internal review, followed by evaluations from both our ISO 9001 auditor and a dedicated ISO/IEC 27001 audit team, with oversight from an accreditation officer to ensure additional scrutiny. Turning Security into Resilience: How We Built Stronger Quality and Security Frameworks As regulatory pressure intensifies across healthcare, finance and other data-sensitive industries, organizations are expected to demonstrate not only innovation but also measurable control over quality, security, and risk. This year we successfully reaffirmed its compliance with ISO 9001 and ISO/IEC 27001 standards, reinforcing our position as a trusted technology partner operating at the highest levels of operational excellence and information security. As Dmytro Stetsenko explains: “Regulatory pressure from frameworks like DORA and NIS2 continues to grow and compliance is becoming increasingly complex, demanding more resources. Our ISO 27001 certification in particular simplifies that landscape for our clients – reducing audit friction, accelerating approvals, and ensuring a consistently high standard of security.” Global frameworks such as DORA and NIS2 are reshaping expectations around cybersecurity, resilience, and governance. For companies operating in regulated environments, compliance is no longer optional – it is foundational. Qualium Systems ISO certifications provide a structured, internationally recognized framework that directly supports these evolving requirements: ISO/IEC 27001 ensures a mature Information Security Management System (ISMS), safeguarding data confidentiality, integrity, and availability ISO 9001 establishes a robust Quality Management System (QMS), focused on consistency, performance, and continuous improvement Together, these standards create a unified operating model where security and quality are embedded into every process, not treated as separate functions. Coded Harder, Built Better, Run Faster, Secured Stronger: What ISO Means for Everyday Quality and Security Rather than treating certification as a one-time milestone, Qualium Systems approaches ISO standards as a continuous discipline. The 2026 renewal reflects a deeper evolution of internal systems, including: ● Advanced risk management practices integrated across delivery, infrastructure, and operations ● Role-based access controls and data governance models aligned with modern security expectations ● Enhanced business continuity and resilience planning, ensuring stability under disruption ● Process optimization frameworks that improve delivery speed without compromising quality This systemic approach allows clients to operate with greater confidence, reducing audit friction, accelerating approvals, and ensuring readiness for increasingly complex regulatory environments. What It Means for our Clients For organizations in healthcare, fintech, and other compliance-driven sectors, working with a certified partner is no longer a preference — it is a requirement. Qualium Systems ISO 9001 and ISO/IEC 27001 certifications translate into tangible business value: ● Reduced compliance burden across regulatory frameworks ● Lower operational and cybersecurity risk exposure ● Predictable, high-quality delivery outcomes ● Faster alignment with enterprise procurement and audit requirements In practice, this means clients can focus on innovation and growth – while relying on a partner whose processes are already aligned with global best practices. What Comes Next: Beyond Compliance The 2026 certification milestone is not an endpoint, but part of a broader strategy to continuously elevate standards across delivery. As regulatory expectations continue to evolve, we are actively expanding our compliance framework to better support clients in highly regulated industries, particularly healthcare. This includes advancing our alignment with GDPR requirements and progressing toward HIPAA readiness, further strengthening our ability to manage sensitive data in complex regulatory environments. By combining deep technical expertise with certified operational frameworks, the company continues to bridge the gap between cutting-edge technology and enterprise-grade reliability. As Dmytro notes: “This certification reflects our long-term commitment to helping clients navigate the most demanding regulatory environments with confidence. While we continue to expand our compliance capabilities, advancing toward GDPR and HIPAA readiness for healthcare-focused solutions.”

How Extended Reality Is Reshaping Modern Marketing
March 31, 2026
How Extended Reality Is Reshaping Modern Marketing

The global extended reality market (including VR, AR and MR) is expected to reach $84.86 billion by 2029, growing at an estimated annual rate of 28%. But the bigger point isn’t just that the market is expanding, it’s that XR is already proving its value in the places marketers care about most: engagement, conversion, and customer confidence. In ecommerce, interacting with products via AR leads to a 94% higher conversion rate compared to products without AR. That makes sense: when people can better understand what they’re buying, they’re more likely to move forward and less likely to regret the purchase later.  XR also gives brands something that’s getting harder to win online: attention. VR campaigns generate about 46% higher engagement than traditional digital campaigns. People who interact with AR content spend around 2.7 times longer on product pages.  XR is now showing up in real results. That is why marketing is moving beyond static content toward immersive experiences. In the following sections, we will share how these technologies can be applied to marketing strategies and explore what the future of immersive experiences might look like. How XR is transforming modern marketing: 4 use cases that prove it works With XR, businesses can turn traditional campaigns into fully immersive experiences, where customers can explore products, interact with brands, and connect with content in memorable ways. Its value goes far beyond visual appeal, directly impacting the business growth and customer journey itself. And while this may not be immediately obvious, XR can also save significant resources, reducing the need for physical prototypes, showrooms, or large-scale events, making marketing more efficient. This is why more businesses are integrating immersive technologies into their marketing strategies, even despite certain challenges, such as development and VR hardware costs, as well as complex technology integration. Below, we highlight several successful use cases of immersive technologies in marketing. Virtual try-ons One of the most persistent barriers to online purchasing is uncertainty. Will these glasses suit my face shape? Will this sofa fit in my living room? Will this shade of lipstick actually complement my skin tone? These are questions that traditionally required a physical store visit. Virtual try-on eliminates that leap entirely. The technology behind this falls into a few distinct forms. The most accessible is smartphone-based AR. Customers point their phone at themselves or their surroundings, and the app overlays a true-to-scale digital product in real time. A striking example is the FindYourGlasses app developed by Qualium Systems. A step further are dedicated AR headsets and glasses, which immerse the customer in a mixed-reality environment where products can be explored in even greater depth and spatial accuracy.  These technologies help customers understand what they are buying before making a purchase, enabling them to make decisions based on accurate, personalized visualization rather than guesswork. Real-world example: IKEA Place AR App IKEA Place AR app lets shoppers visualize furniture in their own physical spaces before buying. Customers simply point their phone camera at a room, select a piece of furniture, and see it rendered in realistic scale within their actual environment. This removes the biggest friction point in furniture shopping: not knowing whether a sofa or shelf will actually fit or match the existing interior design. Results: After launch, the app was downloaded millions of times and became one of the most widely adopted retail AR experiences globally. IKEA reported increased customer engagement and reduced returns because customers could see how items fit before purchase. The company reported also that customers who use the IKEA Place app are 11% more likely to complete a purchase compared to those who do not use the app. Virtual showrooms & Tours Some purchases simply feel too significant to make without experiencing the space or context first. Traditionally, that meant showing up in person. Virtual showrooms and immersive tours remove that requirement. The technology here ranges from 360° web-based tours (viewable in any browser without additional hardware) to fully immersive VR experiences delivered through headsets. Visitors can walk through a branded space, interact with products, and access information on demand, without leaving their couch or office. Automotive brands use virtual showrooms to let buyers explore vehicle interiors, switch trims and colors, and get a feel for the cabin before visiting a dealership. Real estate platforms offer immersive property walkthroughs that let buyers shortlist homes remotely. Hotels and resorts use virtual tours to sell the experience upfront.  The value is especially pronounced in the machinery and heavy equipment sector, where physically demonstrating a product has always been costly: shipping industrial equipment to trade shows, organizing on-site demos, and flying prospects to manufacturing facilities all consume significant budgets. VR removes that overhead entirely: a potential buyer can step inside a virtual factory floor, operate a machine in a simulated environment, and evaluate complex equipment in full detail. Real-world example: Virtual showroom for MAKEEN Energy industrial equipment MAKEEN Energy, a global corporation delivering industrial gas solutions and heavy infrastructure equipment, built a true-to-scale virtual showroom. Using 3D models of their equipment in a virtual environment, they were able to pack their sprawling machinery into a portable VR headset and bring it to any trade fair.  Results: By no longer shipping heavy equipment around the world and reducing travel with virtual product demonstrations, MAKEEN Energy was able to cut logistics costs significantly. The virtual showroom also accelerated complex, multi-stakeholder sales by giving engineers, technicians, and purchase managers across different countries a shared, detailed view of the product. What began as a trade fair tool evolved into a company-wide asset for sales, training, and communications. For industrial businesses looking to adopt XR, Qualium Systems serves as a trusted technology partner, delivering VR and Web3D solutions that simplify the presentation of complex equipment, enhance product understanding, and support more effective digital engagement. Immersive brand storytelling XR gives brands the ability to place customers at the center of a narrative, transforming passive content consumption into a first-person experience that is far harder to forget. A VR film or AR…



Let's discuss your ideas

Contact us