How Virtual Reality Affects Storytelling in Journalism
How Virtual Reality Affects Storytelling in Journalism

Today, we’re in the very early stages of virtual reality development. Nevertheless, VR is already applied in many different fields, including journalism. 

A lot of popular media, e.g., The Guardian, the New York Times, and The Huffington Post, started using VR and 360о video as a new format of storytelling, which helps audiences immerse into the digital version of a specific event.

Evolution of VR Journalism

VR journalism is the innovative way to tell a story in the digital reality, where a real event or a problem is recreated. In general, modern VR journalism is divided into two main categories: 360о videos and VR movies. 

Pioneer of VR Journalism

Nonny de la Peña is considered “the godmother of modern virtual reality”. In 2007, a former New York Times journalist founded the Emblematic Group company, which produces VR content with definite linear storylines.

The Hunger in Los Angeles is considered to be the very first VR documentary, released at Sundance Film Festival in 2012. The visitors of the event could see the movie in a VR headset prototype, designed by future Oculus VR founder Palmer Luckey. Audiences got a chance to get immersed in a virtual Los Angeles and see a  man in diabetic shock losing his consciousness when he was standing in the line. 

“People broke down in tears as they handed back the goggles,” said de la Peña. “That’s when I knew this tool could let viewers experience and understand an event in a completely new way.”

After the movie’s success, Emblematic Group released a number of acclaimed VR documentaries. Greenland Melting, released in 2017, is one of them.  According to the plot, a VR headset user gets on a survey shipboard, watches, and estimates the scale of Greenland ice melting. Greenland Melting also became one of the first VR movies shown at the Venice Film Festival the same year.

360o Video Journalism in Mass Media 

In the second half of the 2010s, the New York Times and The Huffington Post were among the first media to start complementing their articles with 360о videos. These videos can expand the event’s perceptions, like, for example, the 360о video series Out of Sight by The Huffington Post. The three-part miniseries tell about the life of Congo and Nigeria people, who suffer from three neglected tropical diseases: elephantiasis, river blindness, and sleeping sickness.

With the help of 360о videos, you can also recreate historically important places changed many years ago. In the documental featurette Remembering Emmett Till by New York Times, the location from the old photos and modern ones were assembled with the help of  this technology. The main purpose is to show the place of the cruel murder of 14-years old black boy Emmett Till, which happened more than 60 years ago. The teenager was falsely accused of offending a white woman. 

The main advantages of using VR in Journalism

One of the main advantages of the VR journalism is the possibility to plunge in the true story with a better understanding of the subject. The most recent sir David Attenborough’s VR documentary First Life is available to watch on Oculus TV. The 11-minute VR featurette tells the story about the beginning of life on Earth and the very first creatures that appeared in the World Ocean. The original 2D movie was released in 2010. And with the help of virtual reality, the headset users can literally get into the digital version of the prehistoric period. 

With the help of VR, a headset user becomes an active participant in the story, not a passive viewer. Anthony Geffen, movie producer and Atlantic Productions founder, told about his working experience in creating sir David Attenborough VR movies during his TED Talk.

“We wanted to take you, guys, on a completely submerged journey, with David as your guide. The beauty of – when you’re building these kinds of stories, and we have these kinds of situations – is that you can see how the camera can move around. And we realized we were going to have to do this “on a rails” experience. On a rails means we’re pushing you through the experience, but you’re able to look where you want to,” said Geffen.

Also, virtual reality can enhance viewers’ empathy and improve their understanding of the problem. Oscar-winning documentary Colette by The Guardian was converted into VR and available for Oculus headset users. Thanks to this format, the viewer can soak in the story of Colette Marine-Catherine who fought Nazis in France during World War II.  

What You Should Pay Attention to While Creating VR Story

VR is a relatively new tool for storytelling and it’s not completely researched, as professional journalists started working with it only in the first half of the 2010s. Therefore, you should be aware of  the possible risks of using VR in journalism: 

  • The absence of a definite ethical code regulates the demonstration of some materials which may cause witness trauma in people with unstable mental health. Only in North America, approximately 30% of individuals who had witnessed traumatic events developed PTSD; 
  • virtual reality could serve as a weapon to share misinformation and fake news in potential informational wars. Like deepfake technology, for example. Russians have already used it in the war with Ukraine when they made a fake video featuring Ukrainian president Volodymyr Zelensky surrendering

AR/VR Apps development

Conclusion

Virtual reality is a new digital tool, which is used to tell stories thoroughly and convincingly by many journalists. With the help of VR, you can engage the viewers and cause empathy in them. But, at the same time, this tool can be used as a platform to share fake news and misinformation. 

Latest Articles

How Extended Reality Is Reshaping Modern Marketing
March 31, 2026
How Extended Reality Is Reshaping Modern Marketing

The global extended reality market (including VR, AR and MR) is expected to reach $84.86 billion by 2029, growing at an estimated annual rate of 28%. But the bigger point isn’t just that the market is expanding, it’s that XR is already proving its value in the places marketers care about most: engagement, conversion, and customer confidence. In ecommerce, interacting with products via AR leads to a 94% higher conversion rate compared to products without AR. That makes sense: when people can better understand what they’re buying, they’re more likely to move forward and less likely to regret the purchase later.  XR also gives brands something that’s getting harder to win online: attention. VR campaigns generate about 46% higher engagement than traditional digital campaigns. People who interact with AR content spend around 2.7 times longer on product pages.  XR is now showing up in real results. That is why marketing is moving beyond static content toward immersive experiences. In the following sections, we will share how these technologies can be applied to marketing strategies and explore what the future of immersive experiences might look like. How XR is transforming modern marketing: 4 use cases that prove it works With XR, businesses can turn traditional campaigns into fully immersive experiences, where customers can explore products, interact with brands, and connect with content in memorable ways. Its value goes far beyond visual appeal, directly impacting the business growth and customer journey itself. And while this may not be immediately obvious, XR can also save significant resources, reducing the need for physical prototypes, showrooms, or large-scale events, making marketing more efficient. This is why more businesses are integrating immersive technologies into their marketing strategies, even despite certain challenges, such as development and VR hardware costs, as well as complex technology integration. Below, we highlight several successful use cases of immersive technologies in marketing. Virtual try-ons One of the most persistent barriers to online purchasing is uncertainty. Will these glasses suit my face shape? Will this sofa fit in my living room? Will this shade of lipstick actually complement my skin tone? These are questions that traditionally required a physical store visit. Virtual try-on eliminates that leap entirely. The technology behind this falls into a few distinct forms. The most accessible is smartphone-based AR. Customers point their phone at themselves or their surroundings, and the app overlays a true-to-scale digital product in real time. A striking example is the FindYourGlasses app developed by Qualium Systems. A step further are dedicated AR headsets and glasses, which immerse the customer in a mixed-reality environment where products can be explored in even greater depth and spatial accuracy.  These technologies help customers understand what they are buying before making a purchase, enabling them to make decisions based on accurate, personalized visualization rather than guesswork. Real-world example: IKEA Place AR App IKEA Place AR app lets shoppers visualize furniture in their own physical spaces before buying. Customers simply point their phone camera at a room, select a piece of furniture, and see it rendered in realistic scale within their actual environment. This removes the biggest friction point in furniture shopping: not knowing whether a sofa or shelf will actually fit or match the existing interior design. Results: After launch, the app was downloaded millions of times and became one of the most widely adopted retail AR experiences globally. IKEA reported increased customer engagement and reduced returns because customers could see how items fit before purchase. The company reported also that customers who use the IKEA Place app are 11% more likely to complete a purchase compared to those who do not use the app. Virtual showrooms & Tours Some purchases simply feel too significant to make without experiencing the space or context first. Traditionally, that meant showing up in person. Virtual showrooms and immersive tours remove that requirement. The technology here ranges from 360° web-based tours (viewable in any browser without additional hardware) to fully immersive VR experiences delivered through headsets. Visitors can walk through a branded space, interact with products, and access information on demand, without leaving their couch or office. Automotive brands use virtual showrooms to let buyers explore vehicle interiors, switch trims and colors, and get a feel for the cabin before visiting a dealership. Real estate platforms offer immersive property walkthroughs that let buyers shortlist homes remotely. Hotels and resorts use virtual tours to sell the experience upfront.  The value is especially pronounced in the machinery and heavy equipment sector, where physically demonstrating a product has always been costly: shipping industrial equipment to trade shows, organizing on-site demos, and flying prospects to manufacturing facilities all consume significant budgets. VR removes that overhead entirely: a potential buyer can step inside a virtual factory floor, operate a machine in a simulated environment, and evaluate complex equipment in full detail. Real-world example: Virtual showroom for MAKEEN Energy industrial equipment MAKEEN Energy, a global corporation delivering industrial gas solutions and heavy infrastructure equipment, built a true-to-scale virtual showroom. Using 3D models of their equipment in a virtual environment, they were able to pack their sprawling machinery into a portable VR headset and bring it to any trade fair.  Results: By no longer shipping heavy equipment around the world and reducing travel with virtual product demonstrations, MAKEEN Energy was able to cut logistics costs significantly. The virtual showroom also accelerated complex, multi-stakeholder sales by giving engineers, technicians, and purchase managers across different countries a shared, detailed view of the product. What began as a trade fair tool evolved into a company-wide asset for sales, training, and communications. For industrial businesses looking to adopt XR, Qualium Systems serves as a trusted technology partner, delivering VR and Web3D solutions that simplify the presentation of complex equipment, enhance product understanding, and support more effective digital engagement. Immersive brand storytelling XR gives brands the ability to place customers at the center of a narrative, transforming passive content consumption into a first-person experience that is far harder to forget. A VR film or AR…

June 27, 2025
Methodology of VR/MR/AR and AI Project Estimation

Estimation of IT projects based on VR, XR, MR, or AI requires both a deep technical understanding of advanced technologies and the ability to predict future market tendencies, potential risks, and opportunities. In this document, we aim to thoroughly examine estimation methodologies that allow for the most accurate prediction of project results in such innovative fields as VR/MR/AR and AI by describing unique approaches and strategies developed by Qualium Systems. We strive to cover existing estimation techniques used at our company and delve into the strategies and approaches that ensure high efficiency and accuracy of the estimation process. While focusing on different estimation types, we analyze the choice of methods and alternative approaches available. Due attention is paid to risk assessment being the key element of a successful IT project implementation, especially in such innovative fields as VR/MR/AR and AI. Moreover, the last chapter covers the demo of a project of ours, the Chemistry education app. We will show how the given approaches practically affect the final project estimation. Read

June 27, 2025
What Are Spatial Anchors and Why They Matter

Breaking Down Spatial Anchors in AR/MR Augmented Reality (AR) and Mixed Reality (MR) depend on accurate understanding of the physical environment to create realistic experiences, and they hit this target with the concept of spatial anchors. These anchors act like markers, either geometric or based on features, that help virtual objects stay in the same spot in the real world — even when users move around. Sounds simple, but the way spatial anchors are implemented varies a lot depending on the platform; for example, Apple’s ARKit, Google’s ARCore, and Microsoft’s Azure Spatial Anchors (ASA) all approach them differently. If you want to know how these anchors are used in practical scenarios or what challenges developers often face when working with them, this article dives into these insights too. What Are Spatial Anchors and Why They Matter A spatial anchor is like a marker in the real world, tied to a specific point or group of features. Once you create one, it allows for some important capabilities: Persistence. Virtual objects stay exactly where you placed them in the real-world, even if you close and restart the app. Multi-user synchronization. Multiple devices can share the same anchor, so everyone sees virtual objects aligned to the same physical space. Cross-session continuity. You can leave a space and come back later, and all the virtual elements will still be in the right place. In AR/MR, your device builds a point cloud or feature map by using the camera and built-in sensors like the IMU (inertial measurement unit). Spatial anchors are then tied to those features, and without them, virtual objects can drift or float around as you move, shattering the sense of immersion. Technical Mechanics of Spatial Anchors At a high level, creating and using spatial anchors involves a series of steps: Feature Detection & Mapping To start, the device needs to understand its surroundings: it scans the environment to identify stable visual features (e.g., corners, edges). Over time, these features are triangulated, forming a sparse map or mesh of the space. This feature map is what the system relies on to anchor virtual objects. Anchor Creation Next, anchors are placed at specific 3D locations in the environment in two possible ways: Hit-testing. The system casts a virtual ray from a camera to a user-tapped point, then drops an anchor on the detected surface. Manual placement. Sometimes, developers need precise control, so they manually specify the exact location of an anchor using known coordinates, like ensuring it perfectly fits on the floor or another predefined plane. Persistence & Serialization Anchors aren’t temporary — they can persist, and here’s how systems make that possible: Locally stored anchors. Frameworks save the anchor’s data, like feature descriptors and transforms, in a package called a “world map” or “anchor payload”. Cloud-based anchors. Cloud services like Azure Spatial Anchors (ASA) upload this anchor data to a remote server to let the same anchor be accessed across multiple devices. Synchronization & Restoration When you’re reopening the app or accessing the anchor on a different device, the system uses the saved data to restore the anchor’s location. It compares stored feature descriptors to what the camera sees in real time, and if there’s a good enough match, the system confidently snaps the anchor into position, and your virtual content shows up right where it’s supposed to. However, using spatial anchors isn’t perfect, like using any other technology, and there are some tricky issues to figure out: Low latency. Matching saved data to real-time visuals has to be quick; otherwise, the user experience feels clunky. Robustness in feature-scarce environments. Blank walls or textureless areas don’t give the system much to work with and make tracking tougher. Scale drift. Little errors in the system’s tracking add up over time to big discrepancies. When everything falls into place and the challenges are handled well, spatial anchors make augmented and virtual reality experiences feel seamless and truly real. ARKit’s Spatial Anchors (Apple) Apple’s ARKit, rolled out with iOS 11, brought powerful features to developers working on AR apps, and one of them is spatial anchoring, which allows virtual objects to stay fixed in the real world as if they belong there. To do this, ARKit provides two main APIs that developers rely on to achieve anchor-based persistence. ARAnchor & ARPlaneAnchor The simplest kind of anchor in ARKit is the ARAnchor, which represents a single 3D point in the real-world environment and acts as a kind of “pin” in space that ARKit can track. Building on this, ARPlaneAnchor identifies flat surfaces like tables, floors, and walls, allowing developers to tie virtual objects to these surfaces. ARWorldMap ARWorldMap makes ARKit robust for persistence and acts as a snapshot of the environment being tracked by ARKit. It captures the current session, including all detected anchors and their surrounding feature points, into a compact file. There are a few constraints developers need to keep in mind: World maps are iOS-only, which means they cannot be shared directly with Android. There must be enough overlapping features between the saved environment and the current physical space, and textured structures are especially valuable for this, as they help ARKit identify key points for alignment. Large world maps, especially those with many anchors or detailed environments, can be slow to serialize and deserialize, causing higher application latency when loading or saving. ARKit anchors are ideal for single-user persistence, but sharing AR experiences across multiple devices poses additional issues, and developers often employ custom server logic (uploading ARWorldMap data to a backend), enabling users to download and use the same map. However, this approach comes with caveats: it requires extra development work and doesn’t offer native support for sharing across platforms like iOS and Android. ARCore’s Spatial Anchors (Google) Google’s ARCore is a solid toolkit for building AR apps, and one of its best features is how it handles spatial anchors: Anchors & Hit-Testing ARCore offers two ways to create anchors. You can use Session.createAnchor(Pose) if you already know the anchor’s position, or…



Let's discuss your ideas

Contact us