How Virtual Reality Affects Storytelling in Journalism
How Virtual Reality Affects Storytelling in Journalism

Today, we’re in the very early stages of virtual reality development. Nevertheless, VR is already applied in many different fields, including journalism. 

A lot of popular media, e.g., The Guardian, the New York Times, and The Huffington Post, started using VR and 360о video as a new format of storytelling, which helps audiences immerse into the digital version of a specific event.

Evolution of VR Journalism

VR journalism is the innovative way to tell a story in the digital reality, where a real event or a problem is recreated. In general, modern VR journalism is divided into two main categories: 360о videos and VR movies. 

Pioneer of VR Journalism

Nonny de la Peña is considered “the godmother of modern virtual reality”. In 2007, a former New York Times journalist founded the Emblematic Group company, which produces VR content with definite linear storylines.

The Hunger in Los Angeles is considered to be the very first VR documentary, released at Sundance Film Festival in 2012. The visitors of the event could see the movie in a VR headset prototype, designed by future Oculus VR founder Palmer Luckey. Audiences got a chance to get immersed in a virtual Los Angeles and see a  man in diabetic shock losing his consciousness when he was standing in the line. 

“People broke down in tears as they handed back the goggles,” said de la Peña. “That’s when I knew this tool could let viewers experience and understand an event in a completely new way.”

After the movie’s success, Emblematic Group released a number of acclaimed VR documentaries. Greenland Melting, released in 2017, is one of them.  According to the plot, a VR headset user gets on a survey shipboard, watches, and estimates the scale of Greenland ice melting. Greenland Melting also became one of the first VR movies shown at the Venice Film Festival the same year.

360o Video Journalism in Mass Media 

In the second half of the 2010s, the New York Times and The Huffington Post were among the first media to start complementing their articles with 360о videos. These videos can expand the event’s perceptions, like, for example, the 360о video series Out of Sight by The Huffington Post. The three-part miniseries tell about the life of Congo and Nigeria people, who suffer from three neglected tropical diseases: elephantiasis, river blindness, and sleeping sickness.

With the help of 360о videos, you can also recreate historically important places changed many years ago. In the documental featurette Remembering Emmett Till by New York Times, the location from the old photos and modern ones were assembled with the help of  this technology. The main purpose is to show the place of the cruel murder of 14-years old black boy Emmett Till, which happened more than 60 years ago. The teenager was falsely accused of offending a white woman. 

The main advantages of using VR in Journalism

One of the main advantages of the VR journalism is the possibility to plunge in the true story with a better understanding of the subject. The most recent sir David Attenborough’s VR documentary First Life is available to watch on Oculus TV. The 11-minute VR featurette tells the story about the beginning of life on Earth and the very first creatures that appeared in the World Ocean. The original 2D movie was released in 2010. And with the help of virtual reality, the headset users can literally get into the digital version of the prehistoric period. 

With the help of VR, a headset user becomes an active participant in the story, not a passive viewer. Anthony Geffen, movie producer and Atlantic Productions founder, told about his working experience in creating sir David Attenborough VR movies during his TED Talk.

“We wanted to take you, guys, on a completely submerged journey, with David as your guide. The beauty of – when you’re building these kinds of stories, and we have these kinds of situations – is that you can see how the camera can move around. And we realized we were going to have to do this “on a rails” experience. On a rails means we’re pushing you through the experience, but you’re able to look where you want to,” said Geffen.

Also, virtual reality can enhance viewers’ empathy and improve their understanding of the problem. Oscar-winning documentary Colette by The Guardian was converted into VR and available for Oculus headset users. Thanks to this format, the viewer can soak in the story of Colette Marine-Catherine who fought Nazis in France during World War II.  

What You Should Pay Attention to While Creating VR Story

VR is a relatively new tool for storytelling and it’s not completely researched, as professional journalists started working with it only in the first half of the 2010s. Therefore, you should be aware of  the possible risks of using VR in journalism: 

  • The absence of a definite ethical code regulates the demonstration of some materials which may cause witness trauma in people with unstable mental health. Only in North America, approximately 30% of individuals who had witnessed traumatic events developed PTSD; 
  • virtual reality could serve as a weapon to share misinformation and fake news in potential informational wars. Like deepfake technology, for example. Russians have already used it in the war with Ukraine when they made a fake video featuring Ukrainian president Volodymyr Zelensky surrendering

AR/VR Apps development

Conclusion

Virtual reality is a new digital tool, which is used to tell stories thoroughly and convincingly by many journalists. With the help of VR, you can engage the viewers and cause empathy in them. But, at the same time, this tool can be used as a platform to share fake news and misinformation. 

Latest Articles

April 29, 2024
Apple Vision Pro Software Transforms the Construction Industry: The REEKON Experience

Virtual Reality, Augmented Reality, and Extended Reality technologies are revolutionizing various industries, and construction is no exception. While the utilization of Apple Vision Pro in manufacturing or construction industries may not yet be widespread, an increasing number of companies are endeavoring to integrate virtual reality, augmented reality, and extended reality technologies into their daily operations. Why? Let’s try to figure it out in this article! Construction Industry Problems While one might anticipate that technological advancements would alleviate challenges within the construction sector, construction firms frequently encounter a myriad of obstacles that impede efficiency, escalate costs, and compromise safety. Here are some key challenges in construction: Design Visualization and Communication: Traditional blueprints and 2D drawings can be difficult for stakeholders to interpret accurately. Design Iterations and Prototyping: Iterating on design concepts and prototyping can be time-consuming and costly. Construction Planning and Logistics: Planning construction activities and logistics on-site can be complex and error-prone. Worker Training and Safety: Safety is a paramount concern in construction, yet traditional training methods may not effectively prepare workers for on-site hazards. Quality Control and Inspection: Ensuring quality control and conducting inspections during construction can be labor-intensive and prone to human error. Client Engagement and Marketing: Engaging clients and stakeholders in the design process and marketing new developments can be challenging with traditional presentation methods. Remote Collaboration and Coordination: Coordinating teams and stakeholders who are dispersed across different locations can be challenging and time-consuming. Immersive technologies such as Virtual Reality, Augmented Reality, and Mixed Reality, utilizing Apple Vision Pro, offer innovative solutions to many of these problems. Seamless AR Integration with the ROCK Jobsite App on Apple Vision Pro by REEKON Tools One notable example of this transformative technology in action is the implementation of the ROCK Jobsite App on Apple Vision Pro, as demonstrated by REEKON Tools. The ROCK Jobsite App, designed to streamline construction processes, represents a significant advancement in leveraging AR technology using Apple Vision Pro within the construction industry. Unlike many other VR/AR solutions that require extensive customization and integration efforts, the ROCK Jobsite App boasts seamless functionality on the Apple Vision Pro platform. Within just five minutes of installation, users can experience the full capabilities of this powerful tool, making it incredibly accessible and user-friendly. One of the key features of the ROCK Jobsite App is its ability to display measurements in real-time, providing construction professionals with immediate access to crucial data directly on their screens. The integration of Apple Vision Pro enhances this process, making it both effective and engaging. Whether annotating over photos, adding measurements to calculations, or collaborating with team members remotely, this app serves as a valuable companion throughout the construction process How Immersive Technologies Address Construction Problems The integration of Apple Vision Pro into VR/AR/XR technology marks a significant leap forward in the construction sector’s evolution. By tapping into the immersive capabilities of these technologies, construction companies can not only tackle challenges but also unearth fresh avenues for innovation. Here are some standout benefits: Advanced Visualization: With immersive technologies and Apple Vision Pro, stakeholders can immerse themselves in architectural designs and construction plans. This heightened visualization enables a clearer grasp of project requirements and early detection of potential issues. Enhanced Collaboration: Real-time data sharing and annotations foster more efficient collaboration among project teams, regardless of their physical locations. Boosted Efficiency: By automating tasks like data capture and measurement, Apple Vision Pro-equipped tools help construction professionals save time and resources. Manual efforts are replaced with streamlined processes, leading to heightened efficiency on-site. Cost Reduction: AR technology, when integrated with Apple Vision Pro, minimizes errors, lowers rework, and optimizes resource allocation, resulting in cost savings across the project lifecycle The potential applications of AR technology in construction are boundless, from fortifying safety measures to facilitating training simulations. By addressing industry challenges and equipping construction professionals with AR solutions powered by Apple Vision Pro, are reshaping the construction landscape. They’re paving the way for safer, more efficient, and more sustainable building practices.

February 29, 2024
Everything you’d like to know about visionOS development

If you’re venturing into the realm of developing applications for Apple Vision Pro, it’s crucial to equip yourself with the right knowledge. In this article, we unravel the key aspects you need to know about the visionOS operating system, the secrets of programming for Apple Vision Pro, and the essential tools required for app development. visionOS: The Heart of Apple Vision Pro The foundation of the Vision Pro headset lies in the sophisticated visionOS operating system. Tailored for spatial computing, visionOS seamlessly merges the digital and physical worlds to create captivating experiences. Drawing from Apple’s established operating systems, visionOS introduces a real-time subsystem dedicated to interactive visuals on Vision Pro. This three-dimensional interface liberates apps from conventional display constraints, responding dynamically to natural light. At launch, visionOS will support a variety of apps, including native Unity apps, Adobe’s Lightroom, Microsoft Office, medical software, and engineering apps. These applications will take advantage of the unique features offered by visionOS to deliver immersive and engaging user experiences. Programming Secrets for Apple Vision Pro Programming for Apple Vision Pro involves understanding the concept of spatial computing and the shared space where apps coexist. In this floating virtual reality, users can open windows, each appearing as planes in the virtual environment. These windows support both traditional 2D views and the integration of 3D content. Here are some programming “secrets” for Apple Vision Pro: All apps exist in 3D space, even if they are basic 2D apps ported from iOS. Consider the Field of View and opt for a landscape screen for user-friendly experiences. Prioritize user comfort and posture by placing content at an optimal distance. Older UIKit apps can be recompiled for VisionOS, gaining some 3D presence features. Be mindful of users’ physical surroundings to ensure a seamless and comfortable experience. Tools for Apple Vision Pro Development To initiate the development of applications for Vision Pro, you’ll need a Mac computer running macOS Monterey or a newer version. Additionally, you’ll require the latest release of Xcode and the Vision Pro developer kit. The development process entails downloading the visionOS SDK and employing familiar tools such as SwiftUI, RealityKit, ARKit, Unity, Reality Composer Pro, and Xcode, which are also utilized for constructing applications on other Apple operating systems. While it’s feasible to adapt your existing apps for Vision Pro using the visionOS SDK, be prepared for some adjustments in code to accommodate platform differences. Most macOS and iOS apps seamlessly integrate with Vision Pro, preserving their appearance while presenting content within the user’s surroundings as a distinct window. Now, let’s delve into the essentials for assembling your own Apple Vision Pro development kit: SwiftUI: Ideal for creating immersive experiences by overlaying 3D models onto the real world. Xcode: Apple’s integrated development environment, vital for app development and testing. RealityKit: Exclusively designed for Vision Pro, enabling the creation of lifelike, interactive 3D content. ARKit: Apple’s augmented reality framework for overlaying digital content onto the real world. Unity: A powerful tool for visually stunning games and Vision Pro app development. Unity is currently actively developing its SDK to interface with Apple Vision Pro. What’s the catch? Few people know that to develop on Unity, you need not just any Mac, but a Mac with an “M” processor on board! Here are a few more words about supported versions: Unity 2022 LTS (2022.3.191 or newer): Apple Silicon version only. Xcode 15.2: Note that beta versions of Xcode are a no-go. VisionOS 1.0.3 (21N333) SDK: Beta versions are not supported. Unity editor: Apple Silicon Mac and the Apple Silicon macOS build are in; the Intel version is out. Pay attention to these restrictions during your development journey! Apple Vision Pro SDK: Empowering Developers The visionOS Software Development Kit (SDK) is now available, empowering developers to create groundbreaking app experiences for Vision Pro. With tools like Reality Composer Pro, developers can preview and prepare 3D models, animations, and sounds for stunning visuals on Vision Pro. The SDK ensures built-in support for accessibility features, making spatial computing and visionOS apps inclusive and accessible to all users. As Apple continues to lead the way in spatial computing, developers hold the key to unlocking the full potential of the Vision Pro headset. By understanding the intricacies of visionOS, programming secrets, essential development tools, and the application process for the developer kit, you can position yourself at the forefront of this revolutionary technological landscape.

January 30, 2024
Touching Art: How Haptic Gloves Empower to “See” the World of Art

In the realm of art, visual experiences have long been the primary medium of expression, creating a challenge for those with visual impairments. However, a groundbreaking fusion of haptic technology and VR/AR is reshaping the narrative. Explore the innovative synergy between haptic technology and VR/AR and how this collaboration is not only allowing the blind to “see” art but also feel it in ways previously unimaginable. Artful Touch – Haptic Technology’s Role in Art Appreciation Haptic technology introduces a tactile dimension to art appreciation by translating visual elements into touch sensations. Equipped with sensors and precision, haptic gloves enable users to feel textures, contours, and shapes of artworks. This groundbreaking technology facilitates a profound understanding of art through touch, providing a bridge to the visual arts that was once thought impossible for the blind to cross. VR/AR technologies extend this tactile experience into virtual realms, guiding users through art galleries with spatial precision. Virtual environments created by VR/AR technologies enable users to explore and “touch” artworks as if they were physically present. The combination of haptic feedback and immersive VR/AR experiences not only provides a new means of navigating art spaces but also fosters a sense of independence, making art accessible to all. Prague Gallery Unveils a Touchful Virtual Reality Experience The Prague’s National Gallery has taken a revolutionary step towards inclusivity in art with its groundbreaking exhibition, “Touching Masterpieces.” Developed with support of Leontinka Foundation, a charity dedicated to children with visual impairments, this exhibit redefines the boundaries of art appreciation. Visitors to the exhibition, especially those who are blind or visually impaired, are invited to embark on a sensory journey through iconic sculptural masterpieces. Among them are the enigmatic bust of Nefertiti, the timeless Venus de Milo sculpture, and the immortal David by Michelangelo. What sets this exhibition apart is the integration of cutting-edge technology – haptic gloves. These gloves, dubbed “avatar VR gloves,” have been meticulously customized for the project. Using multi-frequency technology, they create a virtual experience where a user’s hand can touch a 3D object in a virtual world, providing tactile feedback in the form of vibrations. The key innovation lies in the gloves’ ability to stimulate different types of skin cells’ tactile responses, ensuring that users, particularly the blind, receive the most accurate perception of the 3D virtual objects on display. As visitors explore the exhibit, they can virtually “touch” and feel the intricate details of these masterpieces, transcending the limitations of traditional art appreciation. Future Possibilities and Evolving Technologies As technology advances, the future holds even more possibilities for inclusive art experiences. The ongoing collaboration between haptic technology and VR/AR promises further refinements and enhancements. Future iterations may introduce features such as simulating colors through haptic feedback or incorporating multisensory elements, providing an even more immersive and enriching experience for blind art enthusiasts. The collaboration between haptic technology and VR/AR is ushering in a new era of art perception, where touch and virtual exploration converge to create a truly inclusive artistic experience. By enabling the blind to “see” and feel art, these technologies break down barriers, redefine traditional boundaries, and illuminate the world of creativity for everyone, regardless of visual abilities. In this marriage of innovation and accessibility, art becomes a shared experience that transcends limitations and empowers individuals to explore the beauty of the visual arts in ways never thought possible.



Let's discuss your ideas

Contact us