Implementing smooth drawing in Unity for VR app with lots of miscalculations_main
June 24, 2022
Implementing smooth drawing in Unity for VR app with lots of miscalculations

While working on a virtual reality application for painting graffiti on walls our team had difficulties with the implementation of smooth painting with spray paint. The drawing process itself is calculated mathematically using linear interpolation. We launch a beam from a can of paint and aim at the texture of the wall on which we will paint, after which we calculate the pixels, and the paint is transferred to the wall. The main difficulty occurred when moving quickly with a can. The paint did not fill all the intermediate points, but only separate ones, as a result, voids appeared in the filling with color. Naturally, at the same time, the drawing process lost its realism and smoothness and became uncomfortable. The reason for this was the use of Unity’s built-in math, which could not cope with a large number of miscalculations. In order to solve this problem, we decided to use third-party libraries: Hybrid renderer — provides systems and components for rendering ECS ​​entities. Hybrid Renderer is not a render pipeline: it is a system that collects the data necessary for rendering ECS ​​entities and sends this data to Unity’s existing rendering architecture. BurstCompiler — it translates from IL/.NET byte code to highly optimized native code using LLVM. It is released as a unity package and integrated into Unity using the Unity Package Manager. Unity C# Job System — lets you write simple and safe multithreaded code that interacts with the Unity Engine for enhanced game performance. In the video below, you can see the difference in performance before and after using third party libraries. After the implementation of the above libraries, the efficiency of the application was improved by 90-100%.

Sound processing for NFT project
June 1, 2022
Sound processing for NFT project

We are working on an interesting NFT project related to 3D avatars, video and sound recording. We decided to briefly tell you about the solutions that we use for work, as well as share information about the inner workings of the project. About the essence of the NFT project The essence of the NFT project is that you buy unique 3D avatars in the application. For them, you can record video with sound using the front camera of your smartphone. The main feature is that the avatar pronounces the recorded sound, repeating the facial expressions and emotions of the speaker. There are currently more than 2000 avatars in the application. The application is currently only available for iPhone. In the future, the purchased NFT avatar can be upgraded in terms of popularity and recognition. After that, it can be monetized: used for advertising, taking part in the shooting of cartoons, animated videos, etc. About the complexities While working on the project, it became necessary to impose various effects on the recorded sound. This can enhance the authenticity of the avatar, make the video more impactful with interesting sound effects, and diversify the experience that users can implement in the application. The main requirement was to use the effects used by professional audio editors. We started looking for a suitable solution, offered various options, some of which did not fit for objective reasons, and the other part was not compatible with the mobile platform. In the end, we settled on one well-known tool that met all our requirements. Tools for working with sound and video Since our project needs to record videos with sound and work with them at the same time, we chose FFMpeg – a complete, cross-platform solution to record, convert and stream audio and video. Before that, we used this library for video processing: trimming, inserting background music. They also knew about the possibilities of working with sound, but did not use all the advanced features. And the possibilities are wide. We can say that this is the most complete library for audio and video processing: compression, timbre, frequency, playback speed, echo. Over 100 different filter combinations. For future clients If you have ideas that you would like to implement, please contact us. We will conduct research, choose the best options and make an effective and simple solution.

Oculus Interaction SDK
April 7, 2022
Switching to the Oculus Interaction SDK on a major education project

Within one of our big VR projects, we migrated from the old version of item interaction to the new Oculus Interaction SDK. Multiplayer virtual reality project in the field of education Qualium Systems has been developing our own multiplayer virtual reality project in the field of Chemistry education for a long time — virtual classrooms. The features of the project VR Chemistry Lessons App offer a large number of interactions with objects, namely: take and hold them, move, resize, etc. For this type of application, the interaction and manipulation of objects, their convenience and simplicity, play a key role, since this is what affects the student’s feelings while using the application. Check out a small video on the work of our VR Chemistry Lessons App. Before the Oculus Interaction SDK was released, the older version of the SDK didn’t allow the effective object manipulation. For example, in the old version, it was impossible to take an object with both hands and change its size — increase or decrease. It was a problem. Development of a custom solution for object manipulation As a result, in order to get around these limitations, we developed our own custom solution for manipulating objects in the application. Besides the advantages, it also had negative aspects: led to code complexity, implied additional work when updating the Oculus SDK on the project, etc. Benefits and adoption of the Oculus Interaction SDK The new Oculus Interaction SDK allowed us to implement almost all the interaction features with objects that we needed. It is now possible to abandon the custom solution, although requiring considerable time for integration. What is the Oculus Interaction SDK Interaction SDK Experimental is a library component that allows developers to use standardized interactions for controllers and hands. Interaction SDK also includes tooling to help developers build their own hand poses. We filmed a short video while testing the Oculus Interaction SDK. The experience we have gained Our development team has the flexibility to change their approaches: develop custom solutions when needed; implement ready-made libraries and frameworks when it speeds up development time without reducing quality.

Proof of concept for software
February 17, 2022
Proof of Concept for software. Risks reduction in the development of AR / VR / MR applications

The development of AR / VR / MR applications is an innovative and dynamically developing area. Active development implies the emergence of many innovative ideas and concepts, technological solutions and equipment. Under these conditions, it is not always clear how one or another function can be implemented and whether it is possible at all. What is the purpose of Proof of Concept To reduce the risk of losing money and time for our clients, we use Proof of Concept. It helps to clarify the technical feasibility and characteristics of the result and then proceed to the implementation of the whole product. Why Proof of Concept is important PoC is for you if you: have non-trivial tasks, in the implementation of which there is no confidence; have a limited budget and want to test the idea. In addition to risks reducing, another benefit of PoC is that you can use it as a pilot project — to evaluate the knowledge and skills of the company’s developers, the level of communication, the quality of the organization of the development process. This is especially true if your idea is feasible, has good prospects and implies a long development stage in the future. Proof of Concept vs MVP If we compare PoC and MVP, we can say that PoC is a bare functionality, the implementation of which allows you to save money at the initial stage, since it does not require costs for UX / UI and functions that are unnecessary at this stage. It is not released to the public, but is tested by a limited number of users. MVP is a complete finished product, only with limited features. It is released to a wide audience and requires more cost and time than PoC. If you have an idea, please contact us — we will advise you on all issues, share experience and expertise in the field of AR / VR / MR development.

Implementation Oculus Interaction SDK
February 8, 2022
Implementation Oculus Interaction SDK

Interaction SDK Experimental is a library component that allows developers to use standardized interactions for controllers and hands. Interaction SDK also includes tooling to help developers build their own hand poses. Technology stack: Unity SDK, Oculus SDK.

3D virtual models in Augmented Reality for retail
February 2, 2022
3D virtual models in Augmented Reality for retail

Today’s retail business’s main purpose is all about involving the customer interacting with the products and the platform so that they feel engaged. From the variety of tools, which are allowed to get this the most interesting ones are 3D models in Augmented reality. 3D virtual models could show clothes in a 3D environment or could be used as a virtual assistants for consultation and sales increasing. Technology: AR Foundation, Unity SDK.

R&D department in extended reality: several examples of experiments and research
December 9, 2021
Extended Reality R&D department: several examples of experiments and research

The field of augmented and virtual reality development is developing by leaps and bounds: Facebook announced its plans to build a metaverse, major market players release new AR / VR / MR devices, new technologies and tools appear regularly. These factors represent both opportunities and challenges. It is very easy to lag behind the development of new directions, to concentrate on the familiar and convenient. Research & Development in VR/AR In order to stay at the peak of new solutions, we at Qualium Systems made a decision about a year ago to create an R&D department in virtual and augmented reality. As part of this department, we master and test new features, tools and capabilities. We use everything that has a perspective to solve the business problems of our clients. Passthrough API For example, we were one of the first to test a then new feature for Oculus Quest 2 – Passthrough API. This technology enables mixed reality experiences that combine real and virtual worlds into one. Immediately after an update became available, we started testing the API. It is an exciting promise by Oculus and has a lot of perspective. We are already brainstorming the ideas of using Passthrough API to solve our clients’ business issues. To test Passthrough API, we used the following Technology stack: Unity SDK, Oculus SDK 31-34, Passthrough API. Our experiment with the Passthrough API was for V31 when the technology was in experimental mode. In the further V34 version of Oculus, the Passthrough API has ceased to be an experimental function and now it does not need to be activated by a user via the console. You can watch our experiment in the video.   Comparison of the Oculus 2 with Passthrough API and HoloLens 2 Also, in continuation of our experiments with the Passthrough API technology, we decided to test the hypothesis – Can the Oculus 2 with Passthrough API take the place of HoloLens 2? After the MRTK library has been adapted to the Oculus Quest 2 and in the ver. 30 of the Operating system we received a pass-through feature, we decided to check if Oculus 2 could be used as an alternative to expensive HoloLens 2. To validate the idea, we’ve made two applications with the same features on the Oculus Quest 2 and on the HoloLens 2. You can see the result in the article http://ad88-46-150-10-153.eu.ngrok.io/blog/ar-vr/can-the-oculus-2-with-passthrough-api-take-the -place-of-hololens-2-checking-the-hypothesis. As part of our experiments, we are trying to highlight the industries that can most benefit from XR technologies, especially during a pandemic and its aftermath. The standalone application with computer vision base on Barracuda and TF model We understood that during the COVID-19 pandemic, many people lost the opportunity to train in the gym with a trainer. Lack of coaching support can lead to improper exercise performing, which in turn can lead to injury. But thanks to modern technology, it has become possible to train correctly. We are creating a standalone application with computer vision based on Barracuda and TF model. You view a recording of an exercise with a trainer and repeat it in real time. The computer vision based on Barracuda and TF model technology compares your movements and displays information on how well you perform the exercise. You can see the result in the video.   Tensorflow lite body recognition As part of testing our hypotheses that can be applied to sports and fitness, we compare Tensorflow lite body recognition based on PoseNet model and Unity Barracuda using tfjs-to-tf and tf2onnx.   Face tracking on PC with Unity SDK and UltraFaceBarracuda One industry that can benefit from XR adoption is Retail. As part of this direction, we tested Face tracking on PC with Unity SDK and UltraFaceBarracuda: depending on the face position, the picture that a person is looking at moves. As a result of this, the effect of a hologram is created. It is an interesting solution for marketing purposes with the ability to track people’s actions and test various marketing activities. To do so, we used the Technology stack: Unity SDK, UltraFaceBarracuda. See the result in the video. This is a small part of our experiments. We continue to research, hypothesize, and help our clients.

VRium
November 5, 2021
VRium study program big update: added 7 important interactive visualizations

VRium is an engaging, interactive VR chemistry simulator that helps students get the most from chemistry theory and practice, developing the needed level of knowledge and honing the skills. Take part in beta testing the VRium App. To do this, please fill out the following form. To provide a real-life experience we have developed a study programme for the VRium using typical curricula of General and Inorganic Chemistry courses used in Ukrainian schools and universities. The first part in this programme is “Atomic Structure and Periodic Table”, which starts from the Topic 1 “Atomic Orbitals in Hydrogen Atom”. For this topic we have prepared the following interactive visualizations. 1.1. Rutherford’s Model of Atom The first model of atomic structure by Thompson postulated that electrons are located in positively charged medium. Famous experiments by Rutherford (1911) disproved this model. In new Rutherford’s model of atom, a light electron moves along a circular orbit around a heavy and small nucleus, which is just the proton in case of Hydrogen atom. Here the model is presented simply as an animation of electron moving in a circle around proton. This model forms a common perception of atomic structure and thus has to be discussed in some more details. 1.2. Rutherford model and Classical Electrodynamics Rutherford’s model was unsatisfactory. It correctly described experimental distributions of charge and mass in atoms. But Rutherford’s atom is unstable in classical electrodynamics, which predicts that an electron has to accelerate while moving on circular orbit, and if there is no external energy supply, it should lose energy by emission of electromagnetic radiation. Decreasing energy of an electron will decrease the radius of its orbit, and thus after some small time (estimated to be about 10-11s) the electron should fall onto the proton and Rutherford’s atom would annihilate. This consideration is illustrated pictorially: an electron emits light and quickly falls onto the proton, moving on some orbit around it. This illustration gives a hint: given that in reality atoms are stable, either the electronic orbits in atoms are more complex, or somehow usual physics laws are not applicable for atoms. 1.3. Unstable Equilibrium in Classical Electrostatics Another problem with classical description of atoms is its complete inability to explain the stability of molecules. Atoms in molecules are held together by chemical bonds, where electrons are located between nuclei. But according to Earnshow’s theorem, the stable static equilibrium cannot exist in any system of point charges. So when an electron is located exactly between 2 protons, attraction forces are compensated, and equilibrium is achieved. But it’s not stable: if an electron is shifted by arbitrarily small distance towards any nucleus, forces will become unbalanced, the electron will fall onto the nucleus and the system will annihilate. 1.4. Bohr’s model Also classical picture could not describe atomic emission spectra. In that model, electron could have any energy, so photon with any wavelength could be emitted or absorbed. This contradicts experimental data – atomic spectra consist of discrete narrow lines. In 1913 Niels Bohr fixed this problem in an ad-hoc manner by letting electron move only on some discrete set on circular orbits. Electron on each such stable orbit has definite energy, so moving between orbits is accompanied by absorption or emission of photon with specific wavelength. This simple picture reproduces lines of atomic spectra, but is self-contradictory and cannot be extended to molecules or even multi-electron atoms. This development is depicted by electron moving on a second circular orbit around proton. After the electron emits a photon it moves to lower orbit. This model is valid for atoms and ions with a single electron: Hydrogen, He+, Li2+, Be3+, but theory cannot account for electron interactions – there’s no place for the second electron. 1.5. Quantum mechanical Hydrogen Atom These problems were solved when in 1925 Schrodinger and other scientists developed modern quantum mechanics. Instead of classical electron that moves on some definite trajectory, it describes electron which can appear everywhere in space. Probability to find it and all physical values that can be measured are governed by a so-called wave function. This wave function is not arbitrary, it must satisfy the Schrodinger equation. This is illustrated as electron that loses its orbit around the nucleus. In experiments when the position of electron is successively measured, results are not located on some line, but can appear everywhere in space. Still, the probability to find electron near the nucleus is much higher than to find it high away from it. 1.6. What is Electronic Cloud? This interactive visualization shows 1s orbital of Hydrogen atom. Here the user observes measured positions of electron and can change the radius of the sphere around the nucleus. Increasing sphere size increases the chance to find electron inside it, but this chance never reaches 100%. This accustoms the user with the nature of atomic orbitals as some kind of distribution of probability to find electron. 1.7. Structure of s-orbitals This interactive visualization shows the inner structure of ns atomic orbitals. It turns out that for orbitals with higher main quantum number n, there are places where electron cannot be located. Number of such places increases as n-1, and these zones separate regions with different signs of wave function. User takes a short quiz after this visualization to memorize the connection between the structure of ns orbital and main quantum number n. After introducing the concept of atomic orbitals (applicable to Hydrogen atom and atomic ions with single electrons), the next logical step is to describe multi-electron atoms. Thus the next topic will include concepts of electronic configurations, rules used to construct them, as well as their relevance to periodic table, chemical properties and chemical bonding.

AR portal, which opens a passage to an unknown and scary place
October 29, 2021
We have made a couple of interesting features to make you feel the Halloween atmosphere even more

Our guys from the Unity department have done a great job. Halloween is coming soon and we have made a couple of interesting features to make you feel the holiday atmosphere even more. Try on scary masks and share pictures with your friends, or enemies, to make you even scarier. Follow the link https://webxr.run/nOGnww05b3JX5 and see for yourself how creepy it all looks and your knees will shake. Be sure to watch our atmospheric video with AR portal, which opens a passage to an unknown and scary place. Just a few seconds will be enough for you to feel a chill of quiet horror running down your spine. Be sure to turn up the volume and do not watch this video alone and in the dark)