Blurring the Line Between Virtual and Real Worlds

Younite-AI
6 min readSep 14, 2023

As 3D technology evolves, the dividing line between the real world and the virtual world has been decreasing to the point it is now arguably gone. The driving force behind this convergence is data and its increasing accessibility and speed of delivery. The result will be more and more applications and experiences that blend the benefits of both worlds.

3D = An intuitive way to view data

3D technologies allow us to replicate places, products, and essentially anything in the real world. These virtual representations can be crafted to look and act like they do in the real world. This gives us a very powerful and intuitive interface to the data being viewed, and also a safety layer to try out future scenarios in simulated digital twin replications without real-world consequences.

At Younite-AI, we’re always exploring the possibilities that these technologies can unlock, particularly the combination of 3D, data, and AI. In doing so we’ve created solutions that allow our clients to look back, view real-time activity, and look forward at possible future scenarios.

“One of the most challenging components of data visualization is that oftentimes the story you need to tell is complex, dynamic, and multidimensional. However, the standard tools we have are flat, static, and designed for paper. Decomposing the dynamic nature of the narrative embedded within your data into a storyboard format is one of the best ways to ensure your key points are effectively received by your intended audience.”

— Thomas Rhodes

Historical: Data-driven reflection

The foundation of visualizing scenarios is the data that drives them. Data sources surround us and are generated constantly. Specific scenarios all leave a data trail in their wake. The problem is generally that this data is in a form out of context with the scenario that generated it. The distance from context makes it more difficult to view and challenging for our brains to unpack. But what was once a progressive set of longitude and latitude numeric strings, can now determine a visual path marked on a map. Elevation information broadcast, while movements occur, allows us to view that path in 3 Dimensions. Velocity and distance between data waypoints in turn allow us to map even more true movement patterns.

You get the idea. The combinations of these data inputs can be crunched down by computers to form ever more realistic scenarios. With more and more data sources available. We can also look past the objects, people, or products we want to visualize around and pull in environmental data to build even more accuracy into scenarios. For example, we may want to replicate a military scenario to analyze a mission undertaken by a squad of marines. We want to do so by leveraging the data generated by the individuals involved. Feasibly everything from their physical location and movements in a physical space, to any biometric data gathered from worn devices, to the data created by any equipment they may use within the timeframe we are replicating. We may know location, so we may be able to map to a replication of a physical terrain. We also know the time so we know lighting based on sun movements. In addition, we can also look at weather-specific data to understand if the area was overcast, blue sky, rainy, windy, etc. The scenarios get even more real and allow us to step into the eyes of the marines that undertook the mission to get a far more contextual, and human, understanding of why decisions may have been made and why a mission played out the way it did. Valuable understanding for improving tactics and strategies for the future.

Present: Data-driven remote viewing

With our understanding of how we can now crunch and combine complex and disparate data sources, we can now leverage the swaths of data being broadcast as it's created, in real-time, to watch scenarios from remote locations as they happen.

This opens up opportunities for real-time influence on a scenario when technology is used to control objects and characters in the real world by influencing our virtual scenario. Simply put, remote control thanks to vehicles and products that are constantly connected to networks broadcasting data back and forth continually. As opposed to taking data in and understanding it, we simply flip the paradigm and broadcast data back to our characters in the real world.

Real-time geo-contextual visualization of transportation data

Future: Real-time AI-driven dynamic scenarios

By infusing AI into our scenarios we add a deeper level of dynamism that can take virtual scenarios to a whole new level and add even more valuable opportunities. The scenarios themselves can be intelligent and react to user input to dynamically determine responses. So in the case of military scenarios, we can set the scene using historical data, but then allow those scenarios to play in new ways based on new interactions. In short, we can look at how new decisions create different conclusions and learn as a result.

We can also create intelligent characters within our scenes allowing our real-world players to engage with them directly and have them react in behavioral ways. Again, military scenarios can be infused with intelligent combatants to make them far more realistic and force players to think and act intelligently in real time, obviously an extremely important skill to develop. The result can be ever-more realistic training scenarios anywhere dynamic situations cause us humans to have to build in instinctual responses that are best trained by experience.

Introduction to Younite Virtual Grounds, a DIS-enhanced scenario platform.

In Conclusion

The possibilities for any of these different views into scenarios are vast. At Younite-AI, we’ve been actively exploring these technologies, and all the forms it can be consumed today, and in the future. We keep pushing the limits of what is achievable and working with our clients to apply these technologies to their problems to create valuable applications across a variety of industries.

Potential applications include:

  1. Virtual Training for heavy machinery
  2. Virtual Training for Extreme Working Environments
  3. Real-time monitoring of transportation networks (land, sea, and air)
  4. Real-time monitoring of logistics across virtualized physical operations
  5. Recreated military scenarios to re-experience them and test and learn new strategies
  6. Recreate workflows and physical working environments to test out new workflows and engagement before implementation
  7. And this list can go on… and on…

Essentially, any time we want to understand what happened better, how we can improve or change results, or train ourselves to handle real-world physical challenges without risk. If any of these needs are ones your organization could benefit from, give us a shout and we can talk to you more directly about how and where this technology can benefit your organization.

Dave Papworth is the Creative Cultivator and Product Leader for Younite-AI. His career has taken him through multimedia, development, design, innovation, and ultimately Younite-AI.

While working in the advertising industry he led teams focused on technical innovation and exploring how it can be leveraged for marketing campaigns and brand-building platforms that resulted in forward-thinking projects showcased at events such as Google Sandbox, and even being recognized in Time Magazine as an “Invention of the Year”.

At Younite-AI, his focus is on building a team that can tackle any challenge, look beyond its boundaries, and grow the collaborative relationships we desire with our clients.

--

--

Younite-AI

Making intelligent tools for the age of Metaverse_