Q1 2025 Gaming Industry Report Released,
View Here

Konvoy’s Weekly Newsletter:

Your go-to for the latest industry insights and trends. Learn more here.

Newsletter

|

Jun 13, 2025

Realism in Games

A framework for games to continue to move toward more extreme and realistic experiences

Copy Link

Copy Link

When we talk about “realistic” experiences in games, we are often talking about the degree of immersion. Games that pursue realism in their experiences are continually implementing new technologies to deepen that immersion. This includes building realistic avatars, developing advanced physics engines, and preparing for the future with AI-powered NPCs and adaptive, emergent game environments.

This move toward realism has been seen in other areas of entertainment such as film (Dawn of the Planet of the Apes, Avatar series). However, these mediums have limits. For example, film is something we watch, not something that “happens to us” or a reality that you can actively engage with. Audiobook dubs on the other hand are becoming more adaptive at scale with the help of AI but still are limited to passive consumption (for now, but imagine a podcast that you could interrupt and it would talk back to you). Gaming, as a form of interactive entertainment, does not have the same limits.

It is clear that most people want more and increasingly realistic experiences, regardless of if they are scripted and fictional or not:

  • 8 of the top 10 highest-grossing films of the last decade relied on heavy utilization of CGI or motion capture for visuals, major sequences, or characters (Box Office Mojo)
  • WWE, a fully-scripted professional wrestling league, has a market cap ($8.8bn) that is comparative to that of the top combat sport league in the world, the Ultimate Fighting Championship ($12.3bn). Even more impressively, WWE has been able to do this with a fanbase that is ~86% smaller than that of the UFC
  • Spend on live music events ($35bn in 2024) continues to rival and exceed that of digital-only recorded music ($21bn in 2024) despite the latter’s rapid growth over the last 2 decades

However, there are areas of gaming, such as VR and virtual sports/music viewership, that have been much slower to gain adoption, indicating that there are still technological limits on the degree of immersion that gaming has achieved to date. This week, we will be attempting to create a framework of realism and immersion, discuss what technologies over the last decade have helped us achieve this, and what else is needed to cross the chasm of realism.

Crossing the Chasm of Immersion: A Framework

The question of what makes something feel “real” to a user does not have a simple answer. It is not only high visual fidelity or reaching a degree of sensory input; rather, it is a complex combination of mimicking active, predictive processes involving multiple neural systems, sensory modalities, and psychological mechanisms.We put together a draft framework to better think through the different aspects of realism in games, made up of five components:

  1. Narrative Plausibility - “Does the story seem real?”: The events and interactions within an experience must follow logical, consistent rules that match our expectations about how the world works.
  2. Affective Resonance - “How does it make you feel?”: Emotional engagement and the activation of appropriate affective responses enhance the sense of reality. Experiences that engage our emotions feel more real than those that remain emotionally neutral.
  3. Embodied Presence - “Do I believe that I am physically there?”: The sense of having a body in the environment, with appropriate agency and control, contributes significantly to reality perception. This includes both the presence of a virtual body (in virtual environments) and the responsiveness of that body to intentions and actions.
  4. Sensorimotor Engagement - “Does it feel like I am physically there?”: Realistic experiences must provide appropriate sensorimotor feedback that match our expectations about how actions should affect our senses. This includes both actual (when we move, the world changes appropriately) and counterfactual feedback (we implicitly know how the world would change if we took various actions).
  5. Predictive Consistency - “Does this match how I thought?”: The brain's predictive models must generate accurate predictions about incoming sensory data. When predictions align well with actual input, experiences feel real and natural. Mismatches between predictions and sensory input create prediction errors that can disrupt the sense of reality.

Narrative Plausibility and Affective Resonance

Narrative plausibility in games has historically been the “baseline” level of immersion set by creators as it is the easiest to limit and control. For example, game developers typically limit character dialogue to a set of fixed choices which can only lead to a controlled set of outcomes. Narrative plausibility could also be described as the “suspension of disbelief” that game developers create for players by creating worlds, narratives, and characters that allow players to buy into the implausible or fantastical elements of a game as if they were real.

Affective realism refers to the phenomenon where our feelings create a sense of realism. Research has shown that feelings are integral to the brain's internal model and thus to perception itself. Rather than simply influencing judgment after perception occurs, emotions actually influence the content of what we perceive. This crucial dimension is often overlooked in purely technological approaches (e.g., good graphics, spatial audio) to understanding realism. This affective component helps explain why emotionally engaging content often feels more real and memorable than emotionally neutral content. The integration of emotional and sensory processing creates what researchers call "unified conscious experience", where sensation from both inside and outside the body is processed together with emotional circuitry.

Technologies that have improved this over the last decade:

  • Motion capture (or mocap) technology has enabled real actor performances to translate to digital characters, capturing subtle gestures, expressions, and body language. This results in more nuanced and relatable characters, as seen in games such as God of War and The Last of Us, where emotional storytelling is central to the experience.
  • Dynamic audio such as custom soundtracks or adaptive music are used to set the mood and reinforce the emotional tone of the game.
  • Technologies that enable players to choose their own paths in a game can create a higher emotional connection with the story line. This is present in The Walking Dead as the engine enabled the game to track player choices across episodes, influencing character relationships and certain plot outcomes.

Emerging technologies:

  • AI NPCs and AI-driven speech adaptation give players the ability to interact with characters that understand their place in the world that is being built. Rather than fixed dialogue, the promise of this technology enables much more dynamic and emotionally realistic conversations with in-game characters.
  • Biometric sensors (heart rate monitors, sweat sensors, etc) are not integrated into popular games today but hypothetically could allow games to respond to a player’s physiological state and emotions.
  • Similarly, player sentiment tools can leverage AI and natural language processing (NLP) and facial recognition to analyze player emotions in real time. This allows games to adapt narratives, challenges, and social interactions based on the player’s emotional state.

Embodied Presence and Sensorimotor Engagement

Mel Slater's work in VR research has provided one of the most comprehensive frameworks for understanding what makes experiences feel real. Slater proposed that the sense of presence - the subjective feeling of "being there" - depends on two components:

  1. Place Illusion (PI): corresponds to the traditional conception of presence as “being there” which relies heavily on sensory immersion and the quality of a virtual environment’s ability to provide convincing sensory input.
  2. Plausibility Illusion (Psi): is a more sophisticated aspect of perception. Psi is defined as "the illusion that what is apparently happening is really happening, in spite of the sure knowledge that it is not". This illusion depends on the coherence and consistency of events within the virtual environment, including logical cause-and-effect relationships, realistic character behaviors, and consistency with user expectations.

The concept of an “embodied presence” (the feeling of “being there”) to define reality is distinct from sensorimotor engagement (alignment to actions/sensory feedback). The former theory suggests that our sense of reality depends on our implicit knowledge of how our actions would change our sensory input (“I believe that if I pushed a cup of coffee off of a desk, that it would move and hit the ground”).

On the other hand, sensorimotor contingency theory puts this into explicit action, suggesting that your sense of reality is only shaped if you were to actually try to push the cup over - only by feeling the cup against your hand, hearing the cup hit the ground, and smelling the spilled coffee would you then have a strong sense of reality.

Technologies that have improved this over the last decade:

  • More games are utilizing flexible avatar creation systems that allow players to precisely mold the way their avatar looks. Games like Bonelab have also developed systems that translate these avatars' specs to directly affect the in-game physics and player experience. For example, a larger, heavier avatar will have more mass, can move objects more easily but may move slower, and differences in limb length and body proportions can affect the player’s reach, climbing ability, and weapon handling.
  • Spatial audio enabled by 3D audio engines (e.g., Dolby Atmos) simulates sound direction/distance, helping players locate threats or events spatially and strengthens place illusion by anchoring audio cues to virtual environments.
  • Improvements in VR headset hardware (inside-out tracking, 120Hz refresh rates, and 4K displays) have continuously improved head/hand tracking, fields of view, and low latency, which has enhanced place illusion by simulating realistic spatial navigation and environmental interactions.
  • Haptic hardware has introduced tactile sensations, such as vibrations or temperature changes, which provide structural isomorphism (physical actions that mirror virtual responses) to deepen embodiment.
  • Advanced engine physics like those of NVIDIA PhysX and Unreal Engine 5’s Chaos simulate realistic object interactions, while ray tracing creates lifelike lighting/shadow dynamics which reinforce environmental plausibility by ensuring virtual worlds behave predictably.

Emerging technologies:

  • Haptic systems, such as Afference (a Konvoy portfolio company), take reactive tactile sensations one step further by allowing for two-way interaction with digital objects by enabling the ability to simulate touch and “feeling” virtual objects. In the meantime, other multi-modal, haptic-based devices such as full-body suits are being developed to deliver more complex sensations than just pressure such as stretching, sliding, and even temperature changes.
  • Next-generation VR systems are integrating multiple sensors (“sensor fusion”, combining IMUs, cameras, LiDAR) to achieve millimeter-accurate, low-latency tracking of the entire body - not just hands or head. This enables avatars to mirror user posture, gait, and even subtle gestures, dramatically increasing the sense of embodiment and natural interaction.

Predictive Consistency

Modern neuroscience has fundamentally transformed our understanding of how the brain creates our sense of reality through a process called predictive coding. This framework outlines that our brain operates as a "Bayesian inference engine," continuously updating probabilities about the external world based on incoming evidence.

When we encounter sensory input, higher brain regions generate models or beliefs about what information should be received from lower levels, creating predictions that get sent down to lower-level sensory processing areas (in other words, our brains are continuously trying to create some prediction on a version of the future). The brain then compares these predictions with actual sensory input, and the resulting prediction error drives learning and perception.

Creating believable predictive consistency in games is about ensuring the brain’s expectations are met by the game’s feedback. It is distinct from narrative logic, emotional engagement, embodiment, or sensorimotor alignment (though it underpins and supports all of these). This is what makes a game feel “real” at the most fundamental, subconscious level.

Technologies that have improved this over the last decade: Given that predictive consistency is an underpinning concept, many of the technologies mentioned earlier also apply here. Specifically, game engines that are consistent, predictable and responsive, and systems that are low latency and high fidelity all contribute to predictive consistency.

Emerging technologies:

  • Predictive rendering and AI precalculation can support systems that predict what the player will do next and render or simulate those outcomes ahead of time, reducing lag and making interactions feel more natural.
  • Neural interfaces could one day allow games to directly read or anticipate player intentions, further reducing the gap between expectation and feedback.

Takeaway: The scientific understanding of what makes something feel real showcase that reality perception is a sophisticated, multi-dimensional process that extends far beyond simple sensory input. Rather than being a passive recording of the external world, our sense of reality emerges from the active integration of predictive processing, sensorimotor contingencies, multimodal sensory integration, embodied presence, emotional engagement, and narrative coherence.

This framework demonstrates that creating realistic experiences requires careful attention to psychological and neurological principles rather than just technological capabilities. As our understanding of these mechanisms continues to evolve, we can expect more sophisticated and effective approaches to creating experiences that feel genuinely real to players.

From the newsletters

View more
Left Arrow
Right Arrow

Interested in our Newsletters?

Click

here

to see them all