Blog

Latest posts

About 5D/XR

The idea of enhancing human perception through computer-mediated reality has its origins tracing back to the 1960s [1]. Since then, this concept has undergone various transformations, leading to a myriad of terms that can be bewildering for many [2]. Extended Reality (XR) serves as a broad term that covers a range of immersive technologies such as virtual reality (VR), augmented reality (AR), and mixed reality (MR) along with several input mechanisms for interactions. XR denotes a spectrum of experiences that blur the boundaries between the physical world and virtual environments [3]. This spectrum was conceptualized by Milgram within the reality–virtuality continuum, which encompasses all possible variations of real and virtual objects [4,5]. Figure 1 illustrates the relationship between different XR technologies, depicting the transition from the real environment to the virtual environment. 

Figure 1. Relation between XR technologies, input mechanisms, and environment

While VR allows the user to submerge in immersive environments, AR enables the overlapping of relevant virtual elements to the observed reality. The XR spectrum also includes MR, Augmented Virtuality (AV), and Diminished Reality, each offering distinct experiences [6,7,8]. However, the prevailing trend indicates a shift from VR headsets to MR headsets, exemplified by newcomers like the Meta Quest 3 and Meta Quest Pro.

A. Augmented Reality

AR is a technology designed to enhance the user's visual field by overlaying necessary information for the current task [9,10]. It exhibits several key properties, including the integration of real and augmented objects, real-time interactivity, and accurate alignment of real and augmented elements [11]. These dynamics of AR complement human associative information processing and memory, facilitating a seamless transition between reality and augmentation [12]. AR finds extensive applications in remote guidance, instructional visualization for complex assembly, disruptive tasks, and training within existing environments [13].AR could be divided into three different types of Hand-held device (smartphone or tablet), Head-worn (glasses) and spatial (projector/hologram) [14]. AR essentially enriches reality with computer-generated content, commonly accessed through smartphones, tablets, or glasses [15, ]. This digital content overlays the user's real-world view, providing a blend of real and virtual experiences, which can include 3D stereoscopic or 2D imagery [17]. The essence of AR lies in Peripheral Input Mechanisms its ability to deliver real-time information or data within the context of the user's immediate surroundings, enhancing situational awareness and facilitating informed decision-making [15]. Table 1 showcases the most recent AR devices, detailing their specifications and offering a thorough comparative analysis.

Table 1. Specifications of latest AR devices


B. Mixed Reality

AV represents a higher level of virtual experience compared to augmented reality, primarily used for visualizing new products and procedures such as picking processes. In AV, a significant portion of the elements involved are synthetic in nature [9,24]. MR is characterized by the blending of both real and virtual environments, where each augments the other [9,25]. MR closely resembles AR but offers a more immersive experience through increased interaction between virtual and real-world elements, enhancing the realism for users [26]. Unlike AR, in MR, users not only see computer-generated content integrated with reality but also interact with it, creating a seamless fusion between virtual and physical interactions. This is exemplified in interactive heads-up displays from science fiction, where users can touch virtual content and perceive its scale as if it were real [15]. To achieve the MR experience, specialized headsets equipped with integrated computers, translucent glass, and sensors are required. These systems map the real-world environment in real-time using integrated sensors, enabling virtual objects to interact with the physical environment and users seamlessly. In essence, MR offers a more immersive and interactive version of AR [2]. Table 2 features notable examples of MR headsets widely utilized in various reported MR applications, along with their specifications and a comparative analysis.

Table 2. Specifications of widely used and latest MR headsets


C. Virtual Reality

VR refers to the utilization of real-time digital computers, along with specialized hardware and software, to create simulations of alternate worlds or environments that are convincingly realistic to users [34, 35]. Positioned at the far end of the reality-virtuality continuum, VR systems generate entirely computer-generated content, immersing users fully in virtual environments, devoid of interaction with the real world. The immersive nature and high level of presence in VR systems offer significant flexibility for exploring "what-if" scenarios [22]. VR immerses users in a synthetic environment that replicates real-world properties using various technologies such as high-resolution displays, high refresh rates, head-mounted displays, stereo headphones, and motion-tracking systems [36, 37].

In VR, users become completely absorbed in a virtual world, unable to perceive the real world around them. This technology enables users to step into a threedimensional world, where they can explore, interact with, and move around as if it were real. [38, 39]. VR finds applications across different phases of manufacturing, commonly used for product development, marketing, training, ergonomics [40] and visualizing digital factories during the design phase, whether for green-field or brownfield applications [41]. When using VR, users don glasses, earphones, and other devices to immerse themselves in a simulated environment, shutting out the real world entirely.

Everything within VR environments, including the user's avatar, is virtual content. These environments can consist of 360-degree photos, 360- degree videos, or n-D models, capturing scenarios or situations for users to experience firsthand. Interactivity is a key feature of VR, allowing users to interact with objects, props, or the environment itself. Typically, VR experiences are delivered through enclosed head-mounted displays with surround audio, providing a fully immersive sensory experience [42,43]. VR systems typically come in three setups [22]:

  1. The entry-level setup involves standalone headsets, which may utilize a smartphone paired with a cardboard viewer or an integrated solution to deliver the virtual experience.
  2.  Another setup is the CAVE (Cave Automatic Virtual Environment), featuring multiple large projecting screens forming the walls and door of a room, providing users with complete immersion.
  3. The third setup involves head-mounted displays (HMDs) connected to standalone computers. This setup has gained dominance in recent years due to its increasing affordability while maintaining a high-quality VR experience [22].

Table 3 presents prominent examples of VR headsets extensively utilized in various reported VR applications. The table includes their specifications and provides a comprehensive comparative analysis.

Table 3. Specifications of widely used and latest VR headsets


D. Haptics

The term "Haptics" was introduced by Revesz in 1950, inspired by observations of blind individuals' performance. It denotes an unconventional sensory experience distinct from traditional touch and kinesthesis. Specifically, haptics refers to "active touch" rather than passive touching [51]. Human haptic perception encompasses both kinesthetic and cutaneous (tactile) feedback. Kinesthetic feedback involves sensing the position and movement of one's body, mediated by receptors located in the skin, joints, skeletal muscles, and tendons [52]. Conversely, cutaneous feedback relates to stimuli detected by low-threshold mechanoreceptors beneath the skin within the contact area [53]. Devices designed to stimulate kinesthesia are typically grounded, bulky, mechanically complex, and expensive, with a limited workspace. Traditionally, these kinesthetic devices can provide distinct forces or torques to move the user's hand or resist motion, offering high-quality interactions but facing constraints in terms of cost and portability [54]. To circumvent the limitations associated with grounded kinesthetic devices, haptic feedback can be delivered through cutaneous devices. Although cutaneous feedback can theoretically be provided for the entire body, it is predominantly delivered through fingertips, which are commonly utilized for grasping and manipulation and are rich in mechanoreceptors [55]. Research has demonstrated that, to some extent, it is feasible to compensate for the absence of kinesthesia using the modulated cutaneous force technique without experiencing significant performance degradation [56,57].

Despite its significance, the haptic sense remains relatively underexplored compared to sight and hearing [58]. The prevailing trend in haptic device development involves bringing the base of these devices closer to the area of stimulation. This shift entails transitioning from grounded devices, which cannot be worn on parts of the user's body, to handheld devices, and progressing further towards the development of partially and fully wearable devices [59]. Accordingly, haptic systems could be classified by wearability level into four categories, as depicted in Figure 5.

  1. Grounded: Non-wearable devices categorized into graspable and touchable systems.
  2. Hand-held: These devices are considered "partially" wearable and could be distinguished based on the type of actuation (direct or indirect) in relation to the user's limb.
  3. Gloves: This category includes exoskeletons, gloves, finger-worn devices, and arm-worn devices, which are designed to be worn directly on the hand.
  4. Suits: These are full body wearable devices and are typically consists of a garment equipped with an array of actuators or vibrators strategically placed throughout the suit. 

Grounded haptic devices, also referred to as "tabletop" haptic devices, are those that cannot be worn on any part of the user's body due to their size and/or functional features, such as the presence of air reservoirs or compressors. As a result, the workspace of such devices is typically limited. Grounded haptic systems can be further categorized into two types: graspable and touchable devices (as depicted in Figure 5a). Because grounded devices are not as constrained in terms of size and weight compared to hand-held and wearable devices, they often employ pneumatic actuation, utilizing bulky reservoirs and pumps, or magnetic actuation, which involves platforms and large electric coils [40].

Hand-held devices, as outlined in Figure 5.b, are devices that can be held within the hands without the need for straps. Compared to grounded devices, they are typically lighter, impose fewer constraints on movements, and offer a larger workspace. However, they cannot be worn, thus limiting complete freedom of movement. Hand-held devices have the capability to render kinesthetic, tactile feedback, or both simultaneously. These controllers significantly enhance the user experience while held in hand during operation. Traditional controllers commonly provide vibrotactile feedback to emphasize certain events occurring on the screen [40]. The most common types of wearable haptic devices are haptic gloves and exoskeleton systems, often referred to simply as exoskeletons. The primary distinction between them is that not all haptic gloves have an exoskeletal structure, and conversely, not all exoskeletal systems are designed in the form of gloves. These devices are primarily intended to provide kinesthetic haptic feedback while being worn directly on the user's body [42]. Table 4 showcases notable examples of haptic gloves widely utilized in various reported XR applications. The table includes detailed specifications for each glove and offers a comprehensive comparative analysis.

Haptic suits represent wearable devices tailored to provide haptic feedback to the upper body as users interact with different elements in the virtual environment [60,61,62]. Most commercially available haptic suits employ vibration actuators strategically positioned throughout the torso and arms, operating in predetermined sequences. This configuration enables users to experience haptic feedback resembling various sensations, such as the impact of a bullet or a fist [63]. Previous studies have been utilized in movies [64], military training simulations [65,66] and games [43,44,67].

5D Concept

In the realm of advanced medical technology, the 5D concept revolutionizes patient care by seamlessly integrating multiple dimensions with the power of extended reality (XR) capabilities. Initially, we consider the 3D aspect, encompassing the spatial coordinates of XYZ, capturing the precise location and depth within the operating room (OR). This forms the foundation for detailed anatomical mapping and spatial orientation.

Moving into the 4D domain, we add the critical element of time, enabling real-time movements through volumetric video streaming. This allows surgeons to visualize and interact with dynamic, three-dimensional images of patient anatomy as they operate, providing a live, immersive view of the surgical site. The incorporation of XR, including augmented reality (AR) and virtual reality (VR), enhances this experience by overlaying vital information directly onto the surgeon's field of view, ensuring they have immediate access to critical data without diverting their attention.

The 5D paradigm further enhances this by incorporating context—an amalgamation of information, assets, and AI-analytics. This contextual layer includes comprehensive patient data, AI-driven insights, and additional resources such as medical histories, imaging results, and surgical plans. XR capabilities enable these data points to be seamlessly integrated and visualized within the surgeon’s XR headset or AR display. For instance, AR can project a patient’s medical history and real-time analytics onto a virtual screen in the surgeon’s view, allowing for quick reference and enhanced decision-making.

AI-analytics play a crucial role in this 5D environment by providing predictive analytics and real -time decision support. These AI tools analyze patient data, monitor vital signs, and predict potential complications, offering suggestions and alerts that are contextually relevant to the ongoing procedure. Surgeons can receive these insights directly through their XR devices, ensuring they are constantly informed and able to make data-driven decisions on the fly.

The integration of these five dimensions with XR capabilities facilitates an unprecedented level of precision and efficiency in surgical procedures. Surgeons are equipped with a comprehensive, immersive view of the patient's anatomy, real-time movements, and contextual information, all enhanced by AI-analytics. This holistic, data-enriched approach not only optimizes patient outcomes but also transforms the surgical experience, making it more intuitive and informationrich. Through the combination of 5D concepts and XR technology, the future of surgery is poised to be more precise, informed, and effective than ever before.


References


  • [1]. Sutherland, I.E., 1968, December. A head-mounted three-dimensional display. In Proceedings of the December 9-11, 1968, fall joint computer conference, part I, pp. 757-764.
  • [2]. Gong, L., Fast-Berglund, Å. and Johansson, B., 2021. A framework for extended reality system development in manufacturing. IEEE Access, 9, pp.24796-24813, doi: 10.1109/ACCESS.2021.3056752.
  • [3]. Alizadehsalehi, S., Hadavi, A. and Huang, J.C., 2020. From BIM to extended reality in AEC industry. Automation in Construction, 116, p.103254, doi: 10.1016/j.autcon.2020.103254.
  • [4]. Milgram, P. and Kishino, F., 1994. A taxonomy of mixed reality visual displays. IEICE TRANSACTIONS on Information and Systems, 77(12), pp.1321-1329.
  • [5]. Milgram, P., Takemura, H., Utsumi, A. and Kishino, F., 1995, December. Augmented reality: A class of displays on the reality-virtuality continuum. In Telemanipulator and telepresence technologies (Vol. 2351, pp. 282-292). Spie., doi: 10.1117/12.197321.
  • [6]. Trindade, N.V., Ferreira, A., Pereira, J.M. and Oliveira, S., 2023. Extended reality in AEC. Automation in Construction, 154, p.105018, doi: 10.1016/j.autcon.2023.105018.
  • [7]. Chalmers, D.J., 2022. Reality+: Virtual Worlds and the Problems of Philosophy. WW Norton & Company.
  • [8]. Craig, A.B., 2013. Understanding augmented reality: Concepts and applications.
  • [9]. Fast-Berglund, Å., Gong, L. and Li, D., 2018. Testing and validating Extended Reality (xR) technologies in manufacturing. Procedia Manufacturing, 25, pp.31-38, doi: 10.1016/j.promfg.2018.06.054.
  • [10]. Thomas, P.C. and David, W.M., 1992, January. Augmented reality: An application of heads-up display technology to manual manufacturing processes. In Hawaii international conference on system sciences (Vol. 2). ACM SIGCHI Bulletin.
  • [11]. Azuma, R.T., 1997. A survey of augmented reality. Presence: teleoperators & virtual environments, 6(4), pp.355-385, doi: 10.1162/pres.1997.6.4.355.
  • [12]. Neumann, U. and Majoros, A., 1998, March. Cognitive, performance, and systems issues for augmented reality applications in manufacturing and maintenance. In Proceedings. IEEE 1998 Virtual Reality Annual International Symposium (Cat. No. 98CB36180) (pp. 4-11). IEEE, doi: 10.1109/VRAIS.1998.658416.
  • [13]. Regenbrecht, H., Baratoff, G. and Wilke, W., 2005. Augmented reality projects in the automotive and aerospace industries. IEEE computer graphics and applications, 25(6), pp.48-56, doi: 10.1109/MCG.2005.124.
  • [14]. Syberfeldt, A., Danielsson, O. and Gustavsson, P., 2017. Augmented reality smart glasses in the smart factory: Product evaluation guidelines and review of available products. Ieee Access, 5, pp.9118-9130, doi: 10.1109/ACCESS.2017.2703952.
  • [15]. Alizadehsalehi, S. and Yitmen, I., 2023. Digital twin-based progress monitoring management model through reality capture to extended reality technologies (DRX). Smart and Sustainable Built Environment, 12(1), pp.200-236, doi: 10.1108/SASBE-01-2021-0016.
  • [16]. Machado, R.L. and Vilela, C., 2020. Conceptual framework for integrating BIM and augmented reality in construction management. Journal of civil engineering and management, 26(1), pp.83-94, doi: 0.3846/jcem.2020.11803.
  • [17]. Garbett, J., Hartley, T. and Heesom, D., 2021. A multi-user collaborative BIM-AR system to support design and construction. Automation in Construction, 122, p.103487, doi: 10.1016/j.autcon.2020.103487.
  • [18]. Magic Leap, Magic Leap 2, Available at: https://www.magicleap.com/devices-ml2, Accessed April 2024.
  • [19]. EPSON, Moverio BT-45C AR Smart Glasses, Available at: https://epson.ca/For-Work/Wearables/SmartGlasses/Moverio-BT-45C-AR-Smart-Glasses/p/V11H970020, Accessed April 2024.
  • [20]. VUZIX, VUZIX M4000 SMART GLASSES, Available at: https://www.vuzix.com/en-ca/products/m4000-smart-glasses, Accessed April 2024.
  • [21]. Lenovo, ThinkReality A3
  • [22]. RealWear, RealWear Navigator Z1, Available at: https://www.realwear.com/devices/navigator-Z1, Accessed April 2024.
  • [23]. XREAL, XREAL Air 2 Ultra, Available at: https://us.shop.xreal.com/products/xreal-air-2-ultra, Accessed April 2024.
  • [24]. Khan, W.A., Raouf, A., Cheng, K., Khan, W.A., Raouf, A. and Cheng, K., 2011. Augmented reality for manufacturing. Virtual Manufacturing, pp.1-56, doi: 10.1007/978-0-85729-186-8_1.
  • [25]. Barfield, W. ed., 2016. Fundamentals of wearable computers and augmented reality. Second Edition, CRC press.
  • [26]. Alizadehsalehi, S., Hadavi, A. and Huang, J.C., 2019. Virtual reality for design and construction education environment. AEI 2019, pp.193-203, doi: 10.1061/9780784482261.023.
  • [27]. Apple, Apple Vission Pro, Available at: https://www.apple.com/apple-vision-pro/, Accessed April 2024.
  • [28]. Microsoft, Microsoft Hololense 2, Available at: https://www.microsoft.com/enca/hololens/hardware#document-experiences, Accessed April 2024.
  • [29]. Varjo, Varjo XR-3, Available at: https://varjo.com/products/varjo-xr-3/, Accessed April 2024.
  • [30]. Varjo, Varjo XR-4, Available at: https://varjo.com/products/xr-4/, Accessed April 2024.
  • [31]. Samsung, HMD Odyssey, Available at: https://www.samsung.com/us/support/troubleshooting/TSG01111314/,Accessed April 2024.
  • [32]. Meta, Meta Quest Pro, Available at: https://www.meta.com/ca/quest/quest-pro/, Accessed April 2024.
  • [33]. Meta, Meta Quest 3, Available at: https://www.meta.com/ca/quest/quest-3/, Accessed April 2024.
  • [34]. S. C.-Y. Lu, M. Shpitalni, and R. Gadh, ``Virtual and augmented reality technologies for product realization,'' CIRP Ann., vol. 48, no. 2, pp. 471-495, 1999.
  • [35]. L. Gong, Å. Fast-Berglund, B. Johansson, A Framework for Extended Reality System Development in Manufacturing, IEEE Access. 9 (2021) 24796–24813. https://doi.org/10.1109/ACCESS.2021.3056752.
  • [36]. L. Taylor, T. Dyer, M. Al-Azzawi, C. Smith, O. Nzeako, Z. Shah, Extended reality anatomy undergraduate teaching: A literature review on an alternative method of learning, Ann. Anat. 239 (2022) 151817. https://doi.org/10.1016/j.aanat.2021.151817.
  • [37]. Moro, C., Stromberga, Z., Raikos, A., Stirling, A., 2017. The effectiveness of virtual and augmented reality in health sciences and medical anatomy. Anat. Sci. Educ. 10 (6), 549–559. https://doi.org/10.1002/ase.1696
  • [38]. Å. Fast-Berglund, L. Gong, D. Li, Testing and validating Extended Reality (xR) technologies in manufacturing, Procedia Manuf. 25 (2018) 31–38. https://doi.org/10.1016/j.promfg.2018.06.054.
  • [39]. T. S. Mujber, T. Szecsi, and M. S. J. Hashmi, "Virtual reality applications in manufacturing process simulation," Journal of Materials Processing Technology, vol. 155-156, pp. 1834-1838, 2004/11/30/ 2004.
  • [40]. G. Lawson, P. Herriotts, L. Malcolm, K. Gabrecht, and S. Hermawati, "The use of virtual reality and physical tools in the development and validation of ease of entry and exit in passenger vehicles," Appl Ergon, vol. 48, pp.240-51, May 2015.
  • [41]. L. P. Berg and J. M. Vance, "An Industry Case Study: Investigating Early Design Decision Making in Virtual Reality," Journal of Computing and Information Science in Engineering, vol. 17, pp. 011001-011001-7, 2016.
  • [42]. S. Alizadehsalehi, I. Yitmen, Digital twin-based progress monitoring management model through reality capture to extended reality technologies (DRX), Smart Sustain. Built Environ. 12 (2023) 200–236. https://doi.org/10.1108/SASBE-01-2021-0016.
  • [43]. A. Khalili, An XML-based approach for geo-semantic data exchange from BIM to VR applications, Autom. Constr. 121 (2021) 103425. https://doi.org/10.1016/j.autcon.2020.103425.
  • [44]. Meta, Meta Quest 2, Available at: https://www.meta.com/ca/quest/products/quest-2/tech-specs/#tech-specs,Accessed April 2024.
  • [45]. Varjo, Varjo Aero, Available at: https://varjo.com/products/aero/, Accessed April 2024.
  • [46]. HTC, VIVE Focus 3, Available at: https://www.vive.com/ca/product/vive-focus3/specs/, Accessed April 2024.
  • [47]. Vive, Vive Index, Available at: https://www.valvesoftware.com/en/index/headset, Accessed April 2024.
  • [48]. Sony, PlayStation VR2, Available at: https://www.playstation.com/en-ca/ps-vr2/ps-vr2-tech-specs/, Accessed April 2024.
  • [49]. HP, HP Reverb G2, Available at: https://support.hp.com/us-en/document/c06938191, Accessed April 2024.
  • [50]. Pimax, Crystal, Available at: https://pimax.com/crystal/, Accessed April 2024.
  • [51]. Srinivasan, M. A. (1995). Haptic Interfaces, In Virtual Reality: Scientific and Technical Challenges, Report of the Committee on Virtual Reality Research and Development.
  • [52]. V. Hayward, O.R. Astley, M. Cruz-Hernandez, D. Grant, G. Robles-De-La-Torre, Haptic interfaces and
  • devices, Sens. Rev. 24 (2004) 16–29. https://doi.org/10.1108/02602280410515770.
  • [53]. C. Pacchierotti, D. Prattichizzo, K.J. Kuchenbecker, Cutaneous feedback of fingertip deformation and vibration for palpation in robotic surgery, IEEE Trans. Biomed. Eng. 63 (2016) 278–287. https://doi.org/10.1109/TBME.2015.2455932.
  • [54].H. Culbertson, S.B. Schorr, A.M. Okamura, Haptics: The Present and Future of Artificial Touch Sensation, Annu. Rev. Control. Robot. Auton. Syst. 1 (2018) 385–409. https://doi.org/10.1146/annurev-control-060117-105043.
  • [55]. G. Westling and R. S. Johansson, ``Responses in glabrous skin mechanoreceptors during precision grip in humans,'' Exp. Brain Res., vol. 66, no. 1, pp. 40128, Mar. 1987.
  • [56]. C. Pacchierotti, A. Tirmizi, and D. Prattichizzo, ``Improving transparency in teleoperation by means of cutaneous tactile force feedback,'' ACM Trans. Appl. Perception, vol. 11, no. 4, pp. 116, Apr. 2014.
  • [57]. A. Adilkhanov, M. Rubagotti, Z. Kappassov, Haptic Devices: Wearability-Based Taxonomy and Literature Review, IEEE Access. 10 (2022) 91923–91947. https://doi.org/10.1109/ACCESS.2022.3202986.
  • [58]. Robles-De-La-Torre, G. (2016) The importance of the sense of touch in virtual and real environments, IEEE Multimedia 13 (3): 24-34.
  • [59]. C. Pacchierotti, S. Sinclair, M. Solazzi, A. Frisoli, V. Hayward, D. Prattichizzo, Wearable haptic systems for the fingertip and the hand: Taxonomy, review, and perspectives, IEEE Trans. Haptics. 10 (2017) 580–600. https://doi.org/10.1109/TOH.2017.2689006.
  • [60]. G. Garcia-Valle, M. Ferre, J. Brenosa, D. Vargas, Evaluation of Presence in Virtual Environments: Haptic Vest and User’s Haptic Skills, IEEE Access. 6 (2017) 7224–7233. https://doi.org/10.1109/ACCESS.2017.2782254.
  • [61]. A. Delazio, K. Nakagaki, J.F. Lehman, R.L. Klatzky, A.P. Sample, S.E. Hudson, Force jacket: Pneumaticallyactuated jacket for embodied haptic experiences, Conf. Hum. Factors Comput. Syst. - Proc. 2018-April (2018). https://doi.org/10.1145/3173574.3173894.
  • [62]. T.H. Yang, J.R. Kim, H. Jin, H. Gil, J.H. Koo, H.J. Kim, Recent Advances and Opportunities of Active Materials for Haptic Technologies in Virtual and Augmented Reality, Adv. Funct. Mater. 31 (2021). https://doi.org/10.1002/adfm.202008831.
  • [63]. D. Kang, C.G. Lee, O. Kwon, Pneumatic and acoustic suit: multimodal haptic suit for enhanced virtual reality simulation, Virtual Real. 27 (2023) 1647–1669. https://doi.org/10.1007/s10055-023-00756-5.
  • [64]. P. Lemmens, F. Crompvoets, D. Brokken, J. van den Eerenbeemd and G. -J. de Vries, A body-conforming tactile jacket to enrich movie viewing, World Haptics 2009 - Third Joint EuroHaptics conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, Salt Lake City, UT, USA, 2009, pp. 7-12, doi: 10.1109/WHC.2009.4810832.
  • [65]. R. W. Lindeman, Y. Yanagida, H. Noma, K. Hoshka, Wearable vibrotactile systems for virtual contact and information display. Virtual Reality, 9 (2006), 203–213. https://doi.org/10.1007/s10055-005-0010-6
  • [66].S.M. Ko, K. Lee, D. Kim, Y.G. Ji, Vibrotactile perception assessment for a haptic interface on an antigravity suit, Appl. Ergon. 58 (2017) 198–207. https://doi.org/10.1016/j.apergo.2016.06.013.
  • [67]. M. Al-Sada, K. Jiang, S. Ranade, M. Kalkattawi, T. Nakajima, HapticSnakes: multi-haptic feedback wearable robots for immersive virtual reality, Virtual Real. 24 (2020) 191–209. https://doi.org/10.1007/s10055-019-00404-x.
  • [68]. Teslasuit, Teslasuit Glove, Available at: https://teslasuit.io/products/teslaglove/, Accessed April 2024.
  • [69]. Senseglove, SenseGlove Nova 2, Available at: https://www.senseglove.com/product/nova-2/, Accessed April 2024.
  • [70]. HaptX, HaptX Gloves, Available at: https://g1.haptx.com/learnabout#unrivaled-haptic-fidelity, Accessed April 2024.