The Effects of Encountered-Type Haptics on Immersion, Presence, User Experience and Task Performance in Virtual Control Panel Environments Kiren K. Padayachee A dissertation submitted to the Faculty of Engineering and the Built Environment, University of the Witwatersrand, Johannesburg, in fulfilment of the requirements for the degree of Master of Science in Engineering. Johannesburg, January 2023 i Declaration I declare that this dissertation is my own, unaided work, except where otherwise acknowledged. It is being submitted for the degree of Master of Science in Engineering to the University of the Witwatersrand, Johannesburg. It has not been submitted before for any degree or examination to any other university. Signed this 30th day of January 2023 Kiren K. Padayachee ii Abstract Virtual Reality (VR) aims to immerse users in an experience that feels real. Passive haptic feedback enhances immersion, with users interacting with real-world analogues of virtual objects, as if directly encountered. Encountered-Type Haptic Displays (ETHDs) track users to dynamically provide this feedback, but cannot keep up with human movements, slowing them down, and negatively impacting the experience. Four experiments were conducted with the custom-built, high-speed, Fast ETHD system, to quantify its effects on aspects of the VR experience. The VR Alien Shooter Game Experiment entailed 25 participants shooting aliens, and answering modified versions of the Presence Questionnaire (PQ) and User Experience Questionnaire (UEQ), to answer “What effects on immersion, presence and User Experience (UX) are observed in users as ETHD speed increases?” Sentiment scores improved for the PQ aspects of Possibility To Act, Quality Of Interface and the UEQ aspect of Efficiency. Game performance improved in the Total Aliens Hit, Aliens Hit by Normal and Double Shots, Aliens Hit by Laser and Big Shots, Alien Bullets Hit and Spaceship Damage categories. The UEQ aspect of Attractiveness showed no effect on scores, with a decrease observed for Dependability. Participants had positive sentiments toward the system and experience, with average scores for Perspicuity, Stimulation and Novelty scales of the UEQ being 1.79, 2.27 and 1.33 respectively, the maximum of 3.00 being strongly positive sentiment. Users were not that sensitive to the speed profile range. Exposing users to more repetitions per speed, on a faster rig, would aid future studies. The VR Fitts’s Test Experiment had novice users performing the ISO/TS 9241-411 multidirectional tapping task for different haptic conditions, answering “What task performance is achieved with a high-speed ETHD and how does it compare to zero-delay conditions and previous studies?” The ETHD Haptics Condition resulted in a throughput of 2.23bps for 27 subjects, the Table Haptics Condition 3.53bps for 26 subjects, and the No Haptics Condition 3.70bps for 23 subjects. The ETHD’s lower throughput is attributed to the latency in the Fast ETHD system. A faster microcontroller, more optimised Computer Numeric Control (CNC) firmware, and incorporating predictive models for positioning is suggested to reduce latency, and training users to expert levels to improve throughput. iii Acknowledgements I would like to thank Dr Stephen P. Levitt and Prof. Kenneth J. Nixon for their unending guidance and support throughout this project. This project would not have sustained its momentum without your help, from organising participants and reserving a room for experiments, even during the tough times of the Covid-19 lockdown. Thank you once again for taking me on as your student, for guiding me through this journey and for all your brilliant ideas that pushed this project to become its best version. I would also like to thank Prof. Ivan Hofsajer for his excellent classes on writing theses and dissertations. I did not think I could ever write this much in my life and I appreciate the extra effort you put into those classes as it paved the way for me to get this far. To my family and friends, thank you for all your motivation and support throughout these many years. I can now finally fulfil my never-ending promises of a braai and smoked ribs “once I’m finished”. To my family, Mum, Dad, Pry, Aryan, Leerusha, Vidhya and Naren, thank you all for supporting me on this lengthy journey which has turned the word master’s into a swear word. I owe you all big time and I am going to make up for all the quality time sacrificed for studies. I love you all lots and I thank you all for everything you have done for me and for being my guinea pigs. Thank you to all my colleagues from the School of Electrical and Information Engineering at the University of Witwatersrand and the participants who took part in this study, I could not have done this without you. I enjoyed all the discussions that came up from your interactions with this project. Finally I would to thank all the people that brought Virtual Reality to where it is today. This project was the push I needed to dive into this field and it was the best decision I’ve ever made. Besides the new skills I’ve gained, this project allowed me, my family, friends, work colleagues and many strangers to finally try out this cool new thing called VR. I now see the power of this technology to inspire people iv in completely new ways. My kids had the chance to be chased by dinosaurs, wield light sabers, be artistic and even fish to rock music. The adults got to feel like kids again, with the video proof to show how funny they looked while doing so. I got the chance to explore the International Space Station and to play Half-Life: Alyx, two things that are extremely rare for anyone to experience. I’m glad I got the chance to contribute even a tiny bit more to this awesome technology, and I cannot wait to see what it evolves into in the future. v Contents Declaration i Abstract ii Acknowledgements iii Contents v List of Figures xi List of Tables xv List of Symbols xviii List of Publications xx List of Videos xx Nomenclature xxi 1 Introduction 1 1.1 Research Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2 1.2 Research Aim . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4 1.3 Research Scope . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5 1.4 Research Question . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5 1.5 Dissertation Structure . . . . . . . . . . . . . . . . . . . . . . . . . . 6 1.6 Contributions Made . . . . . . . . . . . . . . . . . . . . . . . . . . . 8 1.7 Theoretical Framework . . . . . . . . . . . . . . . . . . . . . . . . . . 9 1.7.1 Haptics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9 1.7.2 Immersion and Presence . . . . . . . . . . . . . . . . . . . . . 10 1.7.3 User Experience . . . . . . . . . . . . . . . . . . . . . . . . . 11 1.7.4 Task Performance . . . . . . . . . . . . . . . . . . . . . . . . 11 1.8 Conceptual Framework . . . . . . . . . . . . . . . . . . . . . . . . . . 12 1.8.1 Factor Overview . . . . . . . . . . . . . . . . . . . . . . . . . 13 1.8.2 Factors of Interest for this Study . . . . . . . . . . . . . . . . 15 vi 1.9 Significance of the study . . . . . . . . . . . . . . . . . . . . . . . . . 16 2 Literature Review: Virtual Reality and Encountered-Type Haptic Displays 18 2.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18 2.2 Virtual Reality and Training . . . . . . . . . . . . . . . . . . . . . . 19 2.3 Haptics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21 2.4 Passive Haptics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23 2.5 Encountered-Type Haptic Devices . . . . . . . . . . . . . . . . . . . 25 2.6 Desired Speed Capabilities of Encountered-Type Haptic Devices . . 30 2.7 Hand Tracking in Virtual Reality . . . . . . . . . . . . . . . . . . . . 32 2.8 Chapter Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33 3 The Fast ETHD System 36 3.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36 3.2 Design Requirements . . . . . . . . . . . . . . . . . . . . . . . . . . . 36 3.3 System Components . . . . . . . . . . . . . . . . . . . . . . . . . . . 37 3.3.1 Encountered-Type Haptic Display CNC Rig . . . . . . . . . . 40 3.3.2 Arduino Due g2core Motor Controller . . . . . . . . . . . . . 40 3.3.3 Button Panel . . . . . . . . . . . . . . . . . . . . . . . . . . . 41 3.3.4 QuestG2CoreBridge Java Application . . . . . . . . . . . . . 42 3.3.5 Oculus Quest Headset . . . . . . . . . . . . . . . . . . . . . . 45 3.3.6 VR Test Application . . . . . . . . . . . . . . . . . . . . . . . 46 3.3.7 WiFi Communications . . . . . . . . . . . . . . . . . . . . . . 47 3.3.8 Lighting . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48 3.3.9 Bricks / Table . . . . . . . . . . . . . . . . . . . . . . . . . . 48 3.3.10 Emergency Stop Button . . . . . . . . . . . . . . . . . . . . . 48 3.3.11 Proximity Sensors . . . . . . . . . . . . . . . . . . . . . . . . 48 3.3.12 Buttons and 3D Models . . . . . . . . . . . . . . . . . . . . . 48 3.4 Costs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50 3.5 Design Decisions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50 3.5.1 Motors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52 3.5.2 Microcontrollers . . . . . . . . . . . . . . . . . . . . . . . . . 52 3.5.3 Momentum . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52 3.5.4 Button Model . . . . . . . . . . . . . . . . . . . . . . . . . . . 53 3.5.5 Finger Model Accuracy . . . . . . . . . . . . . . . . . . . . . 53 3.5.6 Wiring . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54 3.5.7 Networking . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54 3.6 System Features and Operation . . . . . . . . . . . . . . . . . . . . . 54 vii 3.6.1 System Initialisation . . . . . . . . . . . . . . . . . . . . . . . 56 3.6.2 Coordinate System Alignment . . . . . . . . . . . . . . . . . 56 3.6.3 Hand Tracking . . . . . . . . . . . . . . . . . . . . . . . . . . 56 3.6.4 Scenes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58 3.6.5 Encountered-Type Haptic Feedback . . . . . . . . . . . . . . 59 3.6.6 Safety Height . . . . . . . . . . . . . . . . . . . . . . . . . . . 59 3.6.7 Control Schemes . . . . . . . . . . . . . . . . . . . . . . . . . 61 3.7 System Performance . . . . . . . . . . . . . . . . . . . . . . . . . . . 62 3.8 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64 3.9 Chapter Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64 4 Literature Review: Immersion, Presence and User Experience 66 4.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 66 4.2 Immersion and Presence . . . . . . . . . . . . . . . . . . . . . . . . . 67 4.3 User Experience . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70 4.4 Chapter Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72 5 VR Alien Shooter Game Experiment 73 5.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73 5.2 Experimental Design . . . . . . . . . . . . . . . . . . . . . . . . . . . 74 5.2.1 Modifications to the Presence and User Experience Question- naires . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75 5.2.2 Measurement Considerations . . . . . . . . . . . . . . . . . . 78 5.2.3 Sample Size Estimation . . . . . . . . . . . . . . . . . . . . . 79 5.3 Experimental Procedure . . . . . . . . . . . . . . . . . . . . . . . . . 81 5.3.1 Equipment . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81 5.3.2 User Interface . . . . . . . . . . . . . . . . . . . . . . . . . . . 82 5.3.3 Participants . . . . . . . . . . . . . . . . . . . . . . . . . . . . 82 5.3.4 Test Setup . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83 5.3.5 Haptic Feedback System Training Phase . . . . . . . . . . . . 84 5.3.6 Game Practise Phase . . . . . . . . . . . . . . . . . . . . . . . 85 5.3.7 Playing the Game . . . . . . . . . . . . . . . . . . . . . . . . 86 5.3.8 Data Capturing . . . . . . . . . . . . . . . . . . . . . . . . . . 87 5.4 Data Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87 5.4.1 Data Transformations . . . . . . . . . . . . . . . . . . . . . . 88 5.4.2 Speed Variation Analysis . . . . . . . . . . . . . . . . . . . . 89 5.4.3 Overall Sentiment Analysis . . . . . . . . . . . . . . . . . . . 94 5.5 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 96 5.6 Chapter Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . 98 viii 6 Literature Review: Task Performance 99 6.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 99 6.2 Measuring Task Performance . . . . . . . . . . . . . . . . . . . . . . 99 6.3 Fitts’s Law . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 101 6.4 Standards for Fitts’s Law HCI studies . . . . . . . . . . . . . . . . . 102 6.5 Fitts’s Law in Virtual Environments . . . . . . . . . . . . . . . . . . 105 6.6 Chapter Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . 106 7 VR Fitts’s Test Experiment Methodology 107 7.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 107 7.2 Experimental Design . . . . . . . . . . . . . . . . . . . . . . . . . . . 108 7.3 Equipment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 113 7.4 User Interface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 113 7.5 Data Capturing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 115 7.6 Data Transformations . . . . . . . . . . . . . . . . . . . . . . . . . . 115 7.7 Sample Size Estimation . . . . . . . . . . . . . . . . . . . . . . . . . 115 7.8 Experimental Procedure . . . . . . . . . . . . . . . . . . . . . . . . . 120 7.9 Participants . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 121 7.10 Errors and Data Reduction . . . . . . . . . . . . . . . . . . . . . . . 121 7.11 Data Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 124 7.12 Chapter Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . 125 8 VR Fitts’s Law Experiment Results 126 8.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 126 8.2 No Haptics Condition . . . . . . . . . . . . . . . . . . . . . . . . . . 127 8.2.1 Equipment . . . . . . . . . . . . . . . . . . . . . . . . . . . . 127 8.2.2 User Interface . . . . . . . . . . . . . . . . . . . . . . . . . . . 128 8.2.3 Experimental Design . . . . . . . . . . . . . . . . . . . . . . . 129 8.2.4 Experimental Procedure . . . . . . . . . . . . . . . . . . . . . 130 8.2.5 Participants . . . . . . . . . . . . . . . . . . . . . . . . . . . . 130 8.2.6 Data Reduction . . . . . . . . . . . . . . . . . . . . . . . . . . 131 8.2.7 Data Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . 131 8.2.8 Throughput Analysis . . . . . . . . . . . . . . . . . . . . . . . 133 8.2.9 Movement Trajectory Analysis . . . . . . . . . . . . . . . . . 136 8.3 Table Haptics Condition . . . . . . . . . . . . . . . . . . . . . . . . . 137 8.3.1 Equipment . . . . . . . . . . . . . . . . . . . . . . . . . . . . 138 8.3.2 User Interface . . . . . . . . . . . . . . . . . . . . . . . . . . . 138 8.3.3 Experimental Design . . . . . . . . . . . . . . . . . . . . . . . 139 8.3.4 Experimental Procedure . . . . . . . . . . . . . . . . . . . . . 141 ix 8.3.5 Participants . . . . . . . . . . . . . . . . . . . . . . . . . . . . 142 8.3.6 Data Reduction . . . . . . . . . . . . . . . . . . . . . . . . . . 142 8.3.7 Data Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . 142 8.3.8 Throughput Analysis . . . . . . . . . . . . . . . . . . . . . . . 144 8.3.9 Movement Trajectory Analysis . . . . . . . . . . . . . . . . . 147 8.4 ETHD Haptics Condition . . . . . . . . . . . . . . . . . . . . . . . . 148 8.4.1 Equipment . . . . . . . . . . . . . . . . . . . . . . . . . . . . 149 8.4.2 User Interface . . . . . . . . . . . . . . . . . . . . . . . . . . . 149 8.4.3 Experimental Design . . . . . . . . . . . . . . . . . . . . . . . 150 8.4.4 Experimental Procedure . . . . . . . . . . . . . . . . . . . . . 150 8.4.5 Participants . . . . . . . . . . . . . . . . . . . . . . . . . . . . 151 8.4.6 Data Reduction . . . . . . . . . . . . . . . . . . . . . . . . . . 151 8.4.7 Data Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . 151 8.4.8 Throughput Analysis . . . . . . . . . . . . . . . . . . . . . . . 153 8.4.9 Movement Trajectory Analysis . . . . . . . . . . . . . . . . . 156 8.5 Inter Experiment Comparison . . . . . . . . . . . . . . . . . . . . . . 158 8.5.1 Differences in experimental results . . . . . . . . . . . . . . . 158 8.5.2 Difference in means for Table Haptics and ETHD Haptics Conditions . . . . . . . . . . . . . . . . . . . . . . . . . . . . 159 8.5.3 Difference in means for No Haptics, Table Haptics and ETHD Haptics Conditions . . . . . . . . . . . . . . . . . . . . . . . . 161 8.5.4 Comparisons to previous studies . . . . . . . . . . . . . . . . 164 8.6 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 168 8.7 Chapter Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . 168 9 Discussion 169 9.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 169 9.2 Threats to validity . . . . . . . . . . . . . . . . . . . . . . . . . . . . 169 9.2.1 Construct Validity . . . . . . . . . . . . . . . . . . . . . . . . 169 9.2.2 Internal Validity . . . . . . . . . . . . . . . . . . . . . . . . . 170 9.2.3 External Validity . . . . . . . . . . . . . . . . . . . . . . . . . 172 9.3 Interpretation of Results . . . . . . . . . . . . . . . . . . . . . . . . . 172 9.3.1 VR Alien Shooter Game Experiment . . . . . . . . . . . . . . 172 9.3.2 VR Fitts’s Test Experiment . . . . . . . . . . . . . . . . . . . 173 9.4 Chapter Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . 174 10 Conclusion 175 10.1 Summary of Research Findings . . . . . . . . . . . . . . . . . . . . . 175 10.2 Contributions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 176 x 10.3 Recommendations for Future Research . . . . . . . . . . . . . . . . . 177 10.3.1 Latency . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 177 10.3.2 Safety System . . . . . . . . . . . . . . . . . . . . . . . . . . . 178 10.3.3 Modifications to the PQ and UEQ . . . . . . . . . . . . . . . 179 10.3.4 Hand tracking . . . . . . . . . . . . . . . . . . . . . . . . . . 179 10.3.5 Virtual Space . . . . . . . . . . . . . . . . . . . . . . . . . . . 180 10.3.6 Replicating the study . . . . . . . . . . . . . . . . . . . . . . 180 10.4 Concluding Remarks . . . . . . . . . . . . . . . . . . . . . . . . . . . 180 References 182 A Hardware Wiring 194 B Witmer-Singer Presence Questionnaire 196 C User Experience Questionnaire 202 D Participant Information Sheets and Consent Forms 206 E Speed Variation Analysis Results for the VR Alien Shooter Game Experiment 213 E.1 Immersion and Presence - Possibility To Act . . . . . . . . . . . . . 213 E.2 Immersion and Presence - Quality Of Interface . . . . . . . . . . . . 218 E.3 User Experience - Attractiveness . . . . . . . . . . . . . . . . . . . . 222 E.4 User Experience - Efficiency . . . . . . . . . . . . . . . . . . . . . . . 226 E.5 User Experience - Dependability . . . . . . . . . . . . . . . . . . . . 230 E.6 Game Performance - Total Aliens Hit . . . . . . . . . . . . . . . . . 234 E.7 Game Performance - Aliens Hit by Normal and Double Shots . . . . 238 E.8 Game Performance - Aliens Hit by Laser and Big Shots . . . . . . . 243 E.9 Game Performance - Alien Bullets Hit . . . . . . . . . . . . . . . . . 248 E.10 Game Performance - Spaceship Damage . . . . . . . . . . . . . . . . 252 F Code Samples 256 xi List of Figures 1.1 Inputs, outputs and process of an interaction with a VR scene com- plemented by an Encountered-Type Haptic Display (ETHD). . . . . 12 1.2 A typical scenario where the user reaches out to touch a virtual object in the Virtual Environment (VE) and the ETHD positions the correct real object for haptic feedback. . . . . . . . . . . . . . . . . . . . . . 13 3.1 The Fast ETHD system used for the experiments of this study. . . . 38 3.2 System overview. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39 3.3 The button control panel. . . . . . . . . . . . . . . . . . . . . . . . . 42 3.4 The button control panel when opened. . . . . . . . . . . . . . . . . 43 3.5 The QuestG2CoreBridge application interface. . . . . . . . . . . . . 44 3.6 The VR Test application interface, with the age entry ‘scene’. . . . . 47 3.7 The GX-H8B proximity sensor located on the left of the upper linear actuator. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49 3.8 Real and Three-dimensional (3D) buttons. . . . . . . . . . . . . . . . 49 3.9 System initialisation process to align real and virtual world coordinate systems. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55 3.10 Examples of optimal (a), decent (b) and poor (c) hand tracking for different hand orientations. . . . . . . . . . . . . . . . . . . . . . . . 57 3.11 A depiction of the ETHD operation. . . . . . . . . . . . . . . . . . . 59 3.12 Control diagram depicting the negative feedback loop to achieve < 0.1mm positional accuracy. . . . . . . . . . . . . . . . . . . . . . 60 3.13 Safety Height of indications for when it is safe to click (a), when the rig is in motion (b) and when the user is prompted to raise their hand (c). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 60 3.14 Observed speeds for each speed profile. . . . . . . . . . . . . . . . . . 63 5.1 Sample Size Estimation for One-Way Analysis of Variance (ANOVA) between all four ETHD speed profiles . . . . . . . . . . . . . . . . . 80 5.2 Sample Size Estimation for Relative Standard Error . . . . . . . . . 81 5.3 The VR interface for the alien shooting game. . . . . . . . . . . . . . 83 5.4 An example of the VR representation of a Presence Questionnaire item. 86 xii 5.5 An example of the VR representation of a User Experience Question- naire item. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87 5.6 Final score frequencies for User Experience Questionnaire (UEQ) scales of Perspicuity, Stimulation and Novelty. . . . . . . . . . . . . . . . . 93 5.7 Score averages and distributions for the UEQ scales of Perspicuity, Stimulation and Novelty . . . . . . . . . . . . . . . . . . . . . . . . . 94 5.8 Probability density plot of average scores for Perspicuity, Stimulation and Novelty . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 95 6.1 The ISO/TS 9241-411 multidirectional tapping task. Targets are arranged in a circle and must be clicked in a predefined order from 1-9. D is the distance between targets and W is the target width. . . 102 6.2 The geometry governing a movement from one target to the next . . 103 7.1 The VR Test application user interface for the No Haptics Condition angled at 45◦. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 109 7.2 The VR Test application user interface for the Table Haptics and ETHD Haptics Conditions angled at 0◦. . . . . . . . . . . . . . . . . 114 7.3 Sample size estimation for a t-test between the Table Haptics and ETHD Haptics Conditions. . . . . . . . . . . . . . . . . . . . . . . . 118 7.4 Sample Size Estimation for a One-Way ANOVA between No Haptics, Table Haptics and ETHD Haptics Conditions. . . . . . . . . . . . . . 119 7.5 Clicking on the control panel or a transparent button registers as a miss. The table flashed red and the application moves onto the next target to visually indicate the miss to the user. . . . . . . . . . . . . 122 8.1 Finger trajectory over time with no haptics. . . . . . . . . . . . . . . 127 8.2 The multidirectional tapping task application user interface for the No Haptics Condition. . . . . . . . . . . . . . . . . . . . . . . . . . . 128 8.3 Average movement times, error rates and accuracy per ID for the No Haptics Condition. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 132 8.4 Q-Q and density plots for IDe and MT data for the No Haptics Condition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 134 8.5 Plots on residuals for the No Haptics Condition . . . . . . . . . . . . 135 8.6 Average Movement Time vs Effective Index Of Difficulty IDe for the No Haptics Condition . . . . . . . . . . . . . . . . . . . . . . . . . . 135 8.7 An example of movement analysis for the No Haptics Condition. . . 136 8.8 Finger trajectory over time with passive flat haptics. . . . . . . . . . 137 8.9 The table used for the Table Haptics Condition. . . . . . . . . . . . 138 xiii 8.10 The VR Test application user interface for the Table Haptics Condition, this calibration scene being used to align the virtual table height to the physical table height. . . . . . . . . . . . . . . . . . . . . . . . . 139 8.11 Average movement times, error rates and accuracy per ID for the Table Haptics Condition. . . . . . . . . . . . . . . . . . . . . . . . . . 143 8.12 Q-Q and density plots for IDe and MT data for the Table Haptics Condition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 145 8.13 Plots on residuals for the Table Haptics Condition . . . . . . . . . . 146 8.14 Average Movement Time vs Effective Index Of Difficulty IDe for the Table Haptics Condition . . . . . . . . . . . . . . . . . . . . . . . . . 146 8.15 An example of movement analysis for the Table Haptics Condition. . 147 8.16 Finger trajectory over time with ETHD button haptics . . . . . . . . 148 8.17 Average movement times, error rates and accuracy per ID for the ETHD Haptics Condition. . . . . . . . . . . . . . . . . . . . . . . . . 153 8.18 Q-Q and density plots for IDe and MT data for the ETHD Haptics Condition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 154 8.19 Plots on residuals for the ETHD Haptics Condition . . . . . . . . . . 155 8.20 Average Movement Time vs Effective Index Of Difficulty IDe for the ETHD Haptics Condition . . . . . . . . . . . . . . . . . . . . . . . . 156 8.21 An example of movement analysis for the ETHD Haptics Condition. 157 8.22 Q-Q and density plots for throughput TP across all conditions . . . 160 8.23 Density plots for throughput TP across all conditions . . . . . . . . 162 8.24 Throughput per device setup . . . . . . . . . . . . . . . . . . . . . . 165 E.1 Possibility To Act score frequencies for each speed profile . . . . . . 214 E.2 Score averages and distributions for the Possibility To Act from the Presence Questionnaire (PQ) . . . . . . . . . . . . . . . . . . . . . . 215 E.3 Probabilities of Possibility To Act average scores for each speed profile 217 E.4 Quality Of Interface score frequencies for each speed profile . . . . . 218 E.5 Score averages and distributions for the Quality Of Interface from the PQ . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 219 E.6 Probabilities of Quality Of Interface average scores for each speed profile221 E.7 Attractiveness score frequencies for each speed profile . . . . . . . . 222 E.8 Score averages and distributions for the Attractiveness from the UEQ 223 E.9 Probabilities of Attractiveness average scores for each speed profile . 225 E.10 Probabilities of Efficiency score frequencies for each speed profile . . 226 E.11 Score averages and distributions for the Efficiency from the UEQ . . 227 E.12 Probabilities of Efficiency average scores for each speed profile . . . . 229 E.13 Dependability score frequencies for each speed profile . . . . . . . . . 230 xiv E.14 Score averages and distributions for the Dependability from the UEQ 231 E.15 Probabilities of Dependability average scores for each speed profile . 233 E.16 Total Aliens Hit score frequencies for each speed profile . . . . . . . 234 E.17 Score averages and distributions for Total Aliens Hit . . . . . . . . . 235 E.18 Probabilities of Total Aliens Hit average scores for each speed profile 237 E.19 Aliens Hit by Normal and Double Shots score frequencies for each speed profile . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 238 E.20 Score averages and distributions for Aliens Hit by Normal and Double Shots . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 239 E.21 The percentage of players who were able to achieve the Double Shot upgrade after 10 aliens were shot down. . . . . . . . . . . . . . . . . 240 E.22 Probabilities of Aliens Hit by Normal and Double Shots average scores for each speed profile . . . . . . . . . . . . . . . . . . . . . . . . . . . 242 E.23 Aliens Hit by Laser and Big Shots score frequencies for each speed profile . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 243 E.24 Score averages and distributions for Aliens Hit by Laser and Big Shots244 E.25 The percentage of players who were able to achieve the Laser and Big Shot upgrades after 20 aliens were shot down. . . . . . . . . . . . . . 245 E.26 Probabilities of Aliens Hit by Laser and Big Shots average scores for each speed profile . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 247 E.27 Alien Bullets Hit score frequencies for each speed profile . . . . . . . 248 E.28 Score averages and distributions for Alien Bullets Hit . . . . . . . . . 249 E.29 Probabilities of Alien Bullets Hit average scores for each speed profile 251 E.30 Spaceship Damage score frequencies for each speed profile . . . . . . 252 E.31 Score averages and distributions for Spaceship Damage . . . . . . . . 253 E.32 Probabilities of Spaceship Damage average scores for each speed profile255 xv List of Tables 2.1 Summary of literature review . . . . . . . . . . . . . . . . . . . . . . 34 3.1 Summary of system costs. . . . . . . . . . . . . . . . . . . . . . . . . 51 3.2 Target, average and max speeds and standard deviation results for 10%, 40%, 70% and 100% ETHD speed profiles . . . . . . . . . . . . 62 5.1 The modified Presence Questionnaire (PQ version 3.0) used in the VR Alien Shooter Game Experiment. . . . . . . . . . . . . . . . . . . . . 75 5.2 The modified User Experience Questionnaire used in the VR Alien Shooter Game Experiment. . . . . . . . . . . . . . . . . . . . . . . . 76 5.3 The design for levels of ETHD speed profile (% of max) resulting in the dependent variables for Immersion, Presence and User Experience scores. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 78 5.4 The Randomised Complete Block Design for treatments of ETHD Speed over all trials. This table shows an example of the randomised order across three different users. . . . . . . . . . . . . . . . . . . . . 79 5.5 Summary of Speed Variation Analysis Results . . . . . . . . . . . . . 90 5.6 Mean scores and standard deviation results for Perspicuity, Stimulation and Novelty . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 94 5.7 Summary of Results . . . . . . . . . . . . . . . . . . . . . . . . . . . 96 7.1 The j × k factorial design for levels of D and W resulting in the dependent variables for movement time, error rate and throughput. . 110 7.2 The simplified single factor design for ID treatments arising from D and W levels. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 111 7.3 The Randomised Block Design used for the different configurations of ID. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 112 7.4 Conditions leading to accept or reject the null hypothesis H0. . . . . 116 8.1 Circle configurations of the multidirectional tapping task for the No Haptics Condition . . . . . . . . . . . . . . . . . . . . . . . . . . . . 129 8.2 Metrics per ID before data cleaning for the No Haptics Condition . . 131 8.3 Circle configurations for the Table Haptics Condition . . . . . . . . . 140 8.4 Metrics per ID before data cleaning for the Table Haptics Condition 142 8.5 Metrics per ID before data cleaning for the ETHD Haptics Condition 152 xvi 8.6 Mean and standard deviation results for No Haptics, Table Haptics and ETHD Haptics Conditions . . . . . . . . . . . . . . . . . . . . . 158 8.7 Multiple pairwise-comparison results from Tukey’s Honest Significant Differences test for No Haptics, Table Haptics and ETHD Haptics Conditions throughput data . . . . . . . . . . . . . . . . . . . . . . . 163 8.8 Throughput per device setup . . . . . . . . . . . . . . . . . . . . . . 164 E.1 Mean scores and standard deviation results for 10%, 40%, 70% and 100% ETHD speed profiles on Possibility To Act . . . . . . . . . . . 214 E.2 Multiple pairwise-comparison results from Dunn’s test for 10%, 40%, 70% and 100% ETHD speed profiles on Possibility To Act . . . . . . 215 E.3 Ordinal Logistic Regression (OLR) coefficients for 40%, 70% and 100% speed profiles for Possibility To Act (in relation to the 10% profile) . 216 E.4 Mean scores and standard deviation results for 10%, 40%, 70% and 100% ETHD speed profiles on Quality Of Interface . . . . . . . . . . 219 E.5 Multiple pairwise-comparison results from Dunn’s test for 10%, 40%, 70% and 100% ETHD speed profiles on Quality Of Interface . . . . . 220 E.6 OLR coefficients for 40%, 70% and 100% speed profiles for Quality Of Interface (in relation to the 10% profile) . . . . . . . . . . . . . . . . 220 E.7 Mean scores and standard deviation results for 10%, 40%, 70% and 100% ETHD speed profiles on Attractiveness . . . . . . . . . . . . . 223 E.8 Multiple pairwise-comparison results from Dunn’s test for 10%, 40%, 70% and 100% ETHD speed profiles on Attractiveness . . . . . . . . 224 E.9 OLR coefficients for 40%, 70% and 100% speed profiles for Attractive- ness (in relation to the 10% profile) . . . . . . . . . . . . . . . . . . . 224 E.10 Mean scores and standard deviation results for 10%, 40%, 70% and 100% ETHD speed profiles on Efficiency . . . . . . . . . . . . . . . . 227 E.11 Multiple pairwise-comparison results from Dunn’s test for 10%, 40%, 70% and 100% ETHD speed profiles on Efficiency . . . . . . . . . . . 228 E.12 OLR coefficients for 40%, 70% and 100% speed profiles for Efficiency (in relation to the 10% profile) . . . . . . . . . . . . . . . . . . . . . 228 E.13 Mean scores and standard deviation results for 10%, 40%, 70% and 100% ETHD speed profiles on Dependability . . . . . . . . . . . . . 231 E.14 Multiple pairwise-comparison results from Dunn’s test for 10%, 40%, 70% and 100% ETHD speed profiles on Dependability . . . . . . . . 232 E.15 OLR coefficients for 40%, 70% and 100% speed profiles for Dependab- ility (in relation to the 10% profile) . . . . . . . . . . . . . . . . . . . 232 E.16 Mean scores and standard deviation results for 10%, 40%, 70% and 100% ETHD speed profiles on Total Aliens Hit . . . . . . . . . . . . 235 xvii E.17 Multiple pairwise-comparison results from Dunn’s test for 10%, 40%, 70% and 100% ETHD speed profiles on Total Aliens Hit . . . . . . . 236 E.18 OLR coefficients for 40%, 70% and 100% speed profiles for Total Aliens Hit (in relation to the 10% profile) . . . . . . . . . . . . . . . . . . . 236 E.19 Mean scores and standard deviation results for 10%, 40%, 70% and 100% ETHD speed profiles on Aliens Hit by Normal and Double Shots239 E.20 Multiple pairwise-comparison results from Dunn’s test for 10%, 40%, 70% and 100% ETHD speed profiles on Aliens Hit by Normal and Double Shots . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 240 E.21 OLR coefficients for 40%, 70% and 100% speed profiles for Aliens Hit by Normal and Double Shots (in relation to the 10% profile) . . . . . 241 E.22 Mean scores and standard deviation results for 10%, 40%, 70% and 100% ETHD speed profiles on Aliens Hit by Laser and Big Shots . . 244 E.23 Multiple pairwise-comparison results from Dunn’s test for 10%, 40%, 70% and 100% ETHD speed profiles on Aliens Hit by Laser and Big Shots . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 245 E.24 OLR coefficients for 40%, 70% and 100% speed profiles for Aliens Hit by Laser and Big Shots (in relation to the 10% profile) . . . . . . . . 246 E.25 Mean scores and standard deviation results for 10%, 40%, 70% and 100% ETHD speed profiles on Alien Bullets Hit . . . . . . . . . . . . 249 E.26 Multiple pairwise-comparison results from Dunn’s test for 10%, 40%, 70% and 100% ETHD speed profiles on Alien Bullets Hit . . . . . . 250 E.27 OLR coefficients for 40%, 70% and 100% speed profiles for Alien Bullets Hit (in relation to the 10% profile) . . . . . . . . . . . . . . . 250 E.28 Mean scores and standard deviation results for 10%, 40%, 70% and 100% ETHD speed profiles on Spaceship Damage . . . . . . . . . . . 253 E.29 Multiple pairwise-comparison results from Dunn’s test for 10%, 40%, 70% and 100% ETHD speed profiles on Spaceship Damage . . . . . 254 E.30 OLR coefficients for 40%, 70% and 100% speed profiles for Spaceship Damage (in relation to the 10% profile) . . . . . . . . . . . . . . . . 254 xviii List of Symbols The principal symbols used in this thesis are summarised below, and the first equation in which each symbol appears is given. f Cohen’s f (Effect size for One-Way ANOVA), Equation (5.1) pi ni / N, Equation (5.1) ni Number of observations in group i, Equation (5.1) N Total number of observations, Equation (5.1) µi Mean of group i, Equation (5.1) µ Grand mean, Equation (5.1) σ2 Error variance within groups, Equation (5.1) f2 Effect Size, Equation (7.1) R2 Goodness of fit, Equation (7.1) R Correlation Co-efficient, Equation (7.1) m1 Mean of group 1, Equation (7.2) m2 Mean of group 1, Equation (7.2) σ1 Standard deviation of group 1, Equation (7.2) σ2 Standard deviation of group 1, Equation (7.2) |m2−m1| Delta of means, Equation (7.3) d Effect Size, Equation (7.3) xix σ Standard deviation, Equation (7.3) pi ni / N, Equation (7.4) ni Number of observations in group i, Equation (7.4) N Total number of observations, Equation (7.4) µi Mean of group i, Equation (7.4) µ Grand mean, Equation (7.4) σ2 Error variance within groups, Equation (7.4) xx List of Publications 1 K. K. Padayachee, S. P. Levitt, and K. J. Nixon, “Determining Human Hand Performance with the Oculus Quest in Virtual Reality Using Fitts’s Law,” in SAICSIT Conference 2021. Proceedings of the South African Institute of Computer Scientists and Information Technologists September 2021, pp. 1–24, Pretoria: Unisa Press and Unisa School of Computing, Oct 2021. https://uir.unisa.ac.za/bitstream/handle/10500/28972/ SAICSIT%202021%20PROCEEDINGS.pdf . . . . . . . . . . . . . . . . . . . . 8 List of Videos 1 Encountered-Type Haptics for the Oculus Quest - Control Schemes: https: //www.youtube.com/watch?v=nQJ4EV1IXak . . . . . . . . . . . . . . . . . 6 2 Encountered-Type Haptics for the Oculus Quest - Detailed Explanation: https://www.youtube.com/watch?v=LxSI_9MnHPg . . . . . . . . . . . . . 6 https://uir.unisa.ac.za/bitstream/handle/10500/28972/SAICSIT%202021%20PROCEEDINGS.pdf https://uir.unisa.ac.za/bitstream/handle/10500/28972/SAICSIT%202021%20PROCEEDINGS.pdf https://www.youtube.com/watch?v=nQJ4EV1IXak https://www.youtube.com/watch?v=nQJ4EV1IXak https://www.youtube.com/watch?v=LxSI_9MnHPg xxi Nomenclature Acronym definitions are provided at the first occurrence of each chapter. 2D Two-dimensional 3D Three-dimensional ANOVA Analysis of Variance CAVE Cave Automatic Virtual Environment CNC Computer Numeric Control DAQ Device Assessment Questionnaire DOF Degrees of Freedom ETHD Encountered-Type Haptic Display HCI Human-Computer Interaction HMD Head-Mounted Display IPUXTP Immersion, Presence, User Experience and Task Performance JSON JavaScript Object Notation LED Light-Emitting Diode MRF Magnetorheological Fluids OLR Ordinal Logistic Regression PQ Presence Questionnaire RSD Robotic Shape Display RSE Relative Standard Error SUS Slater-Usoh-Steed xxii UEQ User Experience Questionnaire UAV Unmanned Air Vehicle UDP User Datagram Protocol UX User Experience VE Virtual Environment VET Virtual Experience Test VR Virtual Reality VRFS Virtual Reality Flight Simulator WYSIWYF What You can See Is What You can Feel 1 Chapter 1 Introduction The importance of Virtual Reality (VR) systems and the benefits they bring to humanity are becoming increasingly evident in the modern world. VR technology is advancing swiftly, and while many are familiar with its entertainment value, it is also often used in training applications across a variety of disciplines. This includes training on medical procedures, construction, Computer Numeric Control (CNC) milling machines, mining, aircraft and for space missions [1–7]. The quality of the training from these applications and the resulting task performance is affected by the degree of immersion and presence provided by the virtual training environment [8, 9]. Immersion is the extent to which a computer display can deliver the illusion of reality to the user’s senses, with a highly immersed user not questioning if their Virtual Environment (VE) is real or fake [10, 11]. Presence refers to a user’s sense of belonging in the VE, with a highly present user being more engaged and concerned about the virtual world that they are in, rather than the physical world surrounding them. The ideal VE provides a user with a high level of immersion and a strong sense of presence to create the perception that they are part of the virtual world [10]. This is achieved by stimulating the user’s senses with high quality information from the VR devices used. While the audiovisual aspect of VR systems has become increasingly mature, the touch component has been researched using many strategies, but is yet to reach the level required for full mental immersion. Haptics refers to the ability to perceive the environment via the sense of touch, an Encountered-Type Haptic Display (ETHD) being one of the approaches that provides high quality haptic feedback in VR [12,13]. Users interacting with a VE assume that the laws of physics in the real world still apply, with their state of immersion immediately being broken when they reach out 2 to touch a virtual object, only to find that their hand moves directly through it [11]. Haptic feedback is provided by vibration, movement restriction, air vortices, and magnetic forces of repulsion and attraction, while acoustic positioning allow users to feel the presence of virtual objects while still allowing them to move through these objects [14–18]. Passive haptics is a specific type of haptic feedback that uses real-world replicas of virtual objects in the VE to provide the realistic, solid feeling of objects that the user expects [19]. These physical replicas allow the user to feel real properties of an object, such as position, orientation, shape, texture, weight, temperature and rigidity. ETHDs improve haptic feedback by using devices that have a wide range of motion to track the user movement and correctly position passive haptic objects to correspond to their virtual representations in a VE [13]. The devices move accordingly to encounter a user interaction with a virtual object in order to provide haptic feedback. ETHDs come in many forms and have a variety of uses, including as industrial prototyping tools and for remote tele-operation activities such as surgical procedures [20]. Virtual experiences can be made more realistic, as ETHDs allow users to interact physically with the virtual objects that they see in the VE. When this added realism is applied to real world scenarios, ETHDs are useful aids in training simulations. More realistic simulations allow trainees to be better equipped for handling real-life situations, such as medical procedures and mining operations, compared to other methods such as manuals and training videos [21]. These types of realistic training simulations are reminiscent of the iconic scene from the science fiction motion picture The Matrix [22], in which the protagonist, Neo, is plugged into a VR simulation via a neural interface to learn martial arts. After a few seconds, Neo exclaims “I know Kung Fu.” While current technology is certainly far from this vision of the future, the use of VR and ETHDs in training applications provide improvements in the level of knowledge passed on to trainees. These visions of future technology depicted in works of science fiction stoke the imagination, which drives us to push technology forward to provide more realistic experiences. 1.1 Research Problem Latency in a system refers to the delay between an input and the desired response [23]. In the case of ETHDs, latency is the delay that results from processing, and the motion that occurs between a new instruction to change the position and when it has finally come to a stop at the new position. Pure, passive, haptic feedback objects 3 provide zero-latency interactions, as they are not in motion. When a user touches one of these passive haptic feedback objects, the interaction is immediate and realistic. However, ETHDs cannot provide zero-latency interactions as they are mechanical devices, and need to reposition objects rapidly in 3D space with as little latency as possible, to provide the sensation of interacting with a solid, immovable, virtual object [24]. A wide space to position these objects corresponds to a large virtual space to explore, with ETHDs providing the touch feedback needed for immersive VR, making them an attractive solution. However, consistency between ETHD capabilities varies widely, with only indus- trial quality robot arms being capable of fast movement speed, having the strong counterforces required to simulate solid virtual objects and the wide enough reach needed for an effective display, their cost making them unaffordable for consumer applications. Smaller robot arms will have far slower speeds, shorter reach and smaller counterforces, such that a user pushing on it would move the device itself. Yokokohji et al. developed a VR system with an ETHD that worked as intended, but noted that the Puma 560 robot used as the display was not able to keep up with users’ quick and erratic movements [25]. Gruenbaum et al. implemented an ETHD system using the four Degrees of Freedom (DOF) Adept 1 Manipulator robot arm and stated that their system was not adequate for VR training systems, as the users were easily able to move faster than it could track [24]. Araujo et al. developed their Snake Charmer ETHD system, which used the small Robai Cyton Gamma 300 robot arm [26]. As a commodity robot arm, this had limited velocity and spatial resolution, and was unable to provide the necessary counterforce when users pushed on it to simulate solid immovable objects. Other devices used as ETHDs, such as aerial drones, also share the same tradeoffs [27]. Creating a good ETHD system is therefore a balance between tracking speed, counterforce capability, reach and cost. One of the most common issues with ETHD systems is the slow tracking of user movements, with users needing to wait for the device to reach their position or being forced to match the slower speed of the device. This forced unnatural movement results in reduced immersion and presence and an overall poor experience [24]. If the user is required to perform tasks within the VE, such as in training applications, the reduction in quality of these factors results in reduced task performance. There are many studies on VEs, VR and haptics related to measuring Immersion, Presence, User Experience and Task Performance (IPUXTP) to evaluate the ef- fectiveness of virtual experiences [8, 9, 28–37]. UX refers to a person’s perceptions and responses from using a product, system or service [38], while user peripherals 4 are measured on task performance to determine usefulness of these devices to the user [39,40]. This task performance is expressed as throughput, which is a metric that combines a user’s speed and accuracy with a device. ETHDs, which are designed to complement VR experiences, are lacking in studies utilising these measurements, and have focused on measuring the physical capabilities of these devices, such as force feedback, speed, latency and tracking accuracy [24–27,41]. Studies are lacking on the utilisation of these methods to investigate the effects of ETHDs on IPUXTP in VEs. 1.2 Research Aim ETHDs can be improved to provide better experiences by understanding the rela- tionship of ETHD speed to IPUXTP, with empirical data aiding in the design of new displays, and comparing and improving on existing designs. Measuring the user’s experience is therefore vital to determine how effective an ETHD is. The relationships between the physical capabilities of ETHDs and the user’s perception and interactions in the virtual world need to be examined, to address the gap in literature regarding these relationships. The basis of this study theorises that ETHD speed directly influences the IPUXTP of users in VE’s, with empirical data of the required speed to match hand movement for nominal IPUXTP being required to build optimal displays. The aim of this study is: • as a primary goal, to investigate the effects of ETHD speed on IPUXTP in VEs and, • as a secondary goal, to design and build a low-cost, grounded, fixed platform ETHD with a focus on speed and user safety. Statistical analysis was used to determine if users were able to perceive the differences in IPUXTP in VEs for varying speed profiles of the ETHD, the results potentially contributing to the advancement of ETHD technology to improve the VR experience. 5 1.3 Research Scope An ideal ETHD is able to provide users with perfect interactions that are indistin- guishable from those experienced in real life. There would be zero latency for haptic feedback, the user would be able to completely explore an object’s shape and could also potentially be able to manipulate an object’s position and orientation. They would also be able to explore a virtual space with unlimited size in 3D with both hands or even their entire body. Some limitations were implemented to simplify the study, these being that the interactions are limited to the index finger of the user’s dominant hand, interactions being limited to 2D space, objects in the scene being fixed in position and orientation, and the user being able to explore the objects’ shapes but not able to move them around. The intention with the ETHD developed was a proof-of-concept that attempted to minimise latency and maximise user safety within the budget constraints of the project. This study required human volunteers, the number of participants required being determined using power analyses and reviewed for feasibility with regards to time. Two tests were conducted, the VR Alien Shooter Game Experiment having 25 participants, while the VR Fitts’s Test Experiment had 27 participants for the ETHD Haptics Condition sub-experiment, 26 for the Table Haptics Condition and 23 for the No Haptics Condition. 1.4 Research Question The research question is: What is the effect of Encountered-Type Haptic Display speed on Immersion, Presence, User Experience and Task Performance in VEs? The main research question can be broken down as follows: RQ1 What effects on immersion, presence and UX are observed in users as ETHD speed increases? 6 RQ2 What task performance is achieved with a high-speed ETHD and how does it compare to zero-delay conditions and previous studies? Two experiments were conducted to answer these questions, the first being the VR Alien Shooter Game Experiment that was conducted to answer RQ1, with the VR Fitts’s Test Experiment being conducted to answer RQ2. The Fast ETHD system that was custom built for these experiments, consisted of a high-speed hardware ETHD and VR software implementations of the ISO/TS 9241-411 multidirectional tapping task and a game reminiscent of the classic arcade shooter Space Invaders [42]. The jerk and acceleration configurations for the ETHD were kept constant to ensure that the focus for the research questions were on its speed profile configurations, which were set as percentages of its maximum speed. Videos Videos on these experiments can be viewed at the following URLs: Encountered-Type Haptics for the Oculus Quest - Control Schemes https://www.youtube.com/watch?v=nQJ4EV1IXak Encountered-Type Haptics for the Oculus Quest - Detailed Explanation https://www.youtube.com/watch?v=LxSI_9MnHPg 1.5 Dissertation Structure This dissertation is structured in the following manner: • Chapter 2. Literature Review: Virtual Reality and Encountered- Type Haptic Displays: A literature review is presented on VR technology, its use in training simulations and on haptic feedback for VR. It also focuses on passive haptics and a special case of passive haptics called encountered-type haptics which make use of ETHDs. Research is presented on what kind of speed capabilities are required of ETHDs to keep up with human hand movements. https://www.youtube.com/watch?v=nQJ4EV1IXak https://www.youtube.com/watch?v=LxSI_9MnHPg 7 • Chapter 3. The Fast ETHD System: The design of the Fast ETHD system that will aid the two experiments is presented, with the requirements and design decisions being detailed as well as the obstacles encountered during the development phase. The system features and operation are given followed by quantitative performance observations. • Chapter 4. Literature Review: Immersion, Presence and User Exper- ience: A literature review is presented on the concepts of immersion, presence and UX experienced by users in VEs, with existing tools for evaluating these concepts quantitatively being discussed. The Presence Questionnaire and User Experience Questionnaire were the final tools chosen to measure these concepts, their selection being based on their construct validity and internal consistency. • Chapter 5. VR Alien Shooter Game Experiment: A game experiment developed for this study, similar to the classic arcade shooter game Space Invaders, is presented to answer RQ1. Participants were required to play this game while getting haptic feedback from the Fast ETHD system. The ETHD speed was varied as the treatments and participants were required to answer modified versions of the PQ and UEQ for each speed profile. The results were used to determine the effect that ETHD speed has on immersion, presence and UX. The effect on game performance was also analysed, with experimental design, setup, methodology and procedures being presented, followed by the results, analysis and discussion on the observations. • Chapter 6. Literature Review: Task Performance: A literature review on human performance using Human-Computer Interaction (HCI) devices is discussed. The focus is on Fitts’s Law, which states that the time taken to move a pointer to a target is determined by the distance to the target and the size of the target. The ISO 9241-9 and ISO/TS 9241-411 multidirectional tapping task used to verify Fitts’s Law are discussed. Previous studies using these tests to determine throughput of devices are discussed with a focus on Fitts’s Law studies in VEs. • Chapter 7. VR Fitts’s Test Experiment Methodology: A second experi- ment was designed to answer RQ2, the VR Fitts’s Law Experiment making use of the ISO/TS 9241-411 multidirectional tapping task in a VE using the Fast ETHD system to provide haptic feedback. Throughput using Fitts’s Law will be determined for different haptic conditions. The experiment methodology, equipment, software, user interfaces, data capturing, data transformations, sample size estimations, participant selection, experiment procedures and data analysis applicable to each of these haptic conditions are provided. 8 • Chapter 8. VR Fitts’s Test Experiment Results: Three haptic conditions were used as treatments for the VR Fitts’s Law Experiment: the No Haptics, Table Haptics and ETHD Haptics Conditions. The experiment methodologies discussed in Chapter 7 are applied to each of these conditions, with the final results and discussions being presented for each haptic condition, as well as their respective final grand throughputs. The comparisons between the three conditions as well as to other devices and VE setups from previous Fitts’s Law studies are discussed. • Chapter 9. Discussion: Threats to validity are discussed for both exper- iments, as are their construct, internal and external validity, as well as a high-level interpretation of the experiment results. • Chapter 10. Conclusion: This chapter reviews the contributions made from this study, provides recommendations for future research, and makes final concluding remarks on the state of VR, as well as the reason it is necessary and beneficial to continue developing this technology along with haptic feedback technology. 1.6 Contributions Made The following contributions have resulted from this study: • The design of the 2D tabletop Fast ETHD system that provides encountered- type haptic feedback in a VE, which is focused on speed and user safety. • At the time of writing, this appears to be the first study that investigates how the speed of an ETHD affects user immersion, presence and UX. • At the time of writing, it appears that this is the first study done that aims to evaluate user task performance using an ETHD. Fitts’s Law and the ISO/TS 9241-411 multidirectional tapping task are used to determine the throughput of the Fast ETHD, which is compared to the zero-latency haptics conditions and other VE setups with haptic feedback from previous studies. • A paper titled Determining Human Hand Performance with the Oculus Quest in Virtual Reality using Fitts’s Law was submitted and accepted for the SAICSIT 2021 conference [43]. This paper presented the results from a preliminary study on the hand tracking feature of the Oculus Quest, the feature’s throughput being evaluated using the ISO/TS 9241-411 multidirectional tapping task. 9 A more detailed account of these contributions can be found in Section 10.2. 1.7 Theoretical Framework VR is effective for providing the user with a good experience when it delivers high quality information to their senses, with an important component being touch. There are many methods to integrate the haptic component of touch into the VR experience. A good VR experience presents users with a high level of immersion, a strong sense of presence and an overall excellent UX which will, in turn, result in better task performance within the VE. These concepts will be explored briefly here with more detailed literature reviews presented in Chapters 2, 4 and 6. The theoretical framework presented takes guidance from Osanloo and Grant [44]. 1.7.1 Haptics Many solutions have been established to enable haptics in the VR experience, each with its own advantages and disadvantages. There is currently no perfect haptic technology capable of completely fooling a user immersed in a VE into believing that they are able to physically interact with virtual objects while still having the natural sensations they are accustomed to in real world interactions. The most common haptic technology available is severely limited in providing realistic interactions with VR objects. The most common haptic technology for VR, VR gloves, even with vibrational capabilities or movement restriction techniques, merely alerts users of colliding with a virtual object, but when they attempt to push through it in the VE, they move directly through it, breaking the immersion. Passive haptics remedies this problem by providing users with real world replicas of virtual objects to interact with, which allow the user to feel real world properties of an object such as position, orientation, shape, texture, weight, temperature and rigidity [19]. However, passive haptics is limited because it can only provide a fixed number of objects in the VE. McNeely noted the problem of VR users passing through VR objects and the need to provide realistic feedback through “solid” virtual objects. McNeely proposes that an external robot arm unattached to the user’s body could provide users with the necessary forces when interacting with virtual objects on a just-in-time basis, which resulted in the proposal of Robotic Graphics [13]. Robotic Shape Displays (RSDs) 10 would employ these robot arms to track the user’s arm position and move to the corresponding real word position of a virtual object that the system anticipates the user to reach out to. The robot arm will then wait at this position until the user encounters it, hence the term Encountered-Type Haptics or Robotic Graphics, which brings a much needed dynamic component to VR to provide a more realistic experience [45]. A common flaw in previous ETHD designs using robot arms is that they were not able to track the user’s erratic motions quickly enough [24–26, 46]. This problem of the device not following the user fast enough forces the user to move slower and unnaturally for the device to catch up [24]. However, this is not ideal, as maximum immersion in a VE requires the user to be able to move naturally without having to focus on their movements. A detailed literature review on VR, haptics and ETHDs is presented in Chapter 2. 1.7.2 Immersion and Presence Immersion in a VE is defined as the extent to which a computer display can deliver an inclusive, extensive, surrounding and vivid illusion of reality to the user’s senses. Presence refers to the state of consciousness of the user as a feeling of being a part of the VE [10]. Mihelj and Podobnik state that mental immersion requires the user to be preoccupied with their existence within the VE to such an extent that they stop questioning whether it is real or fake [11]. Two measurement tools frequently cited in VR literature are that of the Slater-Usoh- Steed (SUS) Questionnaire by Slater et al. [35] and the Presence Questionnaire (PQ) by Witmer and Singer [32]. The SUS Questionnaire focuses on the psychological aspect of presence and has questions that evaluate the experience of users in VEs on their subjective experience of presence. The SUS Questionnaire queries the user about their sense of being a part of the virtual world, to the extent that the virtual world became their ‘new reality’. The PQ measures the degree to which users experience presence in a VE, with both having been used in many VR studies to measure the component aspects of immersion and presence [8, 9, 33]. A detailed literature review on immersion and presence as well as the SUS Question- naire and PQ measurement tools is presented in Chapter 4. 11 1.7.3 User Experience The definition of UX according to the ISO 9241-210 specification is “a person’s perceptions and responses resulting from the use and/or anticipated use of a product, system or service” [38]. Providing a good UX is important to ensure positive responses from users, being able to measure UX quantitatively in an efficient and reliable manner is therefore important to evaluate a product, system or service, the data being used to make improvements if required. Questionnaires have been developed to quantitatively evaluate UX according to these criteria, with Laugwitz et al. having developed the User Experience Questionnaire (UEQ), the questions being grouped into categories of attractiveness, perspicuity, dependability, efficiency, novelty and stimulation [37]. Chertoff et al. developed the Virtual Experience Test (VET), which measures the holistic experiences of users in VEs based on five elements of experience: sensory, cognitive, affective, active(personal) and relational(social) [47]. A detailed literature review on UX as well as the UEQ and VET measurement tools is presented in Chapter 4. 1.7.4 Task Performance Measuring task performance quantitatively is one way to effectively evaluate the usefulness of a device. The ISO 9241-9 specification and its follow-up, ISO/TS 9241-411, focus on evaluating interactive computer pointing devices, such as mice and joysticks, by measuring task performance when using these devices [40, 48]. The ISO/TS 9241-411 2D multidirectional tapping task is a test that makes use of Fitts’s Law to quantitatively evaluate task performance on the metric of throughput. Fitts’ Law is a model that is used to predict human movement towards a target and has been widely used to study relationships for HCI. Specifically, it states that the time taken to move towards a targeted area is related to the size of the target and the distance to it. The ISO/TS 9241-411 2D multidirectional has been widely used to evaluate pointing devices and tasks. A detailed literature review on task performance as well as the ISO/TS 9241-411 multidirectional tapping task is presented in Chapter 6. 12 1.8 Conceptual Framework The conceptual framework presented takes guidance from Maxwell, with the process of a user interacting with a VE complemented by an ETHD being depicted in Figure 1.1 [49]. The user reaches out to touch a virtual object in the VE and the ETHD positions a real world analogue in the corresponding position for the user to physically feel, as depicted in Figure 1.2. The ETHD needs to move fast enough to position the real world object for the user to encounter to prevent a noticeable delay. Therefore, the speed of the ETHD has an effect on the quality of the haptic feedback delivered. The audiovisual feedback from the VR headset worn by the user combined with the haptic feedback component delivered by the ETHD directly affect the immersion, presence and UX experienced by the user in the VE. If the user is performing a task within the VE, as is normally done in games and training applications, the task performance is affected by the previous components in the process flow. User Touch Feedback Encountered-Type Haptic Device Visual/Auditory Feedback Virtual Scene Virtual Objects Real World Analogues Position Position Touch Interaction Immersion, Presence, User Experience ETHD Speed Task Performance Response Time Figure 1.1: Inputs, outputs and process of an interaction with a VR scene comple- mented by an ETHD. 13 Figure 1.2: A typical scenario where the user reaches out to touch a virtual object in the VE and the ETHD positions the correct real object for haptic feedback. 1.8.1 Factor Overview Starting with the user, the touch interaction is the first input factor in the process, being multifaceted in that it is affected by many variables. Large variations between users can occur in terms of perception of visual and touch feedback as well as hand-eye coordination abilities, with age, health, mental and physical abilities affecting hand movement speed. Previous experience with computers, VR and video games would have an effect on how comfortable users are in interacting with VEs and performing tasks within them. Owing to the large variability affecting touch interaction, these will be considered as confounding variables, with an attempt to reduce them, as well as learning and order effects, being made in this study. To reduce these effects, participants were trained to use the equipment and allowed to use it freely for a short while to become comfortable with it, with randomisation of the treatment order being implemented to further reduce the effects of confounding variables. Breaks were intermittently held during the experiment to prevent fatigue, which would affect the results [50]. The audiovisual quality of the virtual scene has a direct impact on the experience of the user, with more realistic graphics and sounds improving the experience in the virtual world. A high quality display to impart this information was necessary to provide the best experience, the study aiming to produce a simple high quality virtual scene with a high quality VR headset for display, their quality remaining constant and not regarded as a factor. The quality of the touch feedback delivered by the ETHD is determined by its speed 14 in responding to changes in user movement, force feedback capability, physical size and reach, which correspond to the virtual volume that the user can explore. For this study, the force feedback, size and reach of the ETHD will be fixed, with the ETHD speed being the main variable determining the quality of the display, as it is directly involved in responding to changes in user movement and repositioning objects for them to encounter. Therefore, the ETHD speed will be regarded as an independent factor, and designed to be as accurate as possible, due to its mechanical nature. It can be assumed that there will be small fluctuations in the intended speed as well as the accuracy of positioning real objects corresponding to the virtual counterparts, the errors in speed and accuracy being regarded as confounding factors. User hand movement speed must be considered as the ETHD must be able to keep up with their movements. Based on previous research, the minimum speed for the ETHD to operate to provide an adequate experience in the VE is 0.5m/s, with no limits being imposed on the maximum speed [51–56]. However, for practicality and cost reasons, 1m/s should be acceptable based on maximum human movement speed, with acceleration to reach intended speeds being as high as possible within practical limitations. The immersion, presence and overall UX experienced by the user depends on the audiovisual quality from the virtual scene and the quality of the touch feedback from the ETHD. These are important factors to consider to evaluate the system as a whole, with immersion, presence and UX being regarded as dependent factors. The PQ and UEQ were chosen as the most appropriate tests for this study since these were specifically designed to evaluate these concepts. Users were asked to perform a task and their performance evaluated, this being necessary to evaluate the usefulness and usability of the ETHD. This is also an important metric that can be used as a measure to aim for in designing ETHDs. Task performance depends on the factors of immersion, presence, UX, ETHD speed, and audiovisual and touch feedback quality, and was regarded as a dependent factor. An appropriate test is required that can evaluate task performance correctly, with the ISO/TS 9241-411 multidirectional tapping task being used. This requires users to interact with targets, arranged in a circle, that vary in size and distance from each other, with target size and distance being regarded as independent factors. The ETHD ran at its maximum stable speed, with results being compared to zero-latency conditions, as done in previous studies. Teather et al., and Joyce and Robinson, used conditions for mid-air interactions with no haptics and a flat surface, the haptics 15 condition being considered an independent factor [29,31]. 1.8.2 Factors of Interest for this Study The following factors are chosen to be investigated in this study: • Independent factors – ETHD Speed: the main independent factor of interest as it will directly affect the user experience. – Target Size: essential for the ISO/TS 9241-411 multidirectional tapping task. – Target Distance: essential for the ISO/TS 9241-411 multidirectional tapping task. – Haptics Condition: essential to compare the ETHD results to zero-latency conditions. • Dependent factors – Immersion: a rating of the quality of the information displayed to the user to create the illusion of the virtual world [10]. – Presence: a rating of the ability of the system as a whole to make the user feel like they are a part of the virtual world [10]. – User Experience: a rating of the user’s overall perception and response to the VE they are interacting with [38]. – Task Performance: measured as throughput in bits per second, is crucial in determining the usefulness and usability of the ETHD and will also be used to quantitatively compare it to other pointing devices [40,48]. The other factors of perception of visual and touch feedback, hand-eye coordination abilities, age, health, mental and physical abilities, previous experience with VR and video games, as well as errors in ETHD speed and positional accuracy were regarded as confounding factors. The experiments for this study implement a Randomised Complete Block Design to reduce the effects of these factors and any others that were unknown. Two experiments were conducted: 16 • VR Alien Shooter Game Experiment to determine the relationship of ETHD speed to immersion, presence and UX, with variable ETHD speeds. • VR Fitts’s Test Experiment keeps the ETHD speed at maximum and varies the haptic condition to determine the task performance of each condition and how they compare to each other. 1.9 Significance of the study The findings of this study will assist in ensuring a more realistic VR experience. With VR becoming increasingly accessible to consumers and used more extensively in training applications, among others, effective haptic technology will be the next step to ensuring a realistic experience. The ETHD design proposed for this study is intended to be able to provide effective tracking speeds to allow the user to move naturally without having to adapt to the limitations of the device. ETHDs have the potential to provide a more dynamic experience by allowing users to interact with solid feeling virtual objects. VEs could change often as the ETHD would be able to accommodate these changes and multiple VEs with different layouts. To achieve the solid feeling of virtual objects, the device should quickly track user movements, provide effective counterforce to simulate solid objects, and provide enough reach so that the user has a wide virtual space to explore. VR training applications would greatly benefit from a reliable ETHD that can keep up with users for a seamless experience, which would improve the quality of the training. Questionnaires for rating a user’s immersion, presence and UX and tests for measuring task performance on a device are readily available and have been used in other VR studies. Therefore, these tools should also provide great insight into the use of ETHDs [32,37]. This study should show that evaluating ETHDs on these concepts empirically will be beneficial in the design of future ETHDs and improvements on existing designs. Having empirical data on how users interact with ETHDs and how these devices affect their overall experience gives far more insight into the use of these devices than just their physical capabilities. This design, while not being able to provide as many DOFs as robot arms, is focused on speed and positional precision, the intention being to provide an effective ETHD experience while potentially being far more cost effective than expensive robot arms. Another factor limiting the widespread use of ETHDs is the cost of the devices. The design of the ETHD proposed for this study will incorporate linear actuators and 17 stepper motors instead of the common use of robot arms. The results are intended to improve the performance of ETHDs, which will enable VEs with high quality haptics to provide higher levels of immersion and presence. ETHDs with fast tracking speeds can provide high quality haptics, while a low- cost, high-speed ETHD would enhance VEs, especially in training applications. Therefore, a low-cost, high-speed ETHD would increase the immersion, presence and UX experienced in VR training applications and improve the task performance of trainees. 18 Chapter 2 Literature Review: Virtual Reality and Encountered-Type Haptic Displays This chapter provides a literature review on the concepts of Virtual Reality (VR) and how effective it is in aiding training simulations. Haptic feedback is then discussed and how it improves the VR experience, and divided into its specialised forms of passive haptics and encountered-type haptics which makes use of Encountered-Type Haptic Displays (ETHDs). These devices should be able to keep up with human hand movements to provide haptic feedback. Studies on human hand movements are evaluated to determine what speeds are acceptable for ETHDs to accomplish this goal. VR setups that use ETHDs require tracking of human hand movements, with different setups being discussed to determine which will be the most appropriate for this study. 2.1 Introduction Virtual Reality (VR) is a technology that, until recently, was limited to well-funded research groups, the general public mainly regarding it as belonging in the realm of science-fiction. Today it is readily available to be used for entertainment in homes and through cellphones, it’s versatility enabling it’s use in training, design and tele-operation. It is increasingly being used in training applications, which now occurs across a variety of disciplines from factory machinery to astronauts for the International Space Station [57,58]. This study recognises the importance of haptic feedback to improve the immersion 19 and overall experience of the user interacting with a Virtual Environment (VE). Touch feedback brings a new element to the virtual experience and is vital for pushing the technology forward towards the ultimate goal of the user not being able to distinguish the experience from reality. However, there are many different methods of providing this touch feedback with varying degrees of effectiveness. In choosing the type of haptic technology to build and study, it is important to compare the different methods of touch feedback available, their advantages and disadvantages, and importantly, how effectively the technology is able to convey this to the user. The following review of literature will look at what qualities are ideal for this touch feedback to have to be effective for the user’s experience. It will then review a few different implementations of haptic feedback, and their effectiveness in conveying touch feedback. The passive haptics implementation in particular will be evaluated and evidence presented as to why this was chosen as the best type to explore. Focus will be given to Encountered-Type Haptic Displays (ETHDs) due to this implementation’s ability to bring a much-needed dynamic element to the otherwise static behaviour of passive haptics. One of the goals of this study is to produce a low-cost high-speed ETHD that is capable of keeping up with normal user hand motions and investigate its effects on the VR experience. An understanding of the speeds a human hand is capable of achieving must first be established to effectively determine how fast the ETHD needs to be. This study also aims to make the virtual experience as natural as possible to participants, one way being to allow the user to interact with the VE with their hands rather than with controllers. Many methods are available to allow this type of interaction with varying degrees of success, these being evaluated to determine which would be the most appropriate for this study. 2.2 Virtual Reality and Training Advanced systems with VR Head-Mounted Displays (HMDs) that were once only found in research initiatives such as the NASA Ames VIEWlab Project can now be found in people’s homes [59]. The Sony PlayStation VR, Oculus Rift and Quest and HTC VIVE are examples of currently available commercial VR products that allow users to be immersed in VEs [60–63]. These are often used for entertainment purposes in the form of video games, the many applications including tele-presence, 20 prototyping, music and the tele-rehabilitation of patients [64–67]. VR is also used in training applications, it’s extensive use being due to it sometimes being more effective than traditional training methods. Chao et al. noted that ideal training performance can be observed from trainees under conditions for proper mental workload [21]. They observed that that VR is a more effective training tool than traditional methods such as technical manuals and multimedia films, while performance measurements from trainees showed that it was the best training method, especially for complex tasks. Chao et al. correlate this to the lower mental workload observed with VR compared to technical manuals and multimedia films. Lin et al. investigated training operators of Computer Numeric Control (CNC) milling machines with VR, and found a significant reduction in training time in VR versus reading the manual for the machine [1]. The number of errors on real CNC machines after the training was much lower in the VR training group than the group that read the manual, and the transfer of skills in the former was significant, particularly regarding procedural skill tasks. Sacks et al. compared safety training of construction workers with traditional methods versus VR and found that there was a significant advantage when being trained on the latter [2]. They noted that workers in the VR training group were able to maintain full focus during the training compared to the traditional method group which tended to lose concentration and needed breaks. Van Wyk and de Villiers assessed the use of VR training for safety in the mining environment and stated that it is far cheaper than building real full-scale training environments [3]. It can be used to expose users to situations that would normally be too dangerous to interact with in real life, this type of training also allowing for a wide variety of scenarios to expose trainees to. A laparoscopy is a surgical procedure that is done to inspect the organs within the abdomen by inserting a tiny camera. A study by Grantcharov et al. showed that trainees who had VR training performed laparoscopic cholecystectomy faster, with few errors and fewer unnecessary movements than those in the control group who had no extra training [4]. Larsen et al. had similar results with trainees being trained in VR and then performing laparoscopic salpingectomy [5]. They also noted that, based on a rating scale, the VR trainees displayed scores equivalent to the experience gained after 20-50 laparoscopic procedures compared to the control group, whose scores were equivalent to the experience gained from less than five procedures. 21 Training for human spaceflight with VR goes back decades, with Loftin and Kenney training the ground-support flight team on the extravehicular activities planned using VEs for the Hubble Space Telescope repairs after its initial launch [7]. The survey results from the trainees showed that by being initially exposed to the objects, environments and procedures that they would be working with, the training had a positive effect on their job performance during the actual mission. During the first few days of spaceflight, many astronauts experience the medical conditions known as Space Motion Sickness and Spatial Disorientation, which are caused by conflicts between the senses in a microgravity environment, unlike under normal gravity conditions where these senses are in agreement, and usually results in reduced activity. Stroud et al. found that those who had variable orientation training in a VR simulator not only displayed faster task completion times than those with fixed orientation in a follow up session, but that they also reported far less motion sickness than the fixed orientation group [68]. The results suggested that this type of training in a VR simulation can be an effective tool for spaceflight. Aslandere et al. created a Virtual Reality Flight Simulator (VRFS) by integrating the commercial flight simulator software X-Plane, with the VRFS showing promise of being used for pilot training in the future [6]. Task performance in this system was reduced due to the users not knowing when they were touching the virtual buttons, thereby increasing the miss rate. Aslandere et al. concluded that the solution to this problem would be to use a physical mockup of the virtual cockpit, but that this would be expensive and make the VRFS dependent on a particular aircraft type. In a follow-up study where the volume of target buttons was varied to accommodate this problem, the authors reported that a physical mockup of the virtual cockpit would be a solution to this problem [69]. The positive effects that VR can have on training is evident from these studies, with advances having many more potential benefits for training and many other applications in VR. 2.3 Haptics As mentioned by the studies conducted by Aslandere et al., task performance in their systems could have been improved by including a physical mockup of the virtual cockpit, implying that the touch component in a VE is also important. The sense of touch is important as it provides the information that is needed to interact with the environment and others, and is therefore essential for human survival [70]. Being able to interact with objects in a VE combined with the ability to actually feel their 22 presence brings in a level of immersion that feels far more natural to the user [71]. Hayward et al. state that the word haptics refers to the ability to perceive the environment via the sense of touch [12]. Haptic devices allow the user to feel virtual objects, as though they were directly encountered, with various methods having been used to provide tactile feedback, depending on the interaction involved [70]. The sensation of grasping objects in VR is usually achieved by exerting forces on the fingers to restrict movement via a VR glove. The Rutgers Master II-ND glove provides force feedback to the fingers via pneumatic actuators that are attached to the palm and fingers [72]. CyberGlove System’s Cybergrasp and Dexta Robotic’s Dexmo [14, 73] gloves use electrical actuators located over the hand. Magnetorheological Fluids (MRF) are fluids composed of a base liquid and magnetic powder, and are able to change viscosity when a magnetic field is applied [74]. The Magnetorheological Actuated Glove Electronic System (MRAGES) was created to show that MRFs could be used to provide haptic feedback [75]. This glove applies the magnetic field to the MRF by activating an electromagnet, which increases the viscosity of the fluid to create the stiffness required to restrict the movement of the fingers. Haptic methods that do not restrict movement include the use of touchless sensing techniques to feel the presence of a virtual object. Neurodigital’s GloveOne and the Manus VR have vibrotactile actuators that are embedded in the fabric of the glove to provide vibrations on the back of the hand when there is contact with a virtual object [16]. Disney’s AIREAL negates the need for VR gloves by firing directed air vortices at the user, which gives the user temporary gentle force feedback as the vortices are short lived [18]. Some haptic systems such as Ultrahaptics, Haptomime, Haptoclone and Mistable eliminate the need to use a VR glove by using 2D phased arrays of ultrasound transducers to create force using acoustic radiation [76]. This creates many points in the air that are individually perceivable, thus providing tactile feedback, this method having been expanded to create discernable 3D shapes in mid-air. The FingerFlux project is focused on touchscreen haptics and expands on the ultrasound haptic feedback methods by using a 2D array of electromagnets. The user’s finger has a narrow, cylindrical, permanent magnet attached to it that enables interaction with the grid of electromagnets [17]. When they interact, the user is able to feel attraction, repulsion, vibration and directional haptic feedback. When an electromagnet’s polarity is set to repulse the fingertip’s permanent magnet, the user can immediately feel the counterforce at that position. Butterfly Haptic’s Maglev 200 System is essentially a magnetically levitated mouse that allows six Degrees of 23 Freedom (DOF) movement to manipulate objects in a VE, the base of the mouse that provides the electromagnetic repulsion being desk mounted to limit the range of movement [77]. The above methods of haptics presented are examples of active haptics, as the actuators in these devices controlling the feedback is done by computer, based on the interactions between the user and the VE [30]. However, there are clear limitations to the effectiveness of these methods in providing haptic feedback, as they still allow users to pass directly through virtual objects. This is not an accurate representation of real life where objects are mainly solid and cannot be passed through in a ghost-like fashion. Worn-type devices such as VR gloves feel unnatural to wear and those that project forces on the user do so too weakly to feel realistic. 2.4 Passive Haptics When users interact with a VE, they automatically assume that it will obey the laws of physics as experienced in the real world. However, one of the most common unnatural property of VEs is not being able to actually feel virtual objects via sensory touch, enabling users to pass directly through them [33]. Users merely have to reach out to grab a virtual object and feel nothing but air to quickly re-ascertain that they are in a simulation [26]. This unnatural property is the same reason why users could not reliably tell when their hands were touching the virtual buttons in the study done by Aslandere et al., who concluded that a physical mockup of the virtual cockpit would have solved this problem, this being an example of passive haptics [6]. Passive haptics is a special type of haptic feedback that remedies this issue by using real world objects that are either exact replicas of the virtual objects or low fidelity versions for the user to interact with. Passive-haptic devices provide feedback to the user due to their shape and texture, as well as other physical properties such as rigidity, mass and temperature [19]. A common implementation of passive haptics is to recreate a real world tactile version of the VE that users will interact with such as the physical cockpit mockup suggested by Aslandere et al. Joyce and Robinson investigated user interactions with a passive haptics setup they called the “Rapidly Configurable Research Cockpit”, which was used to simulate cockpits for spacecraft and aircraft [31]. This setup consisted of a rectangular panel with 3D printed buttons attached, with users interacting with it with and without this passive haptics setup, being measured on task performance 24 throughput as well as reported presence and fatigue. The results from this study concluded that including passive haptics for feedback in VR simulations improved the users’ feeling of presence, which in turn reduced training duration and improved overall task performance. Insko conducted a study to measure user presence with passive haptics in two different VEs, the first had users stand on the edge of a board overlooking a pit below with a real elevated wooden board for haptic feedback [33]. The second VE had users navigate a maze with passive haptics objects made out of polystyrene bricks and cardboard. The results from the first VE with the pit showed that users felt an improved sense of presence with the haptic feedback, with even a significant rise in heart rate reported, while the second VE with the maze investigated their feeling of presence, as well as training transfer. Users who were initially trained with the passive haptics setup displayed faster navigation times and fewer collisions than those trained without the haptic feedback, showing that spatial knowledge transfer was better when using haptic feedback. Cheng et al. developed a “Sparse Haptic Proxy”, which was a passive haptics setup that consisted of a sparse set of primitive geometric shapes arranged in a section of a sphere, with a range of 120° [78]. Instead of the usual passive haptics scenario of modelling the real world setup to mimic the VE exactly, this setup was designed to provide haptic feedback for many different VEs varying in size and shape. For this study the VEs were a regular table with objects and a spaceship cockpit, the idea being to use a position remapping technique called Haptic Retargeting that manipulates the user’s hand-eye coordination while reaching out to the target in the VE. The software could redirect the user’s hand to a physical proxy of the target on the Sparse Haptic Proxy setup, despite this not correlating to a one-to-one mapping of position in the VE and real world setup. The scenario enabled users to perform actions that were common in many virtual experiences, with users reporting that they preferred less redirection in their movements when moving toward targets in the VE. Han et al. also investigated the remapping techniques of Translational Shift and Interpolated Reach, the focus being on using one single passive haptic object to represent many virtual objects and altering user position perceptions when reaching out to these virtual objects [79]. They noted that unless a tactile Passive Haptic setup is custom-made to match a VE, matching physical and virtual objects perfectly is limited. The remapping techniques stem from studies that indicated that a perfect physical representation of a virtual object is not completely essential to maximise 25 the sense of presence for users interacting with a VE. Studies have shown that although not perfect, the visual sense can override the proprioceptive senses (sense of body position), and in the case of physical passive haptic objects not perfectly representing the corresponding virtual objects, allows for some spatial warping in the virtual space to go unnoticed. Han et al. note that there will be cases where the mismatch between the physical and virtual object would be large, to the extent that these techniques no longer become viable for realistic interactions. In their study employing these remapping techniques with users reaching out to virtual objects and their corresponding real world object, it was noted that, especially when using the Interpolated Reach technique, reach time and errors increased significantly after an observed limit to the distance offset was reached. Zenner and Krüger created Shifty, a dynamic passive haptic feedback device that can dynamically change its rotational inertia to give users the feeling of different weights and forms [30]. It does this by using a motor driven linear actuator inside a hollow cylinder to move a weight along its length. When the device is held, this allows the user to perceive the object changing over time with regards to its mass profile, such as an extending telescope, and also to perceive objects with a fixed mass corresponding to the internal weight’s position, such as a cube at the end of a handle. Zenner and Krüger showed that Shifty could be used as a VR controller to greatly enhance a user’s perception of mass and inertia in a VE. While the haptic retargeting techniques mentioned do offer a decent alternative to providing haptic feedback, it was observed that users produce more errors when they are required to move larger distances, preferring shorter distances when using these techniques, thus limiting their use. From the other passive haptics studies mentioned, it appears as though a one-to-one mapping from the VE to the haptic display is the most ideal situation. Users prefer these setups, with the results showing that training and task performance are improved using these one-to-one passive haptic displays. However, they still have limitations as they are static and would be expensive to build for each unique VE, the need to be designed to fit a display accurately, limiting their variations, with a much-needed dynamic element being required for these displays to fit changing VEs. 2.5 Encountered-Type Haptic Devices Many of the studies on haptics mention the importance of McNeely’s proposal of “Robotic Graphics”, the limitations of VR with users passing through virtual objects 26 emphasising the need for realistic haptic feedback [13]. McNeely proposed the Robotic Shape Display (RSD), where a robot would simulate hard immovable objects, as seen in a VE, by moving a corresponding shape to the correct position and locking its brakes, with viscosity and elasticity also being simulated this way. This is called encountered-type haptics as the RSD waits for the user to “encounter” its objects. McNeely stated that this display would work best in VEs where objects of fixed size appear repeatedly, such as the case of layouts with many buttons, knobs and switches. The RSD requires the robot to track users quickly and accurately, which would make an ideal implementation require expensive equipment to implement. Errors in robot movement or slow speeds could also result in users getting hurt due to collisions, with the solution requiring safety systems to protect users. McNeely proposes that the robots used should only exert enough force for the RSD purposes and that the objects presented for interaction should be made from light material such as polystyrene. Software for the VE should employ accident-avoiding techniques, but must only be regarded as a secondary line of defense, as software can be unreliable. RSDs, also known as Encountered-Type Haptic Displays (ETHDs), come in a variety of forms, with Mercado et al. categorising them as grounded and ungrounded types [20]. Grounded type ETHDs are fixed to the floor or a base, and can be robotic arms or fixed platforms which are constrained to a limited workspace that users can explore and interact with. The actuators of these grounded types will move accordingly to encounter a user interaction with a virtual object to provide haptic feedback. Ungrounded ETHDs comprise of Unmanned Air Vehicles (UAVs), such as drones, mobile platforms that move on surfaces, and wearables that are usually attached to the user’s hand. These are unconstrained and offer a potentially limitless workspace to interact with. Yokokohji et al. implemented an ETHD called the What You can See Is What You can Feel (WYSIWYF) display [25]. They noted that McNeely classified haptic displays as worn-type (e.g. VR gloves), held-type (e.g. VR game controllers) and encountered-type, with the latter not requiring the user to have the haptic device on their body at all times [13]. They adopted the encountered-type approach and used a Puma 560 robot to provide the haptic feedback for their VE. This robot had a single square panel that had its orientation changed based on how the user decided to interact with a virtual cube. The square panel was repositioned so that the user co