Baanbrekende astronautentraining: VR en robotica brengen gewichtloosheid tot leven

Maties
Claesen

Stel je voor dat je in de ruimte zweeft en een ingewikkeld apparaat moet repareren. Elke beweging die je maakt, zorgt ervoor dat je gereedschap in alle richtingen begint te draaien. Zonder zwaartekracht wordt zelfs de simpelste handeling een lastige en soms gevaarlijke klus. Astronauten moeten eerst leren omgaan met deze zwevende objecten voordat ze aan hun missie beginnen. Traditioneel gebeurt dit in dure trainingscentra zoals Neutral Buoyancy Pools, waar ze onder water trainen om het gevoel van gewichtloosheid na te bootsen. Maar nu is er een revolutionaire doorbraak: met Virtual Reality (VR) en robotica kunnen astronauten op een veel goedkopere en makkelijkere manier de omstandigheden van de ruimte en gewichtloze objecten oefenen, zonder kopje onder te moeten gaan.


The ZeroTraining system and how it integrates VR and robotics. The user can both see the tool and feel it at that position.

Voor meer visualisaties, bekijk ZeroTraining Prezi (https://prezi.com/view/gBCyICHb10f5wa1Ww9PQ/).

Hoe Virtual Reality de Ruimte-ervaring naar de Aarde Brengt

In dit onderzoek hebben we een slimme combinatie van virtual reality (VR) en robotica ontwikkeld om het gevoel van gewichtloosheid te simuleren. Dit helpt astronauten om op aarde te oefenen met gesimuleerde gewichtloze objecten. Het Europese Ruimteagentschap (ESA) werkte samen aan dit systeem, genaamd ZeroTraining, dat uit twee belangrijke onderdelen bestaat: ZeroPGT en ZeroArm.

ZeroPGT is een VR-app die ruimtewandelingen nabootst. Je kan met de Pistol Grip Tool (PGT)—een gereedschap dat lijkt op een schroevendraaier—omgaan alsof je echt in de ruimte bent. In VR zweeft dit object alsof het gewichtloos is, en jij kunt het als gebruiker aanraken en wegduwen.

Maar alleen VR is niet genoeg om de echte sensaties van gewichtloosheid na te bootsen. Daarom is er ook ZeroArm. Deze robotarm zorgt ervoor dat je niet alleen het virtuele gereedschap ziet, maar het ook echt kunt vastpakken. Terwijl je in de VR-wereld naar een object grijpt, plaatst de robotarm een echt object in je hand op precies het juiste moment en op de juiste plek. Zo krijg je een realistische aanrakingservaring, wat je helpt om je beter voor te bereiden op echte ruimtewandelingen.

Een realistische ruimtewandeling creëren in VR

De ZeroPGT-app biedt een ongelooflijk levensechte ruimtewandeling. Met behulp van een gedetailleerd model van het Internationaal Ruimtestation (ISS), geleverd door NASA, worden gebruikers ondergedompeld in een realistische ruimteomgeving. De adembenemende Melkweg,  vol met fonkelende sterren en verre planeten, vormt de achtergrond en draagt bij aan de authenticiteit. Bovendien gebruikt het systeem geavanceerde gewichtloze touwsimulaties, waardoor gereedschappen, die tijdens de ruimtewandeling vastgebonden zijn, zich precies gedragen zoals ze zouden doen in de gewichtloosheid van de ruimte. Het heen en weer trekken van zo’n virtueel touw aan de PGT geeft je het gevoel alsof je echt in de ruimte bezig bent.

Een robot partner die perfect zijn rol speelt

Hoe weet de robotarm waar hij naartoe moet bewegen? Het geheim zit in inverse kinematica. Dit proces berekent precies hoe elk deel van de arm moet bewegen om overeen te komen met de positie van virtuele objecten. Het is vergelijkbaar met hoe je hersenen uitrekenen hoe je schouder, elleboog en pols moeten samenwerken om een koffiekopje op te pakken. ZeroArm doet precies hetzelfde en zorgt ervoor dat het object altijd op de juiste plek staat, zoals in de virtuele wereld.

Om de interactie nog soepeler te maken, is de robotarm uitgerust met 3D-geprinte onderdelen, die een VR-controller kunnen vasthouden. Bovendien kunnen, indien nodig, ook andere gereedschappen aan de robotarm worden bevestigd. Magneten maken het eenvoudig om de controller snel vast of los the maken van de robotarm tijdens de simulatie. Dit zorgt ervoor dat het gevoel van het gereedschap dat je gebruikt, realistisch wordt nagebootst aangezien je met je eigen hand de contouren van het gereedschap kan voelen.

Veiligheid voorop

Een belangrijk aspect van het ZeroTraining-systeem is de nadruk op veiligheid. De robotarm is zorgvuldig geprogrammeerd om langzaam en voorzichtig te bewegen, waardoor het risico op ongevallen wordt verminderd. Ingebouwde veiligheidsprotocollen zorgen ervoor dat de robot onmiddellijk kan stoppen als er iets misgaat. Dit betekent dat astronauten in opleiding zich volledig kunnen onderdompelen in de virtuele omgeving, met het vertrouwen dat het systeem geen gevaar voor hen vormt.

Testen, testen, testen

Tot nu toe is het ZeroTraining-systeem getest met niet-experts, en hoewel het nog verfijning nodig heeft, is de feedback overweldigend positief. Testdeelnemers gaven aan dat de ervaring opmerkelijk meeslepend was en een realistisch gevoel gaf van interactie met zwevende objecten in gewichtloosheid. ESA is ook enthousiast over het ontwikkelen van een volledig trainingssysteem dat op vergelijkbare wijze astronauten zou kunnen trainen.

“Dit is erg indrukwekkend en cool” - Ingenieur bij ESA


De toekomst van training

Hoewel dit onderzoek aanvankelijk werd ontwikkeld voor ruimtevaarttraining, gaan de mogelijke toepassingen verder dan alleen de sterren. Stel je voor dat chirurgen in VR trainen om complexe operaties uit te voeren, met robotsystemen die hen in realtime feedback geven. Ingenieurs zouden delicate apparatuur kunnen hanteren in gevaarlijke omgevingen zonder ooit ter plaatse te hoeven zijn. De VR-gamingindustrie zou deze robotarm ook kunnen gebruiken om games nog meeslepender te maken. Naarmate VR en robotica zich blijven ontwikkelen, zijn de mogelijkheden eindeloos.
 

Met ZeroTraining krijgen VR en robotica een revolutionaire rol in de astronautentraining. Dit innovatieve systeem biedt een kosteneffectief en realistisch alternatief voor de traditionele methoden, waarmee een enorme vooruitgang wordt geboekt in zowel ruimteverkenning als andere sectoren die complexe taken in uitdagende omgevingen vereisen. Terwijl de technologie zich blijft ontwikkelen, zou deze krachtige combinatie van VR en robotica wel eens kunnen uitgroeien tot een gamechanger voor industrieën wereldwijd, waardoor we binnenkort spectaculaire veranderingen kunnen verwachten in hoe we leren en trainen.


 

Bibliografie

Contributors, W. (3 2024a). Extravehicular activity. Retrieved from
https://en.wikipedia.org/wiki/Extravehicular_activity
Nasa. (2007). Fastenings: Velcro.
Nasa. (7 1995). MAN-SYSTEMS INTEGRATION STANDARDS (Revision B). Retrieved from
https://msis.jsc.nasa.gov/sections/section14.htm
Nasa. (n.d.-c). Extravehicular Activities Art Collection. Retrieved from
https://www.nasa.gov/johnson/exhibits/extravehicular-activities/
Air, N., & Museum, S. (n.d.). Extravehicular Activity (EVA) Tether rings (2). Retrieved from
https://airandspace.si.edu/collection-objects/extravehicular-activity-e…-
2/nasm_A20120167000
Dasch, E. J. (1 2005). A Dictionary of Space Exploration (2nd ed.). doi:10.1093/acref/9780192806314.001.0001
Moore, S. K., & Gast, M. A. (10 2010). 21st Century extravehicular activities: Synergizing past and present
training methods for future spacewalking success. Acta Astronautica, 67(7–8), 739–752.
doi:10.1016/j.actaastro.2010.06.016
Deans, M., Márquez, J. J., Cohen, T., Miller, M. J., Deliz, I., Hillenius, S., … Lim, D. S. S. (3 2017). MINERVA:
User-centered Science Operations Software Capability for future human Exploration.
doi:10.1109/aero.2017.7943609
Contributors, W. (3 2024b). Extravehicular Mobility unit. Retrieved from
https://en.wikipedia.org/wiki/Extravehicular_Mobility_Unit
Jordan, N., Saleh, J. H., & Newman, D. (12 2006). The extravehicular mobility unit: A review of environment,
requirements, and design changes in the US spacesuit. Acta Astronautica, 59(12), 1135–1145.
doi:10.1016/j.actaastro.2006.04.014
Nasa, Christopher, A., Barr, A., & Beard, B. (1 2010). Human Integration Design Handbook (HIDH) (Revision
1). Retrieved from https://www.nasa.gov/feature/human-integration-design-handbook/
Group, B. W., & Nasa. (n.d.). Search | 3D Resources. Retrieved from https://nasa3d.arc.nasa.gov/search/tools
Nasa. (2020). NASA FaCtSFS-2020-4-540-GSFC. Retrieved from National Aeronautics and Space
Administration website: https://smd-cms.nasa.gov/wp-content/uploads/2023/04/hst-tools-fact-shee…
Contributors, W. (3 2024e). Tribology. Retrieved from https://en.wikipedia.org/wiki/Tribology
Jones, W., & Jansen, M. J. (12 2000). Space Tribology. doi:10.1201/9780849377877.ch31
Bgb. (n.d.). What is a Slip Ring and How Do They Work? | BGB.
Systems, R. (12 2020). F AQs » Rotary Systems, Inc. Retrieved from https://rotarysystems.com/support/faqs/
Bferrell. (9 2023). How do VELCRO® Brand Fasteners Work? | VELCRO® Brand Blog. Retrieved from
https://www.velcro.com/news-and-blog/2023/02/how-do-velcro-brand-fasten…
of Disease Control (CDC), C. (n.d.). Anthropometry | NIOSH | CDC. Retrieved from
https://www.cdc.gov/niosh/topics/anthropometry/default.html
Knipp, J. (7 2019). What is neutral buoyancy and how to achieve perfect neutral buoyancy while diving.
Retrieved from https://scubaguru.org/nj-scuba-articles/what-is-neutral-buoyancy-and-ho…-
perfect-neutral-buoyancy-while-diving/
Elsner, J., Reinerth, G., Figueredo, L. F. C., Naceri, A., Walter, U., & Haddadin, S. (7 2022). PARTI-A Haptic
Virtual Reality control Station for Model-Mediated Robotic Applications. Frontiers in Virtual Reality, 3.
doi:10.3389/frvir.2022.925794
Van De Merwe, D. B., Van Maanen, L., Ter Haar, F. B., Van Dijk, R. J. E., Hoeba, N., & Van Der Stap, N. (1
2019). Human-Robot interaction during virtual reality mediated teleoperation: How environment
information affects spatial task performance and operator situation awareness (pp. 163–177).
doi:10.1007/978-3-030-21565-1\{_
Sagardia, M., Hertkorn, K., Hulin, T., Schätzle, S., Wolff, R., Hummel, J., … Gerndt, A. (2015). VR-OOS: The
DLR’s virtual reality simulator for telerobotic on-orbit servicing with haptic feedback. 2015 IEEE
Aerospace Conference, 1–17. doi:10.1109/AERO.2015.7119040
VR-OOS: The DLR’ s virtual reality simulator for telerobotic on-orbit servicing with haptic feedback. (3 2015).
Retrieved from https://ieeexplore.ieee.org/abstract/document/7119040
Rosen, E., Whitney, D., Phillips, E., Chien, G. W., Tompkin, J., Konidaris, G., & Tellex, S. (11 2019).
Communicating robot arm motion intent through mixed Reality Head-Mounted displays (pp. 301–316).
doi:10.1007/978-3-030-28619-4\{_
Fitzgerald, D., & Ishii, H. (2018). Mediate: A Spatial Tangible Interface for Mixed Reality. Extended Abstracts of
the 2018 CHI Conference on Human Factors in Computing Systems, 1–6. Presented at the <conf-loc>,
<city>Montreal QC</city>, <country>Canada</country>, </conf-loc>. doi:10.1145/3170427.3188472
Pyatibratov, G., Kravchenko, O., & Kivo, A. (1 2016). Design Principles and Implementation of Advanced
Simulators for Training Astronauts to Work in Zero or Low Gravity Conditions. Procedia Engineering,
150, 1410–1414. doi:10.1016/j.proeng.2016.07.337
Hu, R., Meng, S., Zhang, L., Dong, Q., & Zhang, C. (8 2021). Research on force sensing and zero gravity motion
simulation technology of industrial Robot. Research Square (Research Square). doi:10.21203/rs.3.rs-
762971/v1
Nasa. (n.d.-d). Virtual Reality Training Lab - NASA. Retrieved from https://www.nasa.gov/virtual-reality-lab-
doug/
Nasa. (n.d.-a). Astronaut tests SAFER Backpack - NASA. Retrieved from https://www.nasa.gov/image-
article/astronaut-tests-safer-backpack/
Chang, M. L., & Marquez, J. J. (10 2018). Human-automation allocations for current robotic space operations:
space station remote manipulator system. Retrieved from NASA’s Human Research Program website:
https://ntrs.nasa.gov/api/citations/20200003149/downloads/20200003149.p…
Garcia, A. D. (9 2023b). Systems Engineering Simulator - NASA.
Garcia, A. D. (9 2023a). DST Technology Lab - NASA. Retrieved from https://www.nasa.gov/general/dst-
technology-lab/
Mars, K. (9 2023). Building on a mission: Astronaut training Facilities - NASA. Retrieved from
https://www.nasa.gov/history/building-on-a-mission-astronaut-training-f…
Stumpf, J. (7 1996). Flight hardware emulations in the Space Station Training Facility. Flight Simulation
Technologies Conference. doi:10.2514/6.1996-3534
Nasa. (n.d.-b). Engineering DOUG Graphics for Exploration (EDGE)(MSC-24663-1) | NASA Software Catalog.
Retrieved from https://software.nasa.gov/software/MSC-24663-1
Paddock, E., Nasa, Engineering, J. S. C., Lab, V . R., Lab, J. V . R., Caci, & Metecs. (n.d.). Overview of NASA EVA
Virtual Reality training, past, present and future. Retrieved from
https://nvite.jsc.nasa.gov/presentations/b3/D2_3_Virtual_Reality_Panel_…
van Deurzen, B., Goorts, P., De Weyer, T., Vanacken, D., & Luyten, K. (2021). HapticPanel: An Open System to
Render Haptic Interfaces in Virtual Reality for Manufacturing Industry. Proceedings of the 27th ACM
Symposium on Virtual Reality Software and Technology. Presented at the Osaka, Japan.
doi:10.1145/3489849.3489901
Swaim, P. L., Jr, Thompson, C. J., Campbell, P. D., & Aerospace, M. D. (1995). The Charlotte vM Intra-
Vehicular Robot. Retrieved from https://ntrs.nasa.gov/search.jsp?R=19950017283
Program, N. T. T. (n.d.). Full-Size Reduced Gravity simulator for humans, robots, and test objects | T2 Portal.
Retrieved from https://technology.nasa.gov/patent/MSC-TOPS-60
Sinnott, C., Liu, J., Matera, C., Halow, S., Jones, A. E., Moroz, M., … MacNeilage, P. R. (11 2019). Underwater
Virtual Reality System for Neutral Buoyancy Training: Development and Evaluation.
doi:10.1145/3359996.3364272
Agency, E. S. (12 2023). Underwater VR for Astronaut Training (pp. 3–4). pp. 3–4. Retrieved from
https://nebula.esa.int/sites/default/files/neb_study/3202/C4000138140FP…
Barricelli, B. R., Casiraghi, E., & Fogli, D. (2019). A Survey on Digital Twin: Definitions, Characteristics,
Applications, and Design Implications. IEEE Access, 7, 167653–167671.
doi:10.1109/ACCESS.2019.2953499
Marín-Vega, H., Alor-Hernández, G., Colombo-Mendoza, L. O., Bustos-López, M., & Zatarain-Cabada, R.
(2021). ZeusAR: a process and an architecture to automate the development of augmented reality serious
games. Multimedia Tools and Applications, 1–35. doi:10.1007/S11042-021-11695-1
Ravna, O. V ., Garcia, J., Themeli, C., & Prasolova-Førland, E. (2022). Supporting Peer-Learning with
Augmented Reality in Neuroscience and Medical Education. In V . L. Uskov, R. J. Howlett, & L. C. Jain
(Eds.), Smart Education and e-Learning - Smart Pedagogy (pp. 299–308). Singapore: Springer Nature
Singapore.
George, O., Foster, J., Xia, Z.-Y ., & Jacobs, C. (2023). Augmented Reality in Medical Education: A Mixed
Methods Feasibility Study. Cureus, 15. doi:10.7759/cureus.36927
Father.io. (n.d.). Father.IO - Real Life gaming. Retrieved from https://father.io/
Song, V . (4 2024). With the Vision Pro’ s Spatial Personas, you can be lonely with friends. Retrieved from
https://www.theverge.com/2024/4/12/24128703/apple-vision-pro-spatial-pe…
Charalambous, A., & Ioannou, A. (2020). Virtual Reality and Augmeneted Reality for Managing Symptoms. In
A. Charalambous (Ed.), Developing and Utilizing Digital Technology in Healthcare for Assessment and
Monitoring (pp. 85–104). doi:10.1007/978-3-030-60697-8_7
Agakhanova, M. D., Grebenkov, V . G., Rumyantsev, V . N., Korzhuk, M. S., Dymnikov, D. A., Ivanov, V . M., …
Gavrilova, A. L. (2023). Experience with augmented reality technology in the surgical treatment of a
patient with encapsulated metal foreign bodies of lower extremities. Bulletin of the Russian Military
Medical Academy, 25(2), 261–268. Retrieved from https://journals.eco-vector.com/1682-
7392/article/view/321172
Eswaran, M., & Bahubalendruni, M. V . A. R. (2022). Challenges and opportunities on AR/VR technologies for
manufacturing systems in the context of industry 4.0: A state of the art review. Journal of Manufacturing
Systems, 65, 260–278. doi:10.1016/j.jmsy.2022.09.016
Cohen, M. F., & Meroño, J. A. (1 2022). ‘Guilty bystanders’: VR gaming with audience participation via
smartphone. International Journal of Entertainment Technology and Management, 1(4), 339.
doi:10.1504/ijenttm.2022.129638
Contributors, W. (3 2024d). Simultaneous localization and mapping. Retrieved from
https://en.wikipedia.org/wiki/Simultaneous_localization_and_mapping
Jakob. (5 2023). The Ultimate AR Lens Guide and its implications - VRX by VR Expert. Retrieved from
https://vrx.vr-expert.com/the-ultimate-ar-lens-guide-and-its-implicatio…
Ding, Y ., Yang, Q., Li, Y ., Yang, Z., Wang, Z., Liang, H., & Wu, S. (12 2023). Waveguide-based augmented
reality displays: perspectives and challenges. eLight, 3(1). doi:10.1186/s43593-023-00057-z
Cholewiak, S. A., Başgöze, Z., Cakmakci, O., Hoffman, D. M., & Cooper, E. A. (12 2020). A perceptual eyebox
for near-eye displays. Opt. Express, 28(25), 38008–38028. doi:10.1364/OE.408404
Nie, J., Chen, Z., Chen, Y ., Pan, Z., Deng, C., Zhang, H., … Shen, B. (2022). Wide color gamut, low non-visual
effect, and their stabilities related to luminous properties in four-primary-color display. Optics & Laser
Technology, 156, 108565. doi:10.1016/j.optlastec.2022.108565
Masaoka, K., & Nishida, Y . (2015). Metric of color-space coverage for wide-gamut displays. Opt. Express, 23(6),
7802–7808. doi:10.1364/OE.23.007802
Pixels, 512. (2 2024). Apple Vision Pro Review: It’ s a carousel. Retrieved from
https://512pixels.net/2024/02/apple-vision-pro-review/
Writer, M. E. S. (12 2019). Digital cinema faces a bright future. Retrieved from
https://www.photonics.com/Articles/Digital_Cinema_Faces_a_Bright_Future…
Hirsch, J., & Curcio, C. A. (1989). The spatial resolution capacity of human foveal retina. Vision Research, 29(9),
1095–1101. doi:10.1016/0042-6989(89)90058-8
Contributors, W. (3 2024c). Retina Display. Retrieved from https://en.wikipedia.org/wiki/Retina_display
Wojciech. (8 2023). Nits vs Lumens vs Luminance: Understanding Display Brightness Metrics. Retrieved from
https://riverdi.com/blog/nits-vs-lumens-vs-luminance-understanding-disp…
Mount, D. M., Netanyahu, N. S., & Le Moigne, J. (1999). Efficient algorithms for robust feature matching.
Pattern Recognition, 32(1), 17–38.
Sarlin, P.-E., DeTone, D., Malisiewicz, T., & Rabinovich, A. (2020). Superglue: Learning feature matching with
graph neural networks. Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern
Recognition, 4938–4947.
Sun, J., Shen, Z., Wang, Y ., Bao, H., & Zhou, X. (2021). LoFTR: Detector-free local feature matching with
transformers. Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition,
8922–8931.
Zhao, X., Wu, X., Miao, J., Chen, W., Chen, P. C. Y ., & Li, Z. (2022). Alike: Accurate and lightweight keypoint
detection and descriptor extraction. IEEE Transactions on Multimedia.
Tian, Y ., Balntas, V ., Ng, T., Barroso-Laguna, A., Demiris, Y ., & Mikolajczyk, K. (2020). D2D: Keypoint
extraction with describe to detect approach. Proceedings of the Asian Conference on Computer Vision.
Beauchemin, S. S., & Barron, J. L. (1995). The computation of optical flow. ACM Computing Surveys (CSUR),
27(3), 433–466.
Fortun, D., Bouthemy, P., & Kervrann, C. (2015). Optical flow modeling and computation: A survey. Computer
Vision and Image Understanding, 134, 1–21.
Google. (n.d.-a). Build new augmented reality experiences that seamlessly blend the digital and physical worlds |
ARCore | Google for Developers. Retrieved from https://developers.google.com/ar
Apple. (n.d.-b). ARKIT 6 - Augmented Reality - Apple Developer. Retrieved from
https://developer.apple.com/augmented-reality/arkit/
Group, T. K. (12 2016). OpenXR - High-performance access to AR and VR —collectively known as XR—
platforms and devices. Retrieved from https://www.khronos.org/api/index_2017/openxr
XREAL. (n.d.). XREAL. Retrieved from https://www.xreal.com/
Microsoft. (n.d.). Microsoft HoloLens | Mixed Reality Technology for business. Retrieved from
https://www.microsoft.com/en-us/hololens
Leap, M. (n.d.). Magic Leap 2. Retrieved from https://www.magicleap.com/magic-leap-2
Megaton rainfall on Steam. (n.d.). Retrieved from
https://store.steampowered.com/app/430210/Megaton_Rainfall/
Oculus. (n.d.). Retrieved from https://www.oculus.com/vader-immortal/
Half-Life: Alyx on Steam. (n.d.). Retrieved from https://store.steampowered.com/app/546560/HalfLife_Alyx/
Magicplan. (n.d.). magicplan | Construction & Floor Plan App For Contractors. Retrieved from
https://www.magicplan.app/
Purohit, K. (6 2023). Enhancing Aviation Safety through Haptic Glove-Enabled Virtual Reality Simulations: An
Economical and Scalable Approach for Pilot Training. doi:10.32920/23330219.v1
Hammar Wijkmark, C., Bail, R., Metallinou, M.-M., Heldal, I., Mi, A., & Aguiar, E. (10 2022). Journal of
Applied Technical and Educational Sciences jATES Introducing Virtual Reality for Firefighter Skills
Training Opinions from Sweden and Brazil. 12, 1–24. doi:10.24368/jates317
Harris, D., Arthur, T., Kearse, J., Olonilua, M., Hassan, E., Burgh, T., … Vine, S. (03 2023). Exploring the role of
virtual reality in military decision training. Frontiers in Virtual Reality, 4. doi:10.3389/frvir.2023.1165030
Solanki, D., Laddha, H., Kangda, M. Z., & Farsangi, E. (06 2023). AUGMENTED AND VIRTUAL
REALITIES: THE FUTURE OF BUILDING DESIGN AND VISUALIZATION. Civil and
Environmental Engineering Reports, 33. doi:10.59440/ceer-2023-0002
Kaleja, P., & Kozlovská, M. (01 2017). Virtual Reality as Innovative Approach to the Interior Designing. Selected
Scientific Papers - Journal of Civil Engineering, 12. doi:10.1515/sspjce-2017-0011
Keil, J., Weißmann, M., Korte, A., Edler, D., & Dickmann, F. (05 2023). Measuring Physiological Responses to
Visualizations of Urban Planning Scenarios in Immersive Virtual Reality. 73. doi:10.1007/s42489-023-
00137-7
IKEA. (2017, December 9). Launch of new IKEA Place app – IKEA Global. Retrieved from
https://www.ikea.com/global/en/newsroom/innovation/ikea-launches-ikea-p…-
people-to-virtually-place-furniture-in-their-home-170912/
Polishchuk, E., Bujdosó, Z., El Archi, Y ., Benbba, B., Zhu, K., & Dávid, L. D. (2023). The Theoretical
Background of Virtual Reality and Its Implications for the Tourism Industry. Sustainability, 15(13).
doi:10.3390/su151310534
Guttentag, D. A. (2010). Virtual reality: Applications and implications for tourism. Tourism Management, 31(5),
637–651. doi:10.1016/j.tourman.2009.07.003
Selmanović, E., Rizvic, S., Harvey, C., Boskovic, D., Hulusic, V ., Chahin, M., & Sljivo, S. (5 2020). Improving
Accessibility to Intangible Cultural Heritage Preservation Using Virtual Reality. J. Comput. Cult. Herit.,
13(2). doi:10.1145/3377143
Pan, X., Xia, Z., Song, S., Li, L. E., & Huang, G. (6 2021). 3D Object Detection With Pointformer. Proceedings
of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 7463–7472.
Gao, K., Gao, Y ., He, H., Lu, D., Xu, L., & Li, J. (2023). NeRF: Neural Radiance Field in 3D Vision, A
Comprehensive Review. arXiv [Cs.CV]. Retrieved from http://arxiv.org/abs/2210.00379
Kerbl, B., Kopanas, G., Leimkühler, T., & Drettakis, G. (7 2023). 3D Gaussian Splatting for Real-Time Radiance
Field Rendering. ACM Transactions on Graphics, 42(4). Retrieved from https://repo-
sam.inria.fr/fungraph/3d-gaussian-splatting/
Google. (n.d.-b). Google Cardboard – Google VR. Retrieved from https://arvr.google.com/cardboard/
Babic, T., Reiterer, H., & Haller, M. (2018). Pocket6: A 6DoF Controller Based On A Simple Smartphone
Application. Proceedings of the 2018 ACM Symposium on Spatial User Interaction, 2–10. Presented at the
Berlin, Germany. doi:10.1145/3267782.3267785
Google. (n.d.-c). WorldSense. Retrieved from https://developers.google.com/vr/discover/worldsense
Nguyen, J., Smith, C., Magoz, Z., & Sears, J. (2020). Screen door effect reduction using mechanical shifting for
virtual reality displays. In B. C. Kress & C. Peroz (Eds.), Optical Architectures for Displays and Sensing
in Augmented, Virtual, and Mixed Reality (AR, VR, MR) (V ol. 11310, p. 113100P).
doi:10.1117/12.2544479
Samsung. (3 2021). Gear VR with Controller, Virtual Reality support | Samsung Care US. Retrieved from
https://www.samsung.com/us/support/mobile/virtual-reality/gear-vr/gear-…
McMillan, M. (3 2024). Meta Quest 2 review. Retrieved from https://www.tomsguide.com/reviews/oculus-quest-
2-review
PCMag. (10 2023). Meta Quest 3 Review. Retrieved from https://www.pcmag.com/reviews/meta-quest-3
PCgamer. (5 2023). Pico 4 VR headset. Retrieved from https://www.pcgamer.com/pico-4-ve-headset-review/
Carter, R. (12 2023). Varjo XR-4 Series Review: Next-Level Mixed reality headsets. Retrieved from
https://www.xrtoday.com/reviews/varjo-xr-4-series-review-next-level-mix…
PCMag. (8 2021). HTC Vive Pro 2 Review. Retrieved from https://www.pcmag.com/reviews/htc-vive-pro-2
PCMag. (7 2021). Valve Index VR Kit review. Retrieved from https://www.pcmag.com/reviews/valve-index-vr-kit
Lynch, K., & Park, F. C. (5 2017). Modern robotics: mechanics, planning, and control. Retrieved from
https://openlibrary.org/books/OL28638262M/Modern_Robotics
Manolescu, D., & Secco, E. (09 2022). Design of a 3-DOF Robotic Arm and implementation of D-H Forward
Kinematics.
Buss, S. (05 2004). Introduction to inverse kinematics with Jacobian transpose, pseudoinverse and damped least
squares methods. IEEE Transactions in Robotics and Automation, 17.
Aristidou, A., & Lasenby, J. (2011). FABRIK: A fast, iterative solver for the Inverse Kinematics problem. Graph.
Models, 73(5), 243–260. doi:10.1016/j.gmod.2011.05.003
Haq, M. U., Gang, Z., Ahad, F. E., Ur Rehman, A., & Hussain, M. (2015). Inverse kinematic analysis of three
link mechanism for fin actuation of fish like micro device. Journal of Biomimetics, Biomaterials and
Biomedical Engineering, 24, 77–81.
Zaplana, I., Hadfield, H., & Lasenby, J. (2022). Singularities of Serial Robots: Identification and Distance
Computation Using Geometric Algebra. Mathematics, 10(12). doi:10.3390/math10122068
Zolotas, M., & Demiris, Y . (7 2020). Transparent Intent for Explainable Shared Control in Assistive Robotics. In
C. Bessiere (Ed.), Proceedings of the Twenty-Ninth International Joint Conference on Artificial
Intelligence, IJCAI-20 (pp. 5184–5185). doi:10.24963/ijcai.2020/732
Zhang, Y ., & Doyle, T. (2023). Integrating intention-based systems in human-robot interaction: a scoping review
of sensors, algorithms, and trust. Frontiers in Robotics and AI, 10. doi:10.3389/frobt.2023.1233328
Wendemuth, A., Böck, R., Nürnberger, A., Al-Hamadi, A., Brechmann, A., & Ohl, F. W. (2018). Intention-Based
Anticipatory Interactive Systems. 2018 IEEE International Conference on Systems, Man, and Cybernetics
(SMC), 2583–2588. doi:10.1109/SMC.2018.00442
Chen, L., Fu, J., Wu, Y ., Li, H., & Zheng, B. (2020). Hand Gesture Recognition Using Compact CNN via Surface
Electromyography Signals. Sensors, 20(3). doi:10.3390/s20030672
Li, G., Liu, Z., Cai, L., & Yan, J. (2020). Standing-Posture Recognition in Human–Robot Collaboration Based on
Deep Learning and the Dempster–Shafer Evidence Theory. Sensors, 20(4). doi:10.3390/s20041158
Liu, L.-K., Chien, T.-C., Chen, Y .-L., Fu, L.-C., & Lai, J.-S. (2019). Sensorless Control with Friction and Human
Intention Estimation of Exoskeleton Robot for Upper-limb Rehabilitation. 2019 IEEE International
Conference on Robotics and Biomimetics (ROBIO), 290–296. doi:10.1109/ROBIO49542.2019.8961760
Moon, D.-H., Kim, D., & Hong, Y .-D. (2019). Development of a Single Leg Knee Exoskeleton and Sensing Knee
Center of Rotation Change for Intention Detection. Sensors, 19(18). doi:10.3390/s19183960
Goldhammer, M., Köhler, S., Zernetsch, S., Doll, K., Sick, B., & Dietmayer, K. (2020). Intentions of Vulnerable
Road Users—Detection and Forecasting by Means of Machine Learning. IEEE Transactions on Intelligent
Transportation Systems, 21(7), 3035–3045. doi:10.1109/TITS.2019.2923319
Welman, C. (1 1993). INVERSE KINEMATICS AND GEOMETRIC CONSTRAINTS FOR ARTICULATED
FIGURE MANIPULATION. Retrieved from https://ir.lib.sfu.ca/item/5706
Faridan, M., Friedel, M., & Suzuki, R. (2022, October). UltraBots: Large-Area Mid-Air Haptics for VR with
Robotically Actuated Ultrasound Transducers. Adjunct Proceedings of the 35th Annual ACM Symposium
on User Interface Software and Technology. doi:10.1145/3526114.3561350
Suzuki, R., Ofek, E., Sinclair, M., Leithinger, D., & Gonzalez-Franco, M. (2021). HapticBots: Distributed
Encountered-type Haptics for VR with Multiple Shape-changing Mobile Robots. The 34th Annual ACM
Symposium on User Interface Software and Technology, 1269–1281. Presented at the Virtual Event, USA.
doi:10.1145/3472749.3474821
Gonzalez, E. J., Chase, E. D. Z., Kotipalli, P., & Follmer, S. (2022). A Model Predictive Control Approach for
Reach Redirection in Virtual Reality. Proceedings of the 2022 CHI Conference on Human Factors in
Computing Systems. Presented at the New Orleans, LA, USA. doi:10.1145/3491102.3501907
Azmandian, M., Hancock, M., Benko, H., Ofek, E., & Wilson, A. D. (2016). Haptic Retargeting: Dynamic
Repurposing of Passive Haptics for Enhanced Virtual Reality Experiences. Proceedings of the 2016 CHI
Conference on Human Factors in Computing Systems, 1968–1979. Presented at the San Jose, California,
USA. doi:10.1145/2858036.2858226
Barrio, A., Fau, G., Deremetz, M., Roy, T., Boitte, R., Codina, E., … Englert, C. (10 2023). X-ARM: a Novel Arm
Exoskeleton combined with Extended Realities to train Future Astronauts.
Arduino Braccio ++. (n.d.). Retrieved from https://store.arduino.cc/products/braccioplusplus
Döllinger, N., Wolf, E., Botsch, M., Latoschik, M. E., & Wienrich, C. (2023). Are Embodied Avatars Harmful to
our Self-Experience? The Impact of Virtual Embodiment on Body Awareness.
doi:10.1145/3544548.3580918
Wijntjes, M., Sato, A., Kappers, A., & Hayward, V . (06 2008). Haptic Perception of Real and Virtual Curvature.
5024, 361–366. doi:10.1007/978-3-540-69057-3_46
Arduino. (n.d.). Braccio++ kit. Retrieved from https://braccio.arduino.cc/
Diy-Robotics. (6 2023). The importance of calculating robot payload for optimal performance.
Apple. (n.d.-a). Apple Vision Pro. Retrieved from https://www.apple.com/apple-vision-pro/
Meta. (n.d.). Meta Quest 3. Retrieved from https://www.meta.com/be/en/quest/quest-3
Bielecki, A. (8 2017). Milky Way Skybox | 2D Textures & Materials | Unity Asset Store. Retrieved from
https://assetstore.unity.com/packages/2d/textures-materials/milky-way-s…
Tutorials, V . (1 2022). How to make Physics hands in VR - P ART 1 - Unity VR Tutorial. Retrieved from
https://www.youtube.com/watch?v=VG8hLKyTiJQ
Technologies, U. (n.d.). Unity - Manual: Physics. Retrieved from https://docs.unity3d.com/Manual/class-
PhysicsManager.html
Method, V . (9 2017). ObI Rope | Physics | Unity Asset Store. Retrieved from
https://assetstore.unity.com/packages/tools/physics/obi-rope-55579
Method, V . (n.d.). OBI Physics for Unity - Big Picture. Retrieved from
https://obi.virtualmethodstudio.com/manual/6.3/index.html
Wilches, D. (7 2021). Ardity: Arduino + Unity communication made easy | Integration | Unity Asset Store.
Retrieved from https://assetstore.unity.com/packages/tools/integration/ardity-arduino-…-
communication-made-easy-123819
Morich, K. (2019). Serial USB Terminal - Apps op Google Play. Retrieved from
https://play.google.com/store/apps/details?id=de.kai_morich.serial_usb_…
Kai-Morich. (n.d.). GitHub - kai-morich/SimpleUsbTerminal: Android terminal app for devices with a serial /
UART interface connected with a USB-to-serial-converter. Retrieved from https://github.com/kai-
morich/SimpleUsbTerminal
Unity. (n.d.-a). Unity - Manual AAR plug-ins and Android Libraries. Retrieved from
https://docs.unity3d.com/2021.2/Documentation/Manual/AndroidAARPlugins…
Unity. (n.d.-b). Unity - Manual JAR plug-ins. Retrieved from
https://docs.unity3d.com/2021.2/Documentation/Manual/AndroidJARPlugins…
Studio, A. (n.d.). Create an Android library. Retrieved from
https://developer.android.com/studio/projects/android-library.html#groo…
V oitanium. (6 2021). Extending Unity Player Activity and calling Unity C# Methods from the Unity Android
Plugin. Retrieved from https://www.youtube.com/watch?v=R-cYild8ZYs
Keon, J. (5 2018). Creating Android JAR and AAR plugins for Unity - Karman Interactive - Medium. Retrieved
from https://medium.com/karmanltd/creating-android-jar-and-aar-plugins-for-u…
Gonçalves, G., Monteiro, P., Coelho, H., Melo, M., & Bessa, M. (2021). Systematic Review on Realism Research
Methodologies on Immersive Virtual, Augmented and Mixed Realities. IEEE Access, 9, 89150–89161.
doi:10.1109/ACCESS.2021.3089946
Witmer, B. G., & Singer, M. J. (6 1998). Measuring Presence in Virtual Environments: A Presence Questionnaire.
Presence, 7(3), 225–240. doi:10.1162/105474698565686
Slater, M. (8 2004). How Colorful Was Your Day? Why Questionnaires Cannot Assess Presence in Virtual
Environments. Presence, 13(4), 484–493. doi:10.1162/1054746041944849
Usoh, M., Catena, E., Arman, S., & Slater, M. (10 2000). Using Presence Questionnaires in Reality. Presence,
9(5), 497–503. doi:10.1162/105474600566989
V orderer, P., Wirth, W., Gouveia, F. R., Biocca, F., Saari, T., Jäncke, F., … Others. (2004). MEC spatial presence
questionnaire (MEC-SPQ): Short documentation and instructions for application. Report to the European
Community, Project Presence: MEC. MEC (IST-2001-37661).
Brooke, J. (01 1996). SUS -- a quick and dirty usability scale.
Schubert, T., Friedmann, F., & Regenbrecht, H. (06 2001). The Experience of Presence: Factor Analytic Insights.
Presence, 10, 266–281. doi:10.1162/105474601300343603
Gonzalez-Franco, M., & Peck, T. C. (2018). Avatar Embodiment. Towards a Standardized Questionnaire.
Frontiers in Robotics and AI, 5. doi:10.3389/frobt.2018.00074
Ryan, R. M., & Deci, E. L. (1985). Intrinsic Motivation and Self-Determination in Human Behavior. New York:
Plenum Press.
Stejskal, T., Svetlík, J., & Ondočko, Š. (2022). Mapping Robot Singularities through the Monte Carlo Method.
Applied Sciences, 12(16). doi:10.3390/app12168330
Research, A. F. H., & Quality. (2005). NASA Task Load Index | Digital Healthcare Research. Retrieved from
https://digital.ahrq.gov/health-it-tools-and-resources/evaluation-resou…-
toolkit/all-workflow-tools/nasa-task-load-index
Cheng, L.-P., Lühne, P., Lopes, P., Sterz, C., & Baudisch, P. (2014). Haptic turk: a motion platform based on
people. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 3463–3472.
Presented at the Toronto, Ontario, Canada. doi:10.1145/2556288.2557101
Heo, S., Chung, C., Lee, G., & Wigdor, D. (2018). Thor’s Hammer: An Ungrounded Force Feedback Device
Utilizing Propeller-Induced Propulsive Force. Proceedings of the 2018 CHI Conference on Human
Factors in Computing Systems, 1–11. Presented at the Montreal QC, Canada.
doi:10.1145/3173574.3174099
Agency, E. S. (n.d.). Spacewalk training. Retrieved from
https://www.esa.int/Science_Exploration/Human_and_Robotic_Exploration/A…
Huggingface, & Sehon, T. (12 2023). blog/gaussian-splatting.md at main · huggingface/blog. Retrieved from
https://github.com/huggingface/blog/blob/main/gaussian-splatting.md
Adriaensen, A., Costantino, F., Di Gravio, G., & Patriarca, R. (2022). Teaming with industrial cobots: A socio-
technical perspective on safety analysis. Human Factors and Ergonomics in Manufacturing & Service
Industries, 32(2), 173–198. doi:10.1002/hfm.20939
Cheng, L.-P., Marwecki, S., & Baudisch, P. (2017). Mutual Human Actuation. Proceedings of the 30th Annual
ACM Symposium on User Interface Software and Technology, 797–805. Presented at the Québec City,
QC, Canada. doi:10.1145/3126594.3126667

Download scriptie (4.59 MB)
Universiteit of Hogeschool
Universiteit Hasselt
Thesis jaar
2024
Promotor(en)
Prof. Dr. Kris Luyten, Prof. Dr. Raf Ramakers