Open Access
Issue
SHS Web Conf.
Volume 139, 2022
The 4th ETLTC International Conference on ICT Integration in Technical Education (ETLTC2022)
Article Number 03022
Number of page(s) 8
Section Topics in Computer Science
DOI https://doi.org/10.1051/shsconf/202213903022
Published online 13 May 2022
  1. Respawn Entertainment. “Apex Legends.” PC [Game] (Update 1.1. 3) (2019) [Google Scholar]
  2. S. Tachi, M. Inami, and Y. Uema, “The transparent cockpit,” IEEE Spectrum, vol. 51, no. 11, pp. 52–56, (2014) [CrossRef] [Google Scholar]
  3. M. Skaff, “F-35 Lightning II Cockpit Vision 2010-01-2330,” [Google Scholar]
  4. R. Bane and T. Hollerer, “Interactive tools for virtual x-ray vision in mobile augmented reality,” Third IEEE and ACM Int. Symp. on Mixed and Augmented Reality, pp. 231–239, (2004) [CrossRef] [Google Scholar]
  5. L. olde Scholtenhuis, S. Zlatanova, X. den Duijn, A.M. Ntarladima, and E. Theocharous, “Spying the underground: visualizing subsurface utilities’ location uncertainties with fuzzy 3D,” SPOOL, vol.4, no.2, pp.61–64, (2017) [Google Scholar]
  6. S. Ortega, J. Wendel, J.M. Santana, S.M. Murshed, I. Boates, A. Trujillo, A. Nichersu, and J.P. Suárez, “Making the invisible visible–strategies for visualizing underground infrastructures in immersive environments,” ISPRS Int. J. of Geo-Information, vol. 8, no. 3, p. 152, (2019) [CrossRef] [Google Scholar]
  7. S. Zhang, W. He, S. Wang, S. Feng, Z. Hou, and Y. Hu, “An AR-Enabled See-Through System for Vision Blind Areas,” Int. Conf. on Human Computer Interaction, pp. 206–213, Springer, (2021) [Google Scholar]
  8. S. Treuillet and E. Royer, “Outdoor/indoor vision based localization for blind pedestrian navigation assistance,” Int. J. Image Graphics, vol. 10, pp. 481–496, 10 (2010) [CrossRef] [Google Scholar]
  9. “Apple Inc. iPad Pro (2nd generation).” https://support.apple.com/kb/SP814/ (Accessed on 01/21/2022). [Google Scholar]
  10. S. Schuon, C. Theobalt, J. Davis, and S. Thrun, “High-quality scanning using time-of-flight depth superresolution,” IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops, pp. 1–7, (2008) [Google Scholar]
  11. “NReal Light.” https://www.nreal.ai/light/ (Accessed on 01/21/2022). [Google Scholar]
  12. “Samsung Galaxy Note20 Ultra 5G.” https://www.samsung.com/us/smartphones/galaxy-note20-5g/specs/ (Accessed on 01/21/2022). [Google Scholar]
  13. “Unity.” https://unity.com/ (Accessed on 01/21/2022). [Google Scholar]
  14. “ARFoundation.” https://unity.com/unity/features/arfoundation/ (Accessed on 01/21/2022). [Google Scholar]
  15. “ARKit.” https://developer.apple.com/augmented-reality/arkit/ (Accessed on 01/21/2022). [Google Scholar]
  16. “ARCore.” https://developers.google.com/ar/ (Accessed on 01/21/2022). [Google Scholar]
  17. “Immersal.” https://immersal.com/ (Accessed on 01/21/2022). [Google Scholar]
  18. “Immersal’s tutorials how-to-map.” https://immersal.gitbook.io/sdk/tutorials/how-to-map/ (Accessed on 01/21/2022). [Google Scholar]
  19. “Mirror.” https://github.com/vis2k/Mirror/ (Accessed on 01/21/2022) [Google Scholar]

Current usage metrics show cumulative count of Article Views (full-text article views including HTML views, PDF and ePub downloads, according to the available data) and Abstracts Views on Vision4Press platform.

Data correspond to usage on the plateform after 2015. The current usage metrics is available 48-96 hours after online publication and is updated daily on week days.

Initial download of the metrics may take a while.