Ezequiel ZorzalEzequiel ZorzalEzequiel ZorzalEzequiel Zorzal
  • Welcome
  • Research
  • Publications
  • Courses
  • Students
  • Videos
  • CV Lattes
  • Contact

Ezequiel Zorzal Hi, my name is Ezequiel Zorzal. I am an associate professor at the Institute of Science and Technology at the Federal University of Sao Paulo (ICT/Unifesp). I am also a researcher collaborator at the INESC-ID at the Instituto Superior Técnico of the University of Lisbon in Portugal (INESC-ID/IST/ULisboa). My areas of interest are interactive technologies such as Virtual Reality and Augmented Reality. I graduated in Computer Science and a PhD in Electrical Engineering. I love my family, and programming is my hobby. Also, I like creative things, helping people, making new friends, and happy environments. You are welcome here! CV Lattes

Gamified learning can increase memory capacity, develop strategic thinking, and improve hand-eye coordination.

Upcoming Conferences

  • ACM Virtual Reality Software and Technology (VRST 2021)

    December 8 - 10, 2021
  • International Conference on Computer Graphics and Interactive Techniques (SIGGRAPH 2021)

    9-13 August, 2021
  • International Conference on Medical Image Computing and Computer Assisted Intervention (MICCAI 2021)

    September 27 - October 1, 2021
  • Brazilian Symposium on Human Factors in Computing Systems (IHC 2021)

    October 18-22, 2021
  • ACM CHI Conference on Human Factors in Computing Systems (CHI 2022)

    April 30 - May 6, 2022, New Orleans, LA
  • Symposium on Virtual and Augmented Reality (SVR 2021)

    18th-21st October 2021
  • IEEE Conference on Virtual Reality and 3D User Interfaces (IEEE VR 2022)

    12th-16th March 2022
  • IEEE International Symposium on Mixed and Augmented Reality (ISMAR 2021)

    Bari, italy, October 4th - 8th 2021
  • Brazilian Symposium on Computer Games and Digital Entertainment (SBGames 2021)

    18th-21st October 2021

Twitter

  • RT  @jjorge : #IEEE Access published our paper on VR conferences in #VR: Towards VR in VR: Assessing Engagement and Social Interaction in a V…
    @ezorzal
    3 months ago

"Creativity is just connecting things."

Steve Jobs

Research

Most of the research I have developed has focused on interactive technologies and how they improve education, teaching, and training. My main research interests are in Extended Reality (XR), and in exploring innovative solutions to improve the user experience in several areas of knowledge.

So, the areas that I have been working on over the years are:

  • XR: Virtual Reality (VR), Augmented Reality (AR), and Mixed Reality (MR).
  • Mobile learning, game-based learning, and gamification.

Also, I am one of the heads of the Interactive Applications Laboratory (Interapps) at UNIFESP. Interapps is a multidisciplinary group exploring interactive applications in areas such as health, industry, and education. We encourage teaching, research, and extension in science and technology. We develop applied research, whose projects use new interactive technologies to improve user experience.

If you’re interested in any of these areas, please contact me!

Extended Reality (XR)

XR is an umbrella term encapsulating Augmented Reality (AR), Virtual Reality (VR), and Mixed Reality (MR) technologies [1,2]. These technologies extend the reality we experience by either blending the virtual and real environments or by creating a fully immersive experience.

In AR, digital information and virtual objects are overlaid on the real world [3]. The overlapping is realized by an estimation process using markers, visual cues that are trained on beforehand to be recognized later in the camera stream [4], or characteristic points from the scenario referred to as markerless tracking [5]. Users can use technology through AR headsets or handheld devices, and can still interact and see what is going on in front of them.
On the other hand, MR takes the best qualities of AR to create and immersive environments. Rather than displaying simple images like AR, MR is a merge of both the real world and the virtual world. In MR, virtual and real objects co-exist and can interact with one another in real-time. MR is a “reality spectrum” that combines the best aspects of VR and AR [2]. It requires an MR headset (e.g. Microsoft’s HoloLens) and more processing power than VR or AR.

In contrast to AR and MR, in a VR experience, users are fully immersed in a simulated digital environment [6]. Individuals must put on a VR headset to view a virtual environment. In AR and MR, most visual sensations and sensory stimuli come from the real world, and the virtual elements contribute less, the virtual object is set as if it were part of the real world. In VR, most sensory information is computer-generated where the virtual environment is presented as if the user were part of it.
The figure from the left side shows the XR taxonomy used in this text.

1. Irlitti, A., Piumsomboon, T., Jackson, D., and Thomas, B. H. (2019). Conveying spatial awareness cues in xR collaborations.IEEE Transactions on Visualization and Computer Graphics, 25(11):3178–3189.
2. Alizadehsalehi, S., Hadavi, A., & Huang, J. C. (2020). From BIM to extended reality in AEC industry. Automation in Construction, 116, 103254. https://doi.org/10.1016/j.autcon.2020.103254
3. Azuma, R. (1997). A Survey of Augmented Reality.Presence: Teleoperators and Virtual Environments,4:355–385.
4. Khan, D., Ullah, S., and Rabbi, I. (2015). Factors affecting the design and tracking of ARToolKit markers. Computer Standards and Interfaces, 41:56–66.
5. Fraga-Lamas, P., Fernández-Caramés, T. M., Blanco-Novoa, O., and Vilar-Montesinos, M. (2018).A Review on Industrial Augmented Reality Systems for the Industry 4.0 Shipyard.IEEE Access,6:13358–13375.
6. Yung, R. and Khoo-Lattimore, C. (2017). New realities: a systematic literature review on virtual reality and augmented reality in tourism research.Current Issues in Tourism, 22(17):2056–2081.

XR allow diverse ways of learning, which is able to motivate the students and enhance their understanding.

Publications

Please, find a complete list of publications in my Google Scholar or my CV Lattes.

Here are just a few of my selected publications.

A Tool for Collaborative Anatomical Dissection.

Abstract: 3D reconstruction from anatomical slices permits anatomists to create three-dimensional depictions of real structures by tracing organs from sequences of cryosections. A wide variety of tools for 3D reconstruction from anatomical slices are becoming available for use in training and study. In this chapter, we present Anatomy Studio, a collaborative Mixed Reality tool for virtual dissection that combines tablets with styli and see-through head-mounted displays to assist anatomists by easing manual tracing and exploring cryosection images. By using mid-air interactions and interactive surfaces, anatomists can easily access any cryosection and edit contours, while following other user’s contributions. A user study including experienced anatomists and medical professionals, conducted in real working sessions, demonstrates that Anatomy Studio is appropriate and useful for 3D reconstruction. Results indicate that Anatomy Studio encourages closely coupled collaborations and group discussion, to achieve deeper insights.

ZORZAL, E. R.; SOUSA, M.; MENDES, D.; FIGUEIREDO, P. S.; RODRIGUES, P.; JORGE, J.; LOPES, D. S. A Tool for Collaborative Anatomical Dissection. Human-Computer Interaction Series. 1ed.: Springer International Publishing, 2021, v., p. 41-58. (http://dx.doi.org/10.1007/978-3-030-61905-3_3)


An immersive educational tool for dental implant placement: A study on user acceptance.

Abstract: Tools for training and education of dental students can improve their ability to perform technical procedures such as dental implant placement. Shortage of training can negatively affect dental implantologists’ performance during intraoperative procedures, resulting in lack of surgical precision and, consequently, inadequate implant placement, which may lead to unsuccessful implant supported restorations or other complications. We designed and developed IMMPLANT a virtual reality educational tool to assist implant placement learning, which allows users to freely manipulate 3D dental models (e.g., a simulated patient’s mandible and implant) with their dominant hand while operating a touchscreen device to assist 3D manipulation. The proposed virtual reality tool combines an immersive head-mounted display, a small hand tracking device and a smartphone that are all connected to a laptop. The operator’s dominant hand is tracked to quickly and coarsely manipulate either the 3D dental model or the virtual implant, while the non-dominant hand holds a smartphone converted into a controller to assist button activation and a greater input precision for 3D implant positioning and inclination. We evaluated IMMPLANT’s usability and acceptance during training sessions with 16 dental professionals. The conducted user acceptance study revealed that IMMPLANT constitutes a versatile, portable, and complementary tool to assist implant placement learning, as it promotes immersive visualization and spatial manipulation of 3D dental anatomy. IMMPLANT is a promising virtual reality tool to assist student learning and 3D dental visualization for implant placement education. IMMPLANT may also be easily incorporated into training programs for dental students.

ZORZAL, E. R.; PAULO, S. F; RODRIGUES, P.; MENDES, J. J.; LOPES, D. S. An immersive educational tool for dental implant placement: A study on user acceptance. INTERNATIONAL JOURNAL OF MEDICAL INFORMATICS, v. 146, p. 104342, 2021. (http://dx.doi.org/10.1016/j.ijmedinf.2020.104342)


Towards augmented Reality for corporate training.

Abstract: Studies indicate that Augmented Reality (AR) is quickly becoming an effective technology for training programs. This systematic literature review (SLR) aims to screen works published on AR for corporate training. We describe AR training applications, discuss current challenges, literature gaps, opportunities, and tendencies of corporate AR solutions. We structured a protocol to define keywords, semantics of research, and databases used as sources of this SLR. From a primary analysis, we considered 1952 articles in the review for qualitative synthesis. We selected 60 among the selected articles for this study. The survey shows a large number of 41.7% of applications focused on automotive and medical training. Additionally, 20% of selected publications use a camera-display with a tablet device, while 40% refer to head-mounted-displays, and many surveyed approaches (45%) adopt marker-based tracking. Results indicate that publications on AR for corporate training increased significantly in recent years. AR has been used in many areas, exhibiting high quality and provides viable approaches to On-The-Job training. Finally, we discuss future research issues related to increase relevance regarding AR for corporate training.

MARTINS, B. R.; JORGE, J.; Zorzal, E. R. Towards augmented Reality for corporate training. INTERACTIVE LEARNING ENVIRONMENTS, v. 29, p. 1-19, 2021. (http://dx.doi.org/10.1080/10494820.2021.1879872)


Mobile apps for the treatment of diabetes patients: a systematic review

Abstract: The incidence of diabetes mellitus (DM) is a significant and growing health problem for all countries, regardless of their development level. Providing mobile computing solutions that assist the patients in their daily treatment routine can be an excellent ally. This systematic review aims to assess how mobile apps for DM treatment follow-up can contribute to improving patients’ quality of life. We searched the bases ACM Digital Library, PubMed, Springer Link, IEEE Explore, and Science Direct databases. We obtained 1608 primary studies; 1485 were used in the screening phase and 791 in the full screening phase, resulting in 106 articles for data extraction from 2013 to 2018. We have identified the main functionalities and characteristics necessary for an application for managing DM, such as automation and ease of data collection. We found that apps for type 1 DM help treat patients, facilitate the collection and access to data, and allow better daily monitoring of treatment.

DE ARAÚJO, W. R. V. C.; MARTINS, L. E. G.; ZORZAL, E. R. Mobile apps for the treatment of diabetes patients: a systematic review. RESEARCH ON BIOMEDICAL ENGINEERING, v. 1, p. 1, 2021. (http://dx.doi.org/10.1007/s42600-021-00150-7)


A survey of industrial augmented reality

Abstract: This article aims to evaluate the impact of Augmented Reality (AR) applicability and usefulness on real industrial processes by employing a systematic literature review (SLR). The SLR was performed in five digital libraries to identify articles and reviews concerning the AR applicability from 2012 to 2018. A patent search in Google’s patents database was also conducted, for the same period. This paper describes how AR has been applied, which industries are most interested in the technology, how the technology has been developed to meet industry needs, as well as the benefits and challenges of AR. This survey concludes by providing a starting point for companies interested in integrating AR into their processes and proposing future directions for AR developers and researchers.

DE SOUZA CARDOSO, L. F.; MARIANO, F. C. M. Q.; ZORZAL, E. R. A survey of industrial augmented reality. COMPUTERS & INDUSTRIAL ENGINEERING, v. 139, p. 106159, 2020. (http://dx.doi.org/10.1016/j.cie.2019.106159)


Laparoscopy with Augmented Reality Adaptations

Abstract: One of the most promising applications of Optical See-Through Augmented Reality is minimally laparoscopic surgery, which currently suffers from problems such as surgeon discomfort and fatigue caused by looking at a display positioned outside the surgeon’s visual field, made worse by the length of the procedure. This fatigue is especially felt on the surgeon’s neck, as it is strained from adopting unnatural postures in order to visualise the laparoscopic video feed. Throughout this paper, we will present work in Augmented Reality, as well as developments in surgery and Augmented Reality applied to both surgery in general and laparoscopy in particular to address these issues. We applied user and task analysis methods to learn about practices performed in the operating room by observing surgeons in their working environment in order to understand, in detail, how they performed their tasks and achieved their intended goals. Drawing on observations and analysis of video recordings of laparoscopic surgeries, we identified relevant constraints and design requirements. Besides proposals to approach the ergonomic issues, we present a design and implementation of a multimodal interface to enhance the laparoscopic procedure. Our method makes it more comfortable for surgeons by allowing them to keep the laparoscopic video in their viewing area regardless of neck posture. Also, our interface makes it possible to access patient imaging data without interrupting the operation. It also makes it possible to communicate with team members through a pointing reticle. We evaluated how surgeons perceived the implemented prototype, in terms of usefulness and usability, via a think-aloud protocol to conduct qualitative evaluation sessions which we describe in detail in this paper. In addition to checking the advantages of the prototype as compared to traditional laparoscopic settings, we also conducted a System Usability Scale questionnaire for measuring its usability, and a NASA Task Load Index questionnaire to rate perceived workload and to assess the prototype effectiveness. Our results show that surgeons consider that our prototype can improve surgeon-to-surgeon communication using head pose as a means of pointing. Also, surgeons believe that our approach can afford a more comfortable posture throughout the surgery and enhance hand-eye coordination, as physicians no longer need to twist their necks to look at screens placed outside the field of operation.

ZORZAL, E. R.; CAMPOS GOMES, J. M.; SOUSA, M.; BELCHIOR, P.; DA SILVA, P. G.; FIGUEIREDO, N.; LOPES, D. S.; JORGE, J. Laparoscopy with Augmented Reality Adaptations. JOURNAL OF BIOMEDICAL INFORMATICS, v. 107C, p. 103463, 2020. (http://dx.doi.org/10.1016/j.jbi.2020.103463)


Mobile augmented Reality to support fuselage assembly

Abstract: One major manufacturing challenge of the fourth industrial revolution is the quality of product development information used in the assembly line. One of the most studied information visualization technologies is Augmented Reality (AR), whose deployment via mobile devices is currently being promoted. However, the industrial sector still has doubts regarding the technology’s maturity and the costs incurred by such production processes, restricting its deployment in production environments. Thus, in the present study, we statistically analyze a markerless AR application, using a hand-held mobile device, that addresses the requirements of the structural assemblies used in aeronautical industries comparing them to those of traditional assembly methods. The main technological benefits and limitations of such applications are identified. To perform this analysis, we conducted a field evaluation where qualified professionals marked the positions of small brackets in an aeronautical structure by using AR. After collecting the time and measurement data for each bracket’s position, a statistical analysis was performed to compare these positionings against those achieved using a traditional, metric scale-based method. The results indicate that AR offers a faster solution for highly complex assemblies, despite several limitations regarding the positioning tolerance when using the technology. As a future study, several application modifications are suggested to reduce markup errors and overall usage time.

DE SOUZA CARDOSO, L. F.; MARIANO, F. C. M. Q.; ZORZAL, E. R. Mobile augmented Reality to support fuselage assembly. COMPUTERS & INDUSTRIAL ENGINEERING, v. 148, p. 106712, 2020. (http://dx.doi.org/10.1016/j.cie.2020.106712)


Anatomy Studio: A tool for virtual dissection through augmented 3D reconstruction

Anstract: 3D reconstruction from anatomical slices allows anatomists to create three dimensional depictions of real structures by tracing organs from sequences of cryosections. However, conventional user interfaces rely on single-user experiences and mouse-based input to create content for education or training purposes. In this work, we present Anatomy Studio, a collaborative Mixed Reality tool for virtual dissection that combines tablets with styli and see-through head-mounted displays to assist anatomists by easing manual tracing and exploring cryosection images. We contribute novel interaction techniques intended to promote spatial understanding and expedite manual segmentation. By using mid-air interactions and interactive surfaces, anatomists can easily access any cryosection and edit contours, while following other user’s contributions. A user study including experienced anatomists and medical professionals, conducted in real working sessions, demonstrates that Anatomy Studio is appropriate and useful for 3D reconstruction. Results indicate that Anatomy Studio encourages closely-coupled collaborations and group discussion, to achieve deeper insights.

ZORZAL, E. R.; SOUSA, M.; MENDES, D.; DOS ANJOS, R. K.; MEDEIROS, D.; PAULO, S. F.; RODRIGUES, P.; MENDES, J. J.; DELMAS, V.; UHL, J-F; MOGORRÓN, J.; JORGE, J.; LOPES, D. S. Anatomy Studio: A tool for virtual dissection through augmented 3D reconstruction. COMPUTERS & GRAPHICS-UK, v. 85, p. 74-84, 2019. (http://dx.doi.org/10.1016/j.cag.2019.09.006)


 

Extended Reality has been used in many areas, exhibiting high quality and provides viable approaches to On-The-Job training.

Courses

Realidade Virtual e Aumentada 

  • Undergraduate: BSc in Computer Science – BCC/UNIFESP, Computer Engineering – EC/UNIFESP, and Science and Technology – BCT/UNIFESP (2011, 2013, 2014, 2015, 2016, 2018, 2020, 2021)
  • Graduate: PhD and MSc students in Technology innovation – PPG-IT/UNIFESP (2018, 2020, 2021), and Computer in Science – PPG-CC/UNIFESP (2021).

Lógica de Programação

  • Undergraduate: BSc in Science and Technology – BCT/UNIFESP (2020, 2021)

Tópicos em Tecnologia da Computação IV (Programação Web)

  • Undergraduate: BSc in Computer Science – BCC/UNIFESP, Computer Engineering – EC/UNIFESP, and Science and Technology – BCT/UNIFESP (2020)
  • Graduate: MSc students in Technology innovation – PPG-IT/UNIFESP (2020, 2021)

Algoritmos e estruturas de dados I

  • Undergraduate: BSc in Computer Science – BCC/UNIFESP, Computer Engineering – EC/UNIFESP, and Science and Technology – BCT/UNIFESP (2010, 2012, 2013, 2014, 2018)

Interação Humano-Computador

  • Undergraduate: BSc in Computer Science – BCC/UNIFESP (2018)

Multimídia

  • Undergraduate: BSc in Computer Science – BCC/UNIFESP (2017)

Sistemas Operacionais

  • Undergraduate: BSc in Computer Science – BCC/UNIFESP, and Computer Engineering – EC/UNIFESP (2011, 2012, 2013, 2014, 2015, 2016, 2017)

Tópicos em Tecnologia da Computação I (Programação Mobile)

  • Undergraduate: BSc in Computer Science – BCC/UNIFESP, and Computer Engineering – EC/UNIFESP (2016)

Udemy: Online courses

Curso Completo de Lógica e ProgramaçãoCurso Completo de Lógica e ProgramaçãoEzequiel Zorzal, Professor UniversitárioCurso rápido e completo para quem quer aprender de uma vez por todas algoritmos, lógica e programação em C.$19.99$14.99 4.7 (88 ratings)

"If you think education is expensive, try ignorance."

Derek Bok

Students

Teamwork makes the dream work. 😉

This is my current fantastic Team:

    • Gabriel de Oliveira Cruz do Prado. (PhD student in Computer in Science, PPG-CC/UNIFESP).
    • Luis Fernando de Souza Cardoso. Advisor. (PhD student in Computer in Science, PPG-CC/UNIFESP).
    • Wellington José das Dores. Advisor. (PhD student in Computer in Science, PPG-CC/UNIFESP).
    • Bruno Rodrigo Martins. Advisor. (MSc student in Technology innovation, PPG-IT/UNIFESP).
    • Caio Cigagna de Godoy. Advisor. (MSc student in Technology innovation, PPG-IT/UNIFESP).
    • Gabriel Riva. (MSc student in Technology innovation, PPG-IT/UNIFESP).
    • Rafael Acioli de Andrade. Advisor. (MSc student in Technology innovation, PPG-IT/UNIFESP).
    • Rodrigo de Almeida Silveira. Advisor. (MSc student in Technology innovation, PPG-IT/UNIFESP).

My previous (also fantastic) MSc students:

    • Orlando Rosa Junior. Advisor, 2020. (MSc student in Technology innovation, PPG-IT/UNIFESP).
    • Patrícia Kyoko Onishi. Advisor, 2020. (MSc student in Technology innovation, PPG-IT/UNIFESP).
    • Francisney Nascimento da Silva. Co-advisor with Iraci de Souza João, 2020. (MSc student in Technology innovation, PPG-IT/UNIFESP). 
    • Walfran Rocha Vieira Carvalho de Araujo. Co-advisor with Luiz Eduardo Galvão Martins, 2020. (MSc student in Technology innovation, PPG-IT/UNIFESP). 
    • Luís Fernando de Souza Cardoso. Advisor, 2019. (MSc student in Technology innovation, PPG-IT/UNIFESP). 
    • Wilson Francisco Moreira de Souza Seron. Co-advisor with Marcos Gonçalves Quiles, 2015. (MSc student in Computer in Science, PPG-CC/UNIFESP). 

 For students:

    • Top Conferences on XR, Computer Graphics, and HCI.
    • Learn LaTeX in 30 minutes.
    • LaTeX Templates – (undergraduate dissertation: <BCC/UNIFESP> / Graduate dissertation and thesis: <PPG-IT/UNIFESP>,<PPG-CC/UNIFESP>)
    • Scientific Journal Rankings – SJR
    • Webqualis – Qualis / CAPES

Due to its high interaction capability, XR can provide more effective applications and improve human-computer interaction.

Videos

Studies indicate that XR is quickly becoming a cornerstone technology for training applications, especially in corporate environments, in the wake of the Covid-19 pandemic.

Contact

Contact information

Email: ezorzal [ at ] unifesp.br

Address: Instituto de Ciência e Tecnologia, Universidade Federal de São Paulo (ICT-UNIFESP)
Av. Cesare Monsueto Giulio Lattes, 1201, Room: 117.
Eugênio de Melo, São José dos Campos – SP, 12247-014, Brazil.

unifespsjc2

  • Welcome
  • Research
  • Publications
  • Courses
  • Students
  • Videos
  • CV Lattes
  • Contact
Copyright 2021 Ezequiel Zorzal | All Rights Reserved
  • Welcome
  • Research
  • Publications
  • Courses
  • Students
  • Videos
  • CV Lattes
  • Contact
Ezequiel Zorzal