Virtual experiment is an important field of human-computer interaction. With more and more virtual laboratories emerging, we found that problems regarding virtual experiments are rising. Such problems can be listed as follows: First, human-computer interaction has lower efficiency during the process of virtual experiment, which means the computer cannot understand the user’s intention thus leading to incorrect operation. Second, there are less detections for false behavior during experiments. Third, the virtual laboratory’s sense of operation and realism is not strong. In order to solve the above problems, the multimodal sensing navigation virtual and real fusion laboratory (MSNVRFL) was designed and implemented in this paper. We design a new set of experimental equipment with the function of cognition and study a multimodal fusion model and algorithm for chemical experiments, which are both finally verified and applied in MSNVRFL. By using multimodal fusion perception algorithm, the user’s true intentions can be understood and the human-computer interaction efficiency can be improved. By carrying out a virtual experiment with the mold of virtual and real fusion, problems like resources wasting and dangers happened during experiment can be avoided, user’s sense of operation and realism can be improved. In addition, teaching navigation and wrong operation behavior reminders are provided for users. The experimental result shows that our method can improve the efficiency of human-computer interaction, reduce the user’s cognitive load, strengthen the user’s sense of reality and operation and stimulate students’ interest in learning.
Research on Multimodal Perceptual Navigational Virtual and Real Fusion Intelligent Experiment Equipment and Algorithm
Jie Yuan,Zhiquan Feng,Di Dong,Xin Meng,Junhong Meng,Dan Kong
Published 2020 in IEEE Access
ABSTRACT
PUBLICATION RECORD
- Publication year
2020
- Venue
IEEE Access
- Publication date
Unknown publication date
- Fields of study
Computer Science, Engineering
- Identifiers
- External record
- Source metadata
Semantic Scholar
CITATION MAP
EXTRACTION MAP
CLAIMS
- No claims are published for this paper.
CONCEPTS
- No concepts are published for this paper.
REFERENCES
Showing 1-39 of 39 references · Page 1 of 1
CITED BY
Showing 1-9 of 9 citing papers · Page 1 of 1