Home » NSF grant to help FIU develop games that train workers for jobs in architecture, engineering and construction | FIU Community News#

NSF grant to help FIU develop games that train workers for jobs in architecture, engineering and construction | FIU Community News#

With robotics automation rapidly transforming jobs in the architecture, engineering and construction (AEC) industries, there is an increased demand for skilled workers in advanced technologies and robotics. The National Science Foundation has awarded FIU a grant to address this demand by developing a platform that will use virtual reality games to train workers for the new world of work.

FIU faculty from the College of Communication, Architecture + The Arts (CARTA’s) School of Architecture and the College of Engineering and Computing’s Knight Foundation School of Computing and Information Sciences, will work together to develop a personalized learning program for AEC industry workers. The project is one of 20 nationwide funded through the NSF Research on Emerging Technologies for Teaching and Learning (RETTL) program.

FIU researchers Shahin Vassigh (principal investigator), Mark Finlayson (co-principal investigator), Professor Biayna Bogosian (co-principal investigator), teaching professor Eric Peterson along with RDF lab researcher Madeline Gannon, will collaborate with learning scientist Seth Corrigan from the University of California-Irvine, and Shu-Ching Chen, data scientist at the University of Missouri-Kansas City and former FIU professor, to develop the advanced virtual reality training platform. This platform would be developed in a game format to help teach AEC students and professionals how to operate industrial robots.

“This grant provides an opportunity to develop a personalized learning tool that tailors robotics lessons and their delivery sequences for differences in ability, experience, and sociocultural backgrounds,” said Shahin Vassigh, director of FIU’s Robotics and Digital Fabrication (RDF) lab, and co-director of the Integrated Computer Augmented Virtual Environment (I-CAVE).

As users perform tasks in the system, they will be asked to talk through their decisions. Artificial intelligence (AI) software will then track their language to determine which concepts they are acquiring. With this data, the system will then recommend specific lessons to the user. For example, if a participant is struggling with how to use a robot’s pivot points to reach an object, the system will go back to the words and actions of the user to provide the most relevant lesson.

“This method is similar to how teachers and professors grade essay questions,” said Finlayson, eminent scholar chaired associate professor of computer science.

Teachers don’t look solely for right answers in students’ papers. They look for the thinking, the qualitative data, that led to the right answers. By analyzing this evidence of concept acquisition, a teacher can figure out which concepts a student isn’t understanding and offer better feedback. The platform will work the same way.

“This project will help us drive forward an area of critical importance, namely, the application of AI and Natural Language Processing technologies to the analysis of learning data,” Finlayson said. “This kind of work will be an important step in understanding how to bring AI benefits to education research and give education researchers a model to follow in the use of cutting-edge NLP analyses.”