Discipline: Computer Sciences and Information Management
Subcategory: Computer Science & Information Systems
Session: 1
Room: Private Dining
Kyle Schultz - Fayetteville State University
Co-Author(s): Chris Arsenault, North Carolina State University, NC; Laura De'Santis, Fayetteville State University
This project concerns the development of AI-Designed Heat Exchangers through the use of machine learning, and predictive neural networks in efforts to accelerate the rapidly improving advancements in electrical aerospace technologies. The current need to reduce payload mass and to enable new aero efficiencies for multifunctional lightweight thermal management structures can be met and surpassed through the use of AI to alleviate countless man hours involved in initial design phases. To satisfy this demand AI based natural language data analysis can be used to analyze existing taxa and postulate new technologies. Current efforts begin with the use of a web-scraper to crawl scholarly sites retrieving publications pertaining to research objectives. These publications are then compiled into a zip folder for further processing.Once a sufficient amount of reference material is compiled, it is run against a natural language processor to comb the existing taxa to accession usable data. The natural language processor uses BERT trained pre-encoders to curate the data in real time and output relevant material based on questions entered into the program. The accessioned data is yielded back to the user with a relative scoring based on how useful the collected data is to the question asked.The curated data is then divided up into non-overlapping subsets: the training set and the test set. The predictive neural network is trained with the training set with a loss function as the measure of how well the neural network has learned to predict the outputs from the inputs. With decreasing loss on the training set, the network predictive ability is also measured on the hold-out test set. The decreasing loss in both training and test data suggests that our choice of neural network and curated dataset has started to produce trained models. With recent advancements in AI technologies, it is now feasible to alleviate many initial research processes reducing overall industry costs and significantly decreasing design time. Future work expansion would be to structure data retrieved from NLP into a collection of parameters more readable to users, and synthesizing data based on initial samples. These steps would further reduce the timeline requirements and allow for faster more efficient model training. Lastly exploration of other NLP and NN techniques may be further used to streamline and reduce timelines. References: Mackinnon J. Poulson: Ezra O. McNichols, “Data Driven Design and Analysis of Biologically Inspired Heat Sinks” (2020). NASA Glenn Research CenterLuay Wadie, “Regression Based Neural Network for Computational Fluid Dynamics Predictions” (2020). NASA Glenn Research Centerhttps://haystack.deepset.ai/overview/introhttps://www.deepset.ai/blog/haystack-question-answering-at-scalehttps://www.sbert.net/docs/pretrained_cross-encoders.htmlMultiphysics v. 5.4 www.comsol.com . COMSOL AB, Stockholm, Sweden.
Funder Acknowledgement(s): Funder Acknowledgements: This work was supported in part by the NASA MUREP Space Technology Artemis Research (M-STAR) Implementation Grant. The authors would like to thank Mackinnon Poulson, and Luay Wadie for the previous research work that aided in understanding the implementation of Machine Learning, NN implementation, and objective related information. Lastly, the authors would like to thank Dr. Ezra McNichols, Dr. Mark Kankam, and Coy Bush of NASA Glenn Research Center for their advice on this task.
Faculty Advisor: Dr. Sambit Bhattacharya, sbhattac@uncfsu.edu
Role: My efforts involved in this project include leadership and Natural Language Processor development. As the leader of the group I applied to the project, delegated tasks to group members, ensured timely completion of deadlines, and handled communications back and forth between NASA collaborators. After data was scraped from the web my role was to develop the question answer based programing to allow relevant data to be supplied for the next stage of the process.