Date/Time: Monday, September 12th, 4:10 p.m - 5:00 p.m. in Barnard Hall 108
Speaker: Philip Brittan
Brief Bio: TBA
Date/Time: Monday, August 29th, 4:10 p.m - 5:00 p.m. in Barnard Hall 108
Facilitator: Dr. John Paxton
Date/Time: Monday, May 2, 4:10 p.m - 5:00 p.m. in Barnard Hall 108
Speaker: Dr. John Paxton
Abstract: At the end of every academic year, we celebrate the accomplishments of members of the Gianforte School of Computing. Join us for this year's celebration where we will reflect on our accomplishments and present awards.
CISA’s Efforts with Partners to Build Secure and Resilient Critical Infrastructure
Date/Time: Monday, May 2, 2:00 p.m - 3:00 p.m. in Barnard Hall 108
Speaker: Dr. Alethea Duhon
Location: Norm Asbjornson Inspiration Hall
Note: This seminar is optional for CS graduate students; attendance will not be taken.
The Cybersecurity and Infrastructure Security Agency (CISA) leads the national effort to understand, manage, and reduce risk to our cyber and physical infrastructure. The responsibility of this mission is becoming increasingly important, because in today’s globally interconnected world, our critical infrastructure and American way of life face a wide array of serious cyber risks. This seminar will highlight the work at CISA and how they are working with partners to defend against today’s threats and collaborating with industry to build more secure and resilient infrastructure for the future. We will also discuss how students and faculty can get engaged with CISA through partnerships or careers to help protect the homeland from cyber and physical threats. Additionally, the speaker will talk about the butterfly effect on how small actions can have life changing impacts through the telling of her personal and professional story.
Bio: Dr. Alethea Duhon, a member of the Senior Executive Service, is the Associate Director for Analysis, National Risk Management Center (NRMC) within the Cybersecurity and Infrastructure Security Agency (CISA) at the Department of Homeland Security. Dr. Duhon’s portfolio includes leading the NRMC’s efforts to take the next step in realizing the vision of the Risk Architecture (backed by the Modeling Capability Transition Environment (MCTE)); building data analysis capabilities to support the architecture via government and commercial solutions; and applying data, models, and technology to develop risk analysis and support risk management decisions around topics such as Supply Chain Security, Foreign Investment risk, and Systemic Risk to Critical Infrastructure from cyberattacks as well as other significant man-made and naturalhazard risks. Prior to this assignment, Dr. Duhon was dual-hatted as the Chief Technology Officer (CTO) to the Department of the Air Force’s Chief Modeling and Simulation Officer (CMSO) and Technical Director of the Air Force Agency for Modeling and Simulation (AFAMS). As the CTO, she served as the Department of the Air Force key scientific authority in the Modeling and Simulation (M&S) field of endeavor. As the AFAMS TD, she was responsible for the planning, direction, management, coordination, reporting, and evaluation of all technical aspects of AFAMS’ mission and programs. Preceding that assignment, Dr. Duhon was the Senior Technical Advisor in the Office of the Under Secretary of Defense (OUSD) Policy, Defense Technology Security Administration (DTSA). In this role, she provided technical insight, advice and analysis on international transfers of defense-related items and other matters of national security interest. Previously she served as an Acquisition Program Manager at the Assistant Secretary of the Air Force (Acquisition), Space Programs, Budget, Congressional and Program Integration Division. She also served as the Chief, Intelligence, Surveillance, and Reconnaissance (ISR) and Special Operations Forces (SOF) Programs, and Air Force Scientific Test and Analysis Techniques (STAT) Lead for the Headquarters United States Air Force, Directorate of Test and Evaluation and was instrumental in establishing the STAT Center of Excellence. She has held previous flight test and flight dynamics positions at Air Force Test Center, Edwards AFB, CA, Northrop Grumman Corporation, Palmdale, CA and Parker Hannifin (Aerospace), Irvine, CA. Dr. Duhon received her B.S. and M.S. degrees in Aeronautical & Astronautical Engineering from Purdue University. She received her Ph.D. in Systems Engineering from The George Washington University and was a Massachusetts Institute of Technology (MIT) Seminar XXI Fellow.
AutoML for Classification of Human Movement and Biomechanics
Date/Time: Monday, April 25, 4:10 p.m - 5:00 p.m. in Barnard Hall 108
Speaker: Dr. Corey Pew
Abstract: Classification and prediction of human movement is an important research topic in biomechanics and health. Classifying the motion of users in real-time is necessary for controlling robotic prosthetics, orthotics, and exoskeletons. In addition, the ability to identify abnormal gait or an adverse event, such as a fall, would provide prompt, objective data that allows researchers and clinicians to better understand a patient’s needs objectively and accurately. To achieve this goal, researchers utilize body worn sensors (inertial measurement units (IMUs), surface electromyography, load cells, etc.) combined with machine learning classifiers to identify various walking modes and user intent. Classifiers take input from sensors by collecting raw data on body movement, which is then translated into classifications of activity such as walking, sitting, standing, stair climbing, and more. Because researchers in the biomechanics field are often not deeply familiar with best practices in machine learning there is a tendency to inappropriately use premade classifiers and boast success through reports of high classification accuracy. It is the goal of our project to create an AutoML pipeline that addresses the specific challenges of biomechanics data that helps to mitigate classifier misuse while also educating the user of the best practices when implementing machine learning classifiers with their data. In addition, we seek to utilize AutoML to facilitate the implementation of personalized classifiers at the clinical level to help bridge the gap between theory and application in the field.
Bio: Dr. Pew is an Assistant Professor in the Mechanical and Industrial Engineering Department at Montana State University. His research interests are focused on biomechanics and human-machine interactions with applications to the advancement of lower limb amputee technologies. This includes the design of new amputee devices as well as sensing and control schemes to facilitate two-way communication between the user and the prosthetic device. Control employs the use of body-worn sensors such as inertial measurement units and electromyography as well as the development of machine learning techniques to translate that motion information into identifiable control signals. In addition, I have interests related to running and human performance evaluation and improvement.
Assessing Forecast Performance of Empirical Crop Response Models using Precision Agriculture and On-Farm Precision Experimentation
Date/Time: Monday, April 18, 4:10 p.m - 5:00 p.m. in Barnard Hall 108
Speaker: Paul Hegedus
Abstract: Precision agroecology leverages the data derived from precision agriculture technology to characterize ecological relationships between crop responses, the environment, and agronomic inputs. Decision support systems utilize models that describe these relationships to generate management recommendations for agronomic inputs, such as nitrogen fertilizer. Yet there is uncertainty in the literature on the best model forms to characterize crop responses to agricultural inputs, likely due to the variability in crop responses to inputs between fields and across years. Seven fields with at least three years of experimentally varied nitrogen fertilizer rates were used to compare the ability of five different model types to forecast crop responses and net-returns. The five model types for each field were investigated using all permutations of the three years of data, where two years were used for training and a third was held out to represent a “future” year. The five models tested were a frequentist based non-linear sigmoid function, a generalized additive model, a non-linear Bayesian regression model, a Bayesian multiple linear regression model, and a random forest regression model. The random forest regression typically resulted in the most accurate forecasts of crop responses and net-returns across most fields. However, in some cases the model type that produced the most accurate forecast of grain yield was not the same as the model producing the most accurate forecast of grain protein concentration. Models performed best when the data used for training models was collected from years with similar weather conditions to the forecasted year. The results are important to developers of decision support tools to minimize assumptions when selecting models used for simulating management outcomes and deriving economically and ecologically optimized nitrogen fertilizer rates.
Bio: Paul Hegedus is a Ph.D. candidate of Ecology and Environmental Sciences at Montana State University (MSU) in the Land Resources and Environmental Sciences Department, where he has also completed a Certificate of Applied Statistics. Planning to graduate in 2022, Paul began his Ph.D. in 2018 after receiving a B.S. in Land Rehabilitation with a minor in Soil Science from MSU in 2017. Paul works as the Research Associate for the Agroecology Lab at MSU and as Field Trial Supervisor for the Data-Intensive Farm Management project at the University of Illinois. His graduate research is focused on data intensive agroecological approaches that harness precision agriculture technology and data science approaches to optimize nitrogen fertilizer management. Paul was awarded an Undergraduate Research Award from the WSSA in 2016 for research on herbicide resistance. In 2019, Paul was awarded a graduate fellowship from the USDA WSRE program to fund his graduate work. He is a member of the ISPA and ESA.
Understanding and Addressing Public Distrust of AI & Autonomous Systems
Date/Time: Monday, April 11, 4:10 p.m - 5:00 p.m. in Barnard Hall 108
Speaker: Dr. Kristen Intemann
Abstract: According to a 2017 Pew Research poll, twice as many U.S. adults say they are “more concerned than excited” about the increased use of intelligent and autonomous systems and nearly half say that they are “as concerned as they are excited about AI.” This is worrisome given the numerous ways that such systems are being increasingly turned to in developing tools to be used in virtually every aspect of human life including: healthcare, agriculture, marketing, social media, transportation, and policing. This talk will present a variety of reasons that segments of the public have for distrusting of both particular applications as well as artificial and intelligent systems more generally. Understanding the sources of distrust is vital for identifying strategies that would increase trust and produce responsible and fair intelligent and autonomous systems.
Bio: Dr. Kristen Intemann is a Professor of Philosophy in the Department of History & Philosophy and the Director of the Center for Science, Technology, Ethics, and Society at MSU. Her research lies at the intersection of science and ethics, looking at questions such as the responsibilities of scientists, public trust in science and technology, and public engagement with science. She teaches courses on environmental ethics, biomedical ethics, technology ethics, and philosophy science, receiving the President’s Excellence in Teaching Award in 2009. She has published over 30 peer-reviewed journal articles and book chapters as well as a book, The Fight Against Doubt: How to Bridge the Gap between Scientists and the Public, which was published by Oxford University Press in 2018.
Composing With Code—Because an 88-Key Keyboard Just Isn’t Enough
Date/Time: Monday, April 4, 4:10 p.m - 5:00 p.m. in Barnard Hall 108
Speaker: Dr. Linda Antas
Abstract: A grand piano has 88 keys and is a helpful tool for composing music—but a computer has more keys and opens up even more possibilities. This talk will explore the merging of computer science and composition for acoustic instruments, electronically-generated sounds, and combinations of the two. Examples will be drawn from the presenter’s works involving code-based sound synthesis, algorithmic composition, and real-time signal processing. Creating audio via the real-time data-mapping of brain waves, and compositions based on data-mapping GPS data from trips on Montana’s wonderful trails and rivers will be discussed.
Bio: Dr. Linda Antas is an Associate Professor of Music Technology in the School of Music at Montana State University. Linda received her DMA in computer music composition from the University of Washington in 2002. Her research interests include Code- and GUI-based sound synthesis, algorithmic composition, sonification, multimedia production, and music cognition
What if Algorithms Weren’t the Ghost in the Machine? Using Explainable AI (XAI) Methods to Turn Algorithmic User Experiences into Research Data Objects
Date/Time: Monday, March 28, 4:10 p.m - 5:00 p.m. in Barnard Hall 108
Speaker: Jason Clark
Abstract:Jason A. Clark, Professor and Lead for Research Optimization, Analytics, and Data Services (ROADS) at MSU Library, will be discussing his research developing software and a curriculum to support the teaching of "Algorithmic Awareness": an understanding around the rules that govern our software and shape our digital experiences. Taking inspiration from investigative data journalists, like The Markup, Jason will introduce a research module for algorithm auditing practices using code, web scraping methods, and structured data formats to uncover proprietary algorithms and turn them into research data objects for analysis. (Code is available in our #AlgorithmicAwareness GitHub repository.) The case study for the module will be the YouTube Video Recommendation Algorithm which has come under criticism for its tactics in drawing parents’ and childrens’ attention to their videos. The goal will be to show the generic patterns, data points, and scripts one can use to analyze algorithmic user experiences and demonstrate how code can be used to turn algorithms into datasets for analysis. In the end, attendees will be able to realize actionable steps for seeing algorithms as data objects, gain a sense of the first steps one can take to programmatically audit these systems with code, and take away investigative data techniques for applying Explainable AI methods to your own work and teaching.
Bio: Jason is the lead for Research Informatics, where he builds and supports research and data services at theMontana State University (MSU) Library. In his work, he has focused on Semantic Web development, digital library development,metadata and data modeling,web services and APIs, search engine optimization, andinterface design. Before coming to MSU, Jason became interested in the intersection between libraries and technology while working as a web developer for the Division of Information Technology at the University of Wisconsin. After two years, he moved on to work as the web services librarian at Williams College Libraries. Jason holds a BA in English and Philosophy from Marquette University, an MA in English from the University of Vermont, and an MLS from the University of Wisconsin-Madison, School of Library & Information Studies.
Statistical Inference in Topological Data Analysis
Date/Time: Monday, March 21, 4:10 p.m - 5:00 p.m. in Barnard Hall 108
Speaker: Jordan Schupbach
Abstract: Topological data analysis (TDA) is a fairly newinterdisciplinary field that seeks to represent the shape of data using tools from algebraic topology. It has the ability to distill complex structural information present in high-dimensional datasets. However, methods for analyzing these representations under non-trivial sampling designs are few to non-existent and as a result are rarely employed in practice. Persisence intensity functions are a common topological descriptor in used in the field of TDA. In this talk, novel methods for conducting hypothesis testing with persistence intensity functions in the setting of hierarchical sampling designs will be presented.
Bio: Jordan Schupbach is a PhD student in statistics at Montana State University co-advised by John Borkowski and John Sheppard. His primary research involves conducting statistical and predictive inference for topological data analysis (TDA) using a point process methodology. At MSU, he has been involved in conducting research using a TDA methodology for predicting progression of prostate cancer and in using Bayesian networks for conducting diagnostics and prognostics in systems health management. His general research interests include machine learning, Bayesian statistics, nonparametric statistics, spatial statistics, and functional data analysis.
Solving Industrial Optimization Problems with Quantum Annealing
Date/Time: Monday, March 7, 4:10 p.m - 5:00 p.m. via MS TEAMS
Speaker: Dr. Alexander Feldman, PARC
Abstract: Quantum annealing machines become an important tool in solving problems of industrial significance. Experimentation in areas such as circuit diagnostics and Automated Test Pattern Generation (ATPG) indicate that quantum annealers will soon outperform classical methods. Despite this, due to hardware limitations, there will always be problems that are too big for the underlying hardware. One way of approaching these large problems is to preprocess and split them.
In this talk we will discuss hybrid methods for solving circuit diagnosis, ATPG, satisfiability and other problems from combinatorial optimization. The emphasis is on preprocessing and on the tool-chain that converts the input problem to a representation suitable for quantum annealing. We will show several advanced techniques for reducing the number of ancillary variables. These techniques significantly improve the performance of the hybrid optimization process.
We will also discuss a class of more difficult problems related to circuit synthesis. These are in the second level of the polynomial hierarchy. Solving them will answer important questions related to quantum computing and the scalability of this promising technology.
Bio: Alexander Feldman is a researcher at PARC (former Xerox PARC). Before that he was a postdoc at University College Cork and a visiting researcher at Ecole Polytechnique Fédérale de Lausanne (EPFL) and Delft University of Technology. He has obtained his Ph.D. (cum laude) in computer science/artificial intelligence and M.Sc. (cum laude) in parallel and distributed systems from the Delft University of Technology. He has more than 50 publications in leading conference proceedings and international journals covering topics from artificial intelligence, model-based diagnosis, computer science, and engineering. In cooperation with NASA Ames Research Center and PARC, Alexander Feldman has co-organized the International Diagnostic Competitions (DXC). Alexander Feldman's interest cover wide spectrum, including topics such as model-based diagnosis, automated problem solving, software and hardware design, quantum computing, logic design, design of diagnostic space applications, digital signal processing, and localization.
Exploration of Multi-Objective Optimization in the Factored Evolutionary Framework
Date/Time: Monday, February 28, 4:10 p.m - 5:00 p.m. in Barnard Hall 108
Speaker: Amy Peerlinck
Abstract: Multi-Objective Optimization (MOO) looks at problems with two or more competing objectives. Such problems occur naturally in the real world. For example, many engineering design problems have to deal with competing objectives, such as cost versus quality in product design. How do we handle these competing objectives? To answer this question, meta-heuristic algorithms that find a set of Pareto optimal solutions have become a popular approach. However, with the increase in complexity of problems, a single population approach may not be the most efficient to solve large scale multi-objective optimization problems. For this reason, co-operative co-evolutionary algorithms (CCEA) are used, which split the population into subpopulations optimizing over subsets of variables that can now be optimized simultaneously. Factored Evolutionary Algorithms (FEA) extends the CCEA idea by including overlap in the subpopulations. So far, FEA has not been applied to the field of MOO, but we believe it could be an effective alternative approach to help solve these types of problems. In this talk, we lay out our plan to research the impact of different ways to create subpopulations and how it influences large-scale and multi-objective optimization. We intend to look at the influence of overlapping and distinct variable decompositions, as well as objective decomposition approaches for MOO.
Bio: Amy Peerlinck received her MS in computer science from Montana State University, a BA in applied linguistics from the University of Antwerp, and a BS in information science from Karel De Grote College/University. She is currently working towards her PhD in computer science at Montana State University, where she is a research assistant on a Precision Agriculture grant, optimizing profit for farmers through Machine Learning Techniques.
Authoring Social Interactions Between Humans and Robots
Date/Time: Monday, February 7, 4:10 p.m. - 5:00 p.m. in Barnard 108
Speaker: David Porfirio
Abstract: Robots serve as interaction partners to humans in the workplace, at home, and for leisure activities, but designing social human-robot interactions (HRIs) is non-trivial. Challenges arise from the need to create interaction experiences that are successful with respect to both task and social outcomes. In particular, HRI developers must manage the low-level details of a robot program, such as asynchronously sensing external input while producing concurrent behaviors like speech and locomotion, while manipulating the robot’s higher-level decision making to produce a natural interaction flow. A further challenge includes the differing success criteria for HRIs within separate interaction contexts, in that developers must consider the end-user constraints and preferences specific to each individual context within which the robot will be deployed. In this talk, I will present my past research and plans for future work on how HRI development approaches can help mitigate these challenges. Approaches of interest include software or hardware interfaces and assistive algorithms made specifically for programming robots. I seek to answer how these development tools and techniques can support HRI developers in creating robust interaction designs by (1) filling in gaps in developer knowledge and expertise and (2) eliciting knowledge already possessed by developers and assisting with the integration of this knowledge into robot programs.
Bio: David Porfirio is a Ph.D. candidate at the University of Wisconsin–Madison. His interests lie in investigating and designing human-robot interaction development tools that make the process of programming social robots easy and approachable for experts and non-experts alike. David has received numerous fellowships and awards during his Ph.D., including the NSF Graduate Research Fellowship, the Microsoft Dissertation Grant, and a best paper award for his work on formally verifying social norms in human-robot interaction designs. Prior to his research at UW–Madison, David earned bachelor’s degrees in computer science and human physiology from the University of Arizona.
Automated AI: Aspirations and Perspirations
Date/Time: Monday, January 31, 4:10 p.m - 5:00 p.m. in Barnard Hall 108
Speaker: Dr. Lars Kotthof
Abstract: AI and machine learning are ubiquitous, but AI and ML experts are not. Arguably, at least some of the tasks those scarce experts are tackling do not make the best use of their skills and expertise — manually tweaking heuristics and hyperparameter settings is tedious but relatively straightforward. Automating these tasks allows the human experts to focus on the interesting and creative work. In this talk, I will outline the aspirational goal of automating large parts of AI that are currently painstakingly done by human experts, including engineering AI software. I will describe some of the progress that has been made to date, in particular in automated machine learning. The talk will conclude with a broader outlook on how the development of automated AI has positive impacts in other fields, using Materials Science as an example.
Bio: Lars Kotthoff is an assistant professor at the University of Wyoming and held post-doctoral appointments at the University of British Columbia, Canada, University College Cork, Ireland, and the University of St Andrews, Scotland. His work in meta-algorithmics, automated machine learning, and applying AI to Materials Science has resulted in more than 80 publications with more than 3333 citations, supported by more than $3M in funding. He is one of the principal developers of the award-winning mlr machine learning software, widely used in academia and industry.
Seminars from 2021.