Three Short Talks from Prof. Daniel Weller's Group07 Oct 2019
- Date: Friday October 11, 2019
- Time: 12:00PM
- Location: Rice 242
Title: Optimizing regularization parameters of image processing algorithms through machine learning
Speaker: Tanjin Taher Toma
Abstract: In image and video processing problems (e.g., enhancement, reconstruction, segmentation, etc.), algorithms often have regularization parameters that need to be set appropriately to obtain good results. Existing automatic parameter selection techniques are mostly iterative and often relies on a predetermined image metric (such as, image quality, or risk estimate) to estimate parameter value. In this talk, we discuss our convolutional neural network approach for direct parameter estimation that demonstrates effectiveness over existing methods.
Title: Myocardial T1 mapping with convolutional neural networks
Speaker: Haris Jeelani
Abstract: The longitudinal relaxation time (T1) of the hydrogen protons in the heart wall can be used as an indicator for a variety of pathological conditions. Traditionally, a pixel-wise nonlinear model fitting sensitive to noise is used to obtain T1 maps. As discussed in my previous AIML talk, to increase the noise robustness we were using a convolutional neural network framework (DeepT1). In this talk I will discuss the updates we made to our DeepT1 framework. The updated model includes a recurrent and a U-net model to improve the performance of T1-map estimation. This is joint work work Dr. Michael Salerno and Dr. Christopher Kramer (Cardiology) and Dr. Yang Yang (now, Mount Sinai School of Medicine).
Title: Examining Working Memory Representations for Neural Networks Trained to Play Games
Speaker: Tyler Spears (supervised by Per Sederberg, Psychology)
Abstract: The current success of deep learning is owed, in no small part, to the field’s roots in cognitive neuroscience. In this work, we examine the properties of several human-based models of working memory (WM), and analyze their computational utility when combined with deep neural networks. We then put forth the Scale-Invariant Temporal History (SITH) model, an applied variant of a WM model recently proposed in the cognitive neuroscience literature. Finally, we discuss future applications of SITH in artificial intelligence, as well as the future of neurally-inspired machine learning methods. This work was supervised by Prof. Per Sederberg (Psychology).