AMATH 563 B: Inferring Structure of Complex Systems

Spring 2022
Meeting:
MWF 11:30am - 12:20pm / LOW 202
SLN:
21304
Section Type:
Lecture
Joint Sections:
AMATH 563 A
Instructor:
FOR ACM CAMPUS STUDENTS ONLY
Syllabus Description (from Canvas):

Description

The course AMATH 563 Inferring Structure Of Complex Systems aimed to provide fundamental skills, concepts, and applications of deep learning and neural networks for the investigation of complex data sets and systems.
We will survey the fundamentals of Artificial Neural Networks (ANN) and describe the underlying principles making neural networks generic computing frameworks. We will then build computational skills for training neural networks, understanding and working with algorithms such as Stochastic Gradient Descent, Adam, Dropout, Initialization, etc, and different types of ANNs, such as Convolutional networks, RNNs, LSTM, GANs. We will also cover topics in data embeddings, auto encoders, and latent data representation. 
The course will conclude with projects developing ANN systems to provide efficient solutions to a variety of applications and data.

Classroom Format

The format of instruction will be divided between lectures (theoretical concepts) and labs (practical aspects). Here's a snapshot of topics that we will cover organized in a weekly schedule. NOTE: This is an approximate list of topics to be covered in the course. Particular examples, topics, their duration are subject to change, to provide the best learning experience. 

WEEK

THEORY

PRACTICE

1

Class Introduction: Brain Networks and Neural Networks Fundamentals

  • Examples of Neural Networks
  • Neural Networks in the Brain (motivation)
  • Machine Learning Basics 
  • Neural Networks basics

Programming setup

  • Python Platforms for DL
  • Introduction to Numpy
  • Plotting with Matplotlib
  • Preparing Data for ML

Exercise: Implement XOR gate with neural network

2

Learning and Optimization: Training Neural Networks

  • Loss
  • Training/Validating/Testing
  • Gradient Descent
  • Stochastic Gradient Descent
  • ADAM

Pytorch Basics

  • Neural net training workflow
  • Pytorch data types
  • Graph computation

Exercise: MNIST Classification in PyTorch

3

Deep Learning Practices: Topics in Constructing and Training Neural Networks

  • Operators
  • Drop out
  • Initialization
  • Normalization
  • Project Cycle with Deep Learning Methodology
  • Introduction to CNN

Pytorch Operators and Training Procedures

  • Pytorch operators
  • Designing training procedures

Exercise: Fashion MNIST Classification with advanced optimization

4

Convolutional Neural Networks

  • Motivation (Neuroscience)
  • Convolutional layers
  • Additional layers
  • Residual Nets
  • Examples
  • DL Projects and Data Practices 

Introduction to CNNs

  • Introduction to CNN
  • Image databases for ML
  • Applications of CNNs

Exercise: MNIST Classification with CNN

5

  • Architectures and Practices in Convolutional Neural Networks

 

  • eScience Introduction

 

Further Practice with CNNs

  • CNN Architectures
  • Image segmentation example

Exercise: Image classification using AlexNet 

 

6

Recurrent Neural Networks

  • Motivation (Neuroscience)
  • Sequential Processing
  • Stability
  • Gated Nets (LSTM, GRU)
  • Examples

 

 

Project Proposals

Intro to RNNs

  • Sequential Data
  • Introduction to RNNs
  • RNN Implementation
  • RNN challenges

Exercise: Generate sinusoidal wave with RNN

7

 

Architectures and Applications of Recurrent Neural Networks

  • Natural Language Processing Applications
    • Word embeddings
    • Sentiment Analysis
  • Multivariate Timeseries and Sequence Analysis
    • Prediction
    • Reconstruction
    • Translation

 

  • Project Pitches

Advanced RNNs

  • Gated RNN Architectures
  • Multi-layer RNNs
  • Applications of different RNNs

Exercise: Predict stock prices with RNN

 

 

8

 

Sequential Data Decomposition and Interpretation

  • Embeddings:
    • L1: PCA (SVD), DMD, POD, Time delay embeddings
    • L2: Manifold Approximation/Visualization: ISOMAP, TSNE, UMAP, force directed graphs
  • Clustering (kmeans, knn)

 

 Classical Embeddings/Decompositions

  • POD, SVD, PCA, KLD
  • Dynamic Mode Decomposition (DMD)
  • K-means clustering and KNN classification

9

AutoEncoders and Latent Spaces

  • Structure Inference:
    • Model inference through optimization, probabilistic graphical models
  • AutoEncoders, AutoDecoders, Sequence to Sequence Learning, Latent Space Representation

AutoEncoders for Multivariate Timeseries

10

Adversarial Approaches to ANN / Generative Adversarial Neural Networks

  • Adversaries
  • Generator-Discriminator
  • Training Process
  • Stability
  • Unsupervised learning

GANs/Style Transform

  • Training GANs
  • Style transform
  • Examples of Image Generation
  • Public Projects Presentations: Poster +  Demo
Catalog Description:
Introduces fundamental concepts of network science and graph theory for complex dynamical systems. Merges concepts from model selection, information theory, statistical inference, neural networks, deep learning, and machine learning for building reduced order models of dynamical systems using sparse sampling of high-dimensional data. Prerequisite: AMATH 561 and AMATH 562, or instructor permission Offered: Sp.
Other Requirements Met:
Honors Course
Credits:
5.0
Status:
Active
Last updated:
April 10, 2024 - 10:36 pm