AMATH 563 B: Inferring Structure of Complex Systems

Spring 2021
Meeting:
MWF 11:30am - 12:20pm / * *
SLN:
10242
Section Type:
Lecture
Joint Sections:
AMATH 563 A
Instructor:
FOR EDGE ONLINE STUDENTS ONLY. CFRM, AMATH IN REG PERIOD 1; ALL OTHERS NEED ADD CODE PERIOD 2 & BEYOND. CONTACT AMATH ADVISOR, LAUREN LEDERER AT GPA@AMATH.WASHINGTON.EDU FOR ADD CODE. PLEASE INCLUDE YOUR STUDENT ID NUMBER. OFFERED VIA REMOTE LEARNING
Syllabus Description (from Canvas):

Description

The course Introduction to Deep Learning Applications and Theory is a graduate course aimed to provide fundamental skills, concepts, and applications of deep learning and neural networks for the investigation of complex data sets and systems. 
We will survey the fundamentals of Artificial Neural Networks (ANN) and describe the underlying principles making neural networks generic computing frameworks. We will then build computational skills for training neural networks, understanding and working with algorithms such as Stochastic Gradient Descent, Adam, Dropout, Initialization, etc, and different types of ANNs, such as Convolutional networks, RNNs, LSTM, GANs. 
We will then introduce deep learning methodology for reinforcement learning and cover practices in hyper parameter optimization and network pruning.
The course will conclude with projects developing ANN systems to provide efficient solutions to a variety of applications and data.

Classroom Format

The format of instruction will be divided between lectures (theoretical concepts) and labs (practical aspects). Here's a snapshot of topics that we will cover organized in a weekly schedule. NOTE: This is an approximate list of topics to be covered in the course. Particular examples, topics, their duration are subject to change, to provide the best learning experience. 

WEEK

THEORY

PRACTICE

1

Class Introduction: Brain Networks and Neural Networks Fundamentals

  • Examples of Neural Networks
  • Neural Networks in the Brain (motivation)
  • Machine Learning Basics 
  • Neural Networks basics

Programming setup

  • Python Platforms for DL
  • Introduction to Numpy
  • Plotting with Matplotlib
  • Preparing Data for ML

Exercise: Implement XOR gate with neural network

2

Learning and Optimization: Training Neural Networks

  • Loss
  • Training/Validating/Testing
  • Gradient Descent
  • Stochastic Gradient Descent
  • ADAM

Pytorch Basics

  • Neural net training workflow
  • Pytorch data types
  • Graph computation

Exercise: MNIST Classification in PyTorch

3

Deep Learning Practices: Topics in Constructing and Training Neural Networks

  • Operators
  • Drop out
  • Initialization
  • Normalization
  • Project Cycle with Deep Learning Methodology
  • Introduction to CNN

CNNs and Operators

  • Pytorch operators
  • Designing training procedures
  • Introduction to CNN

Exercise: MNIST classification with CNN

4

Convolutional Neural Networks

  • Motivation (Neuroscience)
  • Convolutional layers
  • Additional layers
  • Residual Nets
  • Examples

Advanced CNNs

  • Image databases for ML
  • Applications of CNNs
  • CNN Architectures
  • Image segmentation example

Exercise: Image classification using AlexNet 

5

Architectures and Practices in Convolutional Neural Networks

Project Pitches

6

Recurrent Neural Networks

  • Motivation (Neuroscience)
  • Sequential Processing
  • Stability
  • Gated Nets (LSTM, GRU)
  • Examples

Intro to RNNs

  • Sequential Data
  • Introduction to RNNs
  • RNN Implementation
  • RNN challenges

Exercise: Generate sinusoidal wave with RNN

7

Architectures and Applications of Recurrent Neural Networks

  • Natural Language Processing Applications
    • Word embeddings
    • Sentiment Analysis
  • Multivariate Timeseries and Sequence Analysis
    • Prediction
    • Reconstruction
    • Translation

Advanced RNNs

  • Gated RNN Architectures
  • Multi-layer RNNs
  • Applications of different RNNs

Exercise: Predict stock prices with RNN

8

Adversarial Approaches to ANN / Generative Adversarial Neural Networks

  • Adversaries
  • Generator-Discriminator
  • Training Process
  • Stability
  • Unsupervised learning

GANs/Style Transform

  • Training GANs
  • Style transform
  • Examples of Image Generation

9

Advanced Topics: Sequential Data Decomposition and Interpretation

  • Lec1-Lec2: Embeddings:
    • L1: PCA (SVD), DMD, POD, Time delay embeddings
    • L2: Manifold Approximation/Visualization: ISOMAP, TSNE, UMAP, force directed graphs
  • Lec3: Clustering (kmeans, knn)

Classical Embeddings/Decompositions

  • POD, SVD, PCA, KLD
  • Dynamic Mode Decomposition (DMD)
  • K-means and KNN clustering and classification

10

Advanced Topics: AutoEncoders and Latent Spaces

  • Lec4: Structure Inference:
    • Model inference through optimization, probabilistic graphical models
  • Lec5: AutoEncoders, AutoDecoders, Sequence to Sequence Learning, Latent Space Representation

AutoEncoders for Multivariate Timeseries

  • Public Projects Presentations: Poster +  Demo
Catalog Description:
Introduces fundamental concepts of network science and graph theory for complex dynamical systems. Merges concepts from model selection, information theory, statistical inference, neural networks, deep learning, and machine learning for building reduced order models of dynamical systems using sparse sampling of high-dimensional data. Prerequisite: AMATH 561 and AMATH 562, or instructor permission Offered: Sp.
Credits:
5.0
Status:
Active
Last updated:
July 24, 2024 - 12:30 pm