Coursework


Graduate Reinforcement Learning - Theory and Practice

This graduate-level course delves deeply into the principles and practical applications of Reinforcement Learning (RL), a crucial domain in modern artificial intelligence. It emphasizes both theoretical foundations and real-world implementations, focusing on 1) advanced RL algorithms and their mathematical underpinnings, 2) practical techniques for training agents in complex environments.

Graduate Natural Language Processing

Graduate-level course focusing on the latest advancements in Natural Language Processing (NLP), particularly the revolutionary role of transformer models in various NLP tasks. This course explores the depths of linguistic structure and the role of deep learning in language modeling, including both theoretical underpinnings and practical applications. Key topics include the study and implementation of transformer architectures, text generation, sentiment analysis, and language understanding. Coursework involves critical analysis of current research, hands-on programming assignments using state-of-the-art technologies, and a significant final project that encourages original research in NLP. The course aims to provide a comprehensive understanding of the complexities and challenges in the evolving field of NLP.

Graduate Research in Computational Linguistics

This graduate- level course, led by Jessy Li, delves into diverse areas such as the belief systems of language models, methods for knowledge distillation and updating in AI, causal abstraction for model interpretation, the logical consistency of reasoning in AI, and approaches to automatic evaluation of large language models. It includes in-depth studies of topics like rapid language learning through Bayesian priors, challenges in propagating knowledge to language models, and evaluating the impacts of knowledge editing in computational models.

Graduate Topics in NLP

Grad level course. Focuses on  Learning Objective of LMs, Specialization of LMs , Symbolic distillation, Specialization of LM pre-training, Adaptors / Domain specific modeling, Human - LLM interaction, Knowledge Augmentation / Editing, Knowledge Localization, Task-level Editing, Retrieval-based Augmentation

Graduate Distributed Systems

Graduate-level course at The University of Texas at Austin blending theoretical and practical aspects of distributed systems. The course, covering state machines, consensus algorithms, distributed storage systems, Byzantine failures, and Lamport clocks, delves into both the fundamental and advanced topics in distributed computing. It also includes hands-on projects like implementing a distributed key-value store and conducting a research project, offering students a holistic understanding of distributed systems.

Graduate Robot Learning

Grad level course. Researches modern machine learning and AI algorithms for autonomous robots. It covers advanced topics centered on 1) how robots perceive unstructured environments from raw sensory data, 2) how robots make decisions based on their perception, and 3) how robots actively and continuously learn and adapt in the physical world. Includes an analysis of cutting-edge research, presentations, and a final project that implements a decision transformer and makes use of robomimic and OpenAI's MuJoCo project.

Graduate Grounded Natural Language Processing

Grounded Natural Language Processing is a graduate-level course that delves into the intersection of natural language and perception and action in the world, connecting Natural Language Processing (NLP), computer vision, robotics, and computer graphics. The course structure includes reading and presenting research papers as well as an original research project. During the course, I focused on video diffusion models, particularly studying the paper "IMAGEN VIDEO: High Definition Video Generation with Diffusion Models" My own research, titled "Learning Temporal Video-Language Grounding for Egocentric Videos," was a significant part of the course, contributing to the understanding of how language and video interact and can be grounded in real-world, first-person perspectives.

Graduate Natural Language Generation

Grad level course. Text generation is an exciting area of natural language processing in which modern approaches use large-scale pre-trained models for summarization, dialog generation, text simplification, machine translation, creative writing, and other applications. However, it is not without its drawbacks: How do we create text that is both linguistically and factually correct? How do we understand model behavior when everything appears to be black-boxed? How do we evaluate the generated text's quality? Do models make ethical decisions, and if not, what can we do to change that? In this seminar, I analyze state-of-the-art methods for subtopics in Natural Language Generation and work on a novel zero shot topic-controlled language model.

Graduate Algorithms and Complexity

Graduate level course. This course is a rigorous exploration of advanced algorithms and computational complexity, widely regarded as one of the most challenging courses in the field. It covers a broad spectrum of topics, including divide-and-conquer strategies, linear-time selection, polynomial multiplication, Fast Fourier Transform (FFT), dynamic programming, greedy algorithms, matroids, amortized analysis, splay trees, augmenting data structures, Fibonacci heaps, maximum flow algorithms, linear programming, P and NP problems, NP-completeness, Cook-Levin theorem, approximate load balancing, weighted vertex cover, weighted set cover, linearity of expectation, Chernoff bounds, and hypercube routing.

Graduate Techniques of Machine Learning and Data Science

Grad level course. Cover statistical estimation and optimization algorithms, neural networks, geometry of high dimensional spaces, randomized methods, sparse approximation, and dimensional reduction techniques. 

Natural Language Processing

In two ways, this class is intended to be a survey of modern NLP. First, it discusses the most common applications of NLP techniques today, both in academia and industry, as well as enough linguistics to contextualize these problems and understand their challenges. Second, it includes classifiers, sequence models, statistical parsers, neural network encoders, and encoder-decoder models in structured prediction and deep learning. We investigate the models themselves, as well as examples of problems to which they are applied, inference methods, parameter estimation, and optimization. My final project was HuggingFace trainer to enable studying of dataset artifacts.


Neural Networks

This course covers both the theory and practice of deep learning, using hands-on implementations in PyTorch.  Covered common applications in deep learning including computer vision, sequence modeling, deep reinforcement learning, and generative modeling. This class covers the history of neural networks and recent improvements in transformers and biological networks. In this class, I used PyTorch to fine-tune on large scale pre-trained computer vision models to predict the worth and origin of musical instruments. 


Algorithms and Complexity

This course covers a variety of algorithmic techniques and optimal approaches to them. Graph algorithms, greedy algorithms, divide-and-conquer, dynamic programming, network flow, NP-completeness, undecidability, polynomial-time reduction, approximation algorithms, and randomized algorithms are all covered.


Machine Learning

In this course, we will learn how to use machine learning algorithms to find patterns in large datasets. This course discusses supervised, unsupervised, and reinforcement learning. Regression, classification, clustering, anomaly detection, and association analysis are all skills we learn and apply. Hand-written algorithms and Python (sklearn, pandas, numpy) & Jupyter are used to learn. Building Decision Tree classifiers, exploring sklearn, manipulating and analyzing data, creating neural networks and other classifiers, and a final project to investigate an area of interest with ML are all part of the work.


Data Structures

We learned how to use and implement canonical data structures like lists, iterators, stacks, queues, priority queues, trees, binary search trees, balanced binary search trees, sets, maps, hash tables, heaps, tries, and graphs in this course. Testing, pre/post conditions, assertions, debugging, data abstraction, basic algorithm analysis, recursion, canonical sorting and searching algorithms, an introduction to object oriented concepts such as encapsulation, inheritance, and polymorphism, dynamic programming, and functional programming are all covered.


Computer Architecture

At a low level of abstraction, I learned computer systems from the perspective of a programmer. Created a complex memory manager, command interpreter, and Y86-64 System Emulator to learn how to program in C. Learned how to represent and manipulate data (basic operations on binary encodings, primitive data types, points, data structures, floating-point numbers, and so on). The x86-64 instruction set architecture (ISA), processor architectures, pipelines, and implementations were thoroughly examined.


Operating Systems

I learned the intricacies of an operating system in this course. This includes how the operating system manages a computer's resources to improve the user experience. virtual addressing, memory protection, concurrent programming, file systems, and scheduling are all central concepts. Priority scheduling, stack argument passing, system calls for user programs, virtual memory, and conversion of existing file thread system to multi-threaded were all implemented.


Software Engineering

This class covers Python, relational algebra, SQL, refactoring, and design patterns.  In this class, we also study assertions, exceptions, testing, web development, and software engineering tools. I worked with a team of five to to create a website containing information about congressional districts, as well as identify, quantify, and visualize gerrymandering. This project consolidated demographic and political data for each district, as well as quantify gerrymandering based on the shape of each congressional district. Additionally, I created a blog noting my progress over the semester. One can read the blog at https://medium.com/@alex.chandler_70689.


Competitive Programming

The class was extremely beneficial in terms of interview preparation and provided an excellent introduction to the world of competitive programming. I made significant progress in answering difficult computer science problems similar to those found on LeetCode.


Linear Algebra

Linear algebra is a branch of mathematics that deals with linear equations. I learned how to solve and comprehend linear systems, determinants, eigenvalues, eigenvectors, functional analysis, and matrix factorization algorithms.


Probability and Statistics

The first section of the course covered the basics of probability theory, such as discrete and continuous random variables.

Limit theorems and multiple random variables The application of probability to counting problems was covered in this section of the course. The course's second section covered statistical inference, including parameter estimation, hypothesis testing, and confidence intervals. Understanding the applications of Bayes' Theorem was my favorite part of the course.


Longhorn Startup Seminar

In this seminar, I worked on improving my consulting business. I became really good at giving elevator pitches to large audiences, and got helpful feedback for how to increase revenue with my main client.


Other Coursework

Sequential/Series/Multivariate Calculus, Discrete Mathematics, Engineering Physics, Differential Calculus, Macroeconomics, Microeconomics, Chem 1, Chem 2, Advanced German Grammar, Advanced German Grammar and Composition, and Historical Backgrounds of German Civilization.