My master's degree at Chalmers University of Technology is in "Engineering Mathematics and Computational Science" and as you can tell by the title fairly broad. Hence there are different profiles within it and I've focused on courses within Machine Learning and Optimization. I wrote my thesis for the Boeing subsidiary Jeppesen about improving optimization algorithms with deep learning. In Spring 2022 I went on an exchange to Hong Kong University of Science and Technology, ranked top 3 of best young universities in the world, where I focused mostly on deep learning.
On this page, I will give a brief overview of what I've done in my master's degree. Here is a list of the courses in my Master's degree:
At Chalmers University of Technology:
Linear and Integer Optimization with Applications
Statistical Learning for Big data
Algorithms
Statistical inference
Large Scale Optimization
Linear Statistical Models
Nonlinear Optimization
High Performance Computing
At Hong Kong University of Science and Technology:
Big Data Mining
Machine Learning
Advanced Deep Learning Architectures
Data-Driven Portfolio Optimization
Master's Thesis:
The thesis investigated training a deep learning model to predict a degree of compatiability between crew members and tasks which could be used in the Crew Rostering Problem to increase computational speed. This is done by reducing the networksize of the subproblem known as the pricing problem in column generation which is frequently modelled as a resource-constrained shortest path problem. Training a deep learning model able to make accurate enough predictions was found to be very difficult given the techniques and data experimented with. Thus the thesis concluded that further research for improving this concept is needed in two main directions, feature extraction and model techniques
Read the full thesis here: https://odr.chalmers.se/handle/20.500.12380/304493
From course description: "This course focuses on advanced deep learning architectures and their applications in various areas. Specifically, the topics include various deep neural network architectures with applications in computer vision, signal processing, graph analysis, and natural language processing. Different state-of-the-art neural network models will be introduced, including graph neural networks, normalizing flows, point cloud models, sparse convolutions, and neural architecture search. The students have the opportunities to implement deep learning models for some AI related tasks such as visual perception, image processing and generation, graph processing, speech enhancement, sentiment classification, and novel view synthesis. "
Course by: Qifeng Chen (https://cqf.io/, https://scholar.google.com/citations?user=lLMX9hcAAAAJ&hl=en)
From course description: "This course covers core and recent machine learning algorithms. Topics include supervised learning algorithms (linear and logistic regression, generative models for classification, learning theory), deep learning algorithms (feedforward neural networks, convolutional neural networks, recurrent neural networks, adversarial attacks), unsupervised learning algorithms (variational autoencoders, generative adversarial networks, mixture models), and reinforcement learning (classic RL, deep RL). "
Course by: Nevin L. Zhang (https://www.cse.ust.hk/faculty/lzhang/)
From course description: "This course will explore the Markowitz portfolio optimization in its many variations and extensions, with special emphasis on Python programming. All the course material will be complemented with Python code that will be studied in class."
Course by: Daniel Palomar (https://www.danielppalomar.com/)
From course description: "This is project-oriented course. It will expose students to practical issues of large-scale and real-world data mining. Data mining is a process of extracting implicit, previously unknown," and potentially useful knowledge from data, and it is a critical task in many applications. This course will place emphasis on applications of data mining in areas such as business intelligence, which aims to uncover facts and patterns in large volumes of data for decision support. "
Course by: Yangqiu Song (https://scholar.google.com/citations?user=MdQZ-q8AAAAJ&hl=en, https://www.cse.ust.hk/~yqsong/)
In the course projects we got to use julias JuMp with the gurobi solver. Great fun.
Course content:
Linear optimization, Interger optimization, Relaxations, Duality, Simplex method, Branch&Bound algorithm, Cutting planes, Hueristics, Network flows.
The picture is from wikipedia.
I'm currently studying this course so I'll probably describe it better when I'm done with it.
Course content:
Model-based classification, Model assessment for predictive learning, Model selection through Cross-Validation, Tree based methods, Data representation, Clustering, Penalized regression/classification methods, High-dimensional clustering, Large sample methods.
The picture is a genre picture from Google.
Great course where most of the focus was on dynamic programming and reductions.
Course content:
Greedy algorithms, Interval scheduling, Dynamic programming, Knapsack problems, sequence alignment, Divide and Conquer, Recurrances, Graphs, Minimum spanning trees, Complexity classes, Satisfiability problem, Reductions.
The picture is from wikipedia.
Course content:
R, Random sampling, Stratified samples, parametric models, maximum likelihood, hypothesis testing, baysian inference, empirical distribution, comparing two populations, ANOVA 1 and 2, Nonparametric tests, Categorical data, Linear and multiple regression.
The picture is from Google.
In this course we made a Manhattan chanel routing problem solver in Matlab, one solution is shown in the picture to the left.
Course content:
Linear optimization, Binary optimization, Mixed-binary optimization, Lagrangean Relaxations, Lagrangean duality, Subgradient method, Ergodic sequence, Solution recovery, Column generation, Dantzig-Wolfe Decomposition, Benders Decomposition.
Course content:
linear and multivariate regression , bias/variance trade of, least squares, outlier identification, t-test, Adjusted R^2, confidence and prediction intervals, cross-validation, AIC, Partial F-test, Generalised linear models.
The picture is from wikipedia.
Course content:
Convexity, Optimality conditions, Lagrangean duality, Linear programming, Linear programming duality, Convex optimization, integer programming, feasible direction methods, unconstraind optimization, simplex method.
The picture is from wikipedia.
In this course we had some great projects, in one of which we created the picture on the left.
Course content:
C, parallel programming using threads/OpenMP/MPI/OpenCL, Hardware architecture, Code optimization and complier flags.