Skip to content

Commit

Permalink
update
Browse files Browse the repository at this point in the history
  • Loading branch information
jmduarte committed Apr 17, 2024
1 parent 3472321 commit ad98639
Show file tree
Hide file tree
Showing 2 changed files with 35 additions and 47 deletions.
28 changes: 8 additions & 20 deletions homeworks/homework_2/homework_2.tex
Original file line number Diff line number Diff line change
Expand Up @@ -9,9 +9,9 @@
Machine Learning in Physics \hfill
UCSD PHYS 139/239 \hfill \\[1pt]
Homework 2\hfill
Draft version due: Friday, February 3, 2023, 5:00pm\\
Draft version due: Friday, April 26, 2024, 8:00pm\\
\hfill
Final version due: Wednesday, February 8, 2023, 5:00pm\\
Final version due: Wednesday, May 1, 2024, 5:00pm\\
}
}
}
Expand All @@ -20,31 +20,19 @@
\pagestyle{fancy}

\section*{Policies}

\begin{itemize}
\item Draft version due 5:00pm, Friday, February 3 on Gradescope (report and code).
\item Final version due 5:00pm, Wednesday, February 8 on Gradescope (report and code).
\item You are free to collaborate on all of the problems, subject to the collaboration policy stated in the syllabus.
\item You should submit all code used in the homework.
Please use Python 3 and sklearn version $\geq$0.18 for your code, and that you comment your code such that the TA can follow along and run it without any issues.
\end{itemize}

\section*{Submission Instructions}
\textbf{PLEASE NOTE} that there are two steps to submitting your Homework.
Both must be submitted by the deadline.

\begin{itemize}
\item Please submit your report as a single .pdf file to Gradescope under ``Homework 2 Report Draft" or ``Homework 2 Report Final".
\item Please submit your report as a single .pdf file to Gradescope under ``Homework 1" or ``Homework 1 Corrections".
\textbf{In the report, include any images generated by your code along with your answers to the questions.}
For instructions specifically pertaining to the Gradescope submission process, see \url{https://www.gradescope.com/get_started#student-submission}.
\item Please submit your code as a .zip archive to Gradescope under ``Homework 2 Code Draft'' or ``Homework 2 Code Final".
\item Please submit your code as a .zip archive to Gradescope under ``Homework 1 Code'' or ``Homework 1 Code Corrections".
The .zip file should contain your code files.
Submit your code either as Jupyter notebook .ipynb files or .py files.
\end{itemize}

\newpage
\section{Stochastic Gradient Descent [36 Points]}
\materials{lecture 2}
% \materials{lecture 2}

Stochastic gradient descent (SGD) is an important optimization method in machine learning, used everywhere from logistic regression to training neural networks.
In this problem, you will be asked to first implement SGD for linear regression using the squared loss function.
Expand Down Expand Up @@ -105,7 +93,7 @@ \section{Stochastic Gradient Descent [36 Points]}
\end{itemize}
\end{problem}
\begin{solution}
See code. %This does not need to be changed.

\end{solution}

\begin{problem}[2]
Expand Down Expand Up @@ -196,7 +184,7 @@ \section{Stochastic Gradient Descent [36 Points]}


\section{Neural networks vs. boosted decision trees [45 Points]}
\materials{lectures 4--6}
% \materials{lectures 4--6}

In this problem, you will compare the performance of neural networks and boosted decision trees for binary classfication on a tabular dataset, namely the MiniBooNE dataset: \url{https://archive.ics.uci.edu/ml/datasets/MiniBooNE+particle+identification}.

Expand Down Expand Up @@ -247,7 +235,7 @@ \section{Neural networks vs. boosted decision trees [45 Points]}
Now, we will make two minor changes to the network with ReLU activations: preprocessing and the optimizer.

For the feature preprocessing use \texttt{sklearn.preprocessing.StandardScaler} to standardize the input features.
Note you should use fit the standard scaler to the training data \emph{only} and apply it to both the training and testing data.
Note you should fit the standard scaler to the training data \emph{only} and apply it to both the training and testing data.
For the optimizer, use Adam with a learning rate of 0.001 (which is the default) instead of SGD. Train the model for 50 epochs.

Plot the receiver operating characteristic (ROC) curve using the testing dataset.
Expand Down
54 changes: 27 additions & 27 deletions syllabus/syllabus.tex
Original file line number Diff line number Diff line change
Expand Up @@ -65,8 +65,8 @@

\noindent\textbf{Instructors}:\\
Javier Duarte, \href{mailto:jduarte@ucsd.edu}{jduarte@ucsd.edu}, OH M 11:00a--12:00p, MYR-A 5513\\
Aobo Li, \href{mailto:aol002@ucsd.edu}{aol002@ucsd.edu}, OH TBD\\
\noindent \textbf{Teaching assistant}: Anthony Aportela, \href{mailto:aaportel@ucsd.edu}{aaportel@ucsd.edu}, OH TBD\\
Aobo Li, \href{mailto:aol002@ucsd.edu}{aol002@ucsd.edu}, OH W 1:00--2:00p, MYR-A 4571\\
\noindent \textbf{Teaching assistant}: Anthony Aportela, \href{mailto:aaportel@ucsd.edu}{aaportel@ucsd.edu}, OH F 11:00a--12:00p, MYR-A 5516\\

\noindent\textbf{Course webpage, Zoom link to lectures}:\\
\hspace*{1cm}Canvas: \href{https://canvas.ucsd.edu/courses/56257}{https://canvas.ucsd.edu/courses/56257}\\
Expand Down Expand Up @@ -228,71 +228,71 @@

\noindent\textbf{Week 2}

\emph{Monday 4/8 (Duarte)}: \underline{Lecture 04}: Support vector machine, regularization, logistic regression
\emph{Monday 4/8 (Duarte)}: \underline{Lecture 04}: Support vector machine

\emph{Wednesday 4/10 (Duarte)}: \underline{Lecture 05}: (Boosted) decision trees
\emph{Wednesday 4/10 (Duarte)}: \underline{Lecture 05}: Regularization, logistic regression

\emph{Friday 4/12 (Duarte)}: Homework 1 due; \underline{Lecture 06}: (Boosted) decision trees (cont.); \underline{Hands-on}: Scikit-learn, XGBoost, classifying Higgs boson events
\emph{Friday 4/12 (Duarte)}: Homework 1 due; \underline{Lecture 06}: (Boosted) decision trees

\noindent\textbf{Week 3}

\emph{Monday 4/15 (Li)}: \underline{Lecture 07}: (Deep) neural networks, backpropagation
\emph{Monday 4/15 (Li/Duarte)}: \underline{Lecture 07}: (Boosted) decision trees (cont.); \underline{Hands-on}: Scikit-learn, XGBoost, classifying Higgs boson events

\emph{Wednesday 4/17 (Li)}: Homework 1 (corrections) due; Homework 2 released; \underline{Lecture 08}: (Deep) neural networks (cont.), training issue, data standardization
\emph{Wednesday 4/17 (Li)}: Homework 1 (corrections) due; Homework 2 released; \underline{Lecture 08}: (Deep) neural networks, backpropagation

\emph{Friday 4/19 (Duarte)}: \underline{Lecture 09}: Optimizers: (Nesterov) momentum, RMSProp, Adam, skip connections, regularization: dropout, early stopping, \underline{Hands-on}: Keras, classifying jets with high-level features
\emph{Friday 4/19 (Li)}: \underline{Lecture 09}: Classification metrics, confusion matrix, ROC curve, AUC

\noindent\textbf{Week 4}

\emph{Monday 4/22 (Duarte)}: \underline{Lecture 10}: Types of data, inductive bias, image-like data, convolutional neural networks
\emph{Monday 4/22 (Duarte)}: \underline{Lecture 10}: (Deep) neural networks (cont.), training issue, data standardization; \underline{Hands-on}: Keras, classifying jets with high-level features

\emph{Wednesday 4/24 (Duarte)}: \underline{Lecture 11}: Convolutional neural networks (cont.)
\emph{Wednesday 4/24 (Duarte)}: \underline{Lecture 11}: Optimizers: (Nesterov) momentum, RMSProp, Adam, skip connections, regularization: dropout, early stopping

\emph{Friday 4/26 (Li)}: Homework 2 due; \underline{Lecture 12}: Spherical convolutional neural networks, \underline{Hands-on}: Keras, classifying astronomical data (images)
\emph{Friday 4/26 (Li)}: Homework 2 due; \underline{Lecture 12}: Types of data, inductive bias, image-like data, convolutional neural networks

\noindent\textbf{Week 5}

\emph{Monday 4/29 (Li)}: \underline{Lecture 13}: Time-series data, recurrent neural networks
\emph{Monday 4/29 (Li)}: \underline{Lecture 13}: Convolutional neural networks (cont.)

\emph{Wednesday 5/1 (Li)}: Homework 2 (corrections) due; Homework 3 released; \underline{Lecture 14}: Recurrent neural networks (cont.)
\emph{Wednesday 5/1 (Li)}: Homework 2 (corrections) due; Homework 3 released; \underline{Lecture 14}: Spherical convolutional neural networks

\emph{Friday 5/3 (Li)}: \underline{Lecture 15}: Recurrent neural networks (cont.); \underline{Hands-on}: Identifying radio signals (time series)
\emph{Friday 5/3 (Duarte)}: \underline{Hands-on}: Keras, classifying astronomical data (images)

\noindent\textbf{Week 6}

\emph{Monday 5/6 (Duarte)}: \underline{Lecture 16}: Point cloud and graph-like data, relational inductive bias, permutation invariance/equivariance, graph neural networks
\emph{Monday 5/6 (Li)}: \underline{Lecture 15}: Time-series data, recurrent neural networks

\emph{Wednesday 5/8 (Duarte)}: \underline{Lecture 17}: Graph neural networks (cont.)
\emph{Wednesday 5/8 (Li)}: \underline{Lecture 16}: Recurrent neural networks (cont.)

\emph{Friday 5/10 (Duarte)}: Homework 3 due; \underline{Lecture 18}: Graph neural networks (cont.), \underline{Hands-on}: Spektral, $N$-body simulations, springs
\emph{Friday 5/10 (Li)}: Homework 3 due; \underline{Hands-on}: Identifying radio signals (time series)

\noindent\textbf{Week 7}

\emph{Monday 5/13 (Li)}: \underline{Lecture 19}: Unsupervised learning, clustering
\emph{Monday 5/13 (Duarte)}: \underline{Lecture 18}: Point cloud and graph-like data, relational inductive bias, permutation invariance/equivariance, graph neural networks

\emph{Wednesday 5/15 (Li)}: Homework 3 (corrections) due; Homework 4 released; \underline{Lecture 20}: Autoencoders
\emph{Wednesday 5/15 (Duarte)}: Homework 3 (corrections) due; \underline{Lecture 19}: Graph neural networks (cont.)

\emph{Friday 5/17 (Li)}: Project proposal due; \underline{Lecture 21}: Variational autoencoders, \underline{Hands-on}: Finding anomalies in LHC/LIGO data
\emph{Friday 5/17 (Duarte)}: Project proposal due; \underline{Hands-on}: Spektral, $N$-body simulations, springs

\noindent\textbf{Week 8}

\emph{Monday 5/20 (Duarte)}: \underline{Lecture 22}: Model compression, pruning
\emph{Monday 5/20 (Li)}: \underline{Lecture 21}: Unsupervised learning, clustering

\emph{Wednesday 5/22 (Duarte, remote)}: \underline{Lecture 23}: Quantization
\emph{Wednesday 5/22 (Li)}: \underline{Lecture 22}: Autoencoders, variational autoencoders

\emph{Friday 5/24 (Duarte)}: Homework 4 due; \underline{Lecture 24}: Knowledge distillation, \underline{Hands-on}: TensorFlow Model Optimization, QKeras
\emph{Friday 5/24 (Li)}: Homework 4 due; \underline{Lecture 23}: \underline{Hands-on}: Finding anomalies in LHC/LIGO data

\noindent\textbf{Week 9}

\emph{Wednesday 5/29 (Guest)}: Homework 4 (corrections) due; \underline{Guest lecture}: TBD
\emph{Wednesday 5/29 (Duarte)}: \underline{Lecture 24}: Model compression, pruning

\emph{Friday 5/31 (Guest)}: \underline{Guest lecture}: TBD
\emph{Friday 5/31 (Duarte)}: \underline{Lecture 25}: Quantization

\noindent\textbf{Week 10}

\emph{Monday 6/3 (Guest)}: \underline{Guest lecture}: TBD
\emph{Monday 6/3 (Duarte)}: Knowledge distillation

\emph{Wednesday 6/5 (Guest)}: \underline{Guest lecture}: TBD
\emph{Wednesday 6/5 (Duarte)}: \underline{Hands-on}: TensorFlow Model Optimization, QKeras

\emph{Wednesday 6/7 (Guest)}: \underline{Guest lecture}: TBD

Expand Down

0 comments on commit ad98639

Please sign in to comment.