Now Reading
UNIGE 14×050 – Deep Studying

UNIGE 14×050 – Deep Studying

2023-11-19 04:19:21

UNIGE 14×050 – Deep Studying

Yow will discover right here slides, recordings,
and a virtual machine
for François
‘s deep-learning
programs 14×050
of the University of Geneva,


This course is an intensive introduction to deep-learning, with
examples within the PyTorch

  • machine studying targets and predominant challenges,
  • tensor operations,
  • computerized differentiation, gradient descent,
  • deep-learning particular strategies,
  • generative, recurrent, consideration fashions.

You’ll be able to test the pre-requisites.

This course was developped initialy at
the Idiap Research Institute
in 2018, and taught as EE-559
at École
Polytechnique Fédérale de Lausanne
till 2022. The notes for
the handouts had been added with the assistance
of Olivier

Due to Adam Paszke, Jean-Baptiste Cordonnier, Alexandre
Nanchen, Xavier Glorot, Andreas Steiner, Matus Telgarsky,
Diederik Kingma, Nikolaos Pappas, Soumith Chintala, and Shaojie
Bai for his or her solutions or feedback.

Along with the supplies obtainable right here, I additionally wrote and
distribute “The Little Book of Deep Learning”, a
phone-formatted quick introduction to deep studying for readers with a
STEM background.

The slide pdfs are those I exploit for the lectures. They’re in
panorama format with overlays to facilitate the presentation. The
handout pdfs are compiled with out these fancy results in portrait
orientation, with further notes. The screencasts can be found each
as in-browser streaming or downloadable mp4 recordsdata.

You may get archives with all of the pdf recordsdata
(1097 slides):

and subtitles for the screencasts generated automaticallly
with OpenAI’s

or the person lectures:


  • Linear algebra (vectors, matrices, Euclidean areas),
  • differential calculus (Jacobian, Hessian, chain rule),
  • Python programming,
  • fundamentals in chances and statistics (discrete and steady
    distributions, legislation of huge numbers, conditional chances,
    Bayes, PCA),
  • fundamentals in optimization (notion of minima, gradient descent),
  • fundamentals in algorithmic (computational prices),
  • fundamentals in sign processing (Fourier remodel, wavelets).


You’ll have to have a look at the Python, Jupyter pocket book, and PyTorch
documentations at

Sensible session prologue

Helper Python prologue for the sensible

Argument parsing

This prologue parses command-line arguments as follows

utilization: [-h] [--full] [--tiny] [--seed SEED]
[--cifar] [--data_dir DATA_DIR]

DLC prologue file for sensible periods.

optionally available arguments:
-h, --help           present this assist message and exit
--full               Use the complete set, can take ages (default
--tiny               Use a really small set for fast checks
(default False)
--seed SEED          Random seed (default 0, < 0 isn't any seeding)
--cifar              Use the CIFAR data-set and never MNIST
(default False)
--data_dir DATA_DIR  The place are the PyTorch knowledge positioned (default
$PYTORCH_DATA_DIR or './knowledge')

Loading knowledge

The prologue gives the operate

load_data(cifar = None, one_hot_labels = False, normalize = False, flatten = True)

which downloads the info when required, reshapes the photographs to 1d
vectors if flatten
is True, and narrows to a small subset of
samples if –full is just not chosen.

It returns a tuple of 4 tensors: train_data,
train_target, test_data, and test_target.

If cifar is True, the data-base used is CIFAR10, if it
is False, MNIST is used, whether it is None, the argument
–cifar is taken into consideration.

If one_hot_labels is True, the targets are transformed to second
torch.Tensor with as many columns as there are courses, and
-1 all over the place besides the coefficients [n, y_n], equal to 1.

If normalize is True, the info tensors are normalized
in accordance with the imply and variance of the coaching one.

If flatten is True, the info tensors are flattened
into second tensors of dimension N × D, discarding the picture construction
of the samples. In any other case they’re 4d tensors of dimension N × C
× H × W.

See Also

Minimal instance

import dlc_practical_prologue as prologue

train_input, train_target, test_input, test_target = prologue.load_data()

print('train_input', train_input.measurement(), 'train_target', train_target.measurement())
print('test_input', test_input.measurement(), 'test_target', test_target.measurement())


* Utilizing MNIST
** Scale back the data-set (use --full for the complete factor)
** Use 1000 practice and 1000 take a look at samples
train_input torch.Dimension([1000, 784]) train_target torch.Dimension([1000])
test_input torch.Dimension([1000, 784]) test_target torch.Dimension([1000])

A Digital Machine (VM) is a software program that simulates a whole
pc. The one we offer right here features a Linux working
system and all of the instruments wanted to make use of PyTorch from an online
(e.g. Mozilla
or Google

Set up

  1. Obtain and set up Oracle’s VirtualBox,
  2. obtain the virtual machine OVA package (1.68Gb), and
  3. open the latter in VirtualBox with File → Import Equipment.

It’s best to now see an entry within the listing of VMs. The primary time
it begins, it gives a menu to decide on the keyboard structure you
need to use (you may drive the configuration later by operating
the command sudo set-kbd).

If the VM doesn’t begin and VirtualBox complains that the
VT-x is just not enabled, you need to activate the virtualization
capabilities of your CPU within the BIOS of your pc.

Utilizing the VM

The VM mechanically begins
a JupyterLab on port 8888 and
exports that port to the host. This implies that you would be able to entry this
JupyterLab with an online browser on the machine operating VirtualBox at
and use Python notebooks, view recordsdata, begin terminals, and edit supply
recordsdata. Typing !bye in a pocket book
or bye in a terminal will shutdown the

You’ll be able to run a terminal and a textual content editor from contained in the Jupyter
pocket book for workouts that require greater than the pocket book
itself. Supply recordsdata may be executed by operating in a terminal the
Python command with the supply file title as argument. Each may be completed
from the principle Jupyter window with:

  • New → Textual content File to create
    the supply code, or choosing the file and
    clicking Edit to edit an present
  • New → Terminal to begin a
    shell from which you’ll be able to run Python.

This VM additionally exports an ssh port to the port 2022 on the host,
which permits to log in with normal ssh purchasers on Linux and
OSX, and with functions such
as PuTTY on Home windows. The
default login is ‘dave’ and
password ‘dummy’, similar password
for the foundation account.


Observe that efficiency for computation will probably be very poor in comparison with
natively in your machine. Specifically, the VM does
not make the most of a GPU when you have one.

Lastly, please additionally observe that this VM is configured in a
handy however extremely non-secured method, with straightforward to guess
passwords, together with for the foundation, and network-accessible
non-protected Jupyter notebooks.

This VM is constructed on
a Linux Debian,
with miniconda,
PyTorch, MNIST,
CIFAR10, and lots of Python utility packages put in.

My very own supplies on this web page are licensed beneath the
Creative Commons BY-NC-SA 4.0
International License.

Extra merely: I’m okay with this materials getting used for
common tutorial educating, however positively not for a e-book /
youtube loaded with advertisements / no matter monetization mannequin I’m not
conscious of.

Source Link

What's Your Reaction?
In Love
Not Sure
View Comments (0)

Leave a Reply

Your email address will not be published.

2022 Blinking Robots.
WordPress by Doejo

Scroll To Top