NBIS workshop in Neural Nets and Deep Learning
Date: 20 - 24 May 2024
Timezone: Stockholm
Duration: 5 days
National course open for PhD students, postdocs, researchers and other employees in need of Neural networks and Deep Learning skills within all Swedish universities.
Important Dates
Application opens: 2024-01-25
Application closes: 2024-04-10
Confirmation to accepted students: 2024-04-25
Responsible teachers: Claudio Mirabello, Christophe Avenel
Course Fee
A course fee* of 3000 SEK will be invoiced to accepted academic participants. A higher fee applies to applicants from private companies. The fee includes lunches, coffee and snacks. *Please note that NBIS cannot invoice individuals
Course content
This course will give an introduction to the concept of Neural Networks (NN) and Deep Learning.
Topics covered will include:
- NN building blocks, including concepts such as neurons, activation functions, loss functions, gradient descent and back-propagation
- Convolutional Neural Networks
- Recursive Neural Networks
- Transformers and attention-based models
- Autoencoders
- Best practices when designing NNs
If you do not receive information according to the above dates please contact: edu.neural-nets-deep-learning@nbis.se
Contact: edu.neural-nets-deep-learning@nbis.se , Claudio Mirabello: claudio.mirabello@sciliefelab.se , Christophe Avenel : christophe.avenel@it.uu.se
Keywords: bioinformatics, Neural Network, Deep Learning
Venue: SciLifeLab Uppsala, Navet BMC, Husaregatan 3
City: Uppsala
Country: Sweden
Postcode: 75237
Prerequisites:
Entry requirements
Required for being able to follow the course and complete the computer exercises:
- Familiarity with Unix/Linux
- Ability to bring your own laptop with Python and Jupyter Notebooks installed for the practical exercises
- Programming/scripting experience in Python (e.g. having attended the NBIS workshop in basic Python or equivalent)
- Basic experience of statistics and mathematics (e.g. having attended the NBIS workshop Introduction to Biostatistics and Machine Learning or equivalent)
Desirable:
- You have experience of working with Jupyter Notebooks
- You have a necessity to work with large datasets (e.g. thousands of samples)
Due to limited space the course can accommodate a maximum of 25 participants. If we receive more applications, participants will be selected based on several criteria. Selection criteria include correct entry requirements, motivation to attend the course as well as gender and geographical balance.
Learning objectives:
Upon completion of this course, you will be able to:
- Distinguish the concepts of “Artificial Intelligence”, “Machine Learning”, “Neural Networks”, “Deep Learning”
- Distinguish between different types of learning (e.g. supervised, unsupervised, reinforcement) and recognise which applies to their own problem
- Distinguish between linear and non-linear approaches and recognise which is best suited for application to their own problem
- Describe what a feed-forward neural network (FFNN) is, along with its components (neurons, layers, weights, bias, activation functions, cost functions)
- Explain how training of a FFNN works from a mathematical point of view (gradient descent, learning rate, backpropagation)
- Execute with pen and paper a few steps of training of a very simple FFNN model
- Tell the difference between a shallow and a deep network
- Explain broadly how different NN architectures are wired and how they work
- Implement and apply the most appropriate architecture to a given problem/dataset
- Analyze training curves and prediction outputs to evaluate if the training has been successful
- Debug possible issues with the training and suggest changes to fix them
- Explain the difference between training, validation and testing
- Define what overfitting is from a mathematical point of view, and what issues it causes
- Identify what constitutes good practices of dataset design and how to avoid introducing information leakage or other biases when building their own datasets
Capacity: 25
Event types:
- Workshops and courses
Activity log