Description

This course will give an introduction to the concept of Neural Networks (NN) and Deep Learning.

Topics covered will include:

  • NN building blocks, including concepts such as neurons, activation functions, loss functions, gradient descent and back-propagation
  • Convolutional Neural Networks
  • Recursive Neural Networks
  • Transformers and attention-based models
  • Autoencoders
  • Best practices when designing NNs

The latest information about the course can be found on the course website

The course fee is 3000 sek for academic users, 15000 sek for non-academic users

Event Details

Dates
4 - 8 May 2026
Application deadline
April 04, 2026 23:59
Contact

edu.neural-nets-deep-learning@nbis.se

Venue
Linköping University Hospital Campus (Campus US)
City
Linköping
Country
Sweden
Language
English
Cost
kr 3000 (SEK) (Cost incurred by all)
Timezone
Stockholm

Content Providers

Learning Outcomes

Upon completion of this course, you will be able to:

  • Distinguish the concepts of "Artificial Intelligence", "Machine Learning", "Neural Networks", "Deep Learning"
  • Distinguish between different types of learning (e.g. supervised, unsupervised, reinforcement) and recognise which applies to their own problem
  • Distinguish between linear and non-linear approaches and recognise which is best suited for application to their own problem
  • Describe what a feed-forward neural network (FFNN) is, along with its components (neurons, layers, weights, bias, activation functions, cost functions)
  • Explain how training of a FFNN works from a mathematical point of view (gradient descent, learning rate, backpropagation)
  • Execute with pen and paper a few steps of training of a very simple FFNN model
  • Tell the difference between a shallow and a deep network
  • Explain broadly how different NN architectures are wired and how they work
  • Implement and apply the most appropriate architecture to a given problem/dataset
  • Analyze training curves and prediction outputs to evaluate if the training has been successful
  • Debug possible issues with the training and suggest changes to fix them
  • Explain the difference between training, validation and testing
  • Define what overfitting is from a mathematical point of view, and what issues it causes
  • Identify what constitutes good practices of dataset design and how to avoid introducing information leakage or other biases when building their own datasets

Prerequisites & Technical Requirements

Prerequisites

Required for being able to follow the course and complete the computer exercises:

  • You are familiar with Unix/Linux
  • You are able to bring your own laptop with Python and Jupyter Notebooks installed for the practical exercises
  • You have programming/scripting experience in Python (e.g. having attended the NBIS workshop in basic Python or equivalent)
  • You have basic knowledge of statistics and mathematics (e.g. having attended the NBIS workshop Introduction to Biostatistics and Machine Learning or equivalent)

Desirable:

  • You have experience of working with Jupyter Notebooks
  • You have a necessity to work with large datasets (e.g. thousands of samples)

Due to limited space the course can accommodate a maximum of 25 participants. If we receive more applications, participants will be selected based on several criteria. Selection criteria include correct entry requirements, motivation to attend the course as well as gender and geographical balance.


Technical requirements

Students will bring their own laptops and install the tools necessary to complete the practical exercises.

Further instructions will be available on the course webpage a few weeks before the course start.

Topics & Tags

Keywords
Neural NetworkAIMachine LearningLife sciences

Affiliations & Networks

Associated nodes
SciLifeLab
Target audience
PhD studentspostdocsresearchersresearch support staffresearch engineers

Activity log