Algorithms in Bioinformatics - #22125

Information for participants

GENERAL SCHEDULE
Lectures will be in the morning from 9.00 - 12.00, and exercises in the afternoon from 13.00 - 17.00.

The morning sessions will consist of lectures and small practical exercises introducing the different algorithms, and the afternoon sessions will consist of programming exercises where the algorithms will be implemented.

The main programming language will be Python and all program templates provided in the course will be written in Python. Prior detailed knowlegde of Python programming is NOT required but will make following the course very much easier. However, basic programming skills are required to follow the course.

PROGRAMS AND TOOLS


Course Programme
Please note that the programme is updated on a regular basis - click the 'refresh' button once in a while to make sure that you have the most updated information

LITERATURE:

O Wednesday, 4. June

Introduction to course. PSSM construction from pre-aligned sequences including pseudo count. Python recap
Morten Nielsen

  • BACKGROUND TEXTS Essentials: Additionals:
    9.00 - 9.15
    Introduction to course
    Introduction to course [PDF] (MP4).
    9.15 - 9.30
    Introduction to the immune system [PDF] (MP4).
    9.30 - 11.20 (coffee break included)
    Weight matrix construction[PDF]. [PPTX] Weight matrix construction (MP4).
    Logo Handout
    Answers
    Handout. Estimation of pseudo counts
    Answer
    11.20 - 11.40
    Some notes on sequence alignment [PDF] (MP4)
    11.40 - 12.00
    Questions to the mornings lectures and other general issues
    Checking that we all have python and jupyter-notebook installed and running
    12.00 - 13.00
    Lunch
    13.00 - 17.00
    A brief introduction to Python programming and Jupyter-notebooks
    Python intro
    Python Answers
    Implementation of PSSM construction from pre-aligned sequences including pseudo count correction for low counts and sequence clustering
    PSSM construction and evaluation
    PSSM answers


    O Thursday, 5. June. Holiday

    O Friday, 6. June

    Sequence alignment, Dynamic programming, and Psi-Blast
    Morten Nielsen

    BACKGROUND TEXTS
    Essentials: Additionals:
    9.00 - 9.30
    Questions to yesterdays lectures and exercises
    9.30 - 11.00
    Blosum matrices [PDF] (MP4)
    Sequence alignment [PDF] . [PPTX] . (MP4)
    Handout (O3)
    Handout (O2)
    Handout answers
    11.00 - 12.00
    Blast alignment heuristics, Psi-Blast, and sequence profiles [PDF] . [PPTX] .
    Psi-Blast handout.
    12.00 - 13.00
    Lunch
    13.00 - 17.00
    Implementation of the Smith-Waterman Dynamic programming algorithm
    Matrix dumps from alignment programs (to be used for debugging)
    Answers to sequence alignment exercise

    O Monday, 9. June. Holiday

    O Tuesday, 10. June
    Data redundancy reduction algortihms
    Optimizations methods
    Gibbs sampling
    Morten Nielsen

    BACKGROUND TEXTS - Data redundancy reduction and Gibbs sampling
    Essentials: Additionals:
    9.00 - 9.30
    Questions to yesterdays lectures and exercises
    9.30 - 10.00
    Data redundancy reduction algorithms (Hobohm1 and Hobohm2) [PDF]. [PPTX]. (MP4).
    10.00 - 10.45
    Optimization procedures - Gradient decent, Monte Carlo
    Optimization procedures [PDF] [PPTX] (MP4)
    GD handout
    10.45 - 11.00
    Break
    11.00 - 12.00
    Gibbs sampling and Gibbs clustering
    Gibbs sampling [PDF] . [PPTX] . (MP4).
    12.00 - 13.00
    Lunch
    13.00 - 17.00
    Hobohm data redundancy reduction algorithms
    Answers to Hobohm programming exercise
    Implementation of a Gibbs sampling algorithm for prediction of MHC class II binding
    Answers

    O Wednesday, 11. June
    Hidden Markov Models
    Morten Nielsen

  • BACKGROUND TEXTS
    9.00 - 9.30
    Questions to yesterday lectures and exercise
    9.30 - 12.00
    Hidden Markov models
    Viterbi decoding, Forward/Backward algorithm, Posterior decoding, Baum-Welsh learning
    [PDF]. [PPTX] HMM (MP4)
    Viterbi Handout
    Answers
    Forward Handout
    Answers
    11.30 - 12.00
    Profile Hidden Markov Models
    12.00 - 13.00
    Lunch
    13.00 - 17.00
    Implementation of Viterbi and posterior decoding.
    Hidden Markov exercises
    Answer to Hidden Markov exercises

    O Thursday, 12. June
    Cross validation and training of data driven prediction methods. Stabilization matrix method (SMM)
    Morten Nielsen

  • BACKGROUND TEXTS
    9.00 - 9.30
    Questions to yesterdays lectures and exercises
    9.30 - 10.00
    Cross validation and training of data driven prediction methods
    [PDF] . [PPTX] . [MP4].
    10.00 - 10.45
    Stabilization matrix method (SMM) background
    [PDF] . [PPTX] . [MP4].
    SMM handout
    10.45 - 11.00
    Break
    11.15 - 11.30
    Description of potential projects and formation of groups
    Project suggestions, and descriptions.
    Document for project signup.
    11.30 - 12.00
    Quiz with questions capturing essential parts of course up to now Quiz. We will go over the answers to the quiz tomorrow morning.
    12.00 - 13.00
    Lunch
    13.00 - 17.00
    Implementing and evaluating SMM algorithms using cross-validation
    Answers

    O Friday, 13. June
    Artificial neural networks. Sequence encoding, feedforward and backpropagation algorithm
    Morten Nielsen

  • BACKGROUND TEXTS
    9.00 - 9.30
    Questions to yesterdays lectures and exercises
    9.30 - 10.30
    Artificial neural networks[PDF] . [PPTX] . Artificial neural networks Part 1 (MP4). Artificial neural networks Part 2 (MP4).
    Handout
    Answers
    10.30 - 10.40
    Break
    10.40 - 12.00
    Network training - backpropagation
    Training of artificial neural networks [PDF] . [PPTX] . (MP4)..
    Handout
    12.00 - 13.00
    Lunch
    13.00 - 17.00
    Artificial neural networks (Feedforward and Backpropagation)
    ANN answers

    O Monday, 16. June
    An introduction to Deep neural network architectures
    Morten Nielsen

  • BACKGROUND TEXTS
    9.00 - 9.30
    Questions to yesterdays lectures and exercises
    9.30 - 10.00
    Trick for ANN training [PDF].
    10.00 - 10.30
    NNAlign, alignment using ANN's [PDF]
    10.30 - 12.00
    Deep Learning using FFNN and NNAlign (PPTX) [PDF]
    11.00 - 12.00
    Exercise: Constructing and training Deep ANN methods (Joakim Noeddeskov Glifford && Jonas Birkelund Nilsson)
    NNdeep exercise
    12.00 - 13.00
    Lunch
    13.00 -17.00
    Exercise: Constructing and training Deep ANN methods (Joakim Noeddeskov Glifford && Jonas Birkelund Nilsson) cont.
    NNdeep exercise
    Answer Deep FFNN


    O Tuesday, 17. June
    An introduction to Deep neural network architectures
    Introduction to the project work
    Morten Nielsen

  • BACKGROUND TEXTS
    9.00 - 9.30
    Questions to yesterdays lectures and exercises
    9.30 - 10.00
    Doing things in C - A few examples: Alignment and ANN training
    Some code examples
    10.00 - 12.00
    Selection of projects, formation of project groups and start of project work Document for project signup.
    11.00 - 17.00
    Work on project (on your own)

    O Wednesday 18. - Tuesday 25. June. Project work
    No lectures. Project work
    Projects must be submitted (in PDF format) via campusnet Tuesday 24. of June 11.59 (just before lunch)

    O Wedensday, 25 - Thursday 26th June, Project evaluation and Exam


    Each group has 15 minutes to present their project including 5 minutes for questions. Note, only the group will be present for the presentation of the individual projects. After the present, each member of the group is evaluated in a oral exam covering the complete course curriculum.

    The exam will take place in the usual class room, building 210, room 042/048

    Wednesday June 25th
    
    8.00 - 8.50
    Group 1	
    Gibbs sampler approach to the prediction of SGBP binding
    sites including pseudo counts and sequences weighting clustering
    (Hobohm) techniques	
    Natalia (s250668), Maria (s250352), Xiaopeng (s194408)
    
    8.50 - 9.25
    Group 2	
    Comparation of FFNN and NNAlign for MHC binding prediction	
    Marco (s243116), Martina (s243118)
    
    9.25 - 10.10
    Group 3	
    Comparative study of PSSM, ANN, SMM for peptide MHC binding	
    Nicoline (s203530), Mathilde (s215063), Alberte (s215067) and Kristine (s215098)
    
    10.20 - 11.05
    Group 4	
    Compare PSSM, ANN, and SMM - predict MHC binding	
    Annekatrine (s225074), Bunia (s215085), Emma (s215090)
    
    11.05 - 11.50
    Group 5	
    Comparative study using ANN with different sequence encoding schemes	
    Magnus (s204581), Oliver (s204692), Johanna (s204657), 
    
    12.45 - 13.40
    Group 6	
    Comparative study of PSSM, ANN, SMM for peptide MHC binding	
    Xavi (s243360), Luis (s243302), Miguel (s243284) and Natalia (s243547)
    
    13.40 - 14.35
    Group 7	
    Comparative study of ANN with sparse, blosum and BERT encoding	
    Astrid (s193254), Jui-Tse (s244401), Dinis (s212484), Sebastian (s233425)
    
    14.35 - 14.50
    Karolina Tudelska (s212846)
    
    14.50 - 15.05
    Saxe í Dali Wagner (s204559)
    
    Thursday June 26th
    
    9.00 - 9.55
    Group 8	
    Comparative study of ANN with different peptide encodings	
    Pablo (s243357), Victor (s243634), Antonio (s243171), Elena (s243312)
    
    9.55 - 10.40
    Group 9	
    HOBOHM-1 redundancy filtering of MHC class I binders in training data	
    Tobias (s215105), Asbjørn (s215045), and Henrik (s215065)
    
    10.40 - 11.25
    Group 10	
    Implementation of a Baum-Welsh algorithm and/or Gibbs-sampling
    method for training of an HMM	
    Maximo (s247270), Tomas (s250441), Mikkel (s184240)
    
    11.25 - 12.20
    Group 11	
    Comparative study of FFNN and NNAlign for peptide MHC binding	
    Mei Lin (s194685), Magnus (s224188), Suru (s233406), Anne Sofie (s244028)
    

    Go to