Close

Advanced School for Computing and Imaging (ASCI)

ASCI office
Delft University of Technology
Building 28, room 04.E120
Van Mourik Broekmanweg 6
2628 XE – DELFT, The Netherlands

E: asci-office@tudelft.nl
P: +31 15 27 88032

Visiting hours office
Monday, Tuesday, Thursday: 10:00 – 15:00

Directions

The ASCI office is located at the Delft University of Technology campus.  It is easily accessible by bicycle, public transport and car. The numbers of buildings can help you find your way around the campus. Make sure you remember the name and building number of your destination.

Contact us at +31 15 278 8032 or send us an email at asci-office@tudelft.nl

A16 winterschool on Efficient Deep Learning

A16 - ASCI Winterschool on Efficient Deep Learning

Year  November 22-25, 2021
ECTS  5
Registration  maximum number of participants reached

On Thursday Nov 25, 2021 we finished our Winter School on Efficient Deep Learning. We look back at a succesful winter school with very good presentations, and a lot of discussions and interaction. The presentations from the speakers can be found in the links below. The missing presentations will be added as soon as possible.

Monday

Henk Corporaal / Opening and intro to the course

Jan van Gemert / Intro DL

Floran de Putter / Optimizing neural networks for inference: quantization from float32 to int8

Floran de Putter / Optimizing neural networks for inference: extreme quantization beyond int8

Tuesday

Pascal Mettes Hyperbolic deep learning

Henk Corporaal – Floran de Putter
Optimizing neural networks for inference: exploiting data reuse

Floran de Putter
Joint Architecture-Mapping Design Space Exploration for DNN Accelerators

Wednesday

Lydia Chen
Hyperparameter Tuning for Minimum Overhead and Optimal Inference Experience

Willem Sanberg – Sebastian Vogel – Hiram Rodriguez Challenges of scaling and applying NAS for
embedded systems

Sebastian Vogel
Efficient NNs without multipliers and ML-enabled RISC-V for NNs

Damian Podareanu
Gigapixel Semantic Segmentation for Histopathology

Giuseppe Sarda
Analog in-Memory Computing design and integration for deep neural network inference at the edge.

Lin Wang
Introduction to Edge AI Systems

Henk Dreuning
Data-, model- and pipeline-parallelism and memory efficiency in DNN training

Thursday

Bojian Yin / Tutorial on Spiking neural network on time-based data

Paul Detterer
Neuromorphic Computing with 𝜇𝜇Brain Chip

Julien Dupeyroux
Embedding neuromorphic systems onboard drones for greater performances

Jan van Gemert
Transformers

———END————–

Update: The provisional schedule for the A16 Winter School on Efficient Deep Learning has been released!

Please have a look at it here

Update: Maximum number of participants reached.

The maximum number of participants has been reached. If you want to be placed on the reserve list, please send an e-mail to: asci.office@tudelft.nl. Only ASCI or EDL members can apply

————————————————————————————————————————

Location: Kasteel Oud-Poelgeest, link to website.

This winterschool will be co organized with and joint by EDL (an NWO perspective program on Efficient Deep Learning). Please be advised that there are only a limited number of places available.

Course content

Machine learning has numerous important applications in intelligent systems within many areas, like automotive, avionics, robotics, healthcare, well-being, and security. The recent progress in Machine Learning, and particularly in Deep Learning (DL), has dramatically improved the state-of-the-art in object detection, classification and recognition, and in many other domains. Whether it is superhuman performance in object recognition or beating human players in Go, the astonishing success of DL is achieved by deep neural networks. However, the complexity of DL networks for many practical applications can be huge, and their processing may demand a high computing effort and excessive energy consumption. Their training requires huge data sets, making the training even orders of magnitude more intensive than their already very demanding inference phase. A new development is to move intelligence from the cloud to the IoT edge; this further stresses the need to tame the complexity of DL and Deep Neural Networks.

This joint ASCI-EDL winterschool treats various topics addressing the complexity reduction of DL, including:

  • Architectural and Hardware accelerator support for DL, with emphasis on energy reduction, computation efficiency and/or computation flexibility, both for inference and/or for learning;
  • Spiking and brain-inspired neural networks and their implementation;
  • Efficient mapping of DL applications to target architectures, including many-core, GPGPU, SIMD, FPGA, and HW accelerators;
  • Exploiting temporal and spatial data reuse, sparsity, quantization and approximate computing, dynamic neural networks, and other methods, to decrease the complexity and energy demands of DL.
  • Efficient learning approaches, including data reduction, online learning, and quality of learning;
  • Tools, Frameworks and High-level programming language support for DL;
  • NAS: Neural Architecture Search, including Hardware aware NAS;
  • Advanced applications exploiting DL.

Above topics will be treated by experts from the Netherlands and abroad.

Required background: Basic knowledge of deep learning and computer architecture.

Assessment
ASCI students can get 5 ECTS credits for this course. To get these credits they have to complete a lab/research study related to one or more of the treated topics.

Responsible Lecturer

Henk Corporaal
Andy Pimentel
Henri Bal
Rob van Nieuwpoort
and others

Education Period:

Nov 22-25, 2021

Time schedule:

Will follow soon.

Location: Kasteel Oud-Poelgeest (close to Leiden)