ycliper

Популярное

Музыка Кино и Анимация Автомобили Животные Спорт Путешествия Игры Юмор

Интересные видео

2025 Сериалы Трейлеры Новости Как сделать Видеоуроки Diy своими руками

Топ запросов

смотреть а4 schoolboy runaway турецкий сериал смотреть мультфильмы эдисон
Скачать

ESWEEK 2021 Education - Spiking Neural Networks

Автор: Embedded Systems Week (ESWEEK)

Загружено: 2021-11-03

Просмотров: 26262

Описание: ESWEEK 2021 - Education Class C1, Sunday, October 10, 2021
Instructor: Priyadarshini Panda, Yale

Abstract: Spiking Neural Networks (SNNs) have recently emerged as an alternative to deep learning due to their huge energy efficiency benefits on neuromorphic hardware. In this presentation, we suggest important techniques for training SNNs which bring a huge benefit in terms of latency, accuracy, interpretability, and robustness. We will first delve into how training is performed in SNNs. Training SNNs with surrogate gradients presents computational benefits due to short latency and is also considered as a more bio-plausible approach. However, due to the non-differentiable nature of spiking neurons, the training becomes problematic and surrogate methods have thus been limited to shallow networks compared to the conversion method. To address this training issue with surrogate gradients, we will also go over a recently proposed method Batch Normalization Through Time (BNTT) that allows us to target interesting beyond traditional image classification applications like video segmentation. with SNNs. Another critical limitation of SNNs is the lack of interpretability. While a considerable amount of attention has been given to optimizing SNNs, the development of explainability still is at its infancy. I will talk about our recent work on a bio-plausible visualization tool for SNNs, called Spike Activation Map (SAM) compatible with BNTT training. The proposed SAM highlights spikes having short inter-spike interval, containing discriminative information for classification. Finally, with proposed BNTT and SAM, I will highlight the robustness aspect of SNNs with respect to adversarial attacks. In the end, I will talk about interesting prospects of SNNs for non-conventional learning scenarios such as, federated and distributed learning.

Bio: Priya Panda is an assistant professor in the electrical engineering department at Yale University, USA. She received her B.E. and Master’s degree from BITS, Pilani, India in 2013 and her PhD from Purdue University, USA in 2019. During her PhD, she interned in Intel Labs where she developed large scale spiking neural network algorithms for benchmarking the Loihi chip. She is the recipient of the 2019 Amazon Research Award. Her research interests include- neuromorphic computing, deep learning and algorithm-hardware co-design for robust and energy efficient machine intelligence.

Не удается загрузить Youtube-плеер. Проверьте блокировку Youtube в вашей сети.
Повторяем попытку...
ESWEEK 2021 Education - Spiking Neural Networks

Поделиться в:

Доступные форматы для скачивания:

Скачать видео

  • Информация по загрузке:

Скачать аудио

Похожие видео

ESWEEK 2021 Education - Neural Network Accelerator Design

ESWEEK 2021 Education - Neural Network Accelerator Design

Spiking Neural Networks for More Efficient AI Algorithms

Spiking Neural Networks for More Efficient AI Algorithms

Hands-On Session with snnTorch - Jason Eshraghian, University of California Santa Cruz

Hands-On Session with snnTorch - Jason Eshraghian, University of California Santa Cruz

Cosyne 2022 Tutorial on Spiking Neural Networks - Part 1/2

Cosyne 2022 Tutorial on Spiking Neural Networks - Part 1/2

Задача про надёжный пароль | В интернете опять кто-то неправ #035 | Борис Трушин и Математик Андрей

Задача про надёжный пароль | В интернете опять кто-то неправ #035 | Борис Трушин и Математик Андрей

ACACES 2023: Neuromorphic computing: from theory to applications, Lecture 1 – Yulia Sandamirskaya

ACACES 2023: Neuromorphic computing: from theory to applications, Lecture 1 – Yulia Sandamirskaya

Introduction to Next Generation Reservoir Computing

Introduction to Next Generation Reservoir Computing

Neuromorphic computing with emerging memory devices

Neuromorphic computing with emerging memory devices

12a: Neural Nets

12a: Neural Nets

Алгоритм памяти, вдохновлённый работой мозга

Алгоритм памяти, вдохновлённый работой мозга

Может ли у ИИ появиться сознание?  — Семихатов, Анохин

Может ли у ИИ появиться сознание? — Семихатов, Анохин

Training Spiking Neural Networks Using Lessons From Deep Learning

Training Spiking Neural Networks Using Lessons From Deep Learning

Brain-Like (Neuromorphic) Computing - Computerphile

Brain-Like (Neuromorphic) Computing - Computerphile

Intro to Binarized Neural Networks

Intro to Binarized Neural Networks

Dendrites: Why Biological Neurons Are Deep Neural Networks

Dendrites: Why Biological Neurons Are Deep Neural Networks

Mike Davies: Realizing the Promise of Spiking Neuromorphic Hardware

Mike Davies: Realizing the Promise of Spiking Neuromorphic Hardware

Но что такое нейронная сеть? | Глава 1. Глубокое обучение

Но что такое нейронная сеть? | Глава 1. Глубокое обучение

8: Spike Trains - Intro to Neural Computation

8: Spike Trains - Intro to Neural Computation

Neuromorphic Computing Explained | Jeffrey Shainline and Lex Fridman

Neuromorphic Computing Explained | Jeffrey Shainline and Lex Fridman

Garrick Orchard: Spiking Neural Networks for Event-based Vision

Garrick Orchard: Spiking Neural Networks for Event-based Vision

© 2025 ycliper. Все права защищены.



  • Контакты
  • О нас
  • Политика конфиденциальности



Контакты для правообладателей: [email protected]