Technical Workshop Series - Introduction to Transformers (1st Offering) by: Zahra Taherikhonakdar

Tuesday, August 6, 2024 - 15:00

Technical Workshop Series

Introduction to Transformers (1st Offering)

Presenter:  – Zahra Taherikhonakdar

Date: Tuesday, August 6th, 2024

Time:  3:00 PM

Location: 4th Floor at 300 Ouellette Avenue (School of Computer Science Advanced Computing Hub)

 

Abstract: 
Transformers and attention mechanisms have revolutionized the field of deep learning, offering a powerful way to process sequential data and capture long-range dependencies. In this article, we will put our serious hat on and truly explore the basics of transformers and the importance of attention mechanisms in enhancing model performance and coherence. The advent of attention mechanisms has been nothing short of revolutionary in the realm of deep learning. Attention allows models to dynamically focus on pertinent parts of the input data, akin to how humans pay attention to certain aspects of a visual scene or conversation. This selective focus is crucial in tasks where context is key, such as language understanding or image recognition.

 

Workshop Outline:

In this workshop I am going to introduce the large language models:

  • Introduction to LLMs
  • Introducing models
  • Introducing the model techniques
  • Introduction to Transformers
  • Attention vs self-attention
Prerequisites:

Computer Science knowledge

 

Biography: 

Zahra is a PhD student at the University of Windsor. My research is in the area of Information Retrieval. Particularly Zahra’s research is about how to improve query refinement as a technique to make search engines to retrieve the most related documents based on users' initial query.  

 

MAC STUDENTS ONLY - Register Here