Technical Workshop Series - Introduction to Transformers (2nd Offering) by: Zahra Taherikhonakdar

Wednesday, August 7, 2024 - 11:00

Technical Workshop Series

 

 Introduction to Transformers (2nd Offering)

Presenter:  – Zahra Taherikhonakdar

Date: Wednesday, August 7th, 2024

Time:  11:00 AM

Location: 4th Floor at 300 Ouellette Avenue (School of Computer Science Advanced Computing Hub)

 

Abstract: 


Transformers and attention mechanisms have revolutionized the field of deep learning, offering a powerful way to process sequential data and capture long-range dependencies. In this article, we will put our serious hat on and truly explore the basics of transformers and the importance of attention mechanisms in enhancing model performance and coherence. The advent of attention mechanisms has been nothing short of revolutionary in the realm of deep learning. Attention allows models to dynamically focus on pertinent parts of the input data, akin to the way humans pay attention to certain aspects of a visual scene or conversation. This selective focus is particularly crucial in tasks where context is key, such as language understanding or image recognition.

 

Workshop Outline:

In this workshop I am going to introduce the large language models:

  • Introduction to LLMs
  • Introducing models
  • Introducing the model techniques
  • Introduction to Transformers
  • Attention vs self-attention
Prerequisites:

Computer Science knowledge

 

Biography: 

 Zahra is a PhD student at the University of Windsor. My research is in the area of Information Retrieval. Particularly, Zahra’s research is about improving query refinement as a technique to make search engines retrieve the most related documents based on users' initial query.  

 

MAC STUDENTS ONLY - Register Here