Date: July 25, 2024
Time: 9:00 – 17:00 (CEST)
Venue: Rome, CNR IAC headquarter, Via dei Taurini 19, Ground Floor room
The H2IOSC project is promoting a one-day training focused on Building Transformer-Based Natural Language Processing. Organized by CNR IAC and conducted by NVidia, this training is open to the wider H2IOSC community as well as external listeners.
Gain an understanding of how transformers are used as the foundation of modern large language models (LLMs). Learn how to use these models for various Natural Language Processing (NLP) tasks, including text classification, named-entity recognition (NER), author attribution, and question-answering. Additionally, understand how to analyze various model features, constraints, and characteristics to determine which model is best suited for a particular use case based on metrics, domain specificity, and available resources.
Prerequisites
Participants should have experience with Python coding and the use of library functions and parameters. Familiarity with the fundamental understanding of a deep learning framework, such as TensorFlow, PyTorch, or Keras, and a basic understanding of neural networks is required.
It will also be possible to participate in the course online by submitting a registration request to the following email addresses: gabriella.bretti@cnr.it; paolo.rughetti@cnr.it
