***ATTENTION Indico Users***

Notice: Maintenance on 6/19

Please see the News section for more information.

Brookhaven AIMS Tutorial: Generating molecules via transformer: a brief introduction

US/Eastern
Description

In 2023, we have expanded the Brookhaven AIMS series to include also hands-on tutorials organized by the BNL Computational Science Initiative, which are aimed at researchers, educators and students new to machine learning and artificial intelligence. The tutorial presentations are designed for a general audience, and minimal prior experience will be required.

 

    • 12:00 13:00
      Generating molecules via transformer: a brief introduction 1h

      Abstract: This workshop tutorial will first discuss the motivation and structure of the transformer, arguably the most popular machine learning architecture in current literature and industry practices. Namely, the attention mechanism, which is the basis of the transformer, and how it satisfies several desirable model properties (e.g., permutation equivariance, size independence, global all-to-all attention) will be explained. Next, the main categories of the transformer (encoder-decoder, encoder-only, and decoder-only) and how they serve different applications will be addressed. Finally, a toy example of molecular generation (in text representation, e.g., SMILES/SELFIES) will be demonstrated using the transformer. This tutorial in intended for beginners or chemists who wish to apply transformers to molecular data.

      Speaker Biography: Tim Hsu is a staff data scientist at Lawrence Livermore National Laboratory (LLNL) working on generative models for molecules and materials. He joined LLNL in 2020 as a postdoctoral researcher and studied graph neural networks for atomic structures. Prior to that, he also worked on synthesizing 3D microstructure microscopy images with a convolution neural network generative model. Tim received a PhD in Materials Science and Engineering in 2019 at Carnegie Mellon University.

      Speaker: Hsu Tim (Lawrence Livermore National Laboratory)