Loading...

Course Description

Overview 

Technologies like OpenAI's ChatGPT and Google's Bard are changing the way we work and generative AI will become increasingly popular. As data professionals and developers, it is important to know how these tools work under the hood and how you can leverage large language models (LLMs) for your work.

LLMs have revolutionized the field of natural language processing (NLP) and are increasingly being used to solve a wide range of NLP problems in various industries. Understanding LLMs can help developers and data scientists, like you, to:

Build better NLP models: LLMs are state-of-the-art models for many NLP tasks, and understanding how they work can help developers and data scientists to build better models and achieve better performance on their NLP tasks.

Develop custom NLP applications: LLMs can be fine-tuned to specific NLP tasks, making them highly adaptable to different domains and use cases. Developers and data scientists who understand LLMs can leverage this flexibility to develop custom NLP applications for their specific needs.

Optimize model performance: Understanding LLMs can help developers and data scientists to optimize model performance by selecting the appropriate architecture, prompt engineering, fine-tuning strategies, and downstream tasks for their specific use case.

With the most recent release of OpenAI's GPT-4 language model, it is being used by Morgan Stanley wealth management to organize its vast knowledge base, Be My Eyes to transform visual accessibility, Stripe to streamline user experinece and combat fraud, and the Government of Iceland to preserve its language. 

This course will provide you with a comprehensive understanding of the latest techniques, tools, and applications of LLMs so you can build applications or processes and further improve your effectiveness and efficiency when working with large language models.

Who Should Enroll 

  • Software Engineers and Developers
  • Data Analysts and Data Scientists
  • Machine Learning / Artificial Intelligence Engineers
  • Programmers and Developers looking to apply NLP, machine learning, and prompt engineering to their stack

What you will learn 

  • How to apply and integrate various LLMs to an organization’s existing data infrastructure and systems. This includes understanding open-source alternatives to ChatGPT, Sydney, and Bard.
  • Understanding the evolution of transformer architectures and the historical context behind ChatGPT, including different types of LLMs and their life cycles (pretraining, fine-tuning, and inference).
  • Familiarity with various machine learning paradigms, including unsupervised, supervised, self-supervised, and in-context learning.
  • Knowledge of different downstream tasks that LLMs can be applied to, such as prediction, extraction, sequence labeling, sequence transformation, and generation.
  • Ability to perform prompt engineering and effective fine-tuning, including prompt construction, effective completions, and understanding the tradeoffs between zero-shot, k-shot, domain/knowledge transfer, in-context learning, and supervised fine-tuning.
  • Understanding LLMs as components in larger architectures, including their use in embeddings for dense retrieval, recommendations, clustering, synthetic data generation, negative mining, and managing model size through knowledge distillation, pruning, and quantization.

Course Details 

This course offers multiple modes of learning that will allow you to get up-to-speed quickly on the tools, techniques, and applications of LLMs. This includes:

  • 90 min live sessions offered virtually each week. Sessions include live instruction with Q & A and guided demonstrations where learners can simultaneously work in their own lab environment.
  • Hands on assignments to independently practice and hone skills with programming notebooks.
  • Curated readings and case studies to expand your knowledge with LLMs.

Prerequisites:

  • Proficient in reading and writing code in Python (understanding of different data types and the basics of object-oriented programming).
  • Intermediate to advanced experience with data and machine learning Python libraries such as Numpy, Scikit-Learn, and Pandas.
  • Comfortable working with web applications and big data.
  • Experience working with API endpoints and the ability to write a simple Python code to send requests and parse responses from an endpoint.

Course requirements:

  • Google Colab pro subscription (Gmail account required)
  • OpenAI account and API key (learn more about pricing here)
Loading...
Enroll Now - Select a section to enroll in
Section Title
Foundations of Large Language Models: Tools, Techniques, and Applications
Type
Online
Dates
May 07, 2024 to Jun 16, 2024
Course Fee(s)
Course Fee non-credit $1,450.00
Required fields are indicated by .