Building LLM applications with Python
Overview
Who is this for?
Undeniably, large language models (LLMs) are at the centre of a modern gold-rush in technology.
Students, developers, and anyone interested in getting started with theory and practice on building LLM-based applications with Python.
Who is leading the session?
The session is led by Dr. Stelios Sotiriadis, CEO of Warestack, Associate Professor and MSc Programme Director at Birkbeck, University of London. His expertise includes cloud computing, distributed systems, and AI engineering.
Stelios holds a PhD from the University of Derby, completed a postdoctoral fellowship at the University of Toronto, and has worked with Huawei, IBM, Autodesk, and several startups. Since 2018 he has taught at Birkbeck and, in 2021, founded Warestack, building software for startups globally.
What we’ll cover
A practical introduction on the basics of local models and cloud APIs to build real software systems.
You will learn:
- Introduction to natural language processing
- LLMs theory and intuition
- Agents are and how to build them
- Running local models with Ollama (free and offline)
- Calling local models using Python
- Building a ChatGPT-like chatbot with Python libraries
Requirements
- A laptop with Python (Windows, macOS, or Linux)
- Visual Studio Code installed
- Python pip installed
- At least 10 GB free disk space
- At least 8 GB RAM
This space is needed for running local models.
You may also use the lab computers if your device doesn’t meet the requirements.
Format
A 1.5-hours live session including:
- Interactive theory
- Hands-on coding
- Step-by-step exercises
The session will run in person, with streaming available for remote attendees.
Prerequisites
You should be comfortable writing Python scripts (basic to intermediate level).
Good to know
Highlights
- 1 hour 30 minutes
- Online
Location
Online event
Frequently asked questions
Organized by
Stelios Sotiriadis
Followers
--
Events
--
Hosting
--