top of page

AI and Python - Ollama for Local LLM AI Usage





Class Notes:


Class Description:

Ollama allows you to run LLM's locally and get the power of AI without having to deal with OpenAI or other API services. Ollama can run at the command line, or with a module you can interact with it through Python.



Ollama is a framework that allows you to run a huge number of different LLM's from tiny ones to the new 405B parameter one from Meta. You don't need an expensive new computer. I've run the Phi3 model from Microsoft on a 2012 MacBook Pro with 4GB of RAM. (It's slow... but it works...)


We'll learn how to install and run Ollama on your system, and then how to connect your Python scripts to it.


The class will go over:

  • What is Ollama

  • What are LLM's

  • Installing Ollama and Pulling Models

  • Running Ollama at the command line

  • Connecting Python Scripts to Ollama

2 views0 comments

Recent Posts

See All

Comments


bottom of page