AI Learning and Exploration

2/10/2025

AI Learning and Exploration

Introduction

After discussing AI strategies with friends and colleagues, I realized it would be helpful to document some of the tools I’ve worked with in different capacities. My goal is to share insights in a more convenient way for anyone curious about experimenting with AI—whether that involves running models locally or leveraging cloud-based services. While this series focuses on a local macOS setup using Ollama, Docker, and Open WebUI, the techniques and ideas will eventually expand to other hosting approaches as well. Check out the entire playlist here:

AI – Learning & Exploration YouTube Playlist


Why Start with Local AI?

Running AI services on your own hardware can be incredibly empowering. Not only do you maintain ownership of your data, but you also gain the freedom to swap out models and features at will. It’s a great way to learn the fundamentals without relying solely on external providers. Although we begin with a local-first approach, future videos will tackle various deployment options, so you can choose what best suits your needs.


Ollama & Open WebUI

Ollama

Ollama is an open-source project that provides an OpenAI-compatible HTTP service for local use. It streamlines the process of downloading and running models with a user-friendly command-line interface. Once it’s up and running, you can switch between models for tasks like telling jokes, generating code snippets, or reasoning through complex prompts.

Open WebUI

Open WebUI acts as a front-end portal, offering a ChatGPT-like interface on your system. After you configure it, it seamlessly integrates with Ollama, allowing you to:

In one of the videos, you’ll see how to pull the Docker image for Open WebUI, create an admin account, and generate a JWT token for programmatic access. We assume you already have Docker set up—it’s a prerequisite for these instructions. If you need help installing Docker, a quick online search for Docker Desktop should get you started.


Docker for Local AI

While we don’t cover Docker installation in these videos, Docker itself makes managing AI services much simpler:

Running a few Docker commands will launch both Open WebUI and Ollama, creating a powerful local AI environment that integrates well with tools like code editors.


Looking Ahead

Although local AI offers a hands-on way to maintain full control over your data and configurations, it’s just one approach. Future videos in this series will explore:


Conclusion

Running AI services locally can be an excellent introduction to AI deployment. It’s fun, practical, and gives you the flexibility to adapt your setup as your needs evolve. Whether you’re a hobbyist or a professional, understanding local AI environments lays the groundwork for integrating more advanced—or cloud-based—solutions in the future.

Thanks for reading, and be sure to check out my AI – Learning & Exploration YouTube Playlist for in-depth tutorials, demos, and more tips on getting started with AI—both locally and beyond!

Back to all posts