How to Host Local LLMs in a Docker Container on Azure
Have you ever run into a situation where you want to test some local AI models, but your computer doesn't have enough specs to run them? Or maybe you just don't like bloating your computer with a ton of AI models? You're not alone in this. I’ve faced...

Have you ever run into a situation where you want to test some local AI models, but your computer doesn't have enough specs to run them? Or maybe you just don't like bloating your computer with a ton of AI models?
You're not alone in this. I’ve faced this exact issue, and I was able to solve it with the help of a spare VM. So the only thing you'll need is a spare PC somewhere that you can access.
Here, I'm using Azure, but the process should be fairly simple for other cloud providers as well. Even if you have a homelab with your old PC or something that you can SSH into, the only thing you'll have to change is the commands that deal with Azure. Everything else should work just fine.
And the best part? We will be doing everything inside a Docker container. So if you ever want to remove all the AI models, just remove the container, and you're all set. Even your VM is not going to install anything locally, pure Docker!