Top Python Libraries

Top Python Libraries

Share this post

Top Python Libraries
Top Python Libraries
Don't Deploy Large Models Locally Carelessly: Major Security Vulnerabilities in Ollama

Don't Deploy Large Models Locally Carelessly: Major Security Vulnerabilities in Ollama

Learn about the security risks of deploying large models with Ollama and how to protect your system by securing exposed ports and configuring firewalls.

Meng Li's avatar
Meng Li
Feb 13, 2025
∙ Paid
1

Share this post

Top Python Libraries
Top Python Libraries
Don't Deploy Large Models Locally Carelessly: Major Security Vulnerabilities in Ollama
1
Share
How to Run DeepSeek R1 Locally using Ollama

Recently, DeepSeek has become very popular, and there are numerous articles and videos on how to locally deploy large models.

Most of these use just one command with Ollama to launch the model.

However, it has recently been revealed that launching large models with Ollama may have security risks.

On FOFA, it's easy to find many Ollama 11434 ports exposed on the internet.

This post is for paid subscribers

Already a paid subscriber? Sign in
© 2025 Meng Li
Privacy ∙ Terms ∙ Collection notice
Start writingGet the app
Substack is the home for great culture

Share