We have been tasked with blocking DeepSeek on all campus resources. Unfortunately, keeping users from running DeepSeek models locally on their Macs isn’t as simple as making a Restricted Software title in Jamf Pro because the models aren’t actually apps.
What we can work with, however, is Ollama – the framework which DeekSeek models currently run under. If a user has installed Ollama, why not leverage that to check for the presence of DeepSeek LLMs?
Our first step is identifying it. The command ollama list will list all the LLMs a user has currently installed. If we grep those results for deepseek, we can make a little script-based Extension Attribute to identify Macs with DeepSeek LLMs present.
Next, make a Smart Computer Group for detections that looks something like this:
(You could probably get away with just one criteria where DeepSeek Check – is like – *deepseek* or do something with regex, but I did not test those.)
Next up, figure out your action plan. We’ve been asked to do the following:
We’ve got three policies to make this happen, all scoped to our “DeepSeek Detected” Smart Computer Group.
JamfHelper is usually my go-to for user alerts, but it only supports two buttons. I wanted three, so I chose osascript to pop up a series of dialogs. Please use whatever tool fits best for your org.
Our primary alert informs the user that DeepSeek has been detected on their Mac, is not permitted on our computers, and needs to be removed as soon as possible. It presents three buttons:
I should note this does not keep users from getting DeepSeek models through Ollama, but it does help us remediate it if they do. It is also only effective while Ollama is the only way to run them. We’ll have to keep an eye out for if – or realistically when – that changes and expand our methods to cover whatever the new hotness is.