Our environment hasn't really had any need for a caching server, but I figured I'd test it out anyway. I took a server we had previously set up as a file server/time machine server as the guinea pig. Since this is a test only, it's OK if it will only work on its subnet. The problem is it doesn't work at all even with machines confirmed to be on the same subnet.
- Hardware: Mac Mini (mid 2011)
- macOS: 10.12.6
- Server Version: 5.3.1
ip address: static
client OSs: Range from 10.10-10.12.6
Permissions: All Networks + matching this server's network
Peering: All Networks
StartupStatus = "OK"
- RegistrationStatus = 1
- CacheStatus = "OK"
state = RUNNING
AllowPersonalCaching = yes (temp just to confirm caching works)
- LogClientIdentity = yes
- AllowSharedCaching = yes
- ListenRangesOnly = no
- LocalSubnetsOnly = no
As far as I can tell from Apple's documentation, this should be fine for something that runs solely on one subnet. It has always successfully registered with Apple as a viable cache server. These things can take some time to filter in, so I let it run for about a week. No devices ever connected to it. It has cached nothing. It is discoverable, and I can reach it from any given Mac on the subnet. The server was already in use for things on the network, so the DNS settings should be correct.
The subnet itself is solely for wired internet, so I wouldn't expect iOS devices to reach this server. I would, however, expect the devices I plug into the same switch to reach it. When I run /usr/bin/assetcachelocatorutil from client machines, I get a bunch of goose eggs for the found server counts.
Any suggestions on how to get this caching server to actually cache literally anything?