Our master distribution point is on-prem. When replicating large pkgs such as the macOS installers, I'm finding that it syncs and says it replicates successfully, but the file is not available.
If I try to replicate again the next day, it begins trying to sync the file over again. Is there a fix for this?
@bbot - We are also seeing this issue as well for awhile with our Jamf Pro Setup (aka Local master distribution point and manually syncing through Jamf Admin to Jamf Cloud) . We have been working with my Jamf Buddy and a Jamf Strategic Escalations Member to determine if there is a possible resolution and what options are available. Here is what we received:
"We have documented issues with Jamf Admin and uploading packages there to any cloud distribution point, not just the JCDS. We have had better success with customer who move solely to the GUI for uploads. That does require switching to a cloud based DP as the master in Jamf Pro. We also have been making changes to the JCDS database to address larger package uploads or failed uploads in general. Right now, there isn't a fix if you will, we are being asked by engineering to have customers who are seeing issues try to upload the package again. We can delete the package from the GUI on a failed attempt and then upload again. We know this doesn't always fix the issue a second time and we may have to try an additional time until it works. Again, this is not a great answer, but it's the one we are being provided right now. There are some open product issues regarding package uploads to the JCDS, so I do expect some patching to be applied in the coming releases. "
I was actually guided to do the exact opposite. We were using the cloud distribution point as the master, and large uploads like the Adobe cc suite were failing through the GUI. The suggestion was made to make our local the master (which makes a lot more sense as far as speed of deploying software, and impact on the network, unless you were already using network segments to direct Macs on a certain IP range to the local distribution point) but issues replicating large files to the cloud continue. Glad we made the switch to the local, however.
Thanks. At this point, I've excluded network segments that are not behind our firewall to be excluded from policies that have not been able to successfully copy to the cloud distribution point. Seems to be the safest route..
Thanx to JAMF support (!!!) we are doing for our large packages:
1) Directly upload large pkg to AWS GUI - Chrome seems to be the fastest - or if very large: multipart upload: https://www.youtube.com/watch?v=G4NrNGhfim8
2) "Manual" insert package into JAMF Pro with:
curl -k -v -u apiuser:apiuserpw -H "Content-Type: text/xml" https://your-jamf-pro.com:8443/JSSResource/packages/id/id -d "<package><name>LARGE-Package-Name.dmg</name><filename>LARGE-Package-Name.dmg</filename></package>" -X POST
3) Now a little trick to get JAMF Admin to update the AWS dist.point inventory. Just upload a tiny package with JAMF Admin. This updates its items inventory.
4) Restart JAMF Admin - now you should see the new large package as available. 5) Calculate the checksum for the large package with JAMF Admin
6) Ready to go