Skip to main content
Question

How to get list of computers in batch


Forum|alt.badge.img+3

Hello, I am looking at API where if an organization has more than 5000 devices registered in Jamf, i need to get the computers in batches.. means. get 1000 computers in batch to reduce the network latency. Can some one provide some clues?

13 replies

Forum|alt.badge.img+18
  • Contributor
  • 475 replies
  • May 29, 2019

When you say "get the computers" what are you referring to? Do you mean the full xml data for each computer record? Do you mean just the computer names?


Forum|alt.badge.img+14
  • Valued Contributor
  • 375 replies
  • May 29, 2019

Please clarify what you are trying to get. You could use a smart group instead of the API.


mm2270
Forum|alt.badge.img+24
  • Legendary Contributor
  • 7882 replies
  • May 29, 2019

Agreed with the posts above. If you can be more specific on what you're looking for, it may not even be necessary to use the API for this. There are ways of loosely "grouping" a number of Macs in Smart Groups using their Jamf IDs. I say "loosely" because it's hard to know if you will get exactly 1000 Macs in each group. It all depends on whether there are now unused IDs in the specific range.


Forum|alt.badge.img+3
  • Author
  • New Contributor
  • 5 replies
  • May 30, 2019

curl -X GET "https://tryitout.jamfcloud.com/JSSResource/computers" -H "accept: application/xml" OR curl -X GET "https://tryitout.jamfcloud.com/JSSResource/computers/subset/basic" -H "accept: application/xml"
This one gives me all the computers registered with Jamf.. doesnt matter whether 50 or 5000. If there are 5000 computers registered, i need to fetch commuters in batch of 1000. its like fetching first 1000, next 1000, next 1000, etc.

Hope this is more clear now.


Forum|alt.badge.img+18
  • Contributor
  • 475 replies
  • May 30, 2019

@sampathnk There is probably not a way through the API to do what you want.

One option would be to curl down all the IDs of the computers at one time, then individually loop through all the ids and curl down the information you want for each computer. However, if you have 6000 devices, then this would be ~6001 api calls, but it would be getting device record xml for one device at a time. I don't know if this would accomplish your goal or not.

You could use the same commands you've provided and parse the resulting xml to only output 1000 computers, but your initial api request is still getting all of your devices and just chopping down the output, this does not seem to be what you'd want.


Forum|alt.badge.img+9
  • New Contributor
  • 60 replies
  • May 30, 2019

I've only ever accomplished this through artificial delays, like tossing in sleep statements or targeting my script to only get certain ID ranges and then just re-run the script to target different ranges at different times.

What I did with my sleep statements was keep a counter of how many IDs I've gone through, and then do a check to see if how many IDs I've gone through is evenly divisible by 1000. If it is, then I've done my next batch of 1000 IDs, and sleep for 300 seconds.

Something like this:

if (( $totalIDs % 1000 == 0 )); then sleep 300 fi

And then continue on with the for loop to get the rest of your computers.


Forum|alt.badge.img+3
  • Author
  • New Contributor
  • 5 replies
  • June 4, 2019

Any other Solutions please..


mm2270
Forum|alt.badge.img+24
  • Legendary Contributor
  • 7882 replies
  • June 4, 2019

Use Advanced Computer Searches. When setting one up, switch to the advanced criteria and choose Jamf Computer ID, then add the same criteria in a second time.
In the first criteria item use “greater than” and enter 0. In the second one use “less than” and enter 1001. Run your search. This will get you approximately 1000 Macs. Again, as I said up earlier, this will likely only be approximate. If you’ve purged a number of lower ID Macs from your Jamf Server, then you’ll get significantly less than 1000, because the purged IDs never get reused.

You can save the search and bring it up, edit it and just change the numeric values to get different ranges, like from 1000 to 2000 for example and so on.

The same method would work for Smart Computer Groups, but I’m unclear what your end goal is here so I don’t know if just searches will suffice.


Forum|alt.badge.img+3
  • Author
  • New Contributor
  • 5 replies
  • June 12, 2019

how can we use the Advanced Computer Searches as external API to trigger specific search? I am just a REST API Consumer. I see API only to create/update/delete the Advanced Computer Searches as given below advancedcomputersearches
GET
/advancedcomputersearches
Finds all advanced computer searches
GET
/advancedcomputersearches/id/{id}
Finds computer searches by ID
PUT
/advancedcomputersearches/id/{id}
Updates an existing advanced computer search by ID
POST
/advancedcomputersearches/id/{id}
Creates a new advanced computer search
DELETE
/advancedcomputersearches/id/{id}
Deletes a computer search by ID
GET
/advancedcomputersearches/name/{name}
Finds advanced computer searches by name
PUT
/advancedcomputersearches/name/{name}
Updates an existing advanced computer search by name
DELETE
/advancedcomputersearches/name/{name}
Deletes a computer search by name


mm2270
Forum|alt.badge.img+24
  • Legendary Contributor
  • 7882 replies
  • June 12, 2019

@sampathnk The documentation is a little misleading. If you GET the saved advanced computer search by its ID or name it will actually run the search live and show you the results. I know it doesn't seem like it would do that based on those descriptions, but that's how it works. It will then be up to you to parse the results to get the data that you want.


Forum|alt.badge.img+3
  • Author
  • New Contributor
  • 5 replies
  • June 18, 2019

Thanks @mm2270, This is good info.. bot the data i am getting is not sufficient for each computer.. I wanted data that as come with response to /computers/id for batch processing. Means get computer data for set of computers in batch and put in parallel processing. If any one else has any idea please let me know


Forum|alt.badge.img+18
  • Contributor
  • 475 replies
  • June 18, 2019

Sounds like you'd like somebody to make the script for you. Modify as necessary to do what you want.

#!/bin/bash

jssUser="API_USERNAME_GOES_HERE"
jssPass="API_PASSWORD_GOES_HERE"
jssURL=$(defaults read /Library/Preferences/com.jamfsoftware.jamf.plist jss_url | sed s'/.$//')
computersToGetAtOneTime="1000"
secondsToWaitForNextGet="60"

# Make sure xmlstarlet is installed
if ! command -v xmlstarlet > /dev/null ; then
    echo "You must install xmlstarlet before using this script."
    echo "Try "brew install xmlstarlet""
    exit 1
fi

# Get list of computers that have not checked in within the last 6 months
echo "Downloading list of computers IDs..."
while read -r line; do
    ids+=("$line")
done <<< "$(curl -X GET -k -s -u "$jssUser:$jssPass" "$jssURL/JSSResource/computers" | xmlstarlet sel -t -v '/computers/computer/id')"

# Print out name of each computer
echo "Printing computer names $computersToGetAtOneTime at a time..."
count="$computersToGetAtOneTime"
for id in "${ids[@]}"; do
    computerName=$(curl -X GET -k -s -u "$jssUser:$jssPass" "$jssURL/JSSResource/computers/id/$id" | xmlstarlet sel -t -v '/computer/general/name')
    echo "$((computersToGetAtOneTime - count + 1)). $computerName"
    ((count--))
    if [[ "$count" -le 0 ]]; then
        echo "Waiting $secondsToWaitForNextGet seconds..."
        sleep "$secondsToWaitForNextGet"
        count="$computersToGetAtOneTime"
    fi
done

exit 0

Forum|alt.badge.img+3
  • Author
  • New Contributor
  • 5 replies
  • August 12, 2019

Hello All.. I am still looking for any alternative soln.. Assume i have only read access to REST APIs (no put/post is allowed), I need a chunk of devices.. every time like in 100s. If there is any API which supports to get device ids from 0 to 100, then 101 to 200.. etc .. will be more useful.


Reply


Cookie policy

We use cookies to enhance and personalize your experience. If you accept you agree to our full cookie policy. Learn more about our cookies.

 
Cookie settings