The following Python script maps all of the repositories and the storage they used, using the Get Storage Summary Info REST API command, and finally exports the data to a CSV file:
import requests
import json
from collections import defaultdict
import csv
artifactory_hostname = "<hostname>"
URL = "http://{}/artifactory/api/storageinfo".format(artifactory_hostname)
headers = { "Content-Type" : "text/plain" }
payload = 'items.find().include("repo", "size")'
response = requests.get(URL, auth=('<username>', '<password>'))
results = json.loads(response.text)
repoStorage = {}
for repo in results['repositoriesSummaryList']:
repoStorage[repo['repoKey']] = repo['usedSpaceInBytes']
# Open the file in write mode
with open('output.csv', 'w', newline='') as csvfile:
# Create a CSV writer object
writer = csv.DictWriter(csvfile, fieldnames=["repo", "size"])
# Write the headers to the CSV file
writer.writeheader()
# Write the data to the CSV file
for repo, size in repoStorage.items():
writer.writerow({"repo": repo, "size": "{} bytes".format(size)})
print({"repo": repo, "size": size})
As an illustration, the outcomes could be shown as follows:
*You should modify the following fields in the script: username, password, and artifactory_host.
Please note that the size represented is not the binary size but the size of the artifacts.
You may refer to the checksum-based storage documentation for more information.