Log ONNX Models

JFrog Artifactory Documentation

Products
JFrog Artifactory
Content Type
User Guide

Run the following command:

import frogml

repository = "<REPO_NAME>"
name = "<MODEL_NAME>"
version = "<MODEL_VERSION>" # optional
properties = {"<KEY1>": "<VALUE1>"} # optional
dependencies = ["<MODEL_DEPENDENCIES>"] # optional
code_dir = "<CODE_PATH>" # optional
parameters = {"<HYPERPARAM_NAME>": "<VALUE>"} # optional
metrics = {"<METRIC_NAME>": "<VALUE>"} # optional
predict_file = "<PREDICT_PATH>" # optional
onnx_model = get_onnx_model()

frogml.onnx.log_model(
    model=onnx_model,
    repository=repository,
    model_name=name,
    version=version,
    properties=properties,
    dependencies=dependencies,
    code_dir=code_dir,
    parameters=parameters,
    metrics=metrics,
    predict_file=predict_file,
)

Important

The parameters code_dir, dependencies, and predict_file must be provided as a complete set. You must specify all of them or none at all.

Where:

  • <REPO_NAME>: The name of the Artifactory repository where the model is stored

  • <MODEL_NAME>: The unique name of the model you want to download

  • <MODEL_VERSION> (Optional): The version of the model you want to upload. If version is not specified, the current timestamp is used

  • <KEY1> and <VALUE1> (Optional): Key pair values to add searchable string-based tags or metadata to the model. Separate multiple key pairs with a comma

  • <MODEL_DEPENDENCIES> (Optional): Dependencies required to run the model, in one of the following formats:

    • A list of specific package requirements, for example ["catboost==1.2.5", "scikit-learn==1.3.2"]

    • A path to a single requirements.txt , pyproject.toml, or conda.yml package manager file

    • A two-item list containing paths to the pyproject.toml and poetry.lock files

  • <CODE_PATH> (Optional): The path to the directory containing the source code

  • <HYPERPARAM_NAME> and <VALUE> (Optional): Hyperparameters used to train the model

  • <METRIC_NAME> and <VALUE> (Optional): Key pair values to add searchable numeric metadata to the model. Separate multiple key pairs with a comma

  • <PREDICT_PATH> (Optional): The path to a script that defines how to make predictions with the model.

For example:

import frogml

repository = "ml-local"
name = "yolo-object-detector"
version = "1.2.0"
properties = {
    "source_framework": "PyTorch",
    "precision": "FP16"
}
dependencies = ["torch==2.2.0", "onnx==1.16.0"]
code_dir = "src/conversion_scripts/"
predict_file = "src/conversion_scripts/predict.py"
onnx_model = get_onnx_model()

frogml.onnx.log_model(
    model=onnx_model,
    repository=repository,
    model_name=name,
    version=version,
    properties=properties,
    dependencies=dependencies,
    code_dir=code_dir,
)