File-Based Model Type

JFrog Artifactory Documentation

Products
JFrog Artifactory
Content Type
User Guide
ft:sourceType
Paligo

Use the file-based model type to upload, download, or get information on files not supported by one of the format-aware file types in the FrogML SDK.

Upload a File-Based Model to a Machine Learning Repository

You can upload a model to a Machine Learning repository using the frogml.log_model() function. A model can be a single file or a list of files in a directory. This function uses checksum upload, assigning a SHA2 value to each model for retrieval from storage. If the checksum fails, FrogML implements regular upload.Checksum-Based Storage

After uploading the model, FrogML generates a file named model-manifest.json, which contains the model name and its related files and dependencies.

To upload a file-based model to a Machine Learning repository, use the following function:

import frogml
frogml.files.log_model(   repository="<REPOSITORY_KEY>",    # The JFrog repository to upload the model to.
   namespace="<NAMESPACE_NAME>",     # Optional. The namespace or organization.
   model_name="<MODEL_NAME>",     # The uploaded model name
   version="<MODEL_VERSION>",     # Optional. The uploaded model version
   source_path="<FILE_PATH>",     # The file path of the model on your machine.
   properties={<PROPERTIES>}    # Optional. These are properties you can set on the model to categorize and label it.
   dependencies=[<DEPENDENCY>, <DEPENDENCY>],  # Optional
   code_path= <CODE_PATH>
)

Note

Make sure to replace the placeholders in bold with your own JFrog repository key, namespace name, model name, source path, version, properties, dependencies, and code path. If you do not specify a version, the version will be saved as the timestamp in UTC format.

Parameters

Parameter

Description

Example

<REPOSITORY_KEY>

The name of the JFrog repository you want to upload the model to

frogml-local

<NAMESPACE_NAME>

(Optional) The name of the namespace or organization: use this parameter to group models by project or group.

frog-ml-beta

<MODEL_NAME>

The name of the model you want to upload

my-cool-model

<MODEL_VERSION>

(Optional) The version of the model you want to upload.

If you do not add a model version, Artifactory will set the version as the timestamp of the time you uploaded the model in your time zone, in UTC format: yyyy-MM-dd-HH-mm-ss.

1.0.0

<FILE_PATH>

The file path of the model on your machine or to the directory you want to upload.

/root/models/my-cool-model

<PROPERTIES>

These are properties you can set on the model to categorize and label it, in the format: “x”: “1”, “y”: “2”

"model_type": "kerras", "experiment": "my-exp"

<DEPENDENCIES>

A list of the package types and their corresponding versions that your project is dependent on. The following dependency types are supported:

  1. requirements.txt for pip

  2. poetry requirements file

  3. conda requirements file

  4. Explicit versions

Note

When providing a path to a requirements file, please provide the path as a parameter in a list.

For explicit versions:

dependencies = ["pandas==1.2.3", "numpy==1.2.3"]

<CODE_PATH>

An optional path to the code directory. When provided, the code files in the provided path will be uploaded to the Machine Learning repository together with the model artifact.

./src

Example

import frogml
frogml.files.log_model(
repository="frog-ml-local",
namespace="frog-ml-beta",     # Optional
model_name="my-cool-model",
version="1.0.0",    # Optional
source_path="~/root/models/my-cool-model/",
properties={"model_type": "keras", "experiment": "my-exp"} # Optional
dependencies=["numpy>=1.19"],  # Optional
code_path= "./src" #Optional
)

Download a File-Based Model from a Machine Learning Repository

You can use the frogml.load_model() function to download a file-based model from an Artifactory repository. This method will download the model file locally and return the path to the returned model.

import frogml

# Download model version
frogml.files.load_model(
   repository="<REPOSITORY_KEY>",  # The name of the JFrog repository you want to download the model from
   namespace="<NAMESPACE_NAME>",   # Optional. The name of the namespace or organization: use this parameter to group models by project or group.
   model_name="<MODEL_NAME>",      # The name of the model you want to download.
   version="<MODEL_VERSION>",      # The version of the model you want to download.
   target_path="<FILE_PATH>",      # The file path to where you want to download the model.
)

Note

Make sure to replace the placeholders in bold with your own JFrog repository key, namespace name, model name, target path, and version.

Parameters

Parameter

Description

Example

<REPOSITORY_KEY>

The name of the JFrog repository you want to download the model from.

frogml-local

<NAMESPACE_NAME>

(Optional) The name of the namespace or organization: use this parameter to group models by project or group.

frog-ml-beta

<MODEL_NAME>

The name of the model you want to download.

my-cool-model

<MODEL_VERSION>

The version of the model you want to download.

1.0.0

<TARGET_PATH>

The file path to where you want to download the model.

/root/models/

Example

import frogml

# Download model version
frogml.files.load_model(
    repository="frog-ml-local",
    namespace="frog-ml-beta",     #optional
    model_name="my-cool-model",
    version="1.0.0"
    target_path="~/root/models/",
)

Return Value

The load_model() method returns the path to the local downloaded model as a Path object imported from pathlib.

Get Information on a File-Based Model in a Machine Learning Repository

You can use the frogml.get_model_info() function to retrieve information on a specific file-based model version without downloading the full model files.

import frogml

# Download model version
frogml.files.get_model_info(
   repository="<REPOSITORY_KEY>",  # The name of the JFrog repository you want to download the model from
   namespace="<NAMESPACE_NAME>",   # Optional. The name of the namespace or organization: use this parameter to group models by project or group.
   model_name="<MODEL_NAME>",      # The name of the model you want to download.
   version="<MODEL_VERSION>",      # The version of the model you want to download.
  
)

Note

Make sure to replace the placeholders in bold with your own JFrog repository key, namespace name, model name, and version.

Parameters

Parameter

Description

Example

<REPOSITORY_KEY>

The name of the JFrog repository you want to download the model from.

frogml-local

<NAMESPACE_NAME>

(Optional) The name of the namespace or organization: use this parameter to group models by project or group.

frog-ml-beta

<MODEL_NAME>

The name of the model you want to download.

my-cool-model

<MODEL_VERSION>

The version of the model you want to download.

1.0.0

Example

import frogml

# Download model version
frogml.files.get_model_info(
    repository="frog-ml-local",
    namespace="frog-ml-beta",     #optional
    model_name="my-cool-model",
    version="1.0.0",
    
)

Return Values

Key

Description

Example

model_format

Details about the stored model such as framework, framework_version, runtime environment and serialization format.

{'framework': 'files',
'framework_version': ",
'runtime': 'python',
'runtime_version': '3.9.6', 
'serialization_format': 'pkl'}

model_artifacts

List of objects with details about the model artifact files.

[{“artifact_path”: “<artifactory_path>”,
“checksum”: “<file_checksum>”,
“download_path”:”<download_url>”}]

dependency_artifacts

List of objects with details about the attached dependencies and requirement files.

[{“artifact_path”: “<artifactory_path>”,
“checksum”: “<file_checksum>”,
“download_path”:”<download_url>”}]

code_artifacts

List of objects with details about the attached code files

[{“artifact_path”: “<artifactory_path>”,
“checksum”: “<file_checksum>”,
“download_path”:”<download_url>”}]

created_date

Creation date in  ISO 8601 format.

e.g. 2024-11-12T13:25:53.20