utils
Helper functions
get_single_model_config(name, base_path, versions, model_platform='tensorflow')
Generate string block for creating serving config file
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
name |
Name of the model. Will serve as an endpoint address for the servable |
required | |
base_path |
path to the folder containing the models. Should contain one or more folders with model, each with an integer name (1,3,121541). When serving with default config, the model inside the folder with the biggest number will be served |
required | |
versions |
version names. Will be used to generate version policy |
required | |
model_platform |
required field in config. Can be only tensorflow for now |
'tensorflow'
|
Returns:
| Name | Type | Description |
|---|---|---|
out |
str
|
config in protostyle syntax |
Source code in conftrainer/modifications/utils.py
generate_serving_config(param_list, config_save_path=None)
Generate a config file for serving tensorflow models
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
param_list |
list of tuple
|
parameters for each network that should be added to config file |
required |
config_save_path |
str
|
path to save the generated config file. Extension should be .config or .conf |
None
|
Returns:
| Name | Type | Description |
|---|---|---|
out |
str
|
config for serving the network |