oumi.cli#

Submodules#

oumi.cli.alias module#

class oumi.cli.alias.AliasType(value)[source]#

Bases: str, Enum

The type of configs we support with aliases.

EVAL = 'eval'#
INFER = 'infer'#
JOB = 'job'#
JUDGE = 'judge'#
QUANTIZE = 'quantize'#
TRAIN = 'train'#
oumi.cli.alias.try_get_config_name_for_alias(alias: str, alias_type: AliasType) str[source]#

Gets the config path for a given alias.

This function resolves the config path for a given alias and alias type. If the alias is not found, the original alias is returned.

Parameters:
  • alias (str) – The alias to resolve.

  • alias_type (AliasType) – The type of config to resolve.

Returns:

The resolved config path (or the original alias if not found).

Return type:

str

oumi.cli.cache module#

oumi.cli.cache.card(repo_id: ~typing.Annotated[str, <typer.models.ArgumentInfo object at 0x7f748b99eb90>])[source]#

Show repository information for a Hugging Face repository.

oumi.cli.cache.get(repo_id: ~typing.Annotated[str, <typer.models.ArgumentInfo object at 0x7f748b99f450>], revision: ~typing.Annotated[str | None, <typer.models.OptionInfo object at 0x7f748b99ef50>] = None)[source]#

Download and cache a Hugging Face repository.

oumi.cli.cache.ls(filter_pattern: ~typing.Annotated[str | None, <typer.models.OptionInfo object at 0x7f748bd58610>] = None, sort_by: ~typing.Annotated[str, <typer.models.OptionInfo object at 0x7f748e7de050>] = 'size', verbose: ~typing.Annotated[bool, <typer.models.OptionInfo object at 0x7f748be3ed10>] = False)[source]#

List locally cached Hugging Face items.

oumi.cli.cache.rm(repo_id: ~typing.Annotated[str, <typer.models.ArgumentInfo object at 0x7f748b99f550>], force: ~typing.Annotated[bool, <typer.models.OptionInfo object at 0x7f748cf15250>] = False)[source]#

Remove a cached Hugging Face item.

oumi.cli.cli_utils module#

class oumi.cli.cli_utils.LogLevel(value)[source]#

Bases: str, Enum

The available logging levels.

CRITICAL = 'CRITICAL'#
DEBUG = 'DEBUG'#
ERROR = 'ERROR'#
INFO = 'INFO'#
WARNING = 'WARNING'#
oumi.cli.cli_utils.configure_common_env_vars() None[source]#

Sets common environment variables if needed.

oumi.cli.cli_utils.parse_extra_cli_args(ctx: Context) list[str][source]#

Parses extra CLI arguments into a list of strings.

Parameters:

ctx – The Typer context object.

Returns:

The extra CLI arguments

Return type:

List[str]

oumi.cli.cli_utils.resolve_and_fetch_config(config_path: str, output_dir: Path | None = None, force: bool = True) Path[source]#

Resolve oumi:// prefix and fetch config if needed.

Parameters:
  • config_path – Original config path that may contain oumi:// prefix

  • output_dir – Optional override for output directory

  • force – Whether to overwrite an existing config

Returns:

Local path to the config file

Return type:

Path

oumi.cli.cli_utils.section_header(title, console: ~rich.console.Console = <console width=80 None>)[source]#

Print a section header with the given title.

Parameters:
  • title – The title text to display in the header.

  • console – The Console object to use for printing.

oumi.cli.cli_utils.set_log_level(level: LogLevel | None)[source]#

Sets the logging level for the current command.

Parameters:

level (Optional[LogLevel]) – The log level to use.

oumi.cli.distributed_run module#

oumi.cli.distributed_run.accelerate(ctx: ~typer.models.Context, level: ~typing.Annotated[~oumi.cli.cli_utils.LogLevel | None, <typer.models.OptionInfo object at 0x7f74b3c19390>] = None) None[source]#

Starts accelerate sub-process w/ automatically configured common params.

Parameters:
  • ctx – The Typer context object.

  • level – The logging level for the specified command.

oumi.cli.distributed_run.torchrun(ctx: ~typer.models.Context, level: ~typing.Annotated[~oumi.cli.cli_utils.LogLevel | None, <typer.models.OptionInfo object at 0x7f74b3c19390>] = None) None[source]#

Starts torchrun sub-process w/ automatically configured common params.

Parameters:
  • ctx – The Typer context object.

  • level – The logging level for the specified command.

oumi.cli.env module#

oumi.cli.env.env()[source]#

Prints information about the current environment.

oumi.cli.evaluate module#

oumi.cli.evaluate.evaluate(ctx: ~typer.models.Context, config: ~typing.Annotated[str, <typer.models.OptionInfo object at 0x7f748ba0ef10>], level: ~typing.Annotated[~oumi.cli.cli_utils.LogLevel | None, <typer.models.OptionInfo object at 0x7f74b3c19390>] = None)[source]#

Evaluate a model.

Parameters:
  • ctx – The Typer context object.

  • config – Path to the configuration file for evaluation.

  • level – The logging level for the specified command.

oumi.cli.fetch module#

oumi.cli.fetch.fetch(config_path: ~typing.Annotated[str, <typer.models.ArgumentInfo object at 0x7f748b9d2890>], output_dir: ~typing.Annotated[~pathlib.Path | None, <typer.models.OptionInfo object at 0x7f748bdc5ad0>] = None, force: ~typing.Annotated[bool, <typer.models.OptionInfo object at 0x7f748b9d0ad0>] = False) None[source]#

Fetch configuration files from GitHub repository.

oumi.cli.infer module#

oumi.cli.infer.infer(ctx: ~typer.models.Context, config: ~typing.Annotated[str, <typer.models.OptionInfo object at 0x7f748b9ed350>], interactive: ~typing.Annotated[bool, <typer.models.OptionInfo object at 0x7f748b9ed150>] = False, image: ~typing.Annotated[str | None, <typer.models.OptionInfo object at 0x7f748b9ef610>] = None, system_prompt: ~typing.Annotated[str | None, <typer.models.OptionInfo object at 0x7f748b9ef990>] = None, level: ~typing.Annotated[~oumi.cli.cli_utils.LogLevel | None, <typer.models.OptionInfo object at 0x7f74b3c19390>] = None)[source]#

Run inference on a model.

If input_filepath is provided in the configuration file, inference will run on those input examples. Otherwise, inference will run interactively with user-provided inputs.

Parameters:
  • ctx – The Typer context object.

  • config – Path to the configuration file for inference.

  • output_dir – Directory to save configs

  • ~/.oumi/fetch). ((defaults to OUMI_DIR env var or)

  • interactive – Whether to run in an interactive session.

  • image – Path to the input image for image+text VLLMs.

  • system_prompt – System prompt for task-specific instructions.

  • level – The logging level for the specified command.

oumi.cli.judge module#

oumi.cli.judge.judge_conversations_file(ctx: ~typer.models.Context, judge_config: ~typing.Annotated[str, <typer.models.OptionInfo object at 0x7f748b9ee110>], input_file: ~typing.Annotated[str, <typer.models.OptionInfo object at 0x7f748bc2d490>], output_file: ~typing.Annotated[str | None, <typer.models.OptionInfo object at 0x7f748bc2d4d0>] = None, display_raw_output: bool = False)[source]#

Judge a list of conversations.

oumi.cli.judge.judge_dataset_file(ctx: ~typer.models.Context, judge_config: ~typing.Annotated[str, <typer.models.OptionInfo object at 0x7f748be05c90>], input_file: ~typing.Annotated[str, <typer.models.OptionInfo object at 0x7f748bc2d210>], output_file: ~typing.Annotated[str | None, <typer.models.OptionInfo object at 0x7f748bc2d310>] = None, display_raw_output: bool = False)[source]#

Judge a dataset.

oumi.cli.judge.judge_file(ctx: ~typer.models.Context, judge_config: ~typing.Annotated[str, <typer.models.OptionInfo object at 0x7f748bc2d550>], input_file: ~typing.Annotated[str, <typer.models.OptionInfo object at 0x7f748bc2d510>], output_file: ~typing.Annotated[str | None, <typer.models.OptionInfo object at 0x7f748bc2d650>] = None, display_raw_output: bool = False, judgment_fn: ~typing.Callable[[...], list[~typing.Any]] = Ellipsis)[source]#

Judge a dataset or list of conversations.

oumi.cli.launch module#

oumi.cli.launch.cancel(cloud: ~typing.Annotated[str, <typer.models.OptionInfo object at 0x7f748bbd5b10>], cluster: ~typing.Annotated[str, <typer.models.OptionInfo object at 0x7f748bbd5590>], id: ~typing.Annotated[str, <typer.models.OptionInfo object at 0x7f748bbd5dd0>], level: ~typing.Annotated[~oumi.cli.cli_utils.LogLevel | None, <typer.models.OptionInfo object at 0x7f74b3c19390>] = None) None[source]#

Cancels a job.

Parameters:
  • cloud – Filter results by this cloud.

  • cluster – Filter results by clusters matching this name.

  • id – Filter results by jobs matching this job ID.

  • level – The logging level for the specified command.

oumi.cli.launch.down(cluster: ~typing.Annotated[str, <typer.models.OptionInfo object at 0x7f748bbd5ed0>], cloud: ~typing.Annotated[str | None, <typer.models.OptionInfo object at 0x7f748bbd5f90>] = None, level: ~typing.Annotated[~oumi.cli.cli_utils.LogLevel | None, <typer.models.OptionInfo object at 0x7f74b3c19390>] = None) None[source]#

Turns down a cluster.

Parameters:
  • cluster – The cluster to turn down.

  • cloud – If specified, only clusters on this cloud will be affected.

  • level – The logging level for the specified command.

oumi.cli.launch.run(ctx: ~typer.models.Context, config: ~typing.Annotated[str, <typer.models.OptionInfo object at 0x7f748bbd6150>], cluster: ~typing.Annotated[str | None, <typer.models.OptionInfo object at 0x7f748bbd5fd0>] = None, detach: ~typing.Annotated[bool, <typer.models.OptionInfo object at 0x7f748bbd6350>] = False, level: ~typing.Annotated[~oumi.cli.cli_utils.LogLevel | None, <typer.models.OptionInfo object at 0x7f74b3c19390>] = None) None[source]#

Runs a job on the target cluster.

Parameters:
  • ctx – The Typer context object.

  • config – Path to the configuration file for the job.

  • cluster – The cluster to use for this job. If no such cluster exists, a new cluster will be created. If unspecified, a new cluster will be created with a unique name.

  • detach – Run the job in the background.

  • level – The logging level for the specified command.

oumi.cli.launch.status(cloud: ~typing.Annotated[str | None, <typer.models.OptionInfo object at 0x7f748bbd6310>] = None, cluster: ~typing.Annotated[str | None, <typer.models.OptionInfo object at 0x7f748bbd6610>] = None, id: ~typing.Annotated[str | None, <typer.models.OptionInfo object at 0x7f748bbd6550>] = None, level: ~typing.Annotated[~oumi.cli.cli_utils.LogLevel | None, <typer.models.OptionInfo object at 0x7f74b3c19390>] = None) None[source]#

Prints the status of jobs launched from Oumi.

Optionally, the caller may specify a job id, cluster, or cloud to further filter results.

Parameters:
  • cloud – Filter results by this cloud.

  • cluster – Filter results by clusters matching this name.

  • id – Filter results by jobs matching this job ID.

  • level – The logging level for the specified command.

oumi.cli.launch.stop(cluster: ~typing.Annotated[str, <typer.models.OptionInfo object at 0x7f748bbd6450>], cloud: ~typing.Annotated[str | None, <typer.models.OptionInfo object at 0x7f748bbd4850>] = None, level: ~typing.Annotated[~oumi.cli.cli_utils.LogLevel | None, <typer.models.OptionInfo object at 0x7f74b3c19390>] = None) None[source]#

Stops a cluster.

Parameters:
  • cluster – The cluster to stop.

  • cloud – If specified, only clusters on this cloud will be affected.

  • level – The logging level for the specified command.

oumi.cli.launch.up(ctx: ~typer.models.Context, config: ~typing.Annotated[str, <typer.models.OptionInfo object at 0x7f748bbd6490>], cluster: ~typing.Annotated[str | None, <typer.models.OptionInfo object at 0x7f748bbd5f10>] = None, detach: ~typing.Annotated[bool, <typer.models.OptionInfo object at 0x7f748bbd4690>] = False, level: ~typing.Annotated[~oumi.cli.cli_utils.LogLevel | None, <typer.models.OptionInfo object at 0x7f74b3c19390>] = None)[source]#

Launches a job.

Parameters:
  • ctx – The Typer context object.

  • config – Path to the configuration file for the job.

  • cluster – The cluster to use for this job. If no such cluster exists, a new cluster will be created. If unspecified, a new cluster will be created with a unique name.

  • detach – Run the job in the background.

  • level – The logging level for the specified command.

oumi.cli.launch.which(level: ~typing.Annotated[~oumi.cli.cli_utils.LogLevel | None, <typer.models.OptionInfo object at 0x7f74b3c19390>] = None) None[source]#

Prints the available clouds.

oumi.cli.main module#

oumi.cli.main.experimental_features_enabled()[source]#

Check if experimental features are enabled.

oumi.cli.main.get_app() Typer[source]#

Create the Typer CLI app.

oumi.cli.main.run()[source]#

The entrypoint for the CLI.

oumi.cli.quantize module#

oumi.cli.quantize.quantize(ctx: ~typer.models.Context, config: ~typing.Annotated[str | None, <typer.models.OptionInfo object at 0x7f748bc7a250>] = None, method: ~typing.Annotated[str, <typer.models.OptionInfo object at 0x7f748bc7a490>] = 'awq_q4_0', model: ~typing.Annotated[str, <typer.models.OptionInfo object at 0x7f748bc7a3d0>] = '', output: ~typing.Annotated[str, <typer.models.OptionInfo object at 0x7f748bc7a6d0>] = 'quantized_model')[source]#

🚧 DEVELOPMENT: Quantize a model to reduce its size and memory requirements.

Example

oumi quantize –model “oumi-ai/HallOumi-8B” –method awq_q4_0

–output “halloumi-8b-q4.gguf”

Note

The quantization process may require significant memory and time, especially for large models. Ensure sufficient disk space for both the original and quantized models during processing.

oumi.cli.synth module#

oumi.cli.synth.synth(ctx: ~typer.models.Context, config: ~typing.Annotated[str, <typer.models.OptionInfo object at 0x7f748bc7a890>], level: ~typing.Annotated[~oumi.cli.cli_utils.LogLevel | None, <typer.models.OptionInfo object at 0x7f74b3c19390>] = None)[source]#

Synthesize a dataset.

Parameters:
  • ctx – The Typer context object.

  • config – Path to the configuration file for synthesis.

  • level – The logging level for the specified command.

oumi.cli.train module#

oumi.cli.train.train(ctx: ~typer.models.Context, config: ~typing.Annotated[str, <typer.models.OptionInfo object at 0x7f748bc79ed0>], level: ~typing.Annotated[~oumi.cli.cli_utils.LogLevel | None, <typer.models.OptionInfo object at 0x7f74b3c19390>] = None)[source]#

Train a model.

Parameters:
  • ctx – The Typer context object.

  • config – Path to the configuration file for training.

  • level – The logging level for the specified command.