Skip to main content

Proxy Command

The proxy command in the LLM Client is used to manage proxy servers that facilitate interaction between the client and language model providers. It provides a flexible means of routing requests through a specified network point.

Overview

The command is essential for setting up and managing network proxies, offering capabilities such as filtering by provider and model, generating API keys, and more. It can be used to customize the endpoint access settings and integrate seamlessly with different network configurations.

Usage

# Start the proxy with default settings
lc proxy --provider openai --model gpt-3.5-turbo

# Specify a different host and port
lc proxy --host 0.0.0.0 --port 8000

# Use short flags
lc proxy --provider openai -m gpt-3.5-turbo

Subcommands

NameAliasDescription
addaAdd a new proxy entry
removerRemove an existing proxy entry
listlList all active proxy entries

Options

ShortLongDescriptionDefault
--providerFilter by providerNone
-m--modelFilter by specific model (can be provider:model or alias)None
-h--hostHost to bind to127.0.0.1
-p--portPort to listen on6789
-k--keyAPI key for authenticationNone
-g--generate-keyGenerate a random API keyFalse
-h--helpPrint help ⚠️ Known Issue: conflicts with --hostFalse

Examples

Start a Simple Proxy

lc proxy --provider openai --model gpt-3.5-turbo

Generate and Use API Key

lc proxy -g --host 0.0.0.0 --port 8000

Custom Provider and Model

lc proxy --provider custom-provider -m custom-model

Troubleshooting

Common Issues

Conflict with Existing Port

  • Error: Port already in use.
  • Solution: Use the --port flag to specify a different port.

Short-flag Collision (-h)

  • Issue: The proxy command and help use -h.
  • Solution: Use full --help flag where necessary.