Skip to main content

Library Usage Overview

LC (LLM Client) can be used not only as a command-line tool but also as a Rust library in your own projects. This allows you to integrate LLM capabilities directly into your applications without needing to shell out to the CLI.

What You Get

When you use LC as a library, you get programmatic access to:

  • Configuration Management - Load and manage provider configurations
  • Multiple LLM Providers - OpenAI, Anthropic, Google Gemini, and more
  • Chat Functionality - Send chat requests and handle responses
  • Vector Database - Store and search embeddings
  • Session Management - Maintain conversation history
  • Template Processing - Use dynamic templates for prompts

Library vs CLI

FeatureCLI UsageLibrary Usage
Executionlc -m gpt-4 "Hello"client.send_chat_request(request).await
ConfigurationConfig files + flagsProgrammatic config loading
OutputTerminal textStructured data types
IntegrationShell scriptsNative Rust code
Error HandlingExit codesResult types

Use Cases

Web Applications

Build web APIs that use LLMs internally:

// In your web handler
use lc_cli::{Config, OpenAIClient, ChatRequest, Message};

let config = Config::load()?;
let client = OpenAIClient::new(&config)?;
let request = ChatRequest {
messages: vec![Message::user("Summarize this text".to_string())],
model: "gpt-4".to_string(),
..Default::default()
};
let response = client.chat(&request).await?;
Json(response)

Desktop Applications

Create GUI applications with LLM features:

// In your desktop app
use lc_cli::{Config, OpenAIClient, ChatRequest, Message};

let config = Config::load()?;
let client = OpenAIClient::new(&config)?;
let request = ChatRequest {
messages: vec![Message::user(format!("Complete this code: {}", user_code))],
model: "gpt-4".to_string(),
..Default::default()
};
let suggestion = client.chat(&request).await?;
display_suggestion(suggestion);

Automation Scripts

Process files and data with LLM assistance:

// Batch processing
use lc_cli::{Config, OpenAIClient, ChatRequest, Message};

let config = Config::load()?;
let client = OpenAIClient::new(&config)?;

for file in files {
let content = std::fs::read_to_string(file)?;
let request = ChatRequest {
messages: vec![Message::user(format!("Summarize this file: {}", content))],
model: "gpt-4".to_string(),
..Default::default()
};
let summary = client.chat(&request).await?;
save_summary(summary);
}

Integration with Existing Systems

Add LLM capabilities to existing Rust applications:

// In your existing service
use lc_cli::{Config, OpenAIClient, ChatRequest, Message};

let config = Config::load()?;
let client = OpenAIClient::new(&config)?;
let request = ChatRequest {
messages: vec![Message::user(format!("Analyze this data: {:?}", metrics))],
model: "gpt-4".to_string(),
..Default::default()
};
let analysis = client.chat(&request).await?;
update_dashboard(analysis);

Getting Started

  1. Installation - Add LC to your Cargo.toml
  2. Basic Usage - Your first LC library code
  3. Configuration - Set up providers and models
  4. Advanced Features - Vector DB, sessions, templates

Architecture

LC library is structured as modules that you can import selectively:

use lc_cli::{
Config, // Configuration management
OpenAIClient, // LLM provider clients
ChatRequest, // Request/response types
Database, // Chat history storage
VectorDB, // Vector database operations
};

Each module is designed to work independently, so you can use only what you need for your specific use case.