Skip to content

Commit 124ac58

Browse files
authored
Merge pull request unclecode#1599 from unclecode/docs-llm-strategies-update
unclecode#1551 : Fix casing and variable name consistency for LLMConfig in doc…
2 parents d56b0eb + 2e8f8c9 commit 124ac58

File tree

1 file changed

+5
-5
lines changed

1 file changed

+5
-5
lines changed

docs/md_v2/extraction/llm-strategies.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -20,10 +20,10 @@ In some cases, you need to extract **complex or unstructured** information from
2020

2121
## 2. Provider-Agnostic via LiteLLM
2222

23-
You can use LlmConfig, to quickly configure multiple variations of LLMs and experiment with them to find the optimal one for your use case. You can read more about LlmConfig [here](/api/parameters).
23+
You can use LLMConfig, to quickly configure multiple variations of LLMs and experiment with them to find the optimal one for your use case. You can read more about LLMConfig [here](/api/parameters).
2424

2525
```python
26-
llmConfig = LlmConfig(provider="openai/gpt-4o-mini", api_token=os.getenv("OPENAI_API_KEY"))
26+
llm_config = LLMConfig(provider="openai/gpt-4o-mini", api_token=os.getenv("OPENAI_API_KEY"))
2727
```
2828

2929
Crawl4AI uses a “provider string” (e.g., `"openai/gpt-4o"`, `"ollama/llama2.0"`, `"aws/titan"`) to identify your LLM. **Any** model that LiteLLM supports is fair game. You just provide:
@@ -58,7 +58,7 @@ For structured data, `"schema"` is recommended. You provide `schema=YourPydantic
5858

5959
Below is an overview of important LLM extraction parameters. All are typically set inside `LLMExtractionStrategy(...)`. You then put that strategy in your `CrawlerRunConfig(..., extraction_strategy=...)`.
6060

61-
1. **`llmConfig`** (LlmConfig): e.g., `"openai/gpt-4"`, `"ollama/llama2"`.
61+
1. **`llm_config`** (LLMConfig): e.g., `"openai/gpt-4"`, `"ollama/llama2"`.
6262
2. **`schema`** (dict): A JSON schema describing the fields you want. Usually generated by `YourModel.model_json_schema()`.
6363
3. **`extraction_type`** (str): `"schema"` or `"block"`.
6464
4. **`instruction`** (str): Prompt text telling the LLM what you want extracted. E.g., “Extract these fields as a JSON array.”
@@ -112,7 +112,7 @@ async def main():
112112
# 1. Define the LLM extraction strategy
113113
llm_strategy = LLMExtractionStrategy(
114114
llm_config = LLMConfig(provider="openai/gpt-4o-mini", api_token=os.getenv('OPENAI_API_KEY')),
115-
schema=Product.schema_json(), # Or use model_json_schema()
115+
schema=Product.model_json_schema(), # Or use model_json_schema()
116116
extraction_type="schema",
117117
instruction="Extract all product objects with 'name' and 'price' from the content.",
118118
chunk_token_threshold=1000,
@@ -238,7 +238,7 @@ class KnowledgeGraph(BaseModel):
238238
async def main():
239239
# LLM extraction strategy
240240
llm_strat = LLMExtractionStrategy(
241-
llmConfig = LLMConfig(provider="openai/gpt-4", api_token=os.getenv('OPENAI_API_KEY')),
241+
llm_config = LLMConfig(provider="openai/gpt-4", api_token=os.getenv('OPENAI_API_KEY')),
242242
schema=KnowledgeGraph.model_json_schema(),
243243
extraction_type="schema",
244244
instruction="Extract entities and relationships from the content. Return valid JSON.",

0 commit comments

Comments
 (0)