Native support for xAI Grok3
Model shorthand mapping related fixes Comprehensive auto-mode related tests
This commit is contained in:
@@ -3,7 +3,7 @@
|
||||
https://github.com/user-attachments/assets/8097e18e-b926-4d8b-ba14-a979e4c58bda
|
||||
|
||||
<div align="center">
|
||||
<b>🤖 Claude + [Gemini / O3 / OpenRouter / Ollama / Any Model] = Your Ultimate AI Development Team</b>
|
||||
<b>🤖 Claude + [Gemini / O3 / GROK / OpenRouter / Ollama / Any Model] = Your Ultimate AI Development Team</b>
|
||||
</div>
|
||||
|
||||
<br/>
|
||||
@@ -115,6 +115,7 @@ The final implementation resulted in a 26% improvement in JSON parsing performan
|
||||
**Option B: Native APIs**
|
||||
- **Gemini**: Visit [Google AI Studio](https://makersuite.google.com/app/apikey) and generate an API key. For best results with Gemini 2.5 Pro, use a paid API key as the free tier has limited access to the latest models.
|
||||
- **OpenAI**: Visit [OpenAI Platform](https://platform.openai.com/api-keys) to get an API key for O3 model access.
|
||||
- **X.AI**: Visit [X.AI Console](https://console.x.ai/) to get an API key for GROK model access.
|
||||
|
||||
**Option C: Custom API Endpoints (Local models like Ollama, vLLM)**
|
||||
[Please see the setup guide](docs/custom_models.md#option-2-custom-api-setup-ollama-vllm-etc). With a custom API you can use:
|
||||
|
||||
Reference in New Issue
Block a user