Chimeric¶
Chimeric is a unified Python interface for multiple LLM providers with automatic provider detection and seamless switching.
Setup¶
Chimeric provides a unified interface for 7 major AI providers:
Each provider can be installed individually or together using extras:
# Individual providers
pip install "chimeric[openai]"
pip install "chimeric[anthropic]"
pip install "chimeric[google]"
pip install "chimeric[cohere]"
pip install "chimeric[groq]"
pip install "chimeric[cerebras]"
pip install "chimeric[grok]"
# Multiple providers
pip install "chimeric[openai,anthropic,google]"
# All providers
pip install "chimeric[all]"
Quickstart¶
from chimeric import Chimeric
client = Chimeric() # Auto-detects API keys from environment
response = client.generate(
model="gpt-4o",
messages="Hello!"
)
print(response.content)
Use Cases¶
Multi-Provider Switching:
# Seamlessly switch between providers - string input
gpt_response = client.generate(model="gpt-4o", messages="Explain quantum computing")
claude_response = client.generate(model="claude-3-5-haiku-latest", messages="Write a poem about AI")
gemini_response = client.generate(model="gemini-2.5-flash", messages="Summarize climate change")
Flexibility:
# Mixed usage in same application
unified = client.generate(model="claude-3-5-haiku-latest", messages="Code review this function")
native = client.generate(model="claude-3-5-haiku-latest", messages="Debug this error", native=True)
# Use unified for consistent cross-provider code
print(unified.content)
# Use native for provider-specific features
if hasattr(native, 'stop_reason'):
print(f"Claude stop reason: {native.stop_reason}")
Streaming:
stream = client.generate(
model="gpt-4o",
messages="Tell me a story about space exploration",
stream=True
)
for chunk in stream:
print(chunk.content, end="", flush=True)
Function Calling:
@client.tool()
def get_weather(city: str) -> str:
"""Get current weather for a city."""
return f"Sunny, 72°F in {city}"
response = client.generate(
model="gpt-4o",
messages=[{"role": "user", "content": "What's the weather in NYC?"}]
)
Async Support:
import asyncio
async def main():
response = await client.agenerate(
model="claude-3-5-sonnet-latest",
messages=[{"role": "user", "content": "Analyze this data"},
{"role": "assistant",
"content": "I'd be happy to help analyze data. What data would you like me to look at?"},
{"role": "user", "content": "Sales figures from Q4"}]
)
print(response.content)
asyncio.run(main())
Known Limitations¶
- Beta Status: API may change as we refine the interface
- Provider Dependencies: Each provider requires separate installation extra
- Rate Limits: Subject to individual provider rate limits and quotas
- Multimodal Support: Image and audio support is untested and may vary by provider
- Model Availability: Some models may not be available in all regions
Roadmap¶
- Embeddings Support: Unified interface for text embeddings across providers
- Mutli-Modal Support: Enhanced support for images and audio
- Cost Tracking: Built-in usage and cost monitoring
- Advanced Routing: Load balancing and failover between providers
License: Chimeric is licensed under the MIT License.