aydie-genai 1.1.0
$
pip install aydie-genai
Project description
aydie_genai
is a Python library designed to eliminate the complexity of working with multiple Generative AI models. Instead of writing different code for Gemini, OpenAI, Claude, Groq, and others, you can use one simple, unified function to access them all.
Key Features
-
Unified Interface: A single
generate()
function for all supported models. - Simple & Intuitive: Get started in minutes. No need to learn multiple SDKs.
-
Provider Agnostic: Switch between models like
gpt-4o
andgemini-1.5-pro
by changing a single string. -
Built-in Documentation: Rich docstrings provide in-console help via
help(genai.generate)
. - Robust Error Handling: Custom exceptions for common issues like missing API keys.
Supported Models
Provider | Supported Models | Environment Variable |
---|---|---|
gemini-1.5-pro , gemini-1.5-flash , gemini-1.0-pro ... |
GOOGLE_API_KEY |
|
OpenAI | gpt-4o , gpt-4-turbo , gpt-3.5-turbo ... |
OPENAI_API_KEY |
Anthropic | claude-3-opus-20240229 , claude-3-sonnet-20240229 ... |
ANTHROPIC_API_KEY |
Groq | llama3-70b-8192 , llama3-8b-8192 , mixtral-8x7b-32768 ... |
GROQ_API_KEY |
DeepSeek | deepseek-chat , deepseek-coder , deepseek-v2-chat |
DEEPSEEK_API_KEY |
⚡ Quick Start
1. Set Up Your API Keys
The library loads API keys from environment variables. The easiest way to manage these is to create a .env
file in your project's root directory.
# .env file
GOOGLE_API_KEY="your_google_api_key"
OPENAI_API_KEY="your_openai_api_key"
ANTHROPIC_API_KEY="your_anthropic_api_key"
GROQ_API_KEY="your_groq_api_key"
DEEPSEEK_API_KEY="your_deepseek_api_key"
2. Generate a Response
Using the library is incredibly simple. Just import genai
and call the generate
function.
from aydie_genai import genai
# Example 1: Using Google Gemini
response_gemini = genai.generate(
model='gemini-1.5-pro',
prompt='Explain the theory of relativity in simple terms for a five-year-old.',
system_instruction='You are a friendly and patient teacher.'
)
print("Gemini says:", response_gemini)
# Example 2: Switching to OpenAI's GPT-4o is as easy as changing the model name
response_openai = genai.generate(
model='gpt-4o',
prompt='Write a short story about a robot who discovers music.'
)
print("\nOpenAI says:", response_openai)
Function Parameters
The genai.generate()
function accepts the following parameters:
Parameter | Type | Description |
---|---|---|
model |
str |
Required. The identifier for the model you want to use (e.g., 'gpt-4o' ). |
prompt |
str |
Required. The main user input or question for the model. |
system_instruction |
str |
A directive to guide the model's behavior or personality. |
temperature |
float |
Controls randomness (0.0 for deterministic, 2.0 for random). Defaults to 1.0 . |
max_tokens |
int |
The maximum number of tokens to generate in the response. Defaults to 2048 . |
top_p |
float |
Nucleus sampling parameter. Defaults to 1.0 . |
api_key |
str |
Directly pass an API key, overriding the environment variable. Defaults to None . |
Project links
Meta
- License: MIT License
- Author: Aydie