
Andrew Ng’s Python AISuite, a Game-Changer for AI Developers
The AI landscape is teeming with exciting innovations, but managing interactions with multiple large language model (LLM) providers can feel like wrangling a digital hydra. Every provider has unique APIs, SDKs, and quirks. This is where Andrew Ng’s Python AISuite, an open-source Python library introduced by Andrew Ng, steps in. AISuite unifies and simplifies communication with multiple LLMs like OpenAI, Anthropic, and Meta’s Llama, letting you focus on building intelligent solutions without the hassle.
This article introduces AISuite, provides installation and setup instructions, and showcases unique examples to inspire you.
What is Python AISuite?
AISuite is a Python package that acts as a middle layer for interacting with multiple LLMs. With AISuite, you can access various AI models by simply specifying their provider and name, like "openai:gpt-4o"
. It eliminates the need to learn different SDKs or juggle complex configurations.
Key highlights of AISuite:
- Unified Interface: A single client to interact with multiple LLM providers.
- Ease of Use: Minimal setup required to start experimenting.
- Flexibility: Easily compare outputs across models for research, teaching, or development purposes.
Installation and Setup – AISuite Python
AISuite is easy to install and configure.
Step 1: Install AISuite – An Open-Source Python Package
Use pip to install AISuite along with optional dependencies for additional functionality:
pip install aisuite[all]
You may need additional libraries for specific providers, such as openai
or anthropic
:
pip install openai anthropic
Step 2: Securely Storing API Keys
2.1: Why Should You Store API Keys in a Separate File?
API keys are sensitive credentials that allow you to access third-party services like OpenAI, Anthropic, and Ollama. To avoid exposing these keys in your code, it’s crucial to store them securely. Hardcoding your keys directly into your script increases the risk of accidental exposure, especially if you share your code publicly or collaborate with others.
2.2: Best Practices for Secure Key Management
- Use Environment Variables: Storing API keys as environment variables ensures that sensitive data isn’t exposed within your code.
- Create a Config File: Another approach is to store your API keys in a separate Python file (e.g.,
config.py
) and import them into your main script. - Use .gitignore: If you’re using version control (e.g., Git), ensure that your API key file is excluded from being tracked by adding it to
.gitignore
.
2.3: Creating a config.py
File
Let’s store our API keys in a separate file. Create a new Python file named config.py
and add your API keys for OpenAI, Anthropic, and Ollama. Here’s an example:
# config.py
OPENAI_API_KEY = "your_openai_api_key_here"
ANTHROPIC_API_KEY = "your_anthropic_api_key_here"
GROQ_API_KEY = "your_groq_api_key"
HF_API_KEY = "your_hugging_face_access_token"
2.4: Importing the API Keys into Your Main Script
To use the API keys securely, you’ll import them from config.py
into your main Python script. You can also set them as environment variables in the script to ensure they are available when interacting with AISuite.
import os
from config import *
# Set API keys as environment variables
os.environ['OPENAI_API_KEY'] = OPENAI_API_KEY
os.environ['ANTHROPIC_API_KEY'] = ANTHROPIC_API_KEY
os.environ['HUGGINGFACE_TOKEN'] = HF_API_KEY
os.environ['GROQ_API_KEY'] = GROQ_API_KEY
Step 3: Initialize AISuite
Once the keys are in place, initializing AISuite is straightforward:
import aisuite as ai
client = ai.Client()
Step 4: Hands-On Examples with Andrew Ng’s Python AISuite
Now let’s dive into some practical examples. These prompts highlight AISuite’s ability to work seamlessly across multiple providers.
4.1: Example 1- Brainstorming App Ideas with GPT-4o
Let’s use OpenAI’s GPT-4o model to brainstorm innovative app ideas for sustainability.
messages = [
{"role": "system", "content": "You are a creative thinker."},
{"role": "user", "content": "Suggest five app ideas that promote sustainable living."}
]
response = client.chat.completions.create(model="openai:gpt-4o", messages=messages)
print("\n".join([f"{i+1}. {idea}" for i, idea in enumerate(response.choices[0].message.content.splitlines())]))
Output:

4.2: Example 2- Asking Questions with Hugging Face Mistral-7B-Instruct-v0.3 – AISuite
Switch to Mistral-7B-Instruct model to ask a question.
import os
import aisuite as ai
from config import HF_API_KEY
# Initialize AISuite Client
client = ai.Client()
# Specify the Hugging Face model
hf_model = "huggingface:mistralai/Mistral-7B-Instruct-v0.3"
# Prepare the conversation
messages = [
{"role": "user", "content": "Can you explain the concept of generative AI in simple terms?"}
]
# Generate a response
response = client.chat.completions.create(
model=hf_model,
messages=messages
)
# Print the response
print("AI Response:")
print(response.choices[0].message.content)
Output:

4.3: Example 3- Solving Math Problems with groq:llama3-8b-8192 – Andrew Ng’s AISuite
Switch to groc:llama3-8b-8192 and ask a simple mathematics question.
import os
import aisuite as ai
from config import HF_API_KEY, GROQ_API_KEY
# Initialize AISuite Client
client = ai.Client()
messages = [
{"role": "system", "content": "You are a math tutor."},
{"role": "user", "content": "Explain how to solve for x in the equation 2x + 5 = 15."}
]
groq_llama3_8b = "groq:llama3-8b-8192"
# groq_llama3_70b = "groq:llama3-70b-8192"
response = client.chat.completions.create(model=groq_llama3_8b, messages=messages)
print(response.choices[0].message.content)
Output:

4.4: Example 4- Sentiment Analysis Using LLMs
Analyze a customer review’s sentiment with OpenAI’s GPT model.
messages = [
{"role": "system", "content": "You are a sentiment analysis expert."},
{"role": "user", "content": "Analyze the sentiment of this review: 'The product quality is excellent, but delivery was delayed.'"}
]
response = client.chat.completions.create(model="openai:gpt-4o", messages=messages)
print(response.choices[0].message.content)
Output:

Why Choose Andrew Ng’s Python AISuite?
- Effortless Multi-Model Integration: Use a single interface for leading providers.
- Time-Saving: Skip learning multiple APIs; focus on results.
- Flexible Use Cases: Whether it’s education, development, or research, AISuite adapts.
- Open Source: Freely accessible, with growing community support.
Visit here – Official Repository Link
Conclusion
Andrew Ng’s Python AISuite empowers developers and researchers to get the best of LLMs without wrestling with disparate APIs. It’s an ideal tool for those eager to innovate, compare, and build AI solutions efficiently. The open-source Python library AISuite by Andrew Ng provides a seamless, efficient way to work with the best AI models in one place.
GitHub Repository link for the code used in this article.
Whether you’re coding a chatbot, analyzing sentiment, or brainstorming ideas, AISuite simplifies the journey. Install it today and let your creativity flow!
Stay tuned for more hands-on examples!
Contact Us for any collaboration or for guest posts. You can also join our social media groups.
Discover the power of AISuite by Andrew Ng, an open-source Python package that simplifies generative AI workflows. Learn how to integrate Hugging Face models like Mistral-7B-Instruct and explore practical examples to enhance your AI development journey.
2 Comments