MonsterAPI is Now Integrated With Portkey - Here’s Everything You Need to Know
MonsterAPI is now integrated in Portkey, making it easier for developers to route LLM text generation requests directly to our cost-effective, scalable LLM APIs while using Portkey SDK.
We are pleased to announce the integration of MonsterAPI in Portkey platform, offering developers seamless access to our sophisticated suite of LLM APIs. Our decentralized GPU cloud powers these REST APIs offer scalable, low cost and reliable access to Large language models such as Mistral 7B, Zephyr 7B, Phi-2 and many more.
Here’s a simple breakdown of what this means for you.
What is Portkey?
Portkey AI is an AI gateway and observability platform that helps companies develop, deploy, and manage their generative AI applications more efficiently.
It simplifies integration with large language models (LLMs) like OpenAI's GPT models, enabling features such as fallback mechanisms, load balancing, automatic retries, and caching.
The platform provides a suite of tools for monitoring performance and managing prompts, which can significantly enhance the cost-effectiveness, performance, and accuracy of AI applications. Portkey AI is particularly valuable for businesses looking to integrate AI functionality quickly and reliably, offering support for multiple AI providers and frameworks through its universal API and SDKs.
What Does MonsterAPI x Portkey Mean?
Our collaboration with Portkey.ai streamlines the API integration process, making it easier for developers to route LLM text generation requests directly to our cost-effective, scalable LLM APIs while using Portkey SDK.
These APIs support a variety of applications, from chat completions to question answering and virtual support agents, facilitating the development of next-generation AI applications.
Key features of the Portkey integration with MonsterAPI:
- Simplified SDK Installation: Developers can install the Portkey SDK to facilitate easy integration with MonsterAPI's services.
- Virtual Key Initialization: Enhances security by using virtual keys for API interactions.
- Chat Completions: Enables direct invocation of chat completions using MonsterAPI, providing a seamless way to utilize language models.
- Prompt Management: Utilizes Portkey's prompt library to manage and test various prompts tailored for MonsterAPI.
- Support for Multiple Language Models: The integration supports various models including TinyLlama 1.1B, Phi2, Zephyr 7B, and Mistral 7B Instruct.
For further details, you can refer to the integration guide on Portkey's documentation page.
Portkey SDK Integration with MonsterAPI Models
Portkey provides a consistent API to interact with models from various providers. To integrate MonsterAPI with Portkey:
- Install the Portkey SDK
Add the Portkey SDK to your application to interact with MonsterAPI's API through Portkey's gateway.
NodeJS SDK
To install the portkey-ai
package, use the following npm command:
npm install --save portkey-ai
Python SDK
To install the portkey-ai
package, use the following pip command:
pip install portkey-ai
- Initialize Portkey with the Virtual Key
Set up Portkey with your virtual key as part of the initialization configuration. You can create a virtual key for MonsterAPI in the UI.
NodeJS SDK
import Portkey from 'portkey-ai';
const portkey = new Portkey({
apiKey: "PORTKEY_API_KEY", // defaults to process.env["PORTKEY_API_KEY"]
virtualKey: "VIRTUAL_KEY" // Your MonsterAPI Virtual Key
});
Python SDK
from portkey_ai import Portkey
portkey = Portkey(
api_key="PORTKEY_API_KEY", # Replace with your Portkey API key
virtual_key="VIRTUAL_KEY" # Replace with your virtual key for MonsterAPI
)
- Invoke Chat Completions with MonsterAPI
Use the Portkey instance to send requests to MonsterAPI. You can also override the virtual key directly in the API call if needed.
NodeJS SDK
const chatCompletion = await portkey.chat.completions.create({
messages: [{ role: 'user', content: 'Say this is a test' }],
model: 'TinyLlama/TinyLlama-1.1B-Chat-v1.0',
});
console.log(chatCompletion.choices);
Python SDK
completion = portkey.chat.completions.create(
messages= [{ "role": 'user', "content": 'Say this is a test' }],
model= 'mistral-medium'
)
print(completion)
Conclusion
The MonsterAPI and Portkey integration makes it easier to build powerful AI applications. With simplified integration, efficient data handling, and scalability, you can ensure your AI applications perform at their best capability and are up to date with the most advanced Generative AI models that provide better quality responses to generic and domain specific queries as well.