Haystack x MonsterAPI: Powerful SLMs at your fingertips

By integrating MonsterAPI with Haystack, users can tap into large language models to build state of the art RAG pipelines for their chatbots and Agents.

Haystack x MonsterAPI: Powerful SLMs at your fingertips

In the ever-evolving landscape of natural language processing, the integration of MonsterAPI with Haystack opens up exciting new possibilities for developers and researchers. This blog post will explore how you can leverage MonsterAPI's powerful language models within the Haystack framework to create sophisticated NLP applications.

Overview of MonsterAPI Integration

MonsterAPI provides access to a range of powerful language models designed for various text generation tasks. By integrating MonsterAPI with Haystack, users can tap into these models to enhance their natural language processing capabilities.

Getting Started

To begin using MonsterAPI with Haystack, you'll need to:

  1. Sign up at MonsterAPI.
  2. Set the environment variable MONSTER_API_KEY with your API key.
  3. Choose a compatible model for your use case.

Using MonsterAPI in Haystack

Let's look at two primary ways to use MonsterAPI within Haystack:

Using Generator for Question Answering

Here's an example of how to use a MonsterAPI model to perform question answering on a web page:

from haystack.utils import Secret
from haystack.components.fetchers import LinkContentFetcher
from haystack.components.converters import HTMLToDocument
from haystack.components.builders import PromptBuilder
from haystack.components.generators import OpenAIGenerator

# Set up components
fetcher = LinkContentFetcher()
converter = HTMLToDocument()
prompt_template = """
According to the contents of this website:
{% for document in documents %}
 {{document.content}}
{% endfor %}
Answer the given question: {{query}}
Answer:
"""
prompt_builder = PromptBuilder(template=prompt_template)
llm = OpenAIGenerator(
    api_key=Secret.from_env_var("MONSTER_API_KEY"),
    api_base_url="https://llm.monsterapi.ai/v1/",
    model="microsoft/Phi-3-mini-4k-instruct",
    generation_kwargs = {"max_tokens": 256}
)

# Build and run the pipeline
pipeline = Pipeline()
pipeline.add_component("fetcher", fetcher)
pipeline.add_component("converter", converter)
pipeline.add_component("prompt", prompt_builder)
pipeline.add_component("llm", llm)

pipeline.connect("fetcher.streams", "converter.sources")
pipeline.connect("converter.documents", "prompt.documents")
pipeline.connect("prompt.prompt", "llm.prompt")

result = pipeline.run({
    "fetcher": {"urls": ["https://developer.monsterapi.ai/docs/"]},
    "prompt": {"query": "What are the features of MonsterAPI?"}
})

print(result["llm"]["replies"][0])

Using ChatGenerator for Multi-turn Conversations

For engaging in multi-turn conversations, you can use the OpenAIChatGenerator:

from haystack.dataclasses import ChatMessage
from haystack.utils import Secret

generator = OpenAIChatGenerator(
    api_key=Secret.from_env_var("MONSTER_API_KEY"),
    api_base_url="https://llm.monsterapi.ai/v1/",
    model="microsoft/Phi-3-mini-4k-instruct",
    generation_kwargs = {"max_tokens": 256}
)

messages = []
while True:
    msg = input("Enter your message or Q to exit\n ")
    if msg == "Q":
        break
    messages.append(ChatMessage.from_user(msg))
    response = generator.run(messages=messages)
    assistant_resp = response['replies'][0]
    print(assistant_resp.content)
    messages.append(assistant_resp)

Building a RAG Bot with MonsterAPI and Haystack in minutes - 2024 Indian Budget Chatbot

To create a Retrieval-Augmented Generation (RAG) bot using MonsterAPI and Haystack, you can combine the power of Haystack's document retrieval capabilities with MonsterAPI's language models. Here's a high-level approach:

  1. Index your documents using Haystack's document stores.
  2. Use Haystack's retriever components to fetch relevant documents based on user queries.
  3. Utilize the PromptBuilder to create a context-aware prompt, including the retrieved documents.
  4. Pass the prompt to the MonsterAPI model via the OpenAIGenerator for generating responses.

This approach allows you to create a bot that can answer questions based on your specific knowledge base while leveraging the language understanding capabilities of MonsterAPI's models.

Potential Use Cases and Utilities

The integration of MonsterAPI with Haystack opens up a wide range of possibilities:

  1. Customized chatbots for customer support
  2. Document summarization and analysis tools
  3. Content generation for marketing and SEO
  4. Automated report generation from structured data
  5. Language translation and localization services
  6. Sentiment analysis and opinion mining applications

For more information and updates, be sure to check out the MonsterAPI documentation and the Haystack GitHub repository.

Indian 2024-Budget Chatbot

pip install farm-haystack
import os
from haystack import Pipeline
from haystack.utils import Secret
from haystack.components.fetchers import LinkContentFetcher
from haystack.components.converters import HTMLToDocument
from haystack.components.builders import PromptBuilder
from haystack.components.generators import OpenAIGenerator
from haystack.dataclasses import ChatMessage

# Set up the MonsterAPI key
os.environ["MONSTER_API_KEY"] = "YOUR_MONSTERAPI_KEY"

# Initialize components
fetcher = LinkContentFetcher()
converter = HTMLToDocument()

prompt_template = """
Based on the following information about the Indian Budget 2024:
{% for document in documents %}
{{document.content}}
{% endfor %}

Answer the user's question: {{query}}

Answer:
"""

prompt_builder = PromptBuilder(template=prompt_template)

llm = OpenAIGenerator(
    api_key=Secret.from_env_var("MONSTER_API_KEY"),
    api_base_url="https://llm.monsterapi.ai/v1/",
    model="microsoft/Phi-3-mini-4k-instruct",
    generation_kwargs={"max_tokens": 256}
)

# Set up the pipeline
pipeline = Pipeline()
pipeline.add_component("fetcher", fetcher)
pipeline.add_component("converter", converter)
pipeline.add_component("prompt", prompt_builder)
pipeline.add_component("llm", llm)

pipeline.connect("fetcher.streams", "converter.sources")
pipeline.connect("converter.documents", "prompt.documents")
pipeline.connect("prompt.prompt", "llm.prompt")

# Function to fetch and process budget information
def fetch_budget_info():
    result = pipeline.run({
        "fetcher": {"urls": ["https://www.indiabudget.gov.in/doc/budget_speech.pdf"]},
        "prompt": {"query": "Summarize the key points of the Indian Budget 2024"}
    })
    return result["llm"]["replies"][0]

# Chat function
def chat():
    print("Welcome to the Indian 2024 Budget Chatbot!")
    print("Ask me anything about the Indian Budget for 2024.")
    print("Type 'exit' to end the conversation.")

    # Fetch and process budget information
    budget_info = fetch_budget_info()

    messages = []
    while True:
        user_input = input("\nYou: ")
        if user_input.lower() == 'exit':
            print("Thank you for using the Indian 2024 Budget Chatbot. Goodbye!")
            break

        messages.append(ChatMessage.from_user(user_input))

        # Prepare the query with context
        query = f"Given the Indian Budget 2024 information: {budget_info}\n\nUser question: {user_input}"

        response = pipeline.run({
            "prompt": {"query": query}
        })

        assistant_resp = response["llm"]["replies"][0]
        print(f"Chatbot: {assistant_resp}")
        messages.append(ChatMessage.from_assistant(assistant_resp))

if __name__ == "__main__":
    chat()

Create and add a hugging face space:

To try it yourself, visit here.

Conclusion

The integration of MonsterAPI with Haystack provides developers with a powerful toolkit for building sophisticated NLP applications. By combining Haystack's flexible pipeline architecture with MonsterAPI's advanced language models, you can create tailored solutions for a wide range of text processing and generation tasks. We encourage you to explore this integration and discover new ways to leverage these tools in your projects.