Mira API - MCP Server

MCP (Model Context Protocol) is quickly becoming the recognized standard for integrating remote tools and resources into LLMs. You can read more about this standard on the official MCP site.

In this guide we’ll explain the new MCP Server integration option for the Mira API, by showing how you can integrate Mira with OpenAI.

This guide is largely based on the OpenAI Remote MCP guide - and we won’t repeat the Responses API options and descriptions here. Please refer to the original guide for full details, and code samples in other programming languages.

Before you start

To run through this tutorial, you will need:

  • Your Meltwater API token
  • Access to the Mira API Beta program
  • Access to Open AI, and your Open AI API token

Meltwater API MCP Server Specification

The Meltwater API MCP Server exposes the following description and tools to LLMs:

{
    "jsonrpc": "2.0",
    "id": 2,
    "result": {
        "tools": [
            {
                "annotations": {
                    "readOnlyHint": true,
                    "destructiveHint": false,
                    "idempotentHint": false,
                    "openWorldHint": true
                },
                "description": "Ask Meltwater's MIRA assistant about stories, trends, and topics in news or social media. \n\nUse natural language queries like 'What were the top stories for Microsoft last week?' or 'How is climate change being discussed on social media?'\n\nAccess current news stories, social media trends, and emerging topics across global media sources.\nGet insights on brand sentiment, competitive analysis, and industry developments.\nUnderstand social media conversations, influencer discussions, and public opinion trends.\nAnalyze breaking news, story development, and media coverage patterns.",
                "inputSchema": {
                    "properties": {
                        "prompt": {
                            "description": "Prompt to send to the conversational chat",
                            "type": "string"
                        }
                    },
                    "required": [
                        "prompt"
                    ],
                    "type": "object"
                },
                "name": "chat"
            }
        ]
    }
}

Note the descriptions provided in the specification as these are critical to how an LLM will select the tools exposed, and how it will form prompts when calling those tools.

Specifying an MCP Server in OpenAI

The OpenAI Responses API uses the following structure to reference remote MCP servers:

{
    "type": "mcp",
    "server_label": "meltwater-api",
    "server_url": "https://api.meltwater.com/mcp",
    "require_approval": "never",
    "headers": {
        "apikey": <YOUR MELTWATER API KEY>
    }
}

Here the critical fields are:

  • server_url: The URL for the Meltwater MCP server.
  • apikey: Your Meltwater API key, which will be sent as a header in requests to the server by OpenAI.

Calling the Responses API

Using the MCP approach, it’s incredibly simple to call the Responses API including the MCP server tools.

This example code calls the Responses API, and provides a simple prompt for a news brief. Because of the description provided by the Meltwater MCP server, the LLM should know to direct questions regarding news and social insights to the Mira API.

from openai import OpenAI

client = OpenAI(api_key=<YOUR OPENAI API KEY>)

resp = client.responses.create(
    model="gpt-4.1",
    tools=[
        {
            "type": "mcp",
            "server_label": "meltwater-api",
            "server_url": "https://api.meltwater.com/mcp",
            "require_approval": "never",
            "headers": {
                "apikey": <YOUR MELTWATER API KEY>
              }
        },
    ],
    input="What is going on in the news with fast fashion in the last 7 days?",
)

print(resp.output_text)

The output from the Responses API call will be returned in Markdown format. For our example prompt the output was:

1. **Shein Faces Backlash Amid Pop-Up Store Launch in Dijon**
   - Summary: Shein's pop-up store in Dijon attracted hundreds of customers but also faced protests accusing the brand of exploitation and environmental harm. Critics tagged the store with messages like \"Shein kills\" and \"pollution,\" highlighting the controversies surrounding ultra-fast fashion[^14][^15].

2. **France Targets Ultra-Fast Fashion with New Climate Bill**
   - Summary: France has introduced amendments to its climate bill, imposing penalties on ultra-fast fashion brands like Shein and Temu. The law aims to curb hyperproduction and promote sustainability, marking a significant regulatory step in the global fight against disposable fashion[^25][^26][^27].

3. **Sustainability in Fashion: Shein's Climate Goals Under Scrutiny**
   - Summary: Shein's climate goals, validated by the Science Based Targets initiative, have been criticized as greenwashing. Despite pledges to cut emissions and adopt renewable energy, the brand's reliance on polyester and allegations of unethical practices raise doubts about its commitment to sustainability[^22][^23].

4. **Comptoir des Cotonniers and Princesse Tam Tam Seek Judicial Recovery**
   - Summary: French fashion brands Comptoir des Cotonniers and Princesse Tam Tam have filed for judicial recovery due to financial struggles. The brands, owned by Fast Retailing, have faced challenges from mid-range fashion competition and market shifts[^17][^18][^19].

...

For fuller examples and explanation of the Responses API, see the official documentation.

Limits

When using the MCP server, you will be limited by the limits on the Meltwater API endpoints being used.

In this case, you will be limited by the Mira API Chat Completion endpoint limit.