DocumentationMCP Server

MCP Server

We have an official MCP Server implementation that runs in the cloud or can be run locally using Docker or NPX.

Tools

  1. scrape_url_html

    • Use a URL to scrape a website using the ScrAPI service and retrieve the result as HTML. Use this for scraping website content that is difficult to access because of bot detection, captchas or even geolocation restrictions. The result will be in HTML which is preferable if advanced parsing is required.
    • Input: url (string)
    • Returns: HTML content of the URL
  2. scrape_url_markdown

    • Use a URL to scrape a website using the ScrAPI service and retrieve the result as Markdown. Use this for scraping website content that is difficult to access because of bot detection, captchas or even geolocation restrictions. The result will be in Markdown which is preferable if the text content of the webpage is important and not the structural information of the page.
    • Input: url (string)
    • Returns: Markdown content of the URL

Setup

Cloud Server

The ScrAPI MCP Server is also available in the cloud over SSE at https://api.scrapi.dev/sse

Cloud MCP servers are not widely supported yet but you can access this directly from your own custom clients or use MCP Inspector to test it. There is currently no facility to pass through your API key when connecting to the cloud MCP server.

MCP-Inspector

Usage with Claude Desktop

Add the following to your claude_desktop_config.json:

Docker

{
  "mcpServers": {
    "scrapi": {
      "command": "docker",
      "args": [
        "run",
        "-i",
        "--rm",
        "-e",
        "SCRAPI_API_KEY",
        "deventerprisesoftware/scrapi-mcp"
      ],
      "env": {
        "SCRAPI_API_KEY": "<YOUR_API_KEY>"
      }
    }
  }
}

NPX

{
  "mcpServers": {
    "scrapi": {
      "command": "npx",
      "args": [
        "-y",
        "@deventerprisesoftware/scrapi-mcp"
      ],
      "env": {
        "SCRAPI_API_KEY": "<YOUR_API_KEY>"
      }
    }
  }
}

Claude-Desktop