← Back to Home

API DOCUMENTATION

LeoAPI REST API Documentation

// Overview

LeoAPI provides REST API endpoints for web crawling via HTTP requests. All API endpoints require a valid API Key for authentication.

Base URL: https://leoapi.fun

// API URL Structure

LeoAPI uses a hierarchical URL structure to organize different crawling tools and their features:

/api/{tool}/{target_type}

Examples:
  - /api/crawl4ai/infinite-scroll
  - /api/crawl4ai/standard-fetch
  - /api/crawl  (legacy endpoint, supports all tools)

Recommended: Use tool-specific endpoints for better clarity and type safety.

The generic /api/crawl endpoint is maintained for backward compatibility.

// Crawl4AI: Infinite Scroll

POST /api/crawl4ai/infinite-scroll

Advanced web crawler for pages with infinite scroll functionality. Automatically scrolls and extracts content until no new content is found or max scrolls is reached.

Request Parameters

Parameter Type Required Description
api_key string Yes API Key
url string Yes Target URL to crawl
scroll_mode string No Scroll mode: "auto" (default) or "manual"
max_scrolls integer No Maximum scroll count (default: 20, only used when scroll_mode is "manual")

Scroll Modes

Request Examples

cURL - Auto Mode

curl -X POST https://leoapi.fun/api/crawl4ai/infinite-scroll \
  -H "Content-Type: application/json" \
  -d '{
    "api_key": "sk-your-api-key-here",
    "url": "https://example.com",
    "scroll_mode": "auto"
  }'

cURL - Manual Mode

curl -X POST https://leoapi.fun/api/crawl4ai/infinite-scroll \
  -H "Content-Type: application/json" \
  -d '{
    "api_key": "sk-your-api-key-here",
    "url": "https://example.com",
    "scroll_mode": "manual",
    "max_scrolls": 10
  }'

Python

import requests

response = requests.post(
    "https://leoapi.fun/api/crawl4ai/infinite-scroll",
    json={
        "api_key": "sk-your-api-key-here",
        "url": "https://example.com",
        "scroll_mode": "auto"  # or "manual" with "max_scrolls": 10
    }
)

result = response.json()
if result["success"]:
    print(f"Download URL: {result['data']['download_url']}")
    print(f"File size: {result['data']['file_size']} bytes")
    print(f"Total scrolls: {result['data']['crawl_stats']['total_scrolls']}")
else:
    print(f"Error: {result['message']}")

JavaScript/Node.js

const response = await fetch('https://leoapi.fun/api/crawl4ai/infinite-scroll', {
  method: 'POST',
  headers: {
    'Content-Type': 'application/json'
  },
  body: JSON.stringify({
    api_key: 'sk-your-api-key-here',
    url: 'https://example.com',
    scroll_mode: 'auto'
  })
});

const result = await response.json();
if (result.success) {
  console.log('Download URL:', result.data.download_url);
  console.log('Crawl stats:', result.data.crawl_stats);
}

Success Response

{
  "success": true,
  "task_id": "crawl_20251127_101530",
  "status": "completed",
  "data": {
    "filename": "crawl_20251127_101530.md",
    "download_url": "https://leoapi.fun/download/crawl_20251127_101530.md",
    "file_size": 12345,
    "crawl_stats": {
      "total_scrolls": 10,
      "new_content_chars": 50000,
      "initial_length": 10000,
      "final_length": 60000,
      "duration_seconds": 45.23
    }
  },
  "message": "Crawl completed successfully",
  "timestamp": "2025-11-27T10:15:30Z"
}

Error Response

{
  "success": false,
  "error": "API_KEY_ERROR",
  "message": "Invalid API Key",
  "timestamp": "2025-11-27T10:15:30Z"
}

// Crawl4AI: Standard Fetch

POST /api/crawl4ai/standard-fetch

Status: Coming Soon

Standard page fetching without infinite scroll support. This endpoint is currently under development.

// Legacy Crawl Endpoint

POST /api/crawl

Generic crawling endpoint (maintained for backward compatibility). For new integrations, use tool-specific endpoints above.

Request Parameters

Parameter Type Required Description
api_key string Yes API Key
url string Yes Target URL to crawl
api_tool string No Crawling tool, default: "crawl4ai"
target_type string No Target type, default: "INFINITE SCROLL"
scroll_mode string No Scroll mode: "auto" or "manual", default: "auto"
max_scrolls integer No Maximum scroll count, default: 20

Request Example

cURL

curl -X POST https://leoapi.fun/api/crawl \
  -H "Content-Type: application/json" \
  -d '{
    "api_key": "sk-your-api-key-here",
    "url": "https://example.com",
    "api_tool": "crawl4ai",
    "target_type": "INFINITE SCROLL",
    "scroll_mode": "manual",
    "max_scrolls": 10
  }'

Error Codes (Common to All Endpoints)

Error Code HTTP Status Description
MISSING_API_KEY 400 Missing API Key
MISSING_URL 400 Missing URL parameter
API_KEY_ERROR 401 Invalid API Key
UNSUPPORTED_TOOL 400 Unsupported tool or type
CRAWL_FAILED 500 Crawl task failed
INTERNAL_ERROR 500 Internal server error

// n8n Workflow Integration

LeoAPI response format is optimized for n8n workflow automation.

n8n Node Configuration Examples

1. HTTP Request Node Configuration (Recommended: Use Hierarchical Endpoint)

{
  "method": "POST",
  "url": "https://leoapi.fun/api/crawl4ai/infinite-scroll",
  "authentication": "none",
  "requestFormat": "json",
  "body": {
    "api_key": "={{$credentials.leoapi.api_key}}",
    "url": "={{$json.target_url}}",
    "scroll_mode": "auto"
  }
}

Or using legacy endpoint:

{
  "method": "POST",
  "url": "https://leoapi.fun/api/crawl",
  "authentication": "none",
  "requestFormat": "json",
  "body": {
    "api_key": "={{$credentials.leoapi.api_key}}",
    "url": "={{$json.target_url}}",
    "api_tool": "crawl4ai",
    "target_type": "INFINITE SCROLL",
    "scroll_mode": "auto"
  }
}

2. Conditional Node

Check if crawl was successful:

Use expression: {{$json.success}}

If true, continue processing; if false, trigger error handling.

3. Extract Data

// Get download URL
{{$json.data.download_url}}

// Get filename
{{$json.data.filename}}

// Get file size
{{$json.data.file_size}}

// Get crawl statistics
{{$json.data.crawl_stats.total_scrolls}}
{{$json.data.crawl_stats.new_content_chars}}
{{$json.data.crawl_stats.duration_seconds}}

4. Download File Node

{
  "method": "GET",
  "url": "={{$json.data.download_url}}",
  "responseFormat": "file"
}

Complete n8n Workflow Example

{
  "nodes": [
    {
      "name": "Trigger",
      "type": "n8n-nodes-base.webhook",
      "parameters": {
        "path": "crawl-trigger"
      }
    },
    {
      "name": "Call LeoAPI",
      "type": "n8n-nodes-base.httpRequest",
      "parameters": {
        "method": "POST",
        "url": "https://leoapi.fun/api/crawl4ai/infinite-scroll",
        "jsonParameters": true,
        "options": {},
        "bodyParametersJson": "={{ JSON.stringify({\n  api_key: $credentials.leoapi.api_key,\n  url: $json.url,\n  scroll_mode: 'auto'\n}) }}"
      }
    },
    {
      "name": "Check Result",
      "type": "n8n-nodes-base.if",
      "parameters": {
        "conditions": {
          "boolean": [
            {
              "value1": "={{$json.success}}",
              "value2": true
            }
          ]
        }
      }
    },
    {
      "name": "Download File",
      "type": "n8n-nodes-base.httpRequest",
      "parameters": {
        "method": "GET",
        "url": "={{$json.data.download_url}}",
        "responseFormat": "file"
      }
    },
    {
      "name": "Save to Cloud",
      "type": "n8n-nodes-base.googleDrive",
      "parameters": {
        "operation": "upload",
        "name": "={{$json.data.filename}}"
      }
    }
  ]
}

// Other Endpoints

Validate API Key

POST /api/validate-key

Validate if an API Key is valid.

// Request
{
  "apiKey": "sk-your-api-key-here"
}

// Response
{
  "valid": true
}

Get Available Tools

GET /api/tools

Get list of all available crawling tools.

// Response
[
  {
    "name": "Crawl4AI",
    "value": "crawl4ai"
  },
  {
    "name": "Standard API",
    "value": "standard"
  }
]

Download File

GET /download/{filename}

Download the generated Markdown file from crawl.

// Rate Limits & Notes

// Contact & Support

For questions or suggestions, please contact us: