Skip to main content

Overview

Maxun robots are automated tools that help you collect data from websites without writing any code. Think of them as your personal web assistants that can navigate websites, extract information, and organize data just like you would manually - but faster and more efficiently.

There are two types of robots, each designed for a different job.

1. Extract Robots

Extract robots emulate real user behavior and capture structured data at scale.

  • Built for automation and structured data
  • Point-and-click interface
  • Extract from any website, including behind logins
  • Record user actions (clicks, scrolls, form fills, pagination, etc.)
  • Convert sites into APIs, spreadsheets, and workflows
  • Scale extractions and run on schedules or via API

2. Scrape Robots

Built for clean content.

  • Get clean HTML and LLM-ready Markdown
  • No scripts, styling, ads, or clutter
  • Ideal for AI workflows, RAG, summarization, embeddings, and content pipelines

What Can Robots Do?

  • ✨ Open a webpage
  • ✨ Log in
  • ✨ Click on buttons
  • ✨ Fill out a form
  • ✨ Select from a dropdown menu, radios, checkboxes, dates, times, etc.
  • ✨ Take screenshots
  • ✨ Gather web data without writing a single line of code - just point, click, and collect
  • ✨ Handle infinite scrolling and pagination
  • ✨ Auto-adapt to website layout & structural changes
  • ✨ Run on a specific schedule
  • ✨ Run via APIs for third-party integrations
  • ✨ Extract data behind login
  • ✨ Integrate with your favorite applications. N8N, Google Sheet, Airtable and more
  • ✨ Send data to webhooks
  • ✨ Get clean HTML from websites
  • ✨ Turn websites into LLM-ready markdown for AI applications
  • ✨ Talk to your LLM with MCP (Model Context Protocol)

... and much more!