Getting started¶
This guide walks you through installing and running Dead Simple Search on your own machine or server. You'll need Python 3.13 and a MySQL 8+ database.
What you'll need¶
Before we begin, make sure you have these installed:
- Python 3.13 — the programming language Dead Simple Search is written in. You can download it from python.org.
- MySQL 8+ — the database that stores your crawled pages and search index. MySQL Community Edition is free.
- Git — to download the source code. Most systems have it pre-installed.
What about older Python versions?
Dead Simple Search is developed and tested with Python 3.13. Older versions may work but aren't officially supported.
Step 1: Get the code¶
Clone the repository from Codeberg:
Step 2: Set up Python¶
We recommend using uv for managing your Python environment. It's fast and simple:
# Install uv if you don't have it yet
# See https://docs.astral.sh/uv/ for other installation methods
curl -LsSf https://astral.sh/uv/install.sh | sh
# Create a virtual environment and install dependencies
uv venv --python 3.13
source .venv/bin/activate
uv pip install -r requirements.txt
Alternatively, you can use Python's built-in venv:
Step 3: Set up MySQL¶
Create the database and a user for Dead Simple Search:
CREATE DATABASE IF NOT EXISTS deadsimplesearch
CHARACTER SET utf8mb4 COLLATE utf8mb4_unicode_ci;
CREATE USER IF NOT EXISTS 'deadsimplesearch'@'localhost'
IDENTIFIED BY 'choose-a-strong-password';
GRANT ALL PRIVILEGES ON deadsimplesearch.*
TO 'deadsimplesearch'@'localhost';
FLUSH PRIVILEGES;
You can run this in the MySQL command-line client:
Or let the application create the tables automatically on first run (it will do this for you).
Step 4: Configure¶
Dead Simple Search uses environment variables for configuration. The most important one is your MySQL password:
See the configuration reference for all available settings.
Step 5: Run¶
The server starts on http://localhost:5555. You're up and running!
Step 6: Add your first site¶
Register a website to crawl:
curl -X POST http://localhost:5555/api/sites \
-H "Content-Type: application/json" \
-d '{"domain": "yourwebsite.com", "start_url": "https://yourwebsite.com/"}'
Then trigger a crawl:
The crawler runs in the background. You can check its progress:
Once the crawl finishes, search your site:
That's it! You now have a working search engine for your website.
Next steps¶
- Read the API reference for the full list of endpoints
- Check the configuration guide to fine-tune crawling and search behavior
- Learn how it works under the hood