A lightning-fast pseudo Web Search Engine API written in Rust — also works as an MCP server for AI agents. This project emulates popular APIs like SerpAPI or Tavily without needing official and expensive API keys, by multiplexing requests to popular engines directly and scraping the results concurrently.
- Performance: Powered by
tokiofor async concurrent I/O. - Bot Bypass: Leverages
rquestwith TLS impersonation (e.g., simulating a Chrome 124 browser footprint at the TLS/HTTP2 layer) to minimize blocking vs standard HTTP clients (the Rust equivalent ofcurl_cffi). - Standardized: Normalizes
DuckDuckGo,Yahoo, andBraveHTML results into a standardizedSearchResultJSON array.
- DuckDuckGo (Primary standard search)
- Yahoo (Powered by Bing)
- Brave Search (Independent index)
src/
├── main.rs # Entry point: HTTP server or MCP mode (--mcp)
├── search.rs # Shared concurrent search logic
├── mcp.rs # MCP stdio server (JSON-RPC 2.0)
├── models.rs # SearchResultItem, SearchResponse structs
└── engines/
├── mod.rs # SearchEngine enum + trait dispatch
├── duckduckgo.rs # DuckDuckGo scraper
├── yahoo.rs # Yahoo scraper (Bing-powered)
└── brave.rs # Brave Search scraper
examples/
├── fetch_html.rs # Download raw HTML for offline debugging
└── test_parser.rs # Offline CSS selector iteration
.gemini/ # Gemini CLI agent config
├── GEMINI.md # Project-level system prompt
├── settings.json # MCP server configuration
└── skills/ # Project-level agent skills
├── sosearch-engine-dev/ # Scraper development workflow
└── sosearch-api-ops/ # API operations & deployment
.agents/ # Generic agent config (compatible with multiple AI tools)
└── skills/ # Same skills, alternative discovery path
├── sosearch-engine-dev/
└── sosearch-api-ops/
This project includes built-in AI agent support for both Gemini CLI and other tools that follow the .agents/ convention.
| Skill | Description |
|---|---|
sosearch-engine-dev |
Full workflow for adding/debugging search engine scrapers: fetch HTML → test selectors offline → decode URLs → integrate |
sosearch-api-ops |
Operations guide: build, run, test, deploy (local + Docker), troubleshoot |
Configured in .gemini/settings.json:
| Server | Package | Purpose |
|---|---|---|
filesystem |
@modelcontextprotocol/server-filesystem |
Scoped file access to project directory |
cd /path/to/SoSearch
gemini
# Skills are auto-discovered. Ask: "How do I add a new search engine?"Run SoSearch as an MCP server for AI agents (Claude, Gemini, Cursor, etc.):
./SoSearch --mcpClaude Desktop (claude_desktop_config.json):
{
"mcpServers": {
"sosearch": {
"command": "/path/to/SoSearch",
"args": ["--mcp"]
}
}
}Gemini CLI (.gemini/settings.json):
{
"mcpServers": {
"sosearch": {
"command": "/path/to/SoSearch",
"args": ["--mcp"]
}
}
}This exposes a web_search tool that AI agents can call to search the web.
Claude Desktop (%APPDATA%\Claude\claude_desktop_config.json):
{
"mcpServers": {
"sosearch": {
"command": "C:\\path\\to\\SoSearch.exe",
"args": ["--mcp"]
}
}
}# Download pre-built binary or build from source
cargo run --release
curl "http://localhost:10080/search?q=hello+world"使用预编译二进制:
从 GitHub Releases 下载 SoSearch-windows-amd64.zip,解压后:
# 启动 HTTP 服务
.\SoSearch.exe
# 另开一个终端测试
Invoke-RestMethod "http://localhost:10080/search?q=hello+world" | ConvertTo-Json
# 或使用 curl
curl.exe "http://localhost:10080/search?q=hello+world"从源码编译(需要安装 Rust、CMake、NASM、LLVM/Clang):
# 安装依赖 (使用 Chocolatey)
choco install cmake nasm llvm -y
# 编译运行
cargo run --releaseMCP 模式:
.\SoSearch.exe --mcpRefer to QUICK_START.md for full instructions.
CC BY-NC 4.0 — 非商业用途可自由使用、修改和分发。
首发于 LINUX DO 社区,欢迎 Star ⭐ 和 PR!