Loading...

Downloads websites recursively using wget, converting links for local use. Requires wget installation and allows specifying depth and output path.
Boost this tool
Subscribe to listing upgrades or segmented pushes.
Downloads websites recursively using wget, converting links for local use. Requires wget installation and allows specifying depth and output path.
This server is moderately safe if used responsibly. The risk comes from the underlying wget command and the potential for writing to the file system. Ensure the URL is trusted and the output path is carefully managed to mitigate risks. Limiting the download depth is also recommended.
Performance depends on the size and complexity of the website being downloaded, as well as the network connection speed. Large websites may take a significant amount of time to download.
The primary cost is disk space for storing downloaded websites. Network bandwidth usage may also incur costs depending on the hosting environment.
brew install wget{
"mcpServers": {
"website-downloader": {
"command": "node",
"args": ["/path/to/website-downloader/build/index.js"]
}
}
}download_websiteDownloads a website to a specified output path using wget.
Writes files to the file system and executes an external command (wget).
None
This server is moderately safe if used responsibly. The risk comes from the underlying `wget` command and the potential for writing to the file system. Ensure the URL is trusted and the output path is carefully managed to mitigate risks. Limiting the download depth is also recommended.
The server operates with the permissions of the user running the MCP server. Ensure appropriate permissions are set to prevent unauthorized access or modification of the file system.
Production Tip
Monitor disk space usage to prevent filling up the file system with large website downloads. Implement rate limiting to avoid overloading the server.
Use the `depth` parameter in the `download_website` tool. Set it to 0 for only the specified page, 1 for direct links, and so on.
By default, files are stored in the current directory. You can specify a different directory using the `outputPath` parameter.
The files will be downloaded to the current working directory of the MCP server process.
The README provides instructions for installing wget on macOS, Linux, and Windows. Use the appropriate package manager or download the binary.
No, this tool downloads the entire website recursively. You can limit the depth, but not specific sections.
wget will attempt to download all links, but broken links will result in errors. The downloaded website may have incomplete content.
The current implementation does not support resuming failed downloads. You would need to restart the download from the beginning.