BudiBadu Logo
Samplebadu

cURL by Example: Limit Download Speed

Latest

Throttle download bandwidth. Simulate slow connections.

Code

# Limit download to 200 kilobytes per second
curl --limit-rate 200k -O https://example.com/large-file.zip

# Limit to 1000 bytes per second (very slow)
curl --limit-rate 1000 https://example.com

Explanation

Just as with uploads, the --limit-rate flag controls download speeds with the same syntax and behavior. Download throttling is particularly useful for developers testing how applications handle slow network conditions, ensuring proper handling of timeouts, implementing accurate progress bars, testing streaming buffer behavior, and verifying that large file downloads don't negatively impact user experience. By artificially constraining download speed during development and testing, you can catch issues that would only manifest for users on slow connections before deploying to production.

Testing Network Resilience: Different download speeds reveal different application behaviors. At --limit-rate 1000 (1000 bytes/second, extremely slow), you can test extreme patience scenarios and ensure your application doesn't timeout prematurely. At --limit-rate 100k (typical slow mobile data), test realistic poor-connection scenarios. At --limit-rate 1M (moderate broadband), verify expected performance on average connections. Systematically testing across this range ensures your application provides good UX across the full spectrum of real-world connectivity. This is especially critical for mobile apps, progressive web apps, and any application serving global audiences with varying infrastructure quality.

Being a Good Network Citizen: When downloading large files from shared public servers (open-source mirrors, scientific datasets, public APIs), using --limit-rate demonstrates good internet citizenship by not monopolizing bandwidth that others may need. Some servers actively monitor for aggressive downloaders and may throttle or ban clients that consume too much bandwidth. A self-imposed limit like --limit-rate 500k ensures you get the file while leaving capacity for others. This is particularly important for community-run infrastructure like Linux package repositories or academic paper archives that operate on limited budgets.

Understanding Rate Limiting Mechanisms: It's crucial to understand that cURL's rate limiting is an average over time, not a hard instantaneous ceiling. Short spikes above the limit are normal and expected, followed by periods of reduced speed to bring the average back down. The limit is enforced by cURL pausing and resuming the data transfer strategically. Default unit is bytes per second when no suffix is provided (--limit-rate 102400 = 100 KB/s). For very precise rate control, you might need to combine this with other techniques or use dedicated traffic shaping tools at the OS level (tc on Linux, pfctl on macOS). For downloads where the file size is predictable, you can calculate required timeout values: a 100 MB file at 200 KB/s will take approximately 500 seconds, so set --max-time accordingly with some buffer: curl --limit-rate 200k --max-time 600 -O file.zip.

Code Breakdown

2
--limit-rate 200k restricts bandwidth consumption.
5
Default unit is bytes if no suffix is provided.
All
cURL will pause and resume reading to maintain the average speed.