I run a single t3a.medium EC2 instance as a personal development environment.
A Caddy web server on the instance serves small static sites (total content: 88 KB).
On March 19, 2026, after a server reboot, Caddy started with the file_server browse
directive enabled, which exposes directory listings as HTML pages. Automated bots discovered these listings
and began scraping them continuously, generating ~440 GB/day of outbound data transfer.
The actual content being served was only 88 KB. The bots were recursively requesting directory listing pages, not transferring real data. This sustained traffic ran for 14 days before I identified and resolved it on April 2, 2026.
| t3a.medium compute | $27.00 |
| 80 GB gp3 EBS | $6.40 |
| Public IPv4 | $3.60 |
| VPC, SSM, misc | $1.50 |
| EC2 (incl. data transfer) | $508.22 |
| Tax | $110.08 |
| EC2 Other (EBS, IPs) | $14.40 |
| VPC | $1.60 |
Each bar represents one day. Data transfer charges at $0.09/GB. Normal daily data transfer cost: ~$0.
| Period | Data Transfer Cost | Volume | Total Bill |
|---|---|---|---|
| March 19–31 (13 days) | ~$475 | ~5,280 GB | $634.30 |
| April 1–2 (2 days) | $40.85 | 553.87 GB | $55.48 |
| April 3+ (post-fix) | $0.00 | 0.05 GB | $1.08/day |
sites/ directory on this instance is 88 KB. The 5,800+ GB transferred was bots repeatedly requesting auto-generated directory listing HTML pages, not actual content.Time txkB/s Notes
06:10:06 5407.80 Sustained bot traffic
06:20:06 4247.67 Starting to drop (Caddy reload in progress)
06:30:02 15.27 Fix applied at 06:24 — traffic collapses
06:40:06 2.74 Normal idle
...
23:50:02 0.70 Remained idle
$30.66 (440.67 GB) EU-DataTransfer-Out-Bytes 97% of cost
$ 0.97 ( 19.33 hr) EU-CPUCredits:t3a
$ 0.96 ( 23.57 hr) EU-BoxUsage:t3a.medium
$ 0.23 ( 2.67 GB) EU-EBS:VolumeUsage.gp3
$ 0.12 ( 24.00 hr) EU-PublicIPv4:InUseAddress
$ 0.80 (19.69 hr) EU-BoxUsage:t3a.medium
$ 0.17 ( 1.89 GB) EU-EBS:VolumeUsage.gp3
$ 0.10 (21.00 hr) EU-PublicIPv4:InUseAddress
$ 0.00 ( 0.05 GB) EU-DataTransfer-Out-Bytes back to zero
The Caddy web server configuration included the file_server browse directive on two virtual hosts,
which generates HTML directory listing pages for any path. Internet bots discovered these endpoints and
scraped them continuously, generating massive outbound traffic from auto-generated HTML.
dev.liztem.com {
root * /home/ubuntu/projects/personal-os/sites
file_server browse # generates directory listings
}
sudo sed -i 's/file_server browse/file_server/' /etc/caddy/Caddyfile
sudo systemctl reload caddy
Removing browse disables directory listing generation. The web server now returns 404 for
directories without an explicit index.html file. Traffic dropped to near-zero immediately.
| Item | Amount | Notes |
|---|---|---|
| March 2026 anomalous data transfer | ~$475 | EU-DataTransfer-Out-Bytes, Mar 19–31 |
| March 2026 associated tax on data transfer | ~$100 | Pro-rated from $110.08 total tax |
| April 2026 anomalous data transfer | $40.85 | EU-DataTransfer-Out-Bytes, Apr 1–2 |
| April 2026 associated tax | ~$8 | Pro-rated |
| CPU credit overage (serving bot traffic) | ~$15 | EU-CPUCredits:t3a, above baseline |
| Total requested credit | ~$639 | Total bill ($689) minus expected normal cost (~$50 for the period) |
This is my first time requesting a billing adjustment. The issue was caused by a web server misconfiguration on my end, but the resulting charges were entirely from automated bot traffic, not legitimate data transfer. The total static content on the server is 88 KB — the 5,800 GB transferred represents bots requesting auto-generated directory listing pages millions of times.
I have resolved the issue, confirmed that data transfer has returned to normal levels (~0 GB/day), and am implementing monitoring to prevent this from happening again.
I appreciate any credit AWS is able to provide. Thank you for your time.