Skip to main content

Getting around the Amazon S3 rewrite rule limit

·1 min

I mentioned a few weeks ago that whilst moving my blog to S3 I hit a hard limit of 50 redirects, meaning I would break some URLs if I made the move.

Back then, I was using the XML based redirect rules, and there is no getting past the limit.

But there is another way to set up redirects, without any limits. This is done by uploading a file with the x-amz-website-redirect-location property. When browsing to that file, S3 will redirect to the value of that property, which can be a local or absolute URL.

I wrote a couple of scripts to set up a few hundred redirects like this and put them in GitHub alongside my automated tests, which now all pass.

That was the last thing holding up the move to S3. As of today, this site is fully hosted by S3.


Want great, practical advice on implementing data mesh, data products and data contracts?

In my weekly newsletter I share with you an original post and links to what's new and cool in the world of data mesh, data products, and data contracts.

I also include a little pun, because why not? 😅

(Don’t worry—I hate spam, too, and I’ll NEVER share your email address with anyone!)


Andrew Jones
Author
Andrew Jones
I build data platforms that reduce risk and drive revenue.