A red robot hand drawn by a child

While building a web-site is easy, making sure it works smoothly is not. One area that is often neglected is making sure that the caches and the proxies on the client side work properly: loading resources takes time, so making sure a maximal amount of data can be cached in the network can improve performance. I just found a good tool for this: .

RED is a robot that checks HTTP resources to see how they’ll behave, pointing out common problems and suggesting improvements. Although it is not a HTTP conformance tester, it can find a number of HTTP-related issues.

Thanks to this tool, I fixed the following issues on the server hosting this blog:

  • Added missing MIME types for images, this was highlighted by testing my image test page.
  • Added compression for certain MIME types (svg).
  • Added expiration times for some image types, but also for html.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: