Skip to content

All TypePad feeds are failing

As mentioned last time there are always a few blogs’ feeds that we have problems fetching, ultimately giving up until we one day find a solution. Usually they’re either 403 Forbidden errors, or difficulties connecting to the domain, despite the feeds being visible/downloadable in a web browser. But for more than a week, all the feeds we track that are hosted at TypePad have been returning 403 errors. This means that, along with other failing blogs, we can no longer display the latest posts for TypePad blogs.

Some of these TypePad feeds are directly served from typepad.com, and others redirect from a typepad.com URL to a feed hosted elsewhere (such as Feedburner or Feedblitz). But all of them return a 403 error when accessed by our crawler, or when using curl on the command line. They work OK in a web browser. It looks like they’re now “protected” by Cloudflare, preventing automated systems from accessing them.

To check how this compares with other blogging services we now have a table on our dashboard that currently looks like this, showing how many feeds are returning which status code responses:

Domain 200 302 304 403
blogspot.com 95 25 146 0
medium.com 6 0 0 0
tumblr.com 30 0 0 0
typepad.com 0 0 0 12
weebly.com 2 0 0 0
wordpress.com 8 0 115 0
write.as 0 0 2 0

(200 means it’s OK, 302 is a temporary redirect, 304 means nothing’s changed.)

This only counts feeds using blogging services’ domains, not those that might be hosted on those services but have custom domains.

As you can see, typepad.com is not looking good. It’s only 12 feeds, but it’s also 100% of the TypePad feeds.

I’ve also noticed that the TypePad-hosted blogs I follow using Feedbin have stopped updating in the feed reader this week, despite them having new posts. Coincidence?

It’s possible there are ways around this. Requesting feeds through some kind of proxy? Altering how we request them? Using a curl replacement that fakes a conventional browser?

This kind of thing is such a shame. Feeds are designed to be fetched by automated systems – humans aren’t in the habit of reading feeds in their web browsers – and to put in barriers preventing that from happening breaks so much.