Large PrintHandheldAudioRating
Twisting The Hellmouth Crossing Over Awards - Results
Rules for Challenges

RSS Feeds & Syndication

What is RSS?

RSS, or Really Simple Syndication is a means for websites to publish details of when new stories or articles are posted. These feeds may be syndicated to other web sites, or read by users with feed reader software or RSS enabled browsers such as Mozilla Firefox, Opera, Internet Explorer 7 or Safari.

TtH provides an RSS feed on most pages that contain a list of stories or challenges, as well as a feed for news stories.

Subscribing to RSS feeds

TtH publishes details of a page's feed in the HTML header. If your feed reader software integrates with your browser or your browser supports RSS natively, it should detect the feed automatically.

You can also use internet based services to subscrbie to feeds

Other feed reader software may indicate the existance of a feed in a different way. If your feed reader does not automatically detect feeds, or you wish to syndicate the content to another web site, you may use the URLs listed below.

Feed URLs

If you wish to manually subscribe to feeds, you can use the following URLs:

You can see the "number" to use in the URL of the respective pages. Also, you may add &items=<number> onto the end of any of these URLs to limit the maximum number of items returned (up to 100). The default is 30, except for the feed of chapters in a story, which returns all chapters by default. The limit of 100 items does not apply to this feed.


To retreive the latest 30 harry potter stories:

To retreive the latest 10 CSI stories:

To retreive the latest 5 responses to Jinni's "Woke up in vegas" challenge:

To retreive the latest 20 stories from the whole site:

Publishing Syndicated Content

We encourage you to include our feeds on other web sites - e.g. to list your stories archived here on your personal web site. However, we ask you to use a method of incorporating the feed that respects the "Time to Live" published within the feed. Also, in keeping with standards for web robots, please set a user agent string that identifies the URL of a page on your site that explains the purpose of your bot and your policies.

I.e. If the feed has a TTL of one hour, a second hit to your web site should use a cached copy of the feed if it occurs within one hour of the first. You should not use a method that means every single hit on your site results in a hit on our site to download the feed.

We may block the IP addresses of servers not respecting the time to live if they generate excessive traffic.

If your site uses PHP, we suggest using the open source magpierss to embed feeds.