Moving to, part 4

And finally, we need to actually display links from on the front page. Fetch them from hourly and write out a document snippet, and then include that snippet in the page.

First, how to fetch them: a trivial Python script which uses the REST API to get recent posts:

import xmltramp,urllib2,cgi

def e(s):
  return cgi.escape(s).replace('"','"')

authinfo = urllib2.HTTPBasicAuthHandler()
authinfo.add_password(' API', '',
opener = urllib2.build_opener(authinfo)
data = urllib2.urlopen('').read()
dom = xmltramp.parse(data)

out = []
for p in dom['post':]:
    ext = p('extended')
    ext = ''
  out.append('<a href="%s" title="%s">%s</a>' % 
fp = open('/var/www/','w')

Then throw a line in crontab to actually run it, with crontab -e:

@hourly python /var/www/

And finally a brief snippet to include it in the index page, which is a Castalian page:

CACHED_COPY = '../scripts/index.curlies.cached'
pfp = open(CACHED_COPY)

and that’s it. Move complete. No more maintaining my own linklog. :)

(Updated: changed the crontab call and the script so that if there’s a failure it doesn’t just blank out index.curlies.cached!)

More in the discussion (powered by webmentions)

  • (no mentions, yet.)