Moving to del.icio.us, part 4

And finally, we need to actually display links from del.icio.us on the front page. Fetch them from del.icio.us hourly and write out a document snippet, and then include that snippet in the page.

First, how to fetch them: a trivial Python script which uses the del.icio.us REST API to get recent posts:

import xmltramp,urllib2,cgi

def e(s):
  return cgi.escape(s).replace('"','"')

authinfo = urllib2.HTTPBasicAuthHandler()
authinfo.add_password('del.icio.us API', 'http://del.icio.us',
  'USERNAME','PASSWORD')
opener = urllib2.build_opener(authinfo)
urllib2.install_opener(opener)
data = urllib2.urlopen('http://del.icio.us/api/posts/recent').read()
dom = xmltramp.parse(data)

out = []
for p in dom['post':]:
  try:
    ext = p('extended')
  except:
    ext = ''
  out.append('<a href="%s" title="%s">%s</a>' % 
  (e(p('href')),e(ext),e(p('description'))))
fp = open('/var/www/kryogenix.org/scripts/index.curlies.cached','w')
fp.write('n'.join(out))
fp.close()

Then throw a line in crontab to actually run it, with crontab -e:

@hourly python /var/www/kryogenix.org/scripts/get-delicious-recent.py

And finally a brief snippet to include it in the index page, which is a Castalian page:

<?cas
CACHED_COPY = '../scripts/index.curlies.cached'
pfp = open(CACHED_COPY)
response.write(pfp.read())
pfp.close()
?>

and that’s it. Move complete. No more maintaining my own linklog. :)

(Updated: changed the crontab call and the script so that if there’s a failure it doesn’t just blank out index.curlies.cached!)

I'm currently available for hire, to help you plan, architect, and build new systems, and for technical writing and articles. You can take a look at some projects I've worked on and some of my writing. If you'd like to talk about your upcoming project, do get in touch.

More in the discussion (powered by webmentions)

  • (no mentions, yet.)