TL;DR: Automatically generate a JSON content API for Jekyll-based posts and pages. Uses Heroku, works with GitHub pages.
After making the move to Jekyll, one thing I lost was the ability to generate machine-readable representations of content. That may sound trivial, but it’s actually not considering the RESTful direction we’re heading, and is crucial to manipulate content on the frontend (for example,
Backbone.js ). The idea being, for any post or page, if you simply replace the final
.json, you should get an API.
I had previously written a short Jekyll plugin to generate JSON representations of posts, but GitHub Pages (where this site is hosted), doesn’t allow plugins for security reasons. Enter JekyllBot. JekyllBot lives on a (free) Heroku instance, and following any push to GitHub, silently generates JSON files for each post, pushing the changes back to GitHub. No need to do a thing. This allows you to continue to use web-based editors like Prose, and alleviates the need to ever touch the command-line.
How does it work? I added my Heroku instance as a post-receive hook in the repository’s settings page, and have JekyllBot running within Sinatra to listen for the payload. After a push, it simply runs Jekyll (with the plugin), commits to git, and pushes the repository back to GitHub if there are any changes. The secret lies in the JSON generating plugin, which places the resulting files in the source directory (rather than
_site), allowing the files to be tracked by git, and thus silently passed through as static files by GitHub’s Pages servers when the site is built.
With a little customization, there’s no reason this process couldn’t work with most Jekyll plugins (for example, sitemaps, tag archives), allowing GitHub pages to support much more robust Jekyll implementations from within their existing security restrictions. Feel free to give JekyllBot a try on your own site, or simply add
.json to the URL above to see it in action.
If you enjoyed this post, you might also enjoy:
- Ten ways to make a product great
- How I re-over-engineered my home network for privacy and security
- Why open source
- Using GitHub Pages to showcase your organization's open source efforts
- 19 reasons why technologists don't want to work at your government agency
- Explain like I'm five: Jekyll collections
- 15 rules for communicating at GitHub
- Four characteristics of modern collaboration tools
- How I over-engineered my home network for privacy and security
- Twelve tips for growing communities around your open source project
- Intro to GitHub for non-technical roles
Ben Balter is the Director of Engineering Operations and Culture at GitHub, the world’s largest software development platform. Previously, as Chief of Staff for Security, he managed the office of the Chief Security Officer, improving overall business effectiveness of the Security organization through portfolio management, strategy, planning, culture, and values. As a Staff Technical Program manager for Enterprise and Compliance, Ben managed GitHub’s on-premises and SaaS enterprise offerings, and as the Senior Product Manager overseeing the platform’s Trust and Safety efforts, Ben shipped more than 500 features in support of community management, privacy, compliance, content moderation, product security, platform health, and open source workflows to ensure the GitHub community and platform remained safe, secure, and welcoming for all software developers. Before joining GitHub’s Product team, Ben served as GitHub’s Government Evangelist, leading the efforts to encourage more than 2,000 government organizations across 75 countries to adopt open source philosophies for code, data, and policy development. More about the author →
This page is open source. Please help improve it.Edit