Qubyte Codes

Dispatching Webmentions with a Netlify build plugin


Last updated

Until recently I relied on a glitch to act as a webhook dispatcher. When a post was published, an on-success hook would be sent from Netlify (which hosts this site) to the glitch to trigger it. The glitch then checked a sitemap against a cached sitemap to determine new URLs and dispatch Webmentions for them. While this works, it had some issues:

  • The glitch is a server so that it can receive webhook requests.
  • The server hashes the body content and checks a webhook signature header.
  • Glitch can take a while to wake up an app.
  • Difficult to know when there was a problem. You have to remember to visit the glitch and hope the logs have been kept.
  • Code is kept in a different place to the static site generator.
  • Difficult to add automated tests and linting.

Netlify now has a build plugin feature, which allows custom code to be run at particular times in the build lifecycle. In particular, the onSuccess hook means that dispatch of mentions happens strictly after the deployment, so nothing happens if the build fails, and any automated checks by the recipient get the latest version of a page to work

I wrote build plugin inside the repo for this site which makes a request for the atom feed of my site before the build completes and new pages are published, and compares it to the newly generated atom feed after. A collection of new URLs is derived from the before and after atom feeds, and new mentions determined and dispatched. The plugin resolves all the above issues:

  • No server (as in HTTP endpoint stuff) code needed.
  • No webhook used so no signature checking needed.
  • No wake-up delay (well some, but not as big as that of a free glitch).
  • Logs are kept with other deploy logs in Netlify.
  • Code can be kept in the same repo as the static site generator and content.
  • It can use the same testing and linting setup as the rest of the repo.
  • Fewer hard coded things like URLs. Netlify provides things as context and through the environment.

The real thing is a bit more complex, but the gist of it is:

import getOldFeedUrls from './get-old-feed-urls.js';
import readNewFeedUrls from './read-new-feed-urls.js';
import determineChangedUrls from './determine-changed-urls.js';
import getMentionsForPage from './get-mentions-for-page.js';

// Mentions are determined *before* the deploy completes, but
// dispatched *after*.
const allMentions = new Map();

// This hook is executed after the site is built, but before
// it is deployed, so I can grab the old feed using an HTTPS
// request and the new feed from the file system.
export async function onPostBuild() {
  // A set of URLs derived from an HTTPS request for the
  // atom feed for this site.
  const [oldFeed, newFeed] = await Promise.all([

  // Use both feeds to determine which pages are removed,
  // added, or updated. Only unchanged pages are discarded.
  const urls = determineChangedUrls(oldFeed, newFeed);

  for (const url of urls) {
    // Gets mentions for the old and the new page. The URL
    // has enough information to locate the new version
    // from the build directory.
    allMentions.set(url, await getMentionsForPage(url));

// Mentions are dispatched once the deployment completes, so
// that any automated checks by the recipient don't happen
// too soon!
export async function onSuccess() {
  const newFeed = await readNewFeedUrls();

  // Dispatch mentions for each new URL in sequence.
  for (const [url, mentions] of allMentions) {
    for (const mention of mentions) {
      // A mention is the same regardless of if the mention
      // is new, existing, or removed. The recipient must
      // determine what has happened.
      await mention.dispatch();

For the real thing I encourage you to check out the source. There remain some limitations. Only new articles are checked for mentions. When an article is updated it'll be ignored (new links may be added or old ones removed). One way to resolve this would be to collect new and updated URLs when comparing the atom feeds. A database to associate a URL with dispatched mentions could be used to know when links have been added or removed. That database could be stored as a file in the Netlify cache. From the documentation it seems like this cache will store a file indefinitely.

This is just the beginning! I plan to port other capabilities over to Netlify build plugins in time.


Mention from Julian Elve on : ...I then found the approach documented by Mark Everitt which compares the RSS feed before and after the build and pushes webmentions for all new items, and have adapted his code.