📈 SEO Systems ⚙️ Automation ⏱ Low Maintenance

A Simple SEO Content System Using n8n and Supabase

AI & SEO · Updated Nov 15, 2025 · Estimated reading time: 8–10 minutes

Most small businesses know they “should be doing SEO,” but the idea of publishing 3–5 good articles per week feels impossible. In this guide, you’ll build a light SEO content machine that runs on: 1) n8n workflows, 2) Supabase as your content brain, and 3) one short review session per day.

Who this system is for

This setup is designed for small teams and solo founders who:

Mindset shift: The goal is not “perfect SEO content.” The goal is a repeatable pipeline that turns ideas into useful pages with as little friction as possible.

The high-level architecture

At a high level, your SEO system looks like this:

  1. Capture topics into Supabase (manually, or from tools like Google Search Console, Reddit, or “People Also Ask”).
  2. Use n8n to pull one topic at a time and send it to GPT with a strict blog template prompt.
  3. Save the AI-generated article back into Supabase (HTML + metadata).
  4. Publish the article by pushing the HTML to your `/blog` folder via SFTP (exactly what you’re doing now).
  5. Mark the row as processed so the next run picks up the next topic.

Once this is set up, your daily job becomes: “Open the draft, skim, tweak, and hit publish.”

Step 1 – Design your content model in Supabase

First, you need a clean structure for how a “topic” turns into a full blog post. A simple Supabase table called seo_topics is enough:

Suggested columns

You can manually seed 20–50 topics into this table from:

Step 2 – A strict blog template prompt for GPT

The biggest mistake with AI content is letting it “freestyle.” Instead, enforce a consistent structure:

Store this template prompt in n8n or a separate “prompts” file so every article follows the same layout and is easy to skim.

Step 3 – Build the n8n workflow

Core nodes (conceptual)

  1. HTTP node – Fetch one unprocessed topic
    GET from Supabase: ?select=*&status=eq.idea&order=id.asc&limit=1
  2. AI node – Generate the article HTML
    Send the topic, keyword, and slug with your blog template prompt. Ask for valid HTML only inside the body tag.
  3. HTTP node – Save the HTML back into Supabase
    PATCH the row: set status = 'drafted', html = <generated html>.
  4. Code + SFTP – Publish to your /blog folder
    Convert HTML to a file named {slug}.html and upload via SFTP to public_html/blog/.
  5. HTTP node – Mark as published
    PATCH again: status = 'published', published_at = now().

This sounds like a lot, but once the first article runs end-to-end, the rest are just data and small prompt tweaks.

Step 4 – On-page SEO basics baked into the template

To avoid manually “doing SEO” for each article, bake these rules into your GPT prompt and HTML template:

Over time, this consistency matters far more than stressing over every single meta tag.

Step 5 – Connect this to your existing blog page

You’re already using a static /blog/index.html with cards. To connect this SEO system:

  1. Decide which posts are “pillar” posts and which are “supporting” posts.
  2. Manually add new pillar posts to your blog cards and sidebar links once per week.
  3. Let smaller, long-tail posts live as “quiet” pages that mostly pull search traffic.
You don’t need every single article on the homepage. Think of your main blog page as a curated menu, not a full archive.

What a realistic weekly schedule looks like

Here is a simple, honest schedule that most people can actually follow:

In one month you could easily ship 8–12 useful articles with under 3 hours of human effort.


If you do nothing else from this article:

  1. Create the seo_topics table in Supabase.
  2. Feed it 20 real questions from your customers.
  3. Build a tiny n8n flow that turns exactly one row into a HTML article.

Once you see your first article appear live on your blog without you writing it from scratch, you’ll understand the power of a simple SEO system built on automation.