In the second entry of the series, I'll cover how I structured the project and how automated deployments to Netlify are set up. Some of the code samples refer to the original project name (TweetYah), so you'll have to ignore some of that.

General project structure

I'm a big fan of the monorepo approach on solo projects. It keeps everything related to the project in one place instead of having to hop between multiple repositories for frontend, backend, API, etc. That said, there are realistically 3 different things in play here. The first is the front end that's built with Svelte. Next is the API written in Go as Netlify Functions. And the third is the scheduler that scans the database every minute for posts to dispatch. You might consider the front end and API to be the same project, but the code bases are completely different.

Directory breakdown

Below is an in-depth breakdown of each component, as well as some of my reasoning behind why I did certain things.


This folder contains both the front-end Svelte code as well as the Go Netlify Functions. This is primarily because of the way Netlify handles the deployments, as well as testing the app locally. By running npm run dev, the Netlify CLI is triggered to build and monitor the front-end, AND monitor the functions directory for changes.

This lets me hit a URL in my browser and configure any API calls from the front end to hit /.netlify/functions/{handler}, which stays consistent even when deployed to Netlify. Since I'm just using the path instead of a completely separate domain, CORS doesn't become an issue with this approach.


This directory contains the structure of my PlanetScale database schema in an HCL file that can be used by Atlas CLI (a new favorite tool of mine) to deploy schema changes to the database. Essentially it allows me to version control the database structure and store it along with the repository. There is a simple script in that calls the Atlas CLI to deploy those changes, something I will be utilizing more down the line.


atlas schema apply -u mysql://localhost/tweetyah -f schema.hcl

I haven't fully set up all of the pipelines I want, but the end goal will be to use something like GitHub Actions to automate the process of updating the database schema for me. When the file changes in the dev branch, I'll have an action automatically apply those changes to the dev branch of my PlanetScale database. When a PR goes to the main branch in GitHub, another Action will be set up to merge the database changes in PlanetScale. This helps me keep a nice history of the changes that can be applied directly to my database. This will be an article all in itself once I get to that point.


Lib is a shared Go module that's used between the Scheduler and the Netlify Functions. If you are familiar with decoupled application structures (Onion, Hexagonal, etc), this is the "business logic" portion of that. This is essentially the core of the backend code. Using Go's replace directive in the go.mod files, I'm able to tell the system to reference this directory for that dependency instead of hitting a public URL like so:


go 1.17

replace => ../lib // <= here

require ( v1.6.0 v1.4.0

require ( v1.34.1 // indirect v1.44.131 // indirect v1.0.1 // indirect v3.2.2+incompatible // indirect v0.4.0 // indirect v0.9.1 // indirect v0.0.0-20221105205022-03b042823748 // indirect


The Scheduler is a Go project that compiles to a single binary and is deployed via Docker to my home lab for the time being. For that deployment, I use Azure DevOps with a  makefile to handle the logic of deploying the app. I'll eventually move this workflow to GitHub Actions, but I already have a deploy agent running on my home lab so this was just a shortcut.

# makefile
    GOOS=linux GOARCH=amd64 go build -o dist/scheduler .
    docker build --tag tweetyah-scheduler:latest .

start: build
    docker rm -f tweetyah-scheduler
    docker run -d --name tweetyah-scheduler tweetyah-scheduler:latest

Its job is to check the database every minute for posts that need to be sent, specifically the posts table in the database. It filters posts to only those that are in a "scheduled" status with a send_at value earlier than the current time. This is run in an infinite loop that has a 1-minute delay between iterations.

func main() {

    for {
        // do stuff..
        time.Sleep(1 * time.Minute)

The query that's run also joins out to a few other tables to grab user tokens and the mastodon domain where the post was scheduled.

 query := `
            posts p
        left join
            user_tokens ut on p.id_user = ut.user_id
            status = 0 and send_at < NOW()`


The Website directory holds a SvelteKit project that's essentially just a single landing page, deployed to via Netlify. Not a whole lot of complexity going on here, just a reiteration of the App directory. The key takeaway here is that I don't have it hooked up for deployment using Netlify's automatic system. Instead, I use an NPM script to trigger deployments manually.


Next up I think I'll break down the database structure in detail and show my methods for working with SQL in Go without an ORM.