I have some functionality on my app up and running. Wanted to start the process of figuring out the deployment process. The app consist of a Golang executable running Gin and SQLite. I want to be able to deploy the application using a Docker/OCI image so that I have maximum flexibility of where I can deploy.

Building the initial image

The first thing I had to do was create the initial Docker file and build the initial image. I used the Golang image based on apline. Here is what the Docker file looks like:

# Latest golang image on apline linux
FROM golang:1.24.1-bookworm

# Work directory
WORKDIR /build

# Installing dependencies
COPY go.mod go.sum ./

# Installs Go dependencies

RUN go mod download

# Copying all the files
COPY . .

# Starting our application
CMD ["go", "run", "main.go"]

# Exposing server port
EXPOSE 8080

I built the initial image using this command:

docker build -f Dockerfile -t mygoapp:latest .

Built the initial image and was able to run the application. There was a problem, the image was almost 2GB in size. The Go code wasn’t substantial at this point so most of the 2GB overhead was the image itself.

Reducing the image size

To reduce the size, I decided to do a multi-stage build and compile the application statically. In order to do this I had to compile the Go code using the previous image (golang:1.24.1-bookworm) and copy the executable to a small base image, in this case scratch. Here is what the Dockerfile looks like

# Use Go 1.24 bookworm as base image
FROM golang:1.24.1-bookworm AS builder

# Creates an app directory to hold your app’s source code
WORKDIR /build
 
# Copies everything from your root directory into /app
COPY go.mod go.sum ./
 
# Installs Go dependencies
RUN go mod download

# Copy the entire source code into the container
COPY . .
 
# Builds your app with optional configuration
RUN CGO_ENABLED=0 GOOS=linux go build -o /app 
 
# Tells Docker which network port your container listens on

FROM scratch

# Copy the executable to the scratch image
COPY --from=builder /app /app

# Expose the port for Gin
EXPOSE 8080

ENTRYPOINT ["/app"]

I built the image using this command:

docker build -f Dockerfile -t mygoapp:latest .

I ran the Docker image and got this error

[error] failed to initialize database, got error Binary was compiled with 'CGO_ENABLED=0', go-sqlite3 requires cgo to work.

Golang and SQLite Present an Interesting problem

It seems that in order to use SQLite and Golang together, you have to have CGO_ENABLED set to 1 use additional linker options to create a completely static executable. Here are the options I used

RUN CGO_ENABLED=1 GOOS=linux go build -o /app -a -ldflags '-linkmode external -extldflags "-static"' .

I rebuilt the image and reran it. The image started without an issue. The image size was around 70 megabytes. I ran some initial tests against the endpoints exposed and ran into this error:

x509: certificate signed by unknown authority

The service hits a 3rd party API and scratch doesn’t have any ssl certs installed. I could download the certificates manually but I’m lazy and don’t want to complicate the build process and dockerfile.

Google Distroless to the Rescue

Google actually provides minimal Docker images that are a little larger than scratch but contain things like libc, libssl, ca-certificates, etc. I’m going to use the base image (gcr.io/distroless/base). Here is what my new Dockerfile looks like:

# Use Go 1.24 bookworm as base image
FROM golang:1.24.1-bookworm AS builder

# Creates an app directory to hold your app’s source code
WORKDIR /build
 
# Copies everything from your root directory into /app
COPY go.mod go.sum ./
 
# Installs Go dependencies
RUN go mod download

# Copy the entire source code into the container
COPY . .
 
# Builds your app with optional configuration
RUN CGO_ENABLED=1 GOOS=linux go build -o /app -a -ldflags '-linkmode external -extldflags "-static"' .
 
# Tells Docker which network port your container listens on

FROM gcr.io/distroless/base
COPY --from=builder /app /app
COPY --from=builder /build/templates/* ./templates/
EXPOSE 8080

ENTRYPOINT ["/app"]

Everything worked locally and the Docker image was about 100 megabytes. In order to deploy it to a cloud provider I chose to do a multi-platform image that will run on my M4 Mac (arm64) and in the cloud on non-arm hardware (amd64). Here is the command I used to build the image:

docker buildx build --platform linux/amd64,linux/arm64 . -t mygoapp:latest

Questions or comments feel free to reach out to me on BlueSky (https://bsky.app/profile/rooseveltanderson.bsky.social)

I’m working on a project with a Golang backend and a REACT frontend. Built the backend using Gin and was able to create a basic API. I wanted to not only test out the api but also figure out additional API calls I would need for the frontend. I’m primarily backend engineer and who can write frontend code when I had to. I didn’t feel like creating an entire REACT app and going through the process of figuring out the deployment so I decided to be lazy and serve html via Gin.

Using HTML Templates in Go

Writing HTML template in Go is not that difficult. Its basically a static html file with an action pipeline. A few examples of action pipelines:

{{/* a comment */}}	
{{.}}	
{{.Title}}	
{{if .Done}} {{else}} {{end}}	
{{range .Todos}} {{.}} {{end}}	
{{block "content" .}} {{end}}	
Defines a comment
Renders the root element
Renders the “Title”-field in a nested element
Defines an if-Statement
Loops over all “Todos” and renders each using {{.}}
Defines a block with the name “content”

Here is an example template :

<html>
    <header>
        <title>My Template</title>
    </header>
    <body>            
            {{range .Players}}
                <div id="playerList">
                    {{.Rank}} - {{.Name}}
                </div>
            {{end}
    </body>
</html>

Serving the Pages Via Gin

Now that I have a template how do I serve it via Gin. First create a directory named templates and save the above example as index.tmpl .Data for the template must be passed using a Go struct. The files must be loaded using LoadHTMLGlob() or LoadHTMLFiles().

Here is an example of how to serve html file using Gin:

func main() {
	router := gin.Default()
	router.LoadHTMLGlob("templates/*")
        data := []struct {
		Rank string
		Name string
	}{
		{"1", "Michael Jordan"},
		{"2", "Larry Bird"},
	}
	router.GET("/index", func(c *gin.Context) {
		c.HTML(http.StatusOK, "index.tmpl", gin.H{
			"Players": data,
		})
	})
	router.Run(":8080")
}

This code will read templates from the templates directory, create an array with the required data and display the template using the provided data. Gin will detect if a change has been made to templates and automatically reload it. Templates can use CSS and Javascript if you can to get more fancy.

This is a simple workaround to avoid having to create a full blown REACT app and shouldn’t be seen as a long term solution.

I’m working on a new project. Can’t really disclose a lot but I can tell you its going to be based on Golang/REACT and I’m hoping to scale it to handle at least 100k hits per day. Because I’m a cheap bastard, I’m going to try to spend as little money as possible. I plan to document the entire process here.

Technologies Used

  • Golang
  • Docker
  • SQLite
  • React

3rd Party Services Used

  • Oracle Cloud Infrastructure/Google Cloud
    • Hosting the Golang backend
  • OCI NoSQL database/Firebase
    • Hosting user generated data
  • SupaBase
    • User authenitcation
  • PostHog
    • Analytics
  • Vercel
    • Hosting the React Frontend

Why SQLITE?

The project that I’m working on relies on a lot of static data. By storing this data in an SQLite, I’m decreasing the need to make additional network calls to a remote database. The initial data would consist of about 50,000 rows in the SQLite database which would take up about 5-6 megabytes. Each database would be deployed in a docker container with the Golang application.

SQLite can easily handle 100k http requests per day (You can read more about uses for SQLite here). I’m enabling WAL mode (Write-Ahead Logging) to make the database more performant since the database would only be accessed via a single host.

Feel free to comment or reach out to me on Bluesky (https://bsky.app/profile/rooseveltanderson.bsky.social)

Anyone who knows me, knows that I’m not a fan of Agile. Most people ask me why and the answer is always the same, most of companies don’t do it correctly. After talking to people in the industry, that seems to be the general consensus. Most people don’t realize that Agile is not agile at all in its implementation. In order for Agile to work you have to conform very strictly to its core principles. You will not believe how many times I’ve heard people say “We’re going to do Agile but we’re going to do it our way.” The moment that phrase is uttered, is the moment you lose your software developers/engineers.

I’ve been speaking in generalities but here are the are the 3 big reasons why Agile development goes off the rails.

Most Companies Don’t Employ Proper QA Resources

Do you know the difference between a QA engineer and a software tester? I wouldn’t worry if you don’t, because most people don’t know. QA engineers are the people who write test plans for software, monitor and audit the entire SDLC to make sure processes and standards are being upheld, and make sure that the software meets customer requirements. A software tester implements the test plan that the QA engineer develops. Most people who are QA engineers actually studied computer/information science/engineering in college. Most QA engineers actually write code. They are the people who are suppose to write the integration tests between components authored by different teams.

So what does this have to do with Agile? Without a proper QA engineer and most importantly a test plan, software testers have no focus (I know. I was a software tester in college). You wouldn’t believe how many tickets I’ve gotten about text or a button being misaligned, when a major part of the software doesn’t work because a scenario wasn’t documented or tested. The general attitude that I’ve gotten from software testers is they don’t take at least partial ownership of the software. They think if software is deployed with errors, it is the sole fault of the developers/engineers. One company I worked for, the situation got so bad that the development team had to not only write the test plans but also conduct all the testing because the software testers were so bad.

Most Software Engineers make better Scrum Masters than the guy you sent to Training

Most times companies will send a project manager to Scrum Master training. These people study Scrum and Agile for a week and take a test, while most people who studied computer/information science/engineering in college studied theory of software design for a semester or more. Non software people don’t understand how software is written. When tasks are being written for a sprint questions like “How do we test this?” or “What does this task mean?” come up from the scrum master a lot. Companies are essentially making people who have no idea about software a key part of the software development process.

Want an easy way to know if the scrum master knows what they’re talking about, pay attention during scrum poker. If the people writing the software come up with a consensus and the scrum master doesn’t go with it that’s a big warning bell.

Agile is a Software Engineering Process, Most people think it is a Business Process

Like any other software development methodology, you have to have clearly defined business scope and requirements. The people who come up with the scope and requirement think the “rapid and flexible response to change” or “welcome changing requirements, even late in development” parts of Agile means they can change scope of a project in the middle of a sprint or start a sprint without clearly defined scope. Most also don’t know, when a new business requirement might actually change the entire scope of the project. This will drive your developers/engineers crazy.

Do you feel different? Let me know in the comments.