Kubernetes at home, next generation, part 1/N: Hardware

I have been running Kubernetes at home from October 2024 onward. That exercise was single-node though, using (relatively small) part of the Frankenrouter resources. This is about next Kubernetes iteration.. or its hardware choice. Why I did not want to stick with the kind setup forever? Frankenrouter hardware (Intel N305) at least officially supports only 32GB of RAM. In addition to OpenWrt LXC container, and some native Debian processes, it is packing about 49 containers at the time of writing (give or take few, this Grafana thing is only an approximation based on unique images on podman side and pods on Kubernetes side): ...

11.6.2025 · 4 min · 746 words · Markus Stenberg

IPv6 or lack of it (by default), 2025

Our startup ( Time Atlas Labs ) has had more (physical) addresses than it really should - including pre-company forming era, we are now in our 3rd office in period of a year. The networking, in general, has been universally pretty bad, until today. As this is a rant, and I don’t want to particularly blame any specific ISP, the ISPs are left anonymous. Office 1: Landlord-provided internet It was slow and unreliable (‘reset the router’) was the approach to deal with it. ...

3.6.2025 · 4 min · 690 words · Markus Stenberg

Beer consumption analysis using LLMs

I have been working on a life tracking app since last year. To analyze the data I have logged using it, I queried it for ‘beer in 2025’ and analyzed results. The dataset itself I will not publish here, but there are three types of relevant data there (in parentheses how they are encoded in the Markdown output that I pass to the LLMs): Place visits involving beer ( e.g. * 2 hours spent in <insert pub here>) Journal entries mentioning beer ( e.g. I had beer and pizza for lunch) Explicitly counted beer logging ( e.g. - 3 count beer) Baseline - shell egrep 'count beer$' 20250528-beer.md | cut -d ' ' -f 2 | awk '{sum += $1} END {print sum}' 17 So the expectation is that the number should be AT least 17 beers, but ideally more, as there are some journal entries which mention beer. ...

28.5.2025 · 4 min · 726 words · Markus Stenberg

April vibe coding summary

This will be the last post on vibe coding for now, I promise.. ( at least about Google Gemini 2.5 Pro Exp ). I did some vibe coding every weekend in April, just to get a change of pace from work (and for science), starting with ‘what if I could not code’ experiment (not great success), and finishing with two probably useful tools that I wanted. Last week Google made Gemini 2.5 Pro Exp flash available commercially, and reduced the free input token rate limit per day quite a lot. The new limits are (as of now) million input tokens, 25 requests per day (no idea about output tokens). Single request maximum size is probably still? 250k tokens (I hit it couple of times earlier, not sure if it was reduced as most recent project was smaller and I didn’t get beyond 100k token requests). ...

28.4.2025 · 5 min · 862 words · Markus Stenberg

Vibe coding try 2: feat. Gemini 2.5 pro exp

I was not particularly satisfied with my experience of doing fully hands-off vibe coding, but I wanted also to see what I can do if I spend bit more thinking and instructing the LLM before hitting ‘send’ button. So another Sunday spent ‘usefully’. Gemini 2.5 pro exp is free(!) (for now) The shocking part is that Gemini 2.5 pro is currently available in free tier of Google AI Studio (and to chat with at ‎Gemini). The quota is quite generous - you can do essentially up to 25 M tokens per day (25 request limit per day, 1M context size - I did not get quite that far as my requests were <= 100k context size). ...

13.4.2025 · 4 min · 699 words · Markus Stenberg

Aider 0.8.1 and me

I have been using Aider on and off for a couple of months now. I have found its defaults to be pretty bad (at least for me), and so I decided to write up on how I use it and the configuration I use with it. Note: ‘model’ in this text refers to large language models (LLMs), and more specifically, those that are reasonably good at reasoning/coding tasks. Currently I am using mainly Claude 3.7 Sonnet, but the model I use seems to change every month (o3-mini high-reason was the one I used last month), and the recent Deepcoder release makes it possible I will try using local model again soon as my main model. ...

10.4.2025 · 7 min · 1395 words · Markus Stenberg

Vibe coding try 1 .. spoiler: not great success

Vibe coding has been frequently touted in the internet, and not wanting to feel left out, I spent half a day working on ‘something’ I picked from depths of my todo list: a Python utility to convert from format X to format Y (particular format is not relevant so omitted here - nested data structures with tags, and keyword-values). The vision I decided I wanted to pretend I don’t know how to code. So I for most part chose not to write any code myself, but instead guide (a set of) LLMs to produce what I wanted, mostly just specifying which files I want to touch and to do what. ...

6.4.2025 · 6 min · 1119 words · Markus Stenberg

Why structured logging is the thing

When I wrote the first iteration of the Lixie tool about year ago (early 2024), my idea was to identify which logs were boring (most of them), interesting (very few of them) and unknown (not yet classified). At the time I chose not to use ‘AI’ (LLMs) for it, and I am still not that convinced they are the best way to approach that particular problem. Ultimately it boils down to human judgment of what is useful is much more realistic (at least in my context) than what the LLMs ‘know’ (absent fine-tuning and-or extensive example sets which I do not by definition have for my personal logs). After choosing not to use LLMs for it, it was just matching exercise - structured logging messages against an ordered set of rules. ...

1.4.2025 · 4 min · 772 words · Markus Stenberg

How I write notes.. and why?

Over the time my ways of writing notes have evolved. I think writing things down helps me both to retain extended working memory of things I have done over time, as well as process them (sometimes much, much later). I write this blog mainly just to organize my own thoughts too, as opposed to actually writing for an audience (by design, I keep no statistics of readers, and I do not particularly want to interact with hypothetical reader or two that might stumble here, sorry - I believe most of the visitors are AI scraper bots anyway). ...

30.3.2025 · 6 min · 1123 words · Markus Stenberg

From Hue (to back) to Home Assistant

Background I think I wrote about this in some earlier blog post too, but I have used various home automation solutions for awhile now. I started out with very, very early Home Assistant builds, not quite sure when, but I contributed little in 2014 to it at least (based on git log). Later in 2014 I started developing my own solution with somewhat different decentralized model ( GitHub - fingon/kodin-henki: ‘Spirit of home’ - my home automation project written in Python 2/3. ), which I used about 5 years and then switched to much less featureful but also less maintenance requiring Philips Hue system. ...

8.2.2025 · 6 min · 1154 words · Markus Stenberg