Monday, August 11, 2025

Idea for later: write packages of agent prompts.

https://github.com/parruda/claude-swarm?tab=readme-ov-file#multi-level-swarm-example + pre-made prompts describing the context of what an agent does.

bundle add 'accessibility_prompts' ? To generate templates/agents.md?

Saturday, August 09, 2025

Things to go vibe code: Food Crawl Agent

Feature Idea: “Cafe Crawl” Web App with Reminders

A simple web app that pulls cafe, brunch spot, and restaurant info from OpenStreetMap, then turns it into a running list of places to try. The app keeps the list fresh, sends you “hey, try this place!” reminders, and lets you track whether a spot is worth going back to or not.

Saturday morning: You get a ping – Your cafe pick this week: The Green Bean Cafe – 2km away.”You head there, try it out, and tap “Loved it” when you get home. Next week, the app suggests somewhere new.


I officially regret throwing that idea into ChatGPT to flesh out.

Recipes; the Gantt chart

Mealie's current 'cook mode' leaves a lot of room for improvement - they removed Timers just because it wouldn't work for some devices, and they can't be stuffed reviewing UX improvements to it.



My partner's idea for a overhauling this - fix recipes by plugging them into a Gantt chart structure, so dependencies are mapped between steps.
Add timing.
Have a big iPad or tablet of various cooking timers that just run, reading out steps.
To configure, similar UX to music programming software.


Recipe steps are often badly written. Use mealie to extract JSON-LD. Use AI to edit recipe, split any steps where multiple activities are occuring. Use AI to then attempt to do dependencies, so there's a lot less labor.
Have a repository, where timing is unknown, of techniques and steps with the average times. Or Cook Steak for 7 minutes, or Cook roast at 45 minutes per 500g at 220c. Add confindence scale to steps for your env.

Add references for common questions like "what should my eggs look like?"
Video timestamps from YouTube based on transcript?

Anti Stalebot

I remember lazyweb, and was looking for a stream of inspiration.
I found a more modern version in https://github.com/h5bp/lazyweb-requests

For some stupid reason, they switched on Stalebot. This nuked the utility of the project by stifling conversation and ideas. Then they archives the project, closing off all conversation entirely.

It is as though they did not understand the point of the very thing they created.

Talking this through with my partner; she hit on the idea of a summarisation tool that just vibe codes the idea as a last gasp, whenever Stalebot comments.

Some things will be terrible, with useless input. But for the good ideas in the world that are about to die because some maintainer didn't think about how people work or how to have an attention span more than a few weeks wide, it would have the possibility to refresh a project or provide fertile ground for a fork.


Friday, June 20, 2025

AI persona capybara tests

What does it look like when we move from capybara/selenium, scripted, mostly deterministic tests to a fuzzing mindset coupled with ai personas? 

The idea of a user persona as a design tool has existed for some time, but I have rarely found them durable, long term aides beyond "job role" in a BDD context. 

But we have had game "AI" for some time, notably in things like Rimworld where a given persona has agency - it may still be limited, but in any given situation you can rely on it to maximise X or minimise Y.

What does it look like for A/B testing with synthetic personas? What does it look like when your end to end test has an AI agent that is pretending it has the cognition of a 70 year old?

Is there value here?
Or is this, like the hand crafted personas, just fluff?

For the most part - for the deterministic, repeatable results - you don't need AI so much as you just want a set of words with enough weighting.

Picture these basic agents.
The optimist:
Just keeps clicking yes, proceed, next, I agree!
Never fills in a form until an error demands it.
If the text has a positive sentiment, they are on board.

The pessimist:
The inverse of the above 

The cheapskate:
Looks for the second lowest 'price' on a page and focuses on that for purchase.


All seem buildable.

Monday, May 26, 2025

Anarcho-Guerrilla Tactics for Home Improvement

This weekend represented a crucial high point of trust with my partner.
She put down the feed of furniture restoration shorts.
She... asked to touch my chop saw.

What followed was my clumsy, clumsy "don't put your fingers in that"; and some precision cut shelf spacers execute by her while maintaining direct eye contact.

Our spice shelf has never been so organised.

Dear readers, where do go to next?

Wednesday, April 16, 2025

FHIR, Smart Forms and Data Collection

The AHERC's smart forms are pretty neat: https://github.com/aehrc/smart-forms


My day gig - a product called CareRight - does a lot of pretty special things with customisable forms and in-clinic followup automations based on your answers. But right now we are an island, reaching out to others to communicate - email, SMS, secure messaging - any channel we can find.

What excites me more than the status quo over the next few years is the idea that you can visit a doctor at location A with System B, upload a StructuredDataCapture FHIR questionnaire to my health record or secure messaging, and specialist X with system Y gets an instant understanding of who you are and what you need.

While it may take a little while for these ideas to fully gain traction in the Australian PAS/EMR ecosystem, it feels like it's just around the corner.

Assembling all of the prices

Did you know; there's only around 800 supermarket chains in the world (notable enough to get into wikidata), and only about 200-300 of them offer some kind of online shopping capability? Of those, 110 offer schema.org terms.

Sadly, very few of those publish a GTIN (barcode).

Side project for the week has been scraping all of the grocery sites to build an open index of prices; complementary to OpenFoodFacts' open-prices data.

The ultimate intent is to try and glue it together with ODNC APIs and build a plugin for mealie/home assistant's shopping lists.

Friday, February 21, 2025

Hedwig and the Angry Inch (2025, Adelaide)

My partner and I went off to see Hedwig and the Angry Inch (2025, Adelaide); and we both left fairly sad. It was a surreal experience, as there was literally a standing ovation from the crowd; but I've never felt more disengaged or out of touch with a story - I did not understand the gulf between the "more normie" audience experience and mine at all. It was weird, in that it didn't seem to understand grungey queer burlesque origins, it disneyified the main character but not as a villain (who I now understand is meant to be relatable only in that they are just as broken as you or I); and I completely missed half of the narrative having forgotten the salient points of the movie version. But it was a "packed" show with a high ticket price. I feel very mixed thinking about this vs the raw, authentic feel of seeing; say, sewer rat girl (https://www.gogosiobhan.com/srg) or the many talented baby drag performers doing their craft at now defunct places like my lover cindi. I don't know what to do with this, because I'm broadly a cis male attempting not to be a dick in the world. But there's fuck all reviews apart from press releases ("wickedly funny", "raw energy"); and I have a vague apprehension this is sucking attention and $ away from far better folx doing way better stuff, still with broad appeal. If you are looking for something a bit weird but with story and emotion, I strongly suggest you look elsewhere this fringe; ie https://adelaidefringe.com.au/fringetix/mel-mcglensey-is-motorboat-af2025 or

Saturday, February 01, 2025

Tinkering with the SpeechSynthesis API in Mealie, Vue2

I'm fairly happy with Mealie, though every time I put down VueJS for a hot minute I forget about the pain of fighting the data binding behaviours.

Knocked together: https://github.com/mealie-recipes/mealie/pull/4997; and it seems to be one of the actually useful times I'd ever use this API. UX needs a bit more tinkering, to allow auto play or pausing, potentially speech recognition for various commands.