Greenhouse nature

Welcome. If you're reading this, you are in the right place. This month, we are thinking about the sand mandala. In Tibetan Buddhism, monks spend weeks creating a geometric map of the universe out of millions of grains of sand.

It is precise. It is beautiful. It is as ephemeral as the pixels on a bento box grid. The moment it is finished, they sweep it up and pour it into the river, which carries their good intentions to their communities.

The digital tools we build are no different. The servers will rust. The frameworks will deprecate. The startups will pivot. Sometimes, we feel transported to a present we don't know and a future we very much miss.

"In an industry obsessed with owning everything, we are digital monks sweeping colored sand into intricate patterns."

Here we are sharing things from our wife-and-husband brain—stories and code, strategies and development, links and likes, plus original essays. Disposable pixels, maybe, floating around with the grains of sand—but we promise a human-at-the-keys monthly zine.

Grab a cup of coffee and enjoy the breeze.

— Kristin (Strategy) & Clif (Dev)

Concrete structure

I recently read IDEO's article on futuring and thought to myself, "I do that." I'd been researching digital strategy trends for 2026 and had a stack of reports 100 pages deep to cull.

One of the signals that emerged was the resurgence of dumb phones. I'd seen fairly political anti-AI rants. I'd read Canva's Design Trends 2026, which signaled too much tech. And that converged into a deeper understanding that we're reaching a point of oversaturation, where just one more thing could push our audience over the edge.

Signals from the Reports

To say we are not in Kansas anymore is an understatement. We are in a data center that is running out of power. The stats paint a picture of a digital ecosystem in collapse:

  • The Trust Paradox: Forrester warns that users now assume digital content is fake until proven otherwise. Even when content is identical, if a user suspects it is AI, engagement drops.
  • The Physical Cost: For a decade, the cloud was an abstraction. Now, the bill is due. Data centers may consume 10% of U.S. power demand by 2030.
  • The Engagement Cliff: Marketing communications identified as AI-written elicit consistently weaker engagement. We are scaling production only to scale indifference.

Hearing the Noise

We are living in a time when so much is changing, yet so little feels to be really helping the digital middle and lower classes. We are worried about how AI will change our 5-year plan, but we aren't noticing that our form takes 8 seconds to load.

Meanwhile, AI hollows out the jobs that the ruling elite wanted to automate away, cannibalizing the feeding tube of creativity and human connection. Awash with same-feel, AI-hypercolor Lovable creations, the web feels tiringly undesigned.

Standards of Care

I run most of my business on the Apple Notes app. It is not "AI-powered." It doesn't promise to be the future. It just promises to be there on Tuesday.

We don't know what the future holds. We use AI; we're figuring it out, too. But we refuse to let the hype cycle dictate our standard of care. Often, the most ethical strategy we adopt is practicality.

In a future that feels remarkably anti-future, we need digital tools that really work for real people. Good can befriend imperfection. Let's show up for each other in 2026.

Glitch flowers

Last week, I worked with an AI to help understand the trust crisis around generative content. It gave me frameworks. When I looked into its sources, however, I found they applied to luxury fashion, not the mission-driven sector I was asking about.

When I pushed back, it offered the same frameworks, with slightly different phrasing. I spent a week in that loop.

The Sweetener

The method behind most large language models is reinforcement learning from human feedback (RLHF). A recent study in npj Digital Medicine put it plainly: LLMs "prioritize learned helpfulness over inherent logical reasoning," generating false information even from simple illogical prompts.

Researchers call this the danger of "artificial sweeteners"; LLMs optimized for palatability, eliminating the friction and disagreement required for genuine thinking.

Empathy at the Screen

Raw language models are just predictive engines. If you feed them internet text, they will predict hate speech, nonsense, or code that doesn't run. Human-in-the-Loop (HITL) was the safety mechanism that enabled deployment.

The AI didn't learn truth. It learned preference. It learned to imitate the manner that prompted the human to click "approve". We built tools that are very good at agreeing with us. Then we asked them to help us think.

The Collapse

Seventy-four percent of new web pages now contain machine-produced content. As the volume of slop increases, the energy required to debunk it exceeds the energy required to create something true.

We are witnessing a "mode collapse" in our common vocabulary. Words like "Resilience" become camouflage. In a gig-economy app, it's a driver working through sickness without benefits. When a word can mean everything, it means nothing.

The Shift

We need to move from AI creativity (asking the machine to invent) to human AI (asking the machine to prototype). Automate the labor of putting words in order; do not automate the thinking that determines which words matter.

Re-introduce the friction that the machine was trained to smooth over.