Principles

Build the edge. Hide the edge. Keep people in charge.

Software now drafts, decides, and acts. Interfaces are shifting from screens you drive to systems that can move on your behalf.

We are pro-acceleration and pro-human at the same time. This is our North Start for designing software when intelligence is no longer scarce.

The Context We Refuse To Ignore

Agentic systems are shipping. Skill requirements are shifting rapidly. Infrastructure costs are becoming product constraints. Principles must be strong enough for this reality.

Capability has outpaced norms

Systems can now execute meaningful multi-step work. Our standard must move from merely usable to auditable, reversible, and legible.

Work is being rewritten in place

The biggest risk is not only displacement. It is exclusion: people who could benefit but cannot trust, verify, or safely operate the tools.

Governance is now product behavior

Responsible AI is no longer a policy document. It is expressed in defaults, safeguards, and the user's ability to intervene.

Assistance can erode competence

If software optimizes only for speed, people lose judgment over time. We design for capability lift, not cognitive outsourcing.

Energy is part of UX now

Wasteful interaction patterns are no longer just inefficient. At scale, they become infrastructure debt.

Quality determines who gets included

Good interfaces widen participation. Bad interfaces concentrate leverage in the hands of already technical users.

Our Principles

These are moral and practical guidelines. We use them as decision criteria in product work, not as decorative language.

1. Agency Over Automation

Moral stance: People should not lose authorship of outcomes that materially affect them.

Practical rule: Use graduated autonomy: draft, explain, confirm, execute. For high-stakes actions, provide explicit approval and a clear rollback path.

2. Clarity Is Respect

Moral stance: Confusion transfers risk from the system to the user.

Practical rule: Every consequential action must answer three questions: what happened, why it happened, and what to do next.

3. Deliberate Friction For High Stakes

Moral stance: Speed is not a virtue when the blast radius is large.

Practical rule: Introduce checkpoints for irreversible actions, legal commitments, health decisions, and money movement. Remove friction everywhere else.

4. Privacy Is Architecture, Not Policy

Moral stance: People should not have to trade dignity for utility.

Practical rule: Minimize collection, expire data by default, scope permissions tightly, and explain data flows in plain language.

5. Accountability At The Point Of Action

Moral stance: A system that cannot be audited cannot be trusted.

Practical rule: Keep clear traces of decisions, tools used, and user approvals. Correction paths must be first-class product features.

6. Design To Strengthen Judgment

Moral stance: Tools should make people more capable over time, not more dependent.

Practical rule: Surface uncertainty, expose assumptions, and make verification natural in the workflow.

Mission statement, in one line

Build systems that expand human capability, preserve human agency, and earn trust by design.

If this resonates, follow what we do on the blog or reach out at contact@batjko.com.