This document describes the ethical position of the ASH project.
Its purpose is to:
- clearly state intent
- acknowledge risks and misuse potential
- set boundaries for development decisions
- communicate responsibility to users and contributors
ASH is a technical project, but it exists in a social and moral context.
ASH is designed to help individuals communicate securely and deliberately in situations where confidentiality matters and trust in infrastructure is limited.
Examples include:
- sensitive personal conversations
- journalistic source protection
- whistleblowing with trusted parties
- human rights work
- personal safety scenarios
ASH is not designed for casual or mass communication.
We acknowledge that any strong privacy or security tool can be misused.
Potential misuse includes, but is not limited to:
- coordination of criminal activity
- harassment or abuse
- concealment of harmful behavior
These risks are real and must be stated explicitly.
ASH deliberately sets boundaries to reduce misuse and harm:
- No anonymity guarantees at the network level
- No mass communication features
- No group chats
- No broadcast channels
- No automation or background syncing
- No stealth or hidden operation modes
- No plausible deniability guarantees
ASH favors deliberate, visible, human-driven interaction over convenience or scale.
ASH operates under the following responsibility model:
-
The project authors are responsible for:
- honest documentation
- accurate security claims
- clear communication of limitations
- refusing features that increase harm
-
Users are responsible for:
- lawful and ethical use
- understanding the tool’s limitations
- the behavior of conversation participants
ASH does not claim to prevent misuse by malicious participants.
Several design decisions in ASH are motivated explicitly by ethical considerations:
-
Manual ceremony
- Prevents casual or large-scale deployment
- Encourages reflection and intent
-
Ephemeral messaging
- Reduces long-term data accumulation
- Limits accidental harm from data retention
-
Human-verifiable security
- Avoids “black box” security theater
- Empowers users to understand what is happening
-
Minimal backend trust
- Reduces central points of control or abuse
The ASH project will not:
- add features designed to evade law enforcement
- provide instructions for illegal activity
- market itself as “untraceable” or “anonymous”
- optimize for scale, virality, or growth
- integrate surveillance evasion techniques
- accept funding that compromises these principles
If a requested feature meaningfully increases the risk of harm, it will not be implemented.
Contributors (human or AI-assisted) are expected to:
- respect the documented scope and threat model
- consider ethical implications of changes
- avoid adding features that increase misuse potential
- raise concerns when ethical boundaries may be crossed
Ethical concerns are treated as first-class design issues, not afterthoughts.
ASH does not claim that technology alone can solve ethical problems.
- Tools can reduce harm, but not eliminate it
- Responsibility ultimately lies with people
- Transparency is more ethical than false guarantees
ASH prefers honest limits over misleading promises.
ASH exists to make careful communication possible, not effortless secrecy.
If at any point the project’s direction conflicts with this ethical position,
the correct action is to stop, reconsider, and possibly not proceed.
Ethics are not a feature —
they are a constraint.