Welcome to APL - the world's first hardware-native neurosymbolic programming language! This guide will get you up and running in minutes.
npm install @aevov/aplAdd this to your HTML:
<script src="https://cdn.aevov.ai/apl/v1.0.0/apl.bundle.js"></script>Download the bundle from GitHub Releases
Create a file hello.js:
const APL = require('@aevov/apl');
async function main() {
const apl = new APL();
const result = await apl.run(`
print("Hello from APL!")
`);
console.log('Success:', result.success);
}
main();Run it:
node hello.jsCreate hello.html:
<!DOCTYPE html>
<html>
<head>
<title>APL Hello World</title>
<script src="apl.bundle.js"></script>
</head>
<body>
<h1>APL Demo</h1>
<button onclick="runCode()">Run Code</button>
<pre id="output"></pre>
<script>
async function runCode() {
const apl = new APL();
const result = await apl.run(`
print("Hello from APL!")
`);
document.getElementById('output').textContent =
JSON.stringify(result, null, 2);
}
</script>
</body>
</html>APL supports both ASCII and runic characters:
ASCII Version:
q = Q.super(2)
Q.gate(q, "hadamard")Runic Version:
q = ᛩ(2)
ᛜ(q, "hadamard")Both compile to identical bytecode!
APL operations map directly to hardware units:
// Quantum operations → QFU (Quantum Functional Unit)
q = Q.super(4)
// Neural operations → NPU (Neural Processing Unit)
net = N.net(100)
// Genetic operations → GEU (Genetic Evolution Unit)
pop = G.fitness(solutions)Standard programming constructs work as expected:
// Variables
x = 10
y = 20
z = x + y
// Functions
function add(a, b) {
return a + b
}
result = add(5, 3)const apl = new APL();
await apl.run(`
function create_bell_state() {
// Create 2-qubit system
q = Q.super(2)
// Hadamard on qubit 0
Q.gate(q, "hadamard", 0)
// CNOT to entangle
Q.entangle(q, 0, 1)
return q
}
bell = create_bell_state()
print("Bell state created!")
`);await apl.run(`
function learn_patterns(data) {
// Create neural network
net = N.net(100)
// Train on each pattern
for pattern in data {
match = N.match(net, pattern)
N.learn(net, match, 0.01)
}
return net
}
trained = learn_patterns([1, 2, 3, 4, 5])
print("Network trained!")
`);await apl.run(`
function evolve(pop, gens) {
for i in 0..gens {
fit = G.fitness(pop)
best = D.dist(fit)
kids = G.cross(best)
G.mutate(kids, 0.1)
pop = D.unify(best, kids)
}
return pop
}
solution = evolve([1,2,3,4,5], 100)
print("Evolution complete!")
`);Combining multiple paradigms:
await apl.run(`
function ai_system(input) {
// Step 1: Quantum preprocessing
q = Q.super(input.size)
Q.gate(q, "hadamard")
// Step 2: Neural processing
net = N.net(1000)
patterns = N.match(net, q)
// Step 3: Symbolic reasoning
graph = S.graph(patterns)
inference = S.reason(graph)
// Step 4: Unify results
result = D.unify(patterns, inference)
return result
}
output = ai_system({ size: 100 })
print("AI processing complete!")
`);const apl = new APL(options)Options:
mode: 'ascii', 'runic', or 'auto' (default: 'auto')debug: Enable debug logging (default: false)hardwareAcceleration: Enable .aevQG∞ hardware (default: false, requires license)
Compile APL source code to bytecode.
const compiled = apl.compile(`
q = Q.super(2)
`);
console.log(compiled.success); // true
console.log(compiled.code); // bytecodeExecute compiled bytecode.
const result = await apl.execute(compiled);Compile and execute in one step.
const result = await apl.run(`
print("Hello!")
`);Register a native JavaScript function.
apl.registerNative('myFunc', (x, y) => {
return x * y;
});
await apl.run(`
result = myFunc(5, 3)
print(result) // 15
`);Convert between ASCII and runic.
const runic = apl.toRunic('Q.super(2)');
console.log(runic); // 'ᛩ(2)'
const ascii = apl.toAscii('ᛩ(2)');
console.log(ascii); // 'Q.super(2)'| Category | ASCII | Runic | Description |
|---|---|---|---|
| Quantum | |||
| Superposition | Q.super | ᛩ | Create quantum state |
| Gate | Q.gate | ᛜ | Apply quantum gate |
| Entangle | Q.entangle | ᙠ | Entangle qubits |
| Neural | |||
| Network | N.net | ᚾ | Create neural net |
| Match | N.match | ᛈ | Pattern match |
| Learn | N.learn | ᚻ | Learning rule |
| Genetic | |||
| Fitness | G.fitness | ᚠ | Evaluate fitness |
| Crossover | G.cross | ᚴ | Crossover operation |
| Mutate | G.mutate | ᚥ | Apply mutation |
| Symbolic | |||
| Graph | S.graph | ᛕ | Knowledge graph |
| Reason | S.reason | ᛊ | Logical reasoning |
| Coordination | |||
| Distribute | D.dist | ᛞ | Distribute work |
| Unify | D.unify | ᚢ | Unify results |
| Bind | D.bind | ᛂ | Bind values |
- Explore Examples: Check the
examples/directory - Read Documentation: Visit docs.apl.aevov.ai
- Join Discord: Get help at discord.gg/apl
- Try Hardware: License .aevQG∞ for production performance
APL runs on any standard hardware (x86, ARM, RISC-V) using software simulation. For production workloads requiring 100-1000x speedup, license the .aevQG∞ hardware:
const apl = new APL({
hardwareAcceleration: true,
licenseKey: 'your-license-key'
});Contact hardware@aevov.ai for licensing.
Make sure you've installed APL:
npm install @aevov/aplEnsure your editor/terminal supports UTF-8 encoding.
Hardware operations are simulated in software by default. For actual hardware acceleration, a .aevQG∞ license is required.
- Check FAQ
- Ask on Discord
- Open GitHub Issue
Happy Coding! ⚡