Faris Allafi

Faris Allafi

ā˜€ļø 13 y/o indie researcher | Non-autoregressive models & blockchains šŸ’» ✨

About

I'm Faris, a 13-year-old researcher building AI and blockchains from first principles. I created DIMBA — a diffusion language model using Mamba-2 that generates text in parallel, not token-by-token. I also build Ghost, a Substrate blockchain with hybrid PoW/PoS consensus.

I believe current AI architectures aren't the endgame — they're just where we started. My work focuses on O(n) alternatives to attention and questioning fundamentals.

Based in Abu Dhabi. Currently exploring Hamiltonian mechanics, vintage computing, and why parallel generation is the future.

Current Focus

DIMBA (Diffusion Mamba)

DIMBA is a novel non-autoregressive architecture that fuses a cosine-scheduled diffusion process with a Mamba-2 state-space model to enable parallel sequence generation. By replacing sequential decoding with iterative refinement, it achieves significant latency reduction while maintaining high semantic coherence. This fusion leverages Mamba-2's efficiency to provide a controllable trade-off between inference speed and output quality.

Ghost Blockchain

Custom Layer-1 blockchain built from scratch using Substrate. Hybrid PoW/PoS consensus with Entropy-Steered Consensus (ESC).

Philosophy

01

Architecture First

I question the frameworks themselves. If attention is O(n²), maybe we don't need attention at all.

02

Build to Deploy

Research means nothing if it stays in a notebook. Target real hardware constraints.

03

Deep Over Wide

Sit with a problem until the structure reveals itself. Understand one thing completely.

The Journey

2024 — The Realization

Asking why language models generate one token at a time. The start of the non-autoregressive rabbit hole.

2024 — Discovering Mamba

State Space Models as the missing piece for efficient, long-context AI architecture.