Manifund foxManifund
Home
Login
About
People
Categories
Newsletter
HomeAboutPeopleCategoriesLoginCreate
farisallafi avatarfarisallafi avatar
Faris Allafi

@farisallafi

Independent researcher building non-autoregressive language models and custom blockchains. 13 years old, Dubai-based. Published work on diffusion-based LLMs with Mamba-2 SSMs. Shipping code since age 6.

https://linktr.ee/farisallafi
$0total balance
$0charity balance
$0cash balance

$0 in pending offers

About Me

I'm Faris, a 13-year-old Jordanian- American independent researcher based in the UAE; working at the intersection of deep learning and systems design.

What I do:

I build things that shouldn't work but do. My current focus is DIMBA; a diffusion-based language model that generates text in parallel using Mamba-2 state space models instead of the usual transformer + autoregressive approach. The goal is 10-100x faster inference on consumer hardware.

Background:

- Started coding at 6, got serious about ML/blockchain a few years ago

- Working on Ghost blockchain from scratch in Rust using Substrate—custom hybrid PoW/PoS consensus with post-quantum signatures

- Published research on DIMBA architecture and theoretical framework for ultra-fast inference

- Active in ML research communities, contribute to open source tooling

Philosophy:

I believe the most interesting work happens when you question fundamental assumptions. "Language models must be autoregressive" and "you need attention for long-range dependencies" are assumptions worth challenging.

Current status:

Solo researcher, pre-funding, building on nights and weekends. Contributing to open source.

Reach me:

faris.qais.allafi@gmail.com

github.com/devnull37

https://linktr.ee/farisallafi

Projects

DIMBA: Non-Autoregressive Diffusion Language Model with Mamba-2 SSMs

pending admin approval