First diffusion language model inference on real quantum hardware. 3 qubits, 4 CNOTs, 2648 parameters.
Try the interactive demoA tiny MLP learns language patterns, gets converted to a quantum circuit, and runs on real hardware.
A 2-layer MLP (61 → 32 → 16 → 8) learns masked token prediction on a tiny 8-word vocabulary. Trained with Adam optimizer for 5000 steps.
The MLP's output probability distribution is encoded into rotation angles on 3 qubits. Each prediction becomes a quantum circuit with ~4 CNOTs.
Circuits run on the QuTech Tuna-9 superconducting processor. 4096 shots per circuit. Measurement outcomes decoded back to token predictions.
Build a sentence, mask one word, and see the model's prediction. This runs the actual trained MLP weights in your browser.
VOCABULARY (tap to add to sentence)
YOUR SENTENCE
8 test cases run on QuTech Tuna-9 superconducting quantum computer. 4096 shots each.
| Test Case | Expected | Classical | Quantum (T-9) |
|---|---|---|---|
| the cat ___ | sat | ran | ran |
| the dog ___ | ran | ran | cat |
| ___ cat sat | the | the | the |
| the ___ sat on a mat | cat | cat | cat |
| cat ___ | ran | ran | sat |
| dog ___ | sat | ran | sat |
| ___ mat | the | cat | cat |
| the cat sat on ___ | mat | mat | mat |
This is a proof-of-concept. Here is what it cannot do.