Trusted Camera Attestation on Open Hardware: A Pathway to Production-Ready Open TEEs

Trusted Camera Attestation on Open Hardware: A Pathway to Production-Ready Open TEEs


Overview

A project to build an open hardware camera attestation system using Keystone on FPGA — a device that cryptographically proves video frames are real, unmodified sensor output. The deeper motivation: this is a concrete pathway toward making Keystone production-ready as a VM-capable open TEE framework.

Looking for collaborators, architecture feedback, and anyone working in adjacent areas.


The Problem This Solves

Existing solutions sign too late or trust the wrong root:

Software signing (C2PA, Adobe Content Credentials) — signs after frames pass through untrusted software. Proves the software, not the sensor. Trust root is manufacturer PKI.

Proprietary TEE signing (SGX/TrustZone) — signs closer to hardware but the trust root is still a manufacturer with closed microcode and nation-state relationships.

What’s needed: signing at the sensor boundary, key generated inside an attested open enclave, trust root decentralized and verifiable against auditable code.


Why Keystone

Keystone is the only genuinely open TEE framework — the others are implementations or specs:

  • OP-TEE — open source implementation, ARM’s design philosophy baked in
  • Penglai — open source implementation, Chinese state-adjacent provenance
  • CoVE/AP-TEE — open specification, no clean implementation
  • TDX / SEV — proprietary, manufacturer is the trust root

Keystone’s modular architecture, BSD-3 license, and RISC-V foundation are right. The honest gap is VM-level isolation — PMP-based process enclaves limit scalability and don’t support confidential VM workloads. This project builds the platform layer that VM extension sits on top of.


The Prototype

A $310 FPGA board with a $10 camera module, running a custom open source SoC that signs every frame in hardware before any software touches it.

Step 1 — Build the SoC with LiteX

LiteX is an open source Python framework for describing hardware. You describe the system — RISC-V core, memory controller, camera interface, DMA controller — and LiteX generates the Verilog and wires it together. It has first-class Arty A7 and VexRiscv support.

The generated SoC contains a VexRiscv core with PMP enabled, DDR3 memory, UART, camera input via PMOD headers, and a custom DMA controller. LiteX synthesizes this to a bitstream and flashes it — the FPGA physically becomes a RISC-V computer.

Step 2 — Boot Keystone on the SoC

Keystone’s security monitor runs at M-mode — the highest RISC-V privilege level — and owns the PMP registers, which define which memory regions each privilege level can access. LiteX generates a device tree describing the hardware; a Keystone platform layer consumes it and initializes correctly.

This port — Keystone on a LiteX SoC — is the first novel contribution. It doesn’t exist yet as a documented reusable artifact.

Step 3 — Create the Enclave and Generate an Attestation Key

The security monitor derives an Ed25519 keypair by combining the device secret created with qrng (ring oscillator equivalent) with the enclave’s code measurement, then loads it into the PMP-protected enclave region before sealing. The private key never exists outside enclave-private memory.

The security monitor produces an attestation quote linking the enclave’s code measurement to the public key. This quote is posted on-chain to a decentralized attestation registry.

Step 4 — The Trusted DMA Controller

The core engineering contribution. A standard DMA controller can be reconfigured by software to write camera frames to attacker-controlled memory before signing. The prototype implements a PMP-aware DMA controller in FPGA fabric — before writing any frame, it checks the PMP configuration to verify the destination is inside enclave-private memory. If not, the write doesn’t happen.

This check is in hardware, not software. No bypass path exists. The FPGA netlist is open source and verifiable by inspection.

Result: sensor → DMA → enclave-private memory. The enclave is the first software that touches every frame.

Step 5 — The Enclave Signs Each Frame

On each frame arrival the enclave: hashes the raw frame data, reads a monotonic counter (prevents rollback), increments a sequence number (gaps are detectable), bundles these with device identity, signs with the private key, and embeds the signature before releasing the frame to userspace.

No moment exists between sensor output and signature where untrusted code could modify the frame.

Step 6 — Verification

Any recipient can: extract the signature, look up the public key on-chain, verify the signature against the frame hash, verify the attestation quote confirms the key came from an open Keystone enclave with a known measurement, and check sequence numbers for continuity.

No certificate authority. No manufacturer. Trust root is the on-chain enclave measurement verifiable against the open source codebase.


Why This Is a Pathway to Open TEE Production Readiness

This project produces four reusable artifacts beyond the camera application:

1. Keystone port to LiteX — makes Keystone accessible to a much wider developer audience and establishes the pattern for future platform ports.

2. PMP-aware DMA controller — reusable for any TEE system attesting hardware peripherals. The pattern applies to network interfaces, storage controllers, any peripheral inside a trust boundary.

3. Real hardware validation of Keystone’s modularity thesis — exercises the framework in a deployment scenario it hasn’t been tested against. Issues found here improve the framework for everyone.

4. Foundation for VM extension — the next step is adding H-extension support for VM-level isolation, merging Keystone’s open framework with CoVE’s VM model. That makes Keystone viable for cloud/server workloads and directly competitive with TDX — while being verifiably superior on trust minimization.

The trust minimization gradient runs from manufacturer-rooted proprietary hardware toward formally verified open silicon. Keystone is the right foundation. This is the next step.


Hardware budget is ~$310.


On the Broader Context

Every serious confidential computing deployment today is rooted in trust of Intel, AMD, or ARM. For most applications that’s a reasonable tradeoff. For privacy infrastructure designed to resist nation-state adversaries, it’s a structural weakness.

The open TEE problem has been described before. What’s been missing is an incremental, path. This is an attempt to be that — doing the next right thing in a way that makes the step after easier.


References


Feedback on the architecture or interested in collaborating — reply here.

I went with the

iCESugar-Pro FPGA Development-Board Lattice ECP5 FPGA RISC-V Linux SODIMM Module (FPGA and Ext board)

Instead of the arty A7 because of their open/non-proprietary toolset

TEE is the wrong design here, A TEE is designed to protect trusted software from untrusted software running on the same chip. A PUF is the best choice to generate the unique keys for the chip. Open hardware and open manufacturing makes a PUF verifiable by directly burning the software and identify with NVCM.

A PUF is a Physically Unclonable Function. The implementation we are going to use is ring oscillators to generate qrng and seal the key.

So keystone is out for a single purpose chip because there is no need for a TEE with no untrusted software. And PUF is in to generate the chip identity.

The software is physically written to the chip along with the PUF derived identity with NVCM.

Open hardware gives you auditable design. Open manufacturing gives you an auditable enrollment moment. NVCM gives you immutability after that moment. All three together is what makes the trust chain verifiable down to physics.

Update: Working Prototype

The camera attestation prototype is working end-to-end. Here’s a screenshot of it in action:

(It’s a really cheap camera sensor from 2005)

And what it looks like:

The idea

An open-source FPGA captures video from a camera sensor, hashes every raw frame in hardware, and signs the hash with a key derived from the chip’s unique physical fingerprint (PUF). The signature is embedded in the HDMI output as a binary barcode. A host program reads the HDMI feed and verifies the signature in real time.

The result: cryptographic proof that a video came from a specific physical camera device — not generated by AI, not edited after the fact. The entire design is open source and built with open tools. No proprietary silicon, no black-box firmware.

What’s working

  • Live camera feed over HDMI with signature barcode overlay
  • Ed25519 signatures verified in real time by the host program
  • PUF-derived keys that are unique to each physical chip
  • Full open-source toolchain (Yosys, nextpnr, Project Trellis)

Source

Everything is on GitHub: GitHub - zenopie/TEE-camera · GitHub

All RTL, firmware, and host code is there. The whole point is that anyone can audit the design.

Thoughts

AI writing verilog is doable with many itererations, you can tell its more frontier knowlege like early chatgpt and rust. Verilog natural language is a huge unlock tho! I can’t wait to see open source hardware rock the establishment!

If you think this project was interesting and want to support my continued work delegate to the erth.network validator with airdrops every week! thanks for everyone who has delegated!


see you next time! :globe_showing_americas: