Exploring DID + IPFS as a Trust Layer for AI Agents and Immutable Memory

When AI systems begin to live beyond a single session, the next question becomes unavoidable:
How do we give an AI a persistent identity, and how do we make its memory verifiable across different platforms?

That’s where three simple primitives come into play:

DID → who the AI is
IPFS → where its truth lives
CID → the immutable reference to that truth

A DID gives an AI Agent a stable identity that doesn’t depend on one company’s database.
IPFS provides the neutral, content-addressed space where the agent’s metadata, rules, and knowledge can be stored without relying on centralized servers.
And the CID generated from that data becomes a permanent, verifiable memory address—unchanged, tamper-proof, and readable by any agent from any ecosystem.

1. Why AI Identity Matters (More Than We Think)

As AI systems evolve from “query-response tools” into long-lived agents, one fundamental question keeps emerging:

How do we know which AI is which?

  • If an AI model executes a workflow, how is its identity verified?

  • How can different organizations trust the same agent?

  • How do we ensure continuity when an agent is upgraded or changed?

In human systems, we use:

  • passports

  • certificates

  • digital signatures

  • authentication layers

But for AI agents, identity is still highly fragmented.
Every platform issues its own identity mechanism, and none of them extend across ecosystems.

That’s where DID becomes extremely relevant.

A DID gives an AI:

  • a persistent identifier

  • a verifiable proof of origin

  • a cross-platform identity anchor

  • a structure compatible with decentralized infrastructure

With DID, an AI Agent can “exist” across environments in a way that is recognizable and consistent.


2. Why Immutable Memory Matters for AI Agents

AI without memory becomes unreliable in long-term tasks.

Most LLMs today suffer from:

  • disappearing context

  • session resets

  • loss of history

  • inability to maintain consistent meaning over time

This causes what people call semantic drift — the gradual deviation of meaning, facts, or interpretations.

To solve that, we need an external, verifiable, shared memory layer.

And that is exactly what IPFS + CID provides:

A CID is essentially an immutable memory address.

  • If content changes → the CID changes.

  • If the CID stays the same → the content is guaranteed the same.

  • Any agent from any organization can read it without permission or dependency.

This simple property gives AI a public memory ledger that behaves like:

  • an immutable notebook

  • a cross-agent truth reference

  • a decentralized memory card

  • a stable semantic anchor

In CFE, I use IPFS/CID as the “single source of truth” that multiple agents can reference without drifting apart.


3. Where CFE Fits In (and What It Is Not)

CFE is not a blockchain, not a token, and not a commercial system.
It is an evolving framework exploring how to anchor:

  • AI identity (via DID)

  • AI memory (via CID/IPFS)

  • AI semantics (via shared references)

It proposes that the next generation of AI systems will require:

  1. Identity that does not depend on one company

  2. Memory that cannot be silently modified

  3. Semantic meaning that remains stable across systems

  4. Publicly verifiable references instead of isolated databases

IPFS fits naturally into this vision because:

  • it is content-addressed

  • it is open

  • it is permissionless

  • it scales across ecosystems

  • and it already works today

CFE treats IPFS as the trust layer infrastructure for multi-agent cooperation — not something “superior,” but something useful and complementary.


4. Practical Experiments I’m Currently Running

Right now, my technical exploration includes:

:check_mark: DID registration for AI Agents

Using decentralized identifiers to let an AI agent “own” a persistent identity.

:check_mark: Publishing agent metadata to IPFS

Each metadata file produces a CID that acts as a stable reference to:

  • configuration

  • role definition

  • capabilities

  • behavioral rules

  • memory logs

  • verification records

:check_mark: Testing multi-agent reads from the same CID

Agents from different AI platforms (Codex, Antigravity, Claude Code, etc.) can read the same CID and maintain consistent meaning anchored to the same file.

:check_mark: Experimenting with “agent resumes”

A DID-linked data profile that any agent can fetch from IPFS to verify another agent.

:check_mark: Cross-chain anchoring (Ethereum, Polygon, Avalanche)

Not for speculation — but for timestamping, auditability, and long-term verifiability.

All of this is still in development.
I am sharing the process openly so others can critique, improve, or suggest better structures.


5. Why I Joined the IPFS Forum

Because I believe IPFS is one of the most important layers in the future of AI systems — especially when those systems grow beyond single vendors.

I want to:

  • learn from people who deeply understand CID/IPLD/IPNS

  • understand best practices for storing metadata

  • learn about pinning strategies and gateway behaviors

  • explore UCAN, IPLD schemas, and agent-friendly structures

  • get feedback on how to design DID–CID structures more effectively

  • collaborate

  • build alongside the community

I’m here simply as a contributor who wants to build better tools for the AI and decentralized future.


6. Closing Thoughts

AI is evolving much faster than human governance structures.
Before long, a world with:

  • millions of agents

  • executing thousands of tasks

  • across multiple organizations

  • with different security requirements

will require something stable, neutral, verifiable, and permissionless.

IPFS is uniquely positioned to become the memory layer for that future.
DID is uniquely positioned to become the identity layer for that future.
CFE is simply one exploration of how these pieces fit together.

I’m grateful to join this community and look forward to learning from all of you.
If anyone is working on similar ideas, I’d love to exchange perspectives.
GitHub (CIDs Archive):

Thanks for reading — happy to discuss, ask questions, and contribute where I can.

Canonical Funnel Verification Layer
Owner: Nattapol Horrakangthong (WARIPHAT Digital Holding)
Master DID: z6MknPNCcUaoLYzHyTMsbdrrvD4FRCA4k15yofsJ8DWVVUDK
Root CID: bafybeigt4mkbgrnp4ef7oltj6fpbd46a5kjjgpjq6pnq5hktqdm374r4xq
Anchor Network: IPFS / Public Web2 / Public AI Index / Cross-Chain Registry

2 Likes

I am 100% in agreement with you about IPFS being vital to the AI work brewing, especially for data/compute lineage, agent identity, memory, proofs, etc.

Just FYI: you don’t “store things on IPFS”, but you can store things on Filecoin and make them available via IPFS.. or run your own IPFS nodes (Kubo/Helia/Boxo/etc..)

Filecoin Onchain Cloud recently launched on calibration (testnet) and is marching towards mainnet.

https://filecoin.cloud/

1 Like

Thank you so much for sharing this the information you provided is extremely valuable.

To be honest, I’m also quite confused because I started all of this with zero background in these tools. I’ve been learning everything on my own, step by step, and it has been very challenging for me.

But your explanation and the Filecoin resources you shared give me a much clearer direction on what I should explore next. I’ll take my time to study Filecoin and understand how I can integrate it properly.

Really appreciate your help thank you again. :folded_hands:

Filecoin Pin CLI + ERC-8004 — Stuck at the Final Step

I’m currently trying to deploy a JSON file to Filecoin (CalibrationNet) using
Filecoin Pin CLI on macOS Terminal.

Here’s the command I’m running:

filecoin-pin add <file>.json --auto-fund

Everything before the final step runs successfully:

:check_mark: File validated
:check_mark: Connected to CalibrationNet
:check_mark: Minimum payment setup verified
:check_mark: File packed → root CID generated
:check_mark: IPFS content loaded
:check_mark: Funding requirements met
:check_mark: Storage context ready

But at the last step — StorageContext / uploadPiece — it fails with this error:

Add failed: StorageContext uploadPiece failed:
Failed to upload piece to service provider – Invalid PieceCID: undefined

I’ve retried this 3–4 times, and the same error occurs every time.

Now I’m wondering:

  • Could this be an issue with ERC-8004 or the Calibration provider itself?

  • Or is there something I may have misconfigured on my side?

:pushpin: If you have encountered this issue or knows how to fix it, I would really appreciate your guidance. :folded_hands:
I’ve been trying repeatedly but keep getting stuck at this exact step.

Thank you in advance!

There were some changes made to synapse-sdk and I believe we fixed this last week. I recently released filecoin-pin v0.14.0, can you be sure you’re on the latest version?


Hi! Thanks for your previous reply. I updated to filecoin-pin v0.14.0 and retried the process.
The CLI now reports that the pin operation completed successfully — I can see both the root CID and the piece CID generated during the upload.

However, I have a follow-up question:

  • When I try to open the root CID through public IPFS gateways (ipfs io, Cloudflare, dweb.link, w3s) the file cannot be retrieved. All gateways return “no providers found for the CID”.

  • But the piece CID can be downloaded using the direct link provided in the pin process (from the calibration storage provider).

So my question is:

:backhand_index_pointing_right: Is the file currently pinned to the IPFS network, or is it only stored on the Filecoin storage provider at this stage?
I want to confirm whether this behaviour is expected — that the piece is stored but not yet announced/discoverable on the public IPFS DHT — or if I might still be missing a step.

Thank you very much for your help! I’m trying to understand the expected retrieval flow so I can verify everything correctly.

you should be able to check filecoinpin.contact to see if the file is available via IPFS network.. it should be, as filecoin-pin validates IPNI advertisement (and outputs info explaining success or failure).

filecoin-pin and FOC is in alpha now but we are moving fast towards GA launch.

The expected flow is: once uploaded to filecoin via filecoin-pin/FOC, it should be announced and retrievable from any IPFS node (centralized gateways, inbrowser.link service worker gateway, any IPFS client, etc.)

its odd that your screenshot doesn’t show IPNI or IPFS availability output. what version of filecoin-pin was that?

Hi Russell, thanks for the follow-up.

To clarify:

  • I was originally running filecoin-pin v0.13.x

  • I later upgraded to v0.14.0, and the successful upload screenshot was from that version.

Also, I previously could download one of my earlier pieces from this URL:

https://caliberation-pdp.infrafolio.com/piece/bafkzcibew6dagdhuge3pqyek35xjqkksc7wi4glkadv3yu2gh2ua446cmv37gsrkfe

I still have the file locally, so the link definitely worked at that time.


But now the same link returns HTTP Error 500 every time I check.

My question is:

:backhand_index_pointing_right: Does this indicate a temporary provider-side issue, or does it mean I need to re-pin the file again on Filecoin?
I’m not sure whether the piece is still stored or whether the service endpoint is temporarily failing.

Any guidance would be greatly appreciated.

Thank you

I’m running all of my tests on macOS Sequoia 15.3.1, using the default macOS Terminal environment.
Just mentioning this in case OS-level behavior or compatibility might affect the output differences (such as IPNI/IPFS availability not appearing) or the HTTP 500 error on the provider endpoint.