Out of curiosity, why can’t files be encrypted by default? Or keys encrypted by default? Where all content needs to be accessed by both it’s hash and it’s decryption key? I understand that we can encrypt content rather easily before uploading to the network, but but wouldn’t it enforce best practices if everything were encrypted by default? Maybe for public content (Wikipedia kind of stuff for example) the key could be left empty? Or maybe given some sort of standardized key?
What I’m worried about it that files stored on your node seem (correct me if I’m wrong) to be indexed and available just by looking at the ipfs GUI. For repressive regimes it seems if the authorities can look at my node and see a list of encrypted files it puts me in a sticky situation- no?
I’m using IPFS in a sensitive project now, and while it works fine, it would be nice that no host get in trouble for what others upload.
It wouldn’t necessarily be a best practice to encrypt everything indiscriminately if you’re building a protocol that’s trying to prioritize performance. It’s unclear to me why you think that IPFS should encrypt everything by default rather than leave it up to the user, or applications that are built on it. The only benefit I can think of would be plausible deniability.
How would this be any different than IPFS encrypting everything by default? If the repressive regime has banned encryption, then I’m not sure why they’d care whether it was IPFS that encrypted the files or something else like veracrypt or PGP. Or are you saying that in this hypothetical scenario you’ve only chosen to encrypt stuff that you think the repressive regime wouldn’t like and they’d wonder why only some of it is encrypted?
The GUI shouldn’t be exposed to the world by default. You have to expose the web UI to the internet for the internet to see it.
I think it’s more about quasi repressive regimes like the United States. It’s a free-ish country but it can occasionally act against it’s own stated values. Of course in a truly repressive regime, yes, encryption would be outlawed, but in a countries like the US or Europe you can by an large get away with encryption not equating to guilt by default.
Well the performance of the network shouldn’t depend on the data being encrypted no? Plaintext versus encrypted plaintext doesn’t change anything in terms of throughput no?
I’m thinking more about the example with messaging providers like telegram, facebook messenger, whatsapp, signal, etc… Some like facebook messenger do offer end to end encryption, but as it’s not on by default (and not so easy to turn on in the first place) people generally don’t end up using it. Whatsapp however has encryption by default thus everyone uses it. Maybe it’s not the protocols business to cajole lazy developers, but I could certainly imagine a level of confusion down the line where end users are uncertain if this or that developer has properly implemented it’s own style of encryption.
Well, sure. If we ignore the initial encryption step which only happens once per immutable block, it shouldn’t affect throughput if everyone is just passing around the encrypted data and nobody is actually using/decrypting it. I think it’s more likely typical internet users would be using the data that’s getting sent to their node rather than just storing it (without decrypting it first), and decryption to turn the ciphertext into something they care about isn’t computationally free.
It’s unclear how the key for each block is supposed to be shared and stored in this hypothetical scenario, but I can imagine storing and transferring those would come with some overhead as well. There are a lot of blocks that would need keys and key mappings stored for them, especially if each block has its own key – which I’m not really seeing a way around.
Encrypting all blocks by default would also destroy block-level deduplication as a feature for IPFS, unless everyone encrypted everything with the same key by default (which makes no sense).