Oddbean new post about | logout
 🤔 Will there ever be a "delete" button on @damus

Turns out, the "delete post" function is not even possible despite that some clients "allow" it. 
 @jb55 and @vrod address this question head on and explain why there's no second thoughts on #Nostr 🧐

https://m.primal.net/HTCj.mp4 
 “Delete” in Nostr means asking relay “please delete this”. But relay is not required to do that. 
 Delete is and has always been a core part of the nostr protocol. Essentially Nostr delete is just like on centralized platforms, not all copies of content are deleted when you request servers delete something. 

I fundamentally don’t understand why nostr app developers keep insisting on disabling delete and pretending that nostr doesn’t support it. Delete works in nostr and it is a large part of why we switched from secure scuttlebutt to nostr. Delete can’t work in scuttlebutt, but does work in nostr. 

Users should be informed that there is no way to force relays to honor your delete requests. And i think dishonest relays should be something we can rate and track in nostr clients. But to say we don’t have delete or edit in nostr is both incorrect and undermining the adoption of the protocol. 
 because there will be clients that cache notes locally and don’t pull delete notes. There will be so many clients and caches that the idea of delete will be more of a lie over time. Just because relays implement it is ignoring the bigger picture. 
 Would relays be incentivized to delete posts because they pay for space? 

Although for popular notes, they may actually want to keep them and ignore the delete request. 
 It is not about storage cost. 

It is about getting thousands of disparate actors controlling clients, and relays to do the same thing. 

Calling it “delete” is effectively (mis-)implying deletion. 

Request to delete is not delete. 
 Clients not pulling delete notes is a bug imo but I think your point still stands. If you request your subscription relays to delete your note, best case is that they delete it from their own databases. There could still be backups of that note somewhere, in caches or in other peoples backups which means that the event isn't deleted from the world at all(as I expect most people to think note deletion is). The note can still be shared around out-of-band, we can still see the contents and we can still verify the signature. 
 Clients should try their best to support delete but users need to understand the limits. How many people got burned believing in snapchats empheral messages. 
 Beyond this, reposts can encode deleted notes directly. So all it would take to defeat deletion is for someone to write a bot that listens for delete notes and reposts them. 
 The point of a delete isn’t exactly to make sure that we can algorithmically prove the content is gone forever for everyone. The point is to actually indicate the person who created it now wants it retracted. Requesting delete is a social signal to say that the creator no longer wants to endorse authorship of this content and retracts it. Most of the time we should honor that social request by removing the content from our servers, databases, and clients. That way new people won’t find the content and thing the author stands by it. Maybe you’re deleting a typo, or a drunken post, or some revenge porn, but being able to say, oops take that down, is important. 

The confusion comes when nerds get pedantic and argue that you can’t actually delete anything or be absolutely sure it’s gone. It’s true, but that’s also true of a conversation you have in person, the people there remember what you said. But you can apologize after and say “it is forgotten.” Which doesn’t mean it’s actually forgotten or deleted from people’s memories. More, there is an agreement that it’s been retracted. Someone could have recorded that inappropriate conversation and refuse to honor the retraction, but being able to signal that it’s “deleted” is very important. 

What’s more, most clients and relays should honor a user’s request over their own content. If we believe in individual autonomy and self sovereignty then we should respect when users want to retract something they’ve published. 

There are edge cases where we have content reports and reposts which include the original content. I think the solution is to simply include a reference to the content rather than embedding it. There are also cases clients retain a cache of the content, and would then need to honor the delete request. In some cases that might be hard but it was easy to implement in nos and I suspect it’s easy if you’re not using an immutable db. 

Perhaps folks who are concerned about the completeness of “delete” should update their UI to say “retract” or something similar. But then in your FAQ explain that retraction request’s are a kind of delete but because it’s a distributed system there is no way to ensure it is deleted everywhere. 
 I agree with this note 
 Great framing 💯 
 😎🤙 
 “Remove” would be OK; then a reminder message can say that it’s only removed from current client. One thing about bad memory is that you actually forget and forgive, that is in that case a blessing. 
 This is why I don't actually delete renderable events, I just mark the message with strikethrough, put a DELETED message on it, and show the deleted reason (if that was given). This way users can get that social signal of "I retract this", but also users don't feel like they just missed out on something that suddenly was censored before their eyes and come bitching to the developer about censorship.

As it turns out, not actually deleting non-renderable events is a nightmare, so I actually delete those in the client local database. 
 So call it something else and explain it? 
 I propose “retract” for folks who don’t want to call it delete. 
 How about erased? As in erased from my timeline and conscience 
 Wut?

Not being able to delete is a core tenant we rely on.

Delete is for losers. 
 Wouldn't be surprised if deletion requests top the list of spam requests 😂 
 The delete requests need to be signed and only work for the npub’s own events. I’m not quite sure of the way spammers would use them.  
 They create and delete events 
 I mean relays already need to have spam detection and intervention to decide which events to host. Rapid flow of create and delete messages is just another kind of spam. Also if someone wants their messages to disappear, they can just use the expiration timestamp when posting.  
 delete as much as you can, eventually it won't be enough  
 Delete on nostr works like this, but without the weapon.

https://image.nostr.build/b8e917998f5fe69dc8e72628d4b4bff5a0665ca8ed6aed5f0628e9224e39f05e.jpg 
 😂 
 Make people pay to delete? 😆 
 This is correct 😂 
 External audits of functionality of relays could be automated. Would be good. 
 cc  nostr:nprofile1qqswjxp3lsjywa4p7zr90m4mxyfv2ghfxckxxg8jgfqtcsht3jwew0qzpwn8k  
 I like your argument about how the situation is anyways similar to current centralized platforms. Someone may have screenshot your tweet before deletion.

On the other hand I’d like digital content to be storable forever canonically, e.g. by having a way-back relay which is allowed to store things forever. 
 everyone insists on using blaster and then complains they can't request a delete? 乁⁠༼⁠☯⁠‿⁠☯⁠✿⁠༽⁠ㄏ 
 Pretty good to note that in cenrtalized platforms:  - Deleted note might still be shown for other people for extended perioid of time
- Centralized service provider will still propably keep a copy of your deleted note
- Someone else might still keep a local copy of your deleted note (screenshot at least), and re-broadcast it
 
 Cause nostr developers are fucking lazy asses who don't give a fuck.  Isn't this obvious? Look how shitty all the apps are 
 In fact, delete option is necessary because we only talk about the censure resistance and assume that's privacy, but privacy is have the capacity of deleting your content for security too. There are too many malicious people or repressive parties. We could protect our data from them. We could decide over our data. Otherwise, what's the difference between Nostr and big tech? 
 It’s impossible to have a reliable control of data at the same time as decentralized sharing. These two paradigms do not go together. You either have a tight control of the data or you have a decentralized network where no single party controls it. The only alternative I can think of, is if every note is strongly encrypted and only you have the decryption key that you share to specific individuals and have control over how they access your notes.

For all the shit we give big tech, they have a much tighter control and are able to delete “notes” if they choose to do so. This ability comes from them retaining the control and legal agreements with the partners that have access to the same data. 

You can’t have a cake and eat it 🐶🐾🫡 
 I hope your idea would be implemented in the future

note1h7udsq3utnz0nv59z8yc7yx3rvthehzk9vge9uzk3s7zyf4srqjq8psrtz 
 I understand the desire to have control of your data. But fundamentally, the way the universe works, data about you is not actually your data - it belongs to and is controlled by whoever is in possession of it.  And that could be a hell of a lot of people. To delete properly, we would have to control everybody's computers and even confiscate the brain cells of the people who saw the note before you pressed delete.

IMHO we need to make people more aware of the fact that anything you put out into the universe cannot really ever be taken back or undone. There is no time travel to go back in time. There is no way to unsay something (and believe me I have said things that ended relationships that I really wanted to unsay). Be careful what you say because in reality, as it is in nostr, there is no delete.

Email has no delete. And email didn't fail because of it.

In many situations, more speech is the answer. Add more information to clarify whatever you said that was wrong. That is essentially what nostr delete actually is - a message indicating that you retract something. 
 Exactly the right take. 
 One day -- probably one day very soon -- a relay owner will be served with a subpoena to turn over all messages on that relay -- even deleted messages (We all know that "delete" does not mean erase).  That will be an interesting day.

CC: Rabble
CC: Genri 🇨🇺
CC: damus
CC: jb55
CC: btcviv
CC: vrod 
 I don't see why a relay wouldn't actually delete (unlike a client). My future relay will. Of course they don't have to and can't be forced to so we don't rely on it, but practically speaking deleting is in the relay's best interest in many ways: saves space, saves liability, etc.

If I were law enforcement, I'd just pull all the events over nostr and keep my investigation on the down low. Relays are (generally speaking) willing to serve everything to anybody. I guess DMs have changed that somewhat. 
 The sad part of it is that we probably won’t know. Between 1999 and 2005, and less actively after, I helped run the indymedia.org servers. We got tons of subpoenas and worked with EFF to fight them. Our servers were configured to never log ip addresses or any other PII we could find. Indymedia was something across between an anarchist 4chan, news site, and twitter. 

One thing the court orders often include is language which prevents the recipient from stating publicly they’ve been served. Organizations tried warrant canaries as a way of getting around those restrictions but that only works until the first time you get served. 

We should encourage and enable the easy use of tor in clients and for relay operators. The server part is easy, we should expose onion services with all relays, plus document and promote them. Getting client support is harder, right now you can use the web apps from a tor browser or setup amethyst to use tor. As far as I know none of the iOS apps support it directly yet. Doing this will make nostr:npub1sn0wdenkukak0d9dfczzeacvhkrgz92ak56egt7vdgzn8pv2wfqqhrjdv9have a better user experience as onion services are both faster and more private than connecting from tor through a public gateway. 

It would be great if all relays had clearly accessible privacy policies and they followed riseup’s as a model. They care about privacy. 

https://riseup.net/en/privacy-policy

We should make changes to the relay software so it logs no PII, like ip addresses, by default. Admins can enable those logs for debugging or dealing with a security issue, but they should be off by default. 

We should encourage nostr client developers to use tools and libraries that don’t connect to or log on third party services. I think this is more important for when people are publishing vs consuming. The trade off of using a CDN is probably worth it, but your mileage may vary. 

And when you do get a knock at the door call the EFF or other legal group for representation. Don’t assume your normal lawyer knows the law in this case. 

The EFF has released a legal guide for people running fediverse servers and almost all of it applies to people running nostr relays. 

https://www.eff.org/deeplinks/2022/12/user-generated-content-and-fediverse-legal-primer 
 Tor is the way. I'm casually working on new relay software - it won't log any PII.  Thanks for the links. Great stuff. 
 I understand your point, but I am referring to not leaving physical evidence that can be used against you. We all have the right to our security and to remain silent. I don't think e-mails are a good example because they are not designed for the use that is given to social networks. I already said it: the only enemy is not the big corporations because a single person, a political party, can be on your trail to use your data against you, and everything that happens in Nostr is public.

It's very nice that users have to think well before posting because it points more to the responsibility of each one and such, ok, but that's not what I'm talking about. I'm talking about an ethical dimension of the right to silence, to really have my data completely at my disposal, because otherwise I really don't see any difference with big tech because at least they do delete your data at customer level. Something has to be done in Nostr and in web3 in general, some technological solution because the philosophical vision is not enough, people want more practical and precise solutions. 
 There is no technical solution in a decentralized protocol. I think that is provably so. I'm sorry that nostr cannot provide you this "right". 
 If anything posted online is impossible to be erased permanently, then it might be a good approach to advice users the true fact(if it is) , so let themselves to decide what kind of content, writings or posts are appropriate. After all, digital social media is not an exemption for “Lethal Bugs”. Responsibility is for everyone, platforms, users, regulators etc, anyone in this ecosystem. 
 Interesante punto de vista  
 This is why I’ve largely stopped interacting on NOSTR. The ability to edit or delete prior posts is crucial to a platform that prioritizes privacy and expressive freedom. 
 I have never felt the need to delete or edit anything and I’ve been on here for 2 years now. I will never understand this. nostr (in the social network variety) is not a private network and there is no expectation of privacy. All fuckups are public and thats ok. People are held accountable that way. 
 Yes 🙌 
In real life we can’t delete so why should we delete here 💜 
 I think it’s a question of what is Nostr for. And the protocol is designed for uncensorable speech. Unstoppable broadcasting in the face of totalitarian governments or social platforms. 

Delete-able notes are a bug in this context. 

That’s game changing for journalists, human rights advocates, and political dissidents. 

Nostr is just soooo close to perfect for many other social media use cases that sometimes it’s tempting to want comfortable features. 
 I’ve not felt the need to edit or delete any of my posts here either, but our comfort with that is not really the point. People are choosing decentralized platforms because they desire more self-governance on the internet, and when they’re told that they can’t even delete a post here, which is a fundamental feature pretty much anywhere else, they’re going to sour on NOSTR’s flavor of decentralization. 

Not to mention some people will inevitably say things that they later realize were incorrect or inappropriate or even hateful, and knowing that they can never fully disengage from their previous posts will make them less likely to use NOSTR long-term, especially if they have lots of followers. 
 Nostr is not private.  Yeah thats a problem, anonymity is useful and important for many. 
 It really is a 'request delete' function. I've done this where some relays will honor the request, and others will not. As long as the one relay still holds the note, it will still appear in most clients. 
 The clients really need to handle the delete events also I think, assuming that's what the user wants. They could also have an option to show events that have previously been deleted. 
 sometimee relays do not allow you to send two identical messages in a row. It will be more convenient if this practice becomes more popular, because nobody wants duplicates in their feed. 
 Anything that relies on client developers implementing things 100% correctly across the entire ecosystem will not be reliable. Implementing delete as a client dev is not simple either. I’ve already done it on damus web and it was the trickiest piece of code. 
 Delete is tricky because no only do you have to delete the event when the delete message comes in, but you also need to reject the deleted note if it comes in again later, so for every note that comes in you have to check "do I have a deletion request for this note"? which isn't cheap to do. 
 You could dump the ids of deleted events in to a bloomfilter which is fast to check and small to store. 

At least in @nos.social when you view a post it makes a request to relays to get replies, labels, and reports. Adding a request to see if the author deleted it isnt hard. There would be a gap where the user might see then event only have the client need to update with a “the author of this post has deleted it.”

Or clients just rely in relays that a user is connected to being well behaved. We for example won’t post a note with an expiration timestamp to relays that don’t announce their support for the relevant nip.  
 Delete is tricky because no only do you have to delete the event when the delete message comes in, but you also need to reject the deleted note if it comes in again later, so for every note that comes in you have to check "do I have a deletion request for this note"? which isn't cheap to do. 
 You could dump the ids of deleted events in to a bloomfilter which is fast to check and small to store. 

At least in @nos.social when you view a post it makes a request to relays to get replies, labels, and reports. Adding a request to see if the author deleted it isnt hard. There would be a gap where the user might see then event only have the client need to update with a “the author of this post has deleted it.”

Or clients just rely in relays that a user is connected to being well behaved. We for example won’t post a note with an expiration timestamp to relays that don’t announce their support for the relevant nip.  
 nostr:nevent1qqsyrtx9tlyh2az3z7xdm6d7l2hl09ez7x3fua89mj60wt9qu3m8g5qpp4mhxue69uhkummn9ekx7mqzyzqn5t8a5jdgsq0zr0kncguqu7r05n9dw57azkc67xdr02q4272hjqcyqqqqqqg6gw6vl
This might help