Oddbean new post about | logout

Notes by Rabble | export

  To put a close on the nostr:npub14h23jzlyvumks4rvrz6ktk36dxfyru8qdf679k7q8uvxv0gm0vnsyqe2sh topi... 
 Thanks for taking the time. This needs to be a lot clearer and there needs to be better documentation about what’s going on. Until we get the app UI to make sense and docs we’ll continue to see this kind of confusion and conflict.  
 What nostr clients create and / or use kind 1984 report events? 

Nos creates them and uses them as content warnings only if the report is from someone you follow. 

nostr:note10j8wwpj330xeahgq4uassfeskuuw2c29ewsgv6m4q5hut0e70vuq52vjye  
 That is how some custom feeds work in BlueSky. And given the way custom feeds work there (they ar... 
 We’ve got custom feed support via DVM’s but they’re often not reponsive enough and we need better integration in to the apps, only a couple support the custom DVM feeds. They work so well for bluesky, and they work for nostr too, but we even have the abillity to have both paid and free custom feed algorithms.  
 Doesn’t even need to be negative. Just something you don’t care for. Because majority content... 
 Yeah that’s what we found, even folks who liked or were neutral about bitcoin wanted to see posts about other stuff. We don’t need the enthusiastic bitcoiners to stop doing their thing. We need other folks and easy ways to discover a diversity of content and people.  
 Is there a reason we couldn't have #channels in nostr, where if a note contains some word, it is ... 
 I think it might be as simple as we don’t encourage it in the apps. Look at how warpcast works. Simply putting it there in the post UI does a lot to change how people post. At the moment groups / channels / communities are a kind neglected part of Nostr. 

https://image.nostr.build/b409fa439133fc19822cb8381becb15b5d4e6dfa2a574118dc70170de821b37b.jpg

https://image.nostr.build/e7a1da793fabb501da3272647879824669fbb44cd7105758fb456ab43f573c3d.jpg 
 Is this a feature specific to nostr:npub1pu3vqm4vzqpxsnhuc684dp2qaq6z69sf65yte4p39spcucv5lzmqswtf... 
 The mobile apps are required to have reporting functionality by Apple and Google. Some web apps have it and some don’t. Each app also decides what to do with reports, some with settings and others are just hard coded. 

The reports are for people who want them and relay operators to decide what content they want to see / host. Some people are want to be trolls but then get upset when other people troll them. They want freedom for themselves but not other people. 

Just because there is a Nostr report doesn’t mean there is any centralized authority which could take action on these reports. That’s not the way Nostr works. 
 nostr:npub1wmr34t36fy03m8hvgl96zl3znndyzyaqhwmwdtshwmtkg03fetaqhjg240 you have a bot that reports... 
 We’ve thought about a bot that looks for reports and then lets both an AI AND a person double checks them. The ones it agrees with could be republished with some more info.  
 If you can’t figure out how to handle an open source bot which clearly identifies itself and follows the Nostr specs then how are you going to handle a malicious actor that is intentionally deceptive?


nostr:note1j74qxdud2tcryvnhmhl6crp8qv7hl3duw8jghayjwcx5pejz9v8suydlha  
 I stood up on stage at both Nostr conferences and quite literally pointed out examples of offensive and harassment on Nostr. Almost every Nostr developer saw it in person or watched the video. Everyone was very polite, even when we didn’t all agree, which is fine. Nobody assaulted me. I’m going to be speaking at Nostriga in August. I expect the same reaction then, some agreement, some people who disagree, and a lot of productive debate. 

I’m sorry to hear that you’re such a snowflake about what some stranger says that you’ll threaten violence and go to jail. Personally I’m not so easily offended. You’re welcome to run our open source nostr bot and publish a report on everyone of my notes. I don’t threaten people with violence to silence their speech like you’re doing.  
 He was the one saying he would happily go to jail… I think it’s silly and was pointing out that I’ve safely talked about this showing problematic nostr posts to a room full of Nostr users. Nobody got so upset as to use violence.  
 Bots need a human hand behind them. Are humans infalible? I think not. Will the bots do dumb shit... 
 We talked to about a hundred current and former users of Nostr. There were many themes that emeged. One was that people who didn’t know about bitcoin kind of liked being able to do zaps. Another was that they wanted to see content not talking about bitcoin, using it was cool, but constantly talking about it wasn’t their jam. Many users felt like the branding of Nostr as being anti-censorship and radically pro-free speech would draw a crowd they weren’t interested in being part of, that was a perception issue. Many users, and in particular women, had been repeatedly harassed and stalked. They were fine with people saying the most offensive stuff but they didn’t want to see it in their notifications, dm’s, on a global feed, in trending posts, or as replies to their posts. 

The content labeling and report functionality that people are complaining about are ways for  Nostr users to control their own experience. We created content warnings and client filtering options because it solved a problem that was driving people away from Nostr. Now one important thing, Nos and most Nostr apps only display and use reports from people you follow. To use and display all reports and labels from any relay and any npub is a bad idea. It opens you up to anyone who wants to spam you. The only folks who should see or care about @Reportinator are the people who follow it. Just like the dm’s are divided up between people you follow and randos. 

Nostr is about you having the sovereignty over what you publish and what your relays host. You choose which client you want to use, that decides what data you see and which algorithms are used to sort the posts. You do not get a say over what other people publish or what they host in their relays. Some people want content labeling and reporting. Some people want to use a bot which does AI labeling. Is it perfect, far from it, but it is part of the design of Nostr that there isn’t some centralized system of control. That means if other people want a bot that labels content, all you can do is choose not to use that content. You can not censor the bot off of the network.  
 Google's fired 28 employees because they held a protest asking the company to change their policy around Israel. 

note1ukazeq7qfyzqyruaghtjcaw5rm6cfpm499cx6z5nps20xygr92xq72evya 
 "I don't want to see that note on my feed." - Scroll on.

"I don't want to see that npub on my fe... 
 Why is it that you think that content reports like this is a centralized approach? Is it perhaps because you've chosen an app which centralizes things that it feels centralized to you? This is a design decision from Amethyst that shows you all reports as posts and replies to your feed. You're choosing to pull this in. Nobody's pushing these notes to a big public relay, or in to your client. There are many bots out there, you're choosing to care about them. No other apps on Nostr show random people's reports. 

What about a bot which looks at content it finds in it's part of the nostr network and makes notes when it finds offensive language with content warnings is centralized? What about it is censorship? Do i need to send you a dictionary? 
 It’s open source and all the published report events are public. The code is here: https://github.com/planetary-social/cleanstr

You can get the reports with any if the nostr cli tools or reports viewers web apps. 

This is transparent and decentralized. If you have freedom to publish on nostr then others have freedom to publish as well. Unless what you want is freedom for yourself and not other people.  
 Sounds like you’re using a buggy Nostr client. Try a different or change which relays you’re using. Nobody but you can change your contact list. Maybe find the bug in the software you’re using and fix it. The vast majority of nostr apps are open source and welcome contributions.  
 I don’t think that we should practice implementing censorship tools on nostr. It’s kind like ... 
 I believe some Nostr apps support this. We're planning on adding it to  @nos.social.  You can also mute, follow, or ignore any nostr feed, including those like  @Reportinator who are making reports you're not interested in seeing.  
 There's been a lot of discussion on Nostr of folks who are walking away from Nostr and why. 

note14nnde5lzqh0dgs2scywtys5h0fdpyldr4jg77dsnk3msm9x2ch5qjm8kap 
 Oh you do, and please do.  Please build your filters and promote them. But don't expect people wh... 
 The US Constitution 1st Amendment is about what the US congress does with regards to actions by the US government covering Americans. Nostr was not created by Americans, it's not run by Americans, and it's definitely not a project of the American government.  
 joey.nos.social 
 Welcome to nos! We’d love your feedback on your experience with the app.  
 Free speech does not always make you feel good.  

This is a product of living in a bubble for so... 
 So why are you all upset about my ‘crass’ labeling of your speech? Why do you get free speech to say what you want but somehow I shouldn’t have the same free speech rights to comment on it? 
 👋 So my first question: is it pronounced noss or nose? 
 Nose... it's based on the Portugese / Spanish word for "us" which is "nos" and Nostr is a word play for 'our-str' or 'us-str' like napstr....  
 What I wanna know is who and/or what is this reporting to? Is this just a douche bag, or are the ... 
 It’s this code: https://github.com/planetary-social/cleanstr which publishes to its own relay. Nobody asked you to host these reports. Nobody told you to download them. Nobody told you to display them. Your client is connecting to my relay and searching out these reports. Don’t like it? Choose a client that doesn’t do that. Or learn to code change it. It’s not my problem that you don’t understand how Nostr works.  
 The Nostr client requested these events and choose to display them. Most clients don’t do that but you choose the one that does. I suggest trying a different client or making your own if you’re unhappy with the one you’ve got.  
 Hey nostr:npub1wmr34t36fy03m8hvgl96zl3znndyzyaqhwmwdtshwmtkg03fetaqhjg240, you're a fucking faggo... 
 Where exactly is the control? You’re client is connecting to my relay and specifically requesting these reports. Don’t want them? Stop downloading them from my relay. Nobody here is telling you to display them, nobody is telling you to host them on a relay, this is your choice. 

Don’t like it? Change your app or mute nostr accounts you don’t like. Stop demanding that I be blocked or shadowbanned because you don’t like what I’m saying. That’s some centralized platform shit.  
 Perhaps you should consider meditation as it seems like you’ve got a lot of angry feelings.  
 Repeat after me: any attempt at globally imposed moderation service is going to be a failure, all... 
 All  @Reportinator does is provide an opt in way of choosing one kind of content labeling. Hell at this point we don't even opt-in the  @nos.social users. It's got 24 people who follow it! 

I think there should be many different ways of doing moderation. All that this bot does is provide one take, which users can choose to use or not. There is no centralization in a bot with 24 followers posting to a single relay. 

Do you really think that all the Nostr developers would do all this work only to say, oh hey, let's have centralization after all? 
 I would say we have to let him do what he wants on his relay. I’m assuming other relays can jus... 
 Exactly, this is one nostr bot which posts to our own relay… 

https://github.com/nostr-protocol/nips/blob/master/56.md

Read the spec. Nostr has no magical centralized authority which is going to come somehow remove content from your app or relays. 

We’re labeling content so users can make choices about their own experience on Nostr. Apple and Google require the label process be called a “report” so that’s why it’s called that. That’s also why it’s called a 1984 event. 

But nobody’s forcing you to use apps which pay attention to these events.  Nobody’s asking you to connect to the relay that host them.  
 I'm reporting nostr:npub14h23jzlyvumks4rvrz6ktk36dxfyru8qdf679k7q8uvxv0gm0vnsyqe2sh for SPAM. 

I... 
 We only publish the  @Reportinator events to one relay. If you don't like it, don't connect to the relay and mute the account. These reports aren't being spammed, you're client is seeking them out. We do not publish these to anybody else's relays. 

A major problem is Amethyst is not using the Web Of Trust model instead is searching for anybody regardless of your connection to them and highlighting the labels both as replies and new posts. These are type 1984 events, they're labels not replies. 
 Here's the spec... it exists because Apple and Google require it.  

https://github.com/nostr-protocol/nips/blob/master/56.md 
 This comes from  @jb55  who got in a big fight with Apple about it when they blocked Damus from updating until they added reporting. Nostr is open source, use a non-mobile or self compiled app, have it simply not request or display any kind 1984 events. Don't like it, build software which doesn't support it. If you've got freedom to say whatever the hell you want, I've got the freedom to reply how the hell I wan. 

nevent1qqsp5rfrpn7llcky7js3ppm85tl9pmznkjzy56l3fdcw69nzytlextqzyqewrqnkx4zsaweutf739s0cu7et29zrntqs5elw70vlm8zudr3y2s35r7n 
 Here's my talk from the last Nostr conference about why we built  @Reportinator. 

https://www.youtube.com/watch?v=9pGZ2epF8ZY

 
 In our user research, we found that most people joining Nostr get turned off and leave when they have somebody harass them or when they see harassment. So a way to protect free speech and also address these issues are to provide a way of users to choose labels and filters to shape their experience. It empowers relay operators to choose what they want to host, and users to decide what they want to see in their clients. 

Nobody's telling you that you can't say whatever the hell you want on Nostr, but it is giving people tools to shape their social experience. 

We see lots of women join Nostr, love what it offers, and then be harassed until they leave. This has happened many times. While you may want a platform which feels like 4chan, lots of others don't. You should be able to have your flamewar fun time while others don't have deal with that. 

Nostr is held back because of the reputation of the content on the platform, not the quality of the apps and features of what we're doing.  I'm saying, we should be able to use the protocol with communities of many norms. 

Look at how many people are accusing Nostr with being a pro-nazi platform... 

https://www.google.com/search?q=nostr+nazi+bar 
 What makes you think that one nostr bot posting labeling notes only to it's own relay is centralized claiming of authority? The problem is that  @Vitor Pamplona has decided in his app that all report notes should be shown in the global feed and as replies even if they're not somebody you follow or care about. Every other app just uses these if they're relevant to the user. Switch apps, because this IS handled by nostr clients and you're choosing a client which throws it in your face. 

To be clear,  @Vitor Pamplona is totally within his rights as a nostr client developer to decide how he wants to display reports to his users. Amethyst is open source, he's happy for people to fork it and make versions that work differently, or better yet, provide PR's to change it's functionality. 

I think "report" is a terrible name for content labeling because it keeps people thinking that there's somehow some magical secret centralized authority which is getting those reports and taking action on them. That's not how Nostr works. I think we should change the nip to say 'content labeling' and 'flagging users' / 'flagging content' because that's what's happening in Nostr. The apps in the AppStore / Google Play will still need to have it say Report in the UI, but that's their platform monopoly requirements. Relay operators have the power to choose what content they'll host. Almost all do filtering and dropping content based on their own decisions. Nostr isn't a 'you have to listen to me protocol' nor is it a "you have to host my content" protocol. You get the ability to say what you want, but so does everybody else. If you want to make sure your content is online, run a relay. That's what we did with  @Reportinator , and we don't put it's content on anybody else's relays at all. 

And because it's all open source, you can compile your own version without any content reporting or labeling.  
 I’m not the snowflake demanding somebody else stop using Nostr how they like. You’re the one demanding I stop posting stuff on Nostr that you don’t like.  
 I write open source software on an open source permissionless protocol. Relays are only centralized if you choose to, if you want to only use the Damus relay, then you're choosing to give  @jb55 authority over what content you can publish or see. 

If you're so concerned about this issue, i strongly recommend you setup a relay, i recommend: https://github.com/hoytech/strfry

And develop a plugin to strfry that rejects all kind 1984 events.

Here's the instructions: https://github.com/hoytech/strfry/blob/master/docs/plugins.md

And if you want to run a service that reports all of my content, you're welcome to do so. Here's the code which is open source and you're free to use and modify as you see fit. https://github.com/planetary-social/reportinator_server

See, i'm not telling you want to say or do, go for it, but all I ask is that you realize free speech only works if it is for everybody.  
 Censorship is about suppressing speech, but somehow you're saying that my speech, which you're choosing to see, is suppressing your speech? Just use an app that doesn't support kind 1984 events. Use relays that don't support kind 1984 events. Mute bots you don't like. 

You also might want to google me before you think that my projects are doomed or that I've just appeared on the scene. ;-D 
 Look, if somehow i managed to convince all the nostr devs and folks managing the nostr nip process to do this, then you could just make versions of the apps and spec which didn't do that. We're an open source permissionless decentralized protocol. 

Why is it you get to reply to me but i can't reply back to you? Do you not like the  @Reportinator events? Why not make a bot that reports all the reports. Or use a client that doesn't choose to read them. You're choosing to see this stuff you're upset with. Just don't.  
 Exactly. 
 You've spent more time replying to me on here than it would have taken to watch the talk. But again, it's your choice about what media you consume. Do you want the bot to be taken offline? Do you think that content reports are inappropriate speech on Nostr? 

Do you have any idea how Nostr works?  @Reportinator is not a centralized service. It's a single nostr identity which posts content reports on what it sees on the nostr network to it's own relay. 

Where in the Nostr spec does it say, you must go and read content from Reportinator. Or you must remove content from your relay because of what's published by some random bot? Where does it say that your open source nostr client should request Reportinator events form the Reportinator relay and use them to shape what you see?

How is any of that centralization? 

The code for this is open source, running on our servers, nobody has to pay attention to it. 

https://github.com/planetary-social/cleanstr

In fact, most clients and relays completely ignore 1984 events. But why don't you make a version that labels any content you feel is woke so that you can filter that out of your feed. I think that'd be a good thing, it's not what I want to run, but I've created the tools for you to do that.  
 Nobody's asking you to use this bot. Nobody's asking you to follow it. Just like nobody on Nostr has any obligation to host or view your content.  
 I'm not sure but it sounds like what you're seeing is a bug. The spec says that all 1984 events should both have a tag for the person and if it's on specific content also a tag for the event being labeled.

https://github.com/nostr-protocol/nips/blob/master/56.md

The thing is, some clients have a bug where the client sees the person tag and the event tag and displays that event label for everything the person posts. This is now how the nip is suposed to work. 

The  @Reportinator bot is only labeling specific events, you should only see if on a single note, if that particular note has an issue it wants to label. Some clients are labeling EVERYTHING a person posts because a single piece of content got a report. 

There also needs to be some way to either 'confirm' or 'reject' a report. Maybe Bob makes a report, but Alice things it's BS... If I follow both Bob and Alice then I should be able to say, show both reports, but make Alice's judgement take precedent. Or say, 'stop showing all reports from Bob' he's got good content but terrible judgement when it comes to reports.  
 I got my kids to stop saying “fuck” when they were little by explaining what it means. Kind of like I’m explaining how Nostr works to folks freaking out about someone else on Nostr publishing report and label events that reference their content.  
 Grammar users not need either!  
 Yeah the OpenAI model isn’t great. We’re working an update that sends all of its reports to a person to review and which will delete them if the person decides the bot made a mistake. 

It’d be great to have a self hosted and trained bot which did a better job. 

This is an experimental bot followed by a couple dozen people and posting events to a single otherwise read only relay. 

We don’t even suggest Nos users follow it. I think we should suggest it to users but there are people on our team who also think it’s not accurate enough yet and we need human intervention. 

I want there to be many bots like this which all make different judgements about content. You can choose to follow none of them, one, or several. 
 Exactly, nobody’s saying you need to follow a report bot, or that somehow my report bot should be the one and only source for reports. If you don’t want it, mute it. If you don’t like reports at all, use an app that lets you disable them entirely. 

I also see the need for folks to be able to send the report to somebody else and have them check it and publish. Sometimes people don’t want to face the backlash that can come with posting reports. Obiviously a reposting bot would need a person or person and AI checking the content. Something which was just a pass through would work, but is probably a vector for abusive behavior.  
 Yes it'll go out to the fediverse, we want your fediverse address to be bostonabrams@nos.social but right now it's this much less pretty: @2536db06a9c7f1fd21f87e97c799e4dedf9c71f556395b21d2b2be821bc70957@mostr.pub

Some fediverse servers show comments from people on third party servers who they're not following, others don't. I'm not sure what  @ee9c46e5 does.  
 Are there any other #nostr clients that could work with Tor? nostr:note1mz5j32l49kg0l94x9kk2shz2q... 
 I know that amethyst has some support for this. One thing is we would benefit from more onion service relays.  
 At least a quarter of my thoughts that I consider posting never get sent... because I decide they... 
 The need for extra intense privacy seems to be an downside to non-custodial wallets and crypto… not saying it’s not worth it, but i don’t hear folks mentioning that as a tradeoff.  
 I smashed my toe and now I’ve got an offical medical report on my injury as a chair leg vs toe fight. The chair leg won and the toe is losing its nail.  

https://image.nostr.build/12a8bd48946b929ee70dcf342d8cacfb1feb9d9c6db09738e91ea1996411ed5c.jpg 
 Yes apparently it will. But it might take a while. Sometimes a year or two! 
 We should build good tools for connecting Ghost and Nostr. I think this would be a killer feature which would drive creators to publish to the nostr paid long form content apps like highlighter.com, letr.co, yakihonne, and creatr.nostr.wine. 

One thing which has come up in our user research is that most journalists and creators are using tools to manage their publishing to social media platforms as opposed to the apps that people use to read / reply / share content. We need to work with those tools. 

nostr:note1a6av5puatevd9exvt5ahdlu4yddcjla06t69kazjpj4gtgjd38cqptwy6q 
 
 Ghost is kind of like a substack meets wordpress open source cms. 

It’s really good. 

https://ghost.org 
 The goal of reportinator is to provide a service that people can opt in to using for content warning. It’s not perfect and we’re not telling anyone they have to use it. Don’t like it? Mute it. Don’t host its events on your relays. 

It doesn’t always get it right, but we’re tweaking it. We’re also working on adding an easier way for people to review and edit or delete reports. 

Our goal is that there should be many different reportinator like services with different takes. The Thank God For Nostr folks can do one consistent with christian values. The sex workers can do one from the perspective of embracing sexual content but helping protect those same creators from harassment. A bitcoiner might make one that labeles all non-bitcoin crypto content as “shitcoin propaganda”. 

Relays can then decide what content and users they want to host, having content labeled makes that process a lot easier. Most relays already have algorithms to decide what content they’ll host and what they reject. This just gives them better tools. If you don’t like the policies of relays then run your own!  
 @Ronin I appreciate your concerns and what reportinator is right now is a flawed experiment. Sometimes it helps sometimes it gets things wrong. Let’s talk. Also I think there are some straight up bugs where it should be reporting a post and not a person. Or where it flags people talking about being harassed as harassment. I also think this is about content labeling so users can be empowered to shape their own experience, the name “report” comes from appstore requirements and is causing a lot of confusion. This can be better.  
 What app are you even seeing these in? Kind 1984 & 1985 events are labels and shouldn’t be displayed as if they’re notes. Some app confused about how these events are supposed to work.  
 The sentiment is good, and we need a way of protecting user privacy and data. The problem with the way apple decides whether or not they’re collecting data and what counts as third party. 

If an app developer doesn’t look at how their app is used then it won’t improve as fast as the apps that collect a lot of analytics. Where are we losing users, does this new design work better, which of these two ways of doing something is better. That is all user data collection, even if it is never used to for ads or commercially. 

The other issue is third party data sharing. Under apple’s rules using a sass analytics service is considered exactly the same thing as selling all user data to data brokers for resale. It’s insane. The cost of running everything without any sass services is really high. And for what? Posthog and others committed to not using or selling their customer data. 

With Nos we run our own notification system which talks directly to apple, but all apps that use a service for notifications are sharing data with third parties. 

I think most developers lie or ignore apple’s rules about hosted sass services. But that just means they’re vulnerable to being taken down. 

So yes. Don’t collect what you don’t need for bug / crash reporting, don’t sell data or give it away. But current appstore rules don’t really cover those cases at all.  
 Does anybody know who's responsible for this dumbass retarded bot?
nostr:nevent1qqsg4n85rt75nl4ju... 
 If you don’t like the reports and content warnings just don’t use them. Mute @Reportinator and then it’s not part of your experience. This is entirely an opt-in permissionless system for reporting, content warnings, and moderation. If you don’t want it, don’t use it. 

Freedom of speech means you can the shit you want and I can choose to say what I want. That includes publishing labels about you and your content. Feel free to publish content warning labels about me. Or are you a snowflake? Do you want free speech for yourself and only the people you agree with?  
 Because it’s posting content label and report notes kind 1984 & 1985. If your client doesn’t support those event kinds you won’t see anything. There’s a lot of nostr beyond kind 1. 

And again somehow people have magical thinking about content labeling and reporting. There is no centralized authority which takes action based on these reports. Some clients use them to help users decide what content they want to see. Some relays use them to decide what content they want to host. If you don’t like one bot’s reports then mute it. If you don’t like the policies of a relay, run your own. A free open network doesn’t mean anyone is obliged to see or host your content.  
 That’s fine, as long as you are satisfied with free offering, no need. 🐶🐾🫂 
 Generous but actually we need to make sure a media host for nostr is something which can be a sustainable business.  
 Any idea when? This is new to me. 
 The press release went out April 10th! It’s totally new.  
 I have a bone to grind with you, Nostr. I’ve been slaving away trying to build a completely new... 
 This stuff is hard. As the primary media host you’re also taking on a bunch of unsexy work and responsibility.  
 Is there a Nostr equivalent of substack?

#AskNostr 
 Habla.news is kind of being merged in to hightlighter.com. There’s also yakihonne.  
 Corruption idexes are inherently subjective but that doesn’t mean they’re useless. When people think everyone else is honest and trustworthy then they’ll act better themselves. It’s a story here in New Zealand because for the first time ever the country isn’t ranked first or second least corrupt. 

https://www.transparency.org.nz/blog/new-zealands-score-slips-in-latest-corruption-perceptions-index-now-ranked-third

https://image.nostr.build/d6db92937c1239128ce233e955f6a0744a6afab3c9758cc9206417d7f28c50c2.jpg 
 Uruguayans like to complain but they don’t realize how good things are in comparison to the region.  
Event not found
 It didn’t even occur to me to look at twitter to figure our what’s going on with thkse Iran drones.  
Event not found
 Cool, looking forward to it.  
Event not found
 Are the IIW sessions only in person, are there videos or notes? 
Event not found
 We're fixing bugs and streamlining the app as we get ready for the Creator Residency and Journalism Accelerator. 

nostr:note10ngad926aazdd82vna252a9862du74t909vej2p430v7kaz4zf5s5nuge8