BitOneZero

joined 1 year ago
[–] BitOneZero@lemmy.world 26 points 1 year ago* (last edited 1 year ago) (6 children)

It is not even a mistake, it's some pretty mind-fucked up on part of @bamboo@lemmy.blahaj.zone to jump to such a conclusion. crap

[–] BitOneZero@lemmy.world 2 points 1 year ago

I think timestamps of files would be one of the easier things, and try to track back to postings and comments that references the upload... ideally the logged-in account (which is the standard install of lemmy, only logged-in users can upload to pictrs)

[–] BitOneZero@lemmy.world 12 points 1 year ago* (last edited 1 year ago) (1 children)

Yes. odd how people think sharing CSAM is why people would post here, instead of actually tracking down and prosecuting those sharing CSAM. Details about the users who sharedl CSAM content, such as timestamps - would help identify the offenders for prosecution.

[–] BitOneZero@lemmy.world 7 points 1 year ago

It sounds like you’re encouraging people to share CSAM images found, which is obviously not the intent of this tool.

Yes, that is in fact the context.

Context: "which is obviously not the intent of this tool. "

it is not my intent to share the images, nor is it the context of the tool.. Sharing details about the users, timestamps - would be the obvious context.

[–] BitOneZero@lemmy.world 61 points 1 year ago* (last edited 1 year ago) (18 children)

I hope people share the positive hits of CSAM and see how widespread the problem is...

DRAMTIC EDIT: the records lemmy_safety_local_storage.py identifies, not the images! @bamboo@lemmy.blahaj.zone seems to think it "sounds like" I am ACTIVELY encouraging the spreading of child pornography images... NO! I mean audit files, such as timestamps, the account that uploaded, etc. Once you have the timestamp, the nginx logs from a lemmy server should help identify the IP address.

[–] BitOneZero@lemmy.world 1 points 1 year ago* (last edited 1 year ago)

and avoiding link rot

Lemmy seems built to destroy information, rot links. Unlike Reddit has been for 15 years, when a person deletes their account Lemmy removes all posts and comments, creating a black hole.

Not only are the comments disappeared from the person who deleted their account, all the comments made by other users disappear on those posts and comments.

Right now, a single user just deleting one comment results in the entire branch of comment replies to just disappear.

Installing an instance was done pretty quickly... over 1000 new instances went online in June because of the Reddit API change. But once that instance goes offline, all the communities hosted there are orphaned and no cleanup code really exists to salvage any of it - because the whole system was built around deleting comments and posts - and deleting an instance is pretty much a purging of everything they ever created in the minds of the designers.

[–] BitOneZero@lemmy.world 5 points 1 year ago

But lemmy.world should primarily communicate via lemmy imo…

I find the same attitude holds for developers who like to hang out in real-time Matrix chat and don't seem to use Lemmy itself very much and things like code blocks ruining greater-than and less-than slip right into release without much concern.

[–] BitOneZero@lemmy.world 16 points 1 year ago

I've found there is a culture within Lemmy developers and long-time operators to discuss in Discord or Matrix chat instead of "eating their own dogfood" and using Lemmy itself to openly discuss Lemmy technical and project issues. These chat services are legendary for keeping things away from search engines and newcomers getting up to speed. Lemmy itself isn't nearly as search-engine friendly as Reddit was traditionally, it seems like feedback needs to be given as to how important it is to keep things about Lemmy in the eyes of those who actually use Lemmy...

[–] BitOneZero@lemmy.world 3 points 1 year ago

You mean "comment context" links? It's been that way for 10 days that I've noticed. There are previous posts about it, from 4 days ago: https://lemmy.world/post/2697806

[–] BitOneZero@lemmy.world 12 points 1 year ago* (last edited 1 year ago) (2 children)

Some people seem to be interpreting this to mean 11 million comments per day. I think it means the numbers are updated daily.

The numbers also don't make a lot of sense to me. Front page of lemmy.world says 620,000 (local origin) comments. And Lemmy sequentially numbers the comments for an instance, mixing both local and federated and the recent numbers look like 2,122,067. Lemmy.ml says 253,000 on the front page, and their index key is showing 2,321,959 for a comment made today. I have to imagine that these two servers are subscribed to a lot of stuff (including each other). I'd be surprised if there were more than 4 million unique comments in Lemmy. And there would be some kbin messages in the Lemmy.world index.

[–] BitOneZero@lemmy.world 1 points 1 year ago

Thoughts?

I haven't tested with 0.18.3 to see if new features were added to front-end lemmy-ui, but based on my experience with earlier 0.18 releases... the "Sign Up" page of Lemmy needs to have a custom message added for each instance basically introducing the instance from the admins. The experience is pretty bad... on my instance I have registration closed and lemmy-ui still just presents "Sign Up" links and even the form. I think it's pretty important to get this in the back-end now so that the evolving independent front-ends all support the custom message shown above/below the Sign Up form..

Seems like something that shouldn't take a lot of coding to get added (admin screen has place to create custom messages like "Legal") that would be a good lemmy network-wide focus on the newcomer experience.

[–] BitOneZero@lemmy.world 2 points 1 year ago (2 children)

I can confirm the problem, it's been gong on all week. It really impacts anyone on another instance with a link, they will fail.

As I understand the situation, Lemmy.world has been suffering from performance problems and certain comment links were being attacked by distributed clients. So they basically have firewalled /comment links for everyone (I assume using nginx based on behavior, or maybe the front-end cloud distributor).

Personally I'm interested to know which specific comment links cause the PostgreSQL performance problems as I'm trying to track down and fix those issues. But I haven't seen anyone detail which specific post/comment threads cause the problems... I've just seen the developers reduce loading to 50 and 300 without creating testing scripts to reproduce the issue for other developers to study.

I'm hoping lemmy.world can implement a less-drastic solution than 100% block of comment links from non-local referral origin... such as a rate limit on those links of 3 per 5 seconds or something low like that. Anyway, I hope you are having a good weekend.

 

curl 'https://lemmy.world/api/v3/community/list?type_=Local&sort=Old&limit=1&page=250000&show_nsfw=true'

Returns a result for page 250,000 - and if you edit that number +1 it still returns the same final item in the list. Does this on all the sorts I tried. I am not getting this behavior on lemm.ee or lemmy.ml

With page limit at 50, it ends up returning the final same list of 50 no matter what page number you increment past the final page. This could be causing some front-end apps to endlessly load the list.

the lemmy-ui interface seems to be having the same problem: https://lemmy.world/communities?listingType=Local&page=250001

 

I found one post from 12 hours ago that describes the problem: https://lemmy.ml/post/2650814 - but that's all I've seen. Content is way down on lemmy.ml with nothing coming in from .world

EDIT: seems that lemmy.ml is getting some content delivered to .world, but .world communities seem islands on .ml that only have local new posts/comments.

 

lemmy.world had a DDOS in the past 24 hours, Lemmy.ml was showing problem, but now it is entirely unreachable. lemmy.world was showing "Error!" page (server still reachable), but that seems to have only lasted for a hour or so.

 

504 Gateway Time-out nginx timeout error on home page

view more: next ›