Meta (lemm.ee)

3467 readers
34 users here now

lemm.ee Meta

This is a community for discussion about this particular Lemmy instance.

News and updates about lemm.ee will be posted here, so if that's something that interests you, make sure to subscribe!


Rules:


If you're a Discord user, you can also join our Discord server: https://discord.gg/XM9nZwUn9K

Discord is only a back-up channel, !meta@lemm.ee will always be the main place for lemm.ee communications.


If you need help with anything, please post in !support instead.

founded 1 year ago
MODERATORS
1
 
 

Hey there! I recently got server banned b/c I was sharing a link in men's communities and I was mistaken for a bot.

I didn't have a way to respond to the ban. I was just logged out. I didn't see a link or even which mod did the ban so that I could reply to them, but then I couldn't get into my account to reply anyway.

It'd be nice to maybe get a "if you think this was in error" email or something.

Although I know that if lemmy was the size of reddit, people would hook that email up to chatgpt, so :/

Anyway, it'd be a nice feature to have. I doubt I'll need it again, but I was quite confused as to what to do.

2
 
 

Hey all!

Upcoming lemm.ee cakeday

Can you believe that lemm.ee is almost 1 year old? In just a couple of weeks (specifically, on the 9th of June), we will be able to celebrate our first instance cakeday.

I am thinking of compiling some stats about how lemm.ee has been used in its first year, if you have any specific stats in particular you would like to see, feel free to comment below. I will try to accommodate any ideas as I start gathering this info!

Infrastructure updates

A few weeks ago, I posted about plans to make some changes to our infrastructure in order to deal with different intermittent networking issues.. It took a bit longer than I hoped (just did not manage to get enough free time between then and now), but I am happy to report that this work has now been completed! Additionally, I have decommissioned our stand-alone pict-rs server.

With the two changes mentioned above, I believe lemm.ee should now be much more resilient going forwad, and I expect a significantly lower rate of infrastructure-related issues for the rest of the year!

I'll leave a tehcnical overview about the problem & solution below for those interested, but if these details don't interest you, then you can safely skip the rest of this post.


For context, lemm.ee has been hosted on Hetzner servers for most of this year (having migrated from DigitalOcean initially), with everything except our database being hosted on the Hetzner Cloud side, and the database itself living on a powerful dedicated Hetzner server. This mix allows a great amount of flexibility for redeploying and horizontally scaling our application servers, while still allowing a really cost-effective way of hosting a quite resource-hungry database.

In order to facilitate networking between the cloud servers and the dedicated database server (which live in different networks), Hetzner provides a service named "vSwitch". This service basically allows you to connect different servers together in a private network. Unfortunately, I discovered quite quickly that this service is very unreliable. During the short few months that we have been using the vSwitch, we have gone through one extended period of downtime (where the service was just completely broken for several hours), as well as dozens (if not hundreds at this point) intermittent disconnects, where servers randomly lose their connections over the vSwitch. After such a disconnect, the connection never recovers without manual intervetion.

For most lemm.ee users, the majority of these vSwitch issues have been mostly invisible, as we have redundancy in our servers - if one server loses its connection to the database, other servers will take over the load. Additionally, I have generally been able to respond quite quickly to issues by redeploying the broken servers (or deploying other temporary workarounds). However, in addition to a huge amount of these issues which lemm.ee users hopefully haven't ever noticed, there have also been a few short periods of downtime this year so far, as well as a few cases of federation delays. These more extreme cases were generally caused by multiple servers losing their vSwitch connections at the same time.

After several attempts to work around these issues, I decided that we need to migrate away from vSwitch.

As of earlier today, lemm.ee is no longer using Hetzner's vSwitch at all!

I finally found enough time earlier today to focus on this migration, and I was able to successfully complete it. None of our networking is relying on the vSwitch anymore.

In the end, I went with quite a simple solution - I configured a host-level firewall (nftables) on our database dedicated server, which will deny all connections by default. Whenever any cloud servers are added/removed, their corresponding public IP addresses are added/removed in the allowlist of our database firewall. It would have been ideal to do this whole logic in Hetzner's own firewall, but that one unfortunately has a limit of only 10 rules per server, which is just not enough for our setup.

Bonus: our pict-rs server has been decommissioned!

Pict-rs is the software which Lemmy uses for everything related to media (image storage mostly). Initially, pict-rs required a local filesystem to store both files as well as metadata about files. Since the beginning, lemm.ee has used a dedicated server just for pict-rs, in order to ensure we could easily redeploy the rest of our servers without losing any images.

Over the past year, pict-rs has gained the ability to store files in object storage, and metadata in a PostgreSQL database. This meant that the server running pict-rs itself no longer contained any of the important data, so it became possible to redeploy without losing any images. Additionally, this meant that it would be possible to run multiple pict-rs servers in parallel.

While we had already migrated our pict-rs server to use object storage and PostgreSQL several months ago, we still had the single dedicated pict-rs server up until today. I have been planning for a while to decommission this server, and start running pict-rs directly on each one of our Lemmy application servers. Earlier today, I was able to complete this plan. This should hopefully mean that our pict-rs server is less likely to get overloaded, and it also means a tiny reduction in our overall monthly infrastructure bill (due to one less server running).

With the above changes, I think our infrastructure has become more robust, and hopefully, we will experience less issues with images, federation, and general downtime going forward.


That's all from me for now. Feel free to leave any thoughts or questions in the comments, and as always, I hope you're having a great day!

3
 
 

Hey folks!

This is a quick notice about a change to our moderation policy.

We have had a policy on lemm.ee for administration and federation nearly since the very beginning. This policy has also always included a section about moderator responsibilities. Today, we have made two changes to this policy:

  1. The policy has been renamed to Policy for administration, moderation, federation - this is to make it clear that the policy is also relevant for mods
  2. We have introduced a new responsibility for moderators, they must "Ensure that they only provide accurate and clear reasons for mod actions".

The reason for the addition is that mod log actions federate out to other instances, and are more or less permanent (due to how Lemmy and federation works right now). This means that users do not really currently have any easy way to clarify or defend themselves against inaccurate accusations in the mod log.

As always, I am very grateful to all mods for your efforts in building awesome communities on lemm.ee. I hope you can understand why this new policy is necessary - I do not want to make your lives more difficult, the goal is to just try and reduce any mod log related misunderstandings in the future.

Thank you for reading and have a nice day!

4
 
 

My community is:
https://lemm.ee/c/eurographicnovels

The post I was working on was:
https://lemm.ee/post/2890991

To be clear-- I, in no, way, shape or form intend to delete my community. I wish the community to remain undeleted, thanks.

In case it matters:

57 users / day
150 users / week
444 users / month
1.53K users / 6 months
635 subscribers
317 Posts
902 Comments

EDIT: Google retains the specific URL's of a bunch of our posts, such as the "Moebius" ones.

5
1
submitted 6 months ago* (last edited 5 months ago) by sunaurus@lemm.ee to c/meta@lemm.ee
 
 

Hey folks!

We unfortunately had about half an hour of unplanned downtime today. This was caused by an issue with our hosting provider. The issue is solved for now, and I am planning to make some changes to prevent similar issues in the future. Sorry for the inconvenience!


Technical details

Our servers are communicating with our database over Hetzner's "vSwitch" service. Unfortunately, this service seems to be quite flaky - over the past few months, I have had to deal with the connection just dropping without recovering many times. Mostly this has not resulted in any noticeable downtime, as we have redundant servers, so even if one of them stops working, it won't affect lemm.ee users. However, in this instance, all of our API servers lost their connection to our database at the same time, which resulted in actual downtime.

I have now decided to migrate our setup away from the vSwitch in the near future to hopefully stop these issues for good. Should be possible to do this migration without any downtime, I just need to set aside some time to actually create an alternative solution for us, most likely over the coming weekend. I will update this post once the migration is complete.

Update: the migration is now complete! You can read more here.

6
 
 

EDIT: I should have posted this in the Support community - as others are also doing.

https://lemm.ee/c/support


Just noticed tonight that when I visit my FullMoviesOnYouTube community, the banner image is not shown.

I tried linking directly to it:

https://lemm.ee/pictrs/image/642b333b-5c37-4d39-af9f-cc876de484fc.webp?format=webp

And got this error:

{"error":"unknown","message":"Request error: error sending request for url (http://10.0.0.3:8080/image/process.webp?src=642b333b-5c37-4d39-af9f-cc876de484fc.webp): operation timed out"}

Any idea what's up? It's definitely been working as of just last week.

7
 
 

Hello!

I noticed that a post I created from my sopuli account to a lemm.ee community, federated over here, but then not out any other instances.

Is lemm.ee dealing with similar federation woes as lemmy.world?

8
 
 

I was using imgur now to upload pictures, but for none of my posts it shows a thumbnail for me in jerboa. It does at least for some of those using lemmy.world, so i assume this may be something lemm.ee specific? Thumbnails seem to be shown for images hosted on catbox.moe or other instances. I just don't get what the difference is.

What is it that would make thumbnails show up?

9
 
 

Hey folks

This is just a quick heads up that I need to perform some maintenance & upgrades on our database server, which unfortunately will require downtime. I don't expect the downtime to last for longer than 2-3 minutes, but just wanted to give a heads up first so you know not to be concerned.

That's all, hope you have a great week!

Edit: maintenance complete!

10
 
 

There's presently !asklemmy@lemm.ee and !asklemmee@lemm.ee, but both have low activity and appear to be unmoderated. Removing communities is probably reserved as a last resort, and seeking some moderators may be preferable, so I thought I might ask about'em to raise some attention.

These are the kinds of communities that could get pretty rough if left in their current unattended state, I think.

11
 
 

Hello, friends!

TL;DR: I am working on a new Lemmy frontend in nextJS. There is still much work to be done, but you can already have an early look at https://next.lemm.ee

First of all, quick note to lemm.ee users: I am making this announcement post in !meta@lemm.ee, as this is also a notice that I will be hosting an alternative frontend (lemmy-ui-next) for the first time on lemm.ee. Going forward, I will post updates about lemmy-ui-next in a separate dedicated community: !lemmy_ui_next@lemm.ee. If you're interested in future updates, please subscribe there!

What is lemmy-ui-next?

Lemmy is generally accessed through some kind of frontend UI. By default, Lemmy provides its own web interface (lemmy-ui), which you can find on the front page of most Lemmy instances (including lemm.ee). There are also several other independent frontends, for both the web and different mobile platforms, which I'm sure many of you are familiar with.

Lemmy-ui-next is a brand new alternative frontend, built from the ground up with modern and popular tooling - a framework known as NextJS. Lemmy-ui-next has (or aims to have) the following high-level features:

  • Open source (AGPL)
  • Drop-in replacement for lemmy-ui - same exact URL structure, so all existing links will continue working
  • Very plain & minimalistic UI, strongly inspired by other link aggregator sites (of course including the original lemmy-ui!)
  • Very basic and "typical" NextJS architecture, to encourage open source contributions
  • Fully functional even when JavaScript is disabled (but works better with JS enabled!)
  • Optimized data transfer between your browser and the server (filtering out only relevant data from the Lemmy API, caching, memoization)
  • Strong focus on privacy and security (all authentication with the Lemmy API is done through secure httpOnly cookies, user IP addresses are not leaked to external image hosts, etc)

What is the current status of lemmy-ui-next?

I have mentally split the initial work I want to complete into 3 milestones:

  1. Lurk - All read-only features of Lemmy
  2. Participate - Voting/posting/commenting/DMs/reports, etc
  3. Moderate - Handling reports, creating & managing communities, etc

I am now nearing completion of the first milestone. It's not 100% there yet, but you can already log in, browse, subscribe to communities and even vote. Some things may still look a bit wonky, and some features are still missing, but the core experience is getting there.

In terms of code contributions, I would ask anybody who is interested in getting involved to contact me first before working on anything. I am not looking for PRs just yet - the code structure is still a bit loose, and I am redefining it as I add more stuff. I would ideally really like to complete the first 3 milestones before opening things up for external contributors.

Who can use lemmy-ui-next?

At the moment, it is only hosted on this instance, at https://next.lemm.ee. I do not yet have any formal instructions for running it on other instances, but generally speaking, it is a simple NextJS app - to deploy it, you just need to do: npm install, npm run build and LEMMY_BACKEND=https://<your lemmy api here> npm run start.

Why not just improve lemmy-ui instead?

Lemmy-ui is an extremely important and valuable project. There has been a significant amount of hard effort put into it so far, and nobody can refute that it is the frontend which has really carried Lemmy to this point.

Unfortunately, there are some architectural problems with lemmy-ui (mostly related to how data is fetched and how sessions are stored in memory), all of which would require quite a significant rewrite to fix. Additionally, I think that the core technical solution used for lemmy-ui is just a bit too obscure, which has been an obstacle to my own contributions, as well as to contributions by others. If a rewrite is on the table anyway, then I believe a different technology is the best way forward.

Why not work on lemmy-ui-leptos instead?

Lemmy-ui-leptos is another rewrite of lemmy-ui, which is being lead by Lemmy maintainers. It is based around a Rust web framework called Leptos. I think this is really cool tech, and will be happy to host lemmy-ui-leptos on lemm.ee in the future as well.

There are a two key reasons why I personally decided to start working on another alternative, though:

  • I have heard from several people on Lemmy that they feel like Leptos is a big barrier to entry in terms of them contributing
  • Even for myself personally, I am very comfortable (and think I can move very fast) when working on something like NextJS, but with Leptos, I think the learning curve would be quite big and I would get much less done with any time I invest into it

My hope is that by providing a very vanilla alternative, I can provide an outlet for potential open source contributors who would like to work on Lemmy, but aren't prepared to do it with Leptos.

Does this mean that lemm.ee will change in ways I don't like?

First, let me be clear: lemm.ee will always host the default Lemmy frontend. This means lemmy-ui for now, and most likely lemmy-ui-leptos in the future.

I am however considering the possibility of switching things around at some point in the future, so that lemmy-ui-next will be hosted directly on lemm.ee, and lemmy-ui will be accessible on a different subdomain (like ui.lemm.ee). This would only happen once I have completed all 3 milestones for lemmy-ui-next. The main reason I am considering this is that I feel like I will always be in the best position to offer technical support to users on the frontend which I am myself maintaining. If you have any thoughts about this potential change, please let me know in the comments below!

That's about it for now!

This is something I've been thinking of doing for a while now, and I'm very excited to finally get the ball rolling! If you have a chance, please feel free to check out what https://next.lemm.ee looks like so far, and please let me know if you have any thoughts or feedback!

12
 
 

You know we all escaped Reddit for greener pastures. But I have increasingly noticed that greener pastures are not to be found in the Fediverse either.

I undeniably witnessed some vote fuzzing this morning and I feel a bit psychologically violated whenever I notice people or companies or platforms engaging in psychological manipulation.

It does make me feel livid.


Update - case closed because:


Wisdom from gregorum@lemm.ee

There is no vote fuzzing in Lemmy. The software doesn’t support it. Period.

I’ve been paying attention to the Lemmy mod tools and Lemmy admin Matrix rooms, and it’s been the subject of major discussion over the past week between one or two users who keep bringing it up and, like, everyone else (including the Lemmy devs) who are all like, “no, not now, not ever.”

It’s not a thing.

SO, what could explain what you’re seeing?

Possible explanations include:

Vote brigading

Alt account abuse

A genuine statistical fluke where there are genuine votes that are producing this bizarre pattern, and hopefully it will subside soon along with your reasonably justified paranoia.

The solution to all of these possibilities, and possibly others that I did not enumerate here, is to just ignore the problem and move on with your life. Remember: nothing that happens here is really important.

I really hope this makes you feel better. Because it should.

13
1
submitted 8 months ago* (last edited 7 months ago) by sunaurus@lemm.ee to c/meta@lemm.ee
 
 

Hey

This is just a quick heads up that our host, Hetzner, has been experiencing networking issues today, which has caused some downtime for lemm.ee.

I have a workaround in place for now, so we should (fingers crossed) be recovering at the moment, but I am still waiting on the proper solution from Hetzner. You can track their issue here: https://status.hetzner.com/incident/9406c500-9c8b-48be-9591-a73691134096

Also, this is a good opportunity to remind everybody about https://status.lemm.ee - you can be sure that I will provide updates on that page as soon as I am aware of & dealing with any issues. I have been posting status updates for the current issue there as well.

Sorry for the inconvenience and I hope you have an otherwise great day!

UPDATE: Hetzner claims they have fixed the issue, but the problems have not been resolved for lemm.ee servers yet, so I am keeping my temporary workaround active for now. Will continue troubleshooting this tomorrow.

UPDATE 2: Hetzner has now fixed their issue, and our network has been restored to its original optimized state.

14
 
 

cross-posted from: https://jlai.lu/post/4905287

Join Lemmyvision, a Eurovision-like Song Contest for Lemmy communities around the world !

Crossposted from https://lemmy.world/post/12717592

TLDR

  • From right now and until April 1st, discuss with your country's community on Lemmy about which song to send and share to the Fediverse.
  • On April 1st, voting will begin, where you will rank your favourite songs. Any song not submitted by this date will not be featured.
  • On April 8th, results of everyone's favourite songs will be published.
  • Join us at !lemmyvision@jlai.lu for any question, this will be the community for updates and results, make sure to subscribe if you'd like to stay in the loop.

Hey everyone!

I'm trying to launch Lemmyvision, a Eurovision-like Song Contest for Lemmy communities around the world ! I'd love for people across Lemmy to participate, I hope it will bring people together through our diverse taste of music, so join us if you'd like to discover new music and culture from around the world!

What is Lemmyvision?

  • Lemmyvision is inspired from Eureddision (itself a reenactment of the Eurovision song contest) which was held on r/europe some years ago, and based on the participation of national communities / instances and the delicate musical taste of their members (you!).
  • Every country is welcome to participate! The contest follows the rule of “national languages only”. The aim is to promote different languages and cultures from around the world, to share more between our online communities across Lemmy, and discover songs from lesser known artists. I hope to make it a regular event, so hopefully this works well for the first edition!

I'm going to try and share the word across Lemmy, don't hesitate if you want to crosspost to your country's community!

You can find more information on !lemmyvision@jlai.lu, don't hesitate if you have questions or suggestions, or would like to help!

See you soon! ❤️🧡💛💚💙💜🤎🖤🤍

15
1
Hexbear? (lemm.ee)
submitted 8 months ago by JakenVeina@lemm.ee to c/meta@lemm.ee
 
 

So, I thought Hexbear defederated from us a little while back, and we, in turn, defederated from them. Why do I keep seeing occasional (new) Hexbear posts in the "All" feed, lately? Did the defederation get reversed? Is it somehow a bug?

16
 
 

I just joined lemm.ee from lemmy.world. I was drawn to lemm.ee bc it federates with all the right instances.

But my feed is behaving oddly. (I view Lemmy through the Eternity app on Android.)

  • I'm seeing posts from communities I haven't subscribed to, even when I view my "Subscribed" feed
  • I get a weird error when I try to comment on some posts (in Firefox@lemmy.ml, most recently)

These issues didn't happen previously. Anyone know how to fix?

17
1
submitted 8 months ago* (last edited 8 months ago) by sunaurus@lemm.ee to c/meta@lemm.ee
 
 

Hey folks

Some of you may have noticed comments complaining about spam and lack of moderation within the past day or so. Maybe you've even noticed a few spam posts yourself (hopefully not too much, as we have automations in place on lemm.ee to remove the spam as soon as it is posted).

I just wanted to write a quick post with some context about the attack, what we are doing about it, and how you can help.

Context

Allegedly, a group of kids in Japan have created a bot, which signs up on different Fediverse instances and posts spam into different communities. The spam generally consists of Japanese text and/or an image and/or a bunch of random @mentions into different communities. You can check a post on Mastodon with more information here: https://mastodon.de/@ErikUden/111940301222380638

What we are doing about it

Many instances are actively working to limit this spam-wave, and lemm.ee is no different. Thankfully, we have not had to deal with any bot sign-ups on our instance (potentially as a result of different protections we have implemented for sign-ups), but we still suffer the effects of the spam, even if it's posted from other instances. To help us quickly eliminate most of the spam for lemm.ee users, I am continually tuning our @adminbot to automatically detect and remove content posted in this current spam-wave.

We cannot remove content from the wider Fediverse if it's not posted there by a lemm.ee user, so our automated removals won't help users on other instances, but we are at least improving the experience for our own users. For an example, you can compare how /c/opensource@lemmy.ml currently looks like on lemm.ee, to how it looks like on this screenshot I took from another smaller instance:

How you can help

First and foremost, please continue reporting any spam you find, so that relevant mods and admins can deal with it. I am very grateful to users who help us identify spam through reports, and your reports are precisely what allow me to implement automated content removal for more extreme spam-waves such as this current one.

Secondly, I am seeking for a few volunteers to grow the lemm.ee admin team. I am purposely burying this at the bottom of the post, to hopefully pre-filter out some candidates who would want to join for the wrong reasons. If you have read until this point in the post, then I assume you are already quite interested in improving the experience on lemm.ee, so if you feel like you could contribute to the admin team, please read on.

First, I will say a few words about who we are looking for, then I will describe what kind of tasks you would have as an admin, and finally, I will cover some significant downsides of joining the admin team.

We are looking for folks who more or less match the following profile:

  • You have already been active on the Fediverse for several months (not necessarily on lemm.ee)
  • Previous mod experience would be a huge plus
  • You should feel a strong agreement with our basic instance rules and our administration & federation policy
  • You should be prepared to be exposed to some vile content through reports
  • You are OK with using Discord as the main method of admin communication (that is what we have settled on and will continue using for the foreseeable future)

As volunteers, we don't expect admins to be available 24/7, but as our instance grows, I do think it would be quite important to achieve a state of pretty good timezone coverage with our admin team, so please only consider applying if you are already regularly active on Lemmy.

As for what tasks admins are responsible are for: it's mostly covered in the administration policy post linked above. But in short, you should be prepared to regularly check the report queue, contact users with friendly messages to de-escalate conflicts, issue bans, remove content, and monitor the activity of @adminbot. Additionally, if you're interested in taking a more hands-on approach to any kind of community-building on lemm.ee, then this would be totally welcome as well, but not strictly considered a core responsibility for admins.

Please note that the lemm.ee admin team has an absolute zero tolerance policy against any kind of abuse towards minority communities. If you do not share this mindset, then please do not consider applying.

Finally, let me share some negative aspects about joining the admin team. I think this will probably reduce the amount of any potential candidates, but I still feel it's important to be honest and upfront about this:

Through the report queue, you will regularly see absolutely vile content which you might otherwise never even notice on Lemmy. Many users come to Lemmy to spread hate, post disturbing images, etc, and in order to clean such content up for other users, mods and admins need to actually be exposed to this content in much larger amounts than regular users.

Additionally, while Lemmy is constantly being improved by the developers, the moderation tools are still quite rough around the edges. Lemmy is not at 1.0 yet, and that will most likely become even more obvious to you as you work on admin tasks.

Maybe this is the most important one: no matter what you do, there will always be people unhappy with how you apply our rules. I have seen countless comments complaining about lemm.ee admins specifically. I have been told by complete strangers that they hate me. I have seen many complaints about us moderating too harshly. I have seen complaints about us not moderating enough. I have seen users on Lemmy make up wild stories about our admin team, and share them as facts. There are of course plenty of supportive users, but the negative experiences tend to leave a much more lasting impression.

If after reading all of the above, you are still motivated to help make lemm.ee a better place through offering your help in the admin team, please contact me on Discord (@sunaurus)!

That's all from me for now. Thank you very much to anybody who went through this whole wall of text, and I hope you are all having a good weekend!

18
1
submitted 9 months ago* (last edited 9 months ago) by mdd@lemm.ee to c/meta@lemm.ee
 
 

edit - the original title was "Are Picture Uploads Allowed or Not?"

I know pic uploads weren't allowed and a file size message would appear.

I forgot and made a post earlier today with a pic (PNG). There was no error but the pic didn't appear. I updated the post with a link to the picture hosted elsewhere (pixelfed.social).

Before this post I created a test post with another picture (JPG) and it uploaded and displayed. (edit - I deleted the post).

I added the same test picture to this post.

19
 
 

When did this happen?! It makes me so happy!

It's still not quite as good as old reddit with RES was, where you started with everything normal and then clicked a button to expand (or contract) media. Being able to contract everything is almost as useful, particularly if you want to scroll further down. However it's still a great improvement, as well as proof that lemmy just keeps getting better.

20
 
 

I've searched but couldn't find any discussion on this topic, so I apologize if it's been answered before.

Would the admins consider hosting the Alexandrite and Photon front-ends, e.g. at a.lemm.ee and p.lemm.ee?

21
 
 

Hey folks!

Just a quick update: we now have a dedicated status page for lemm.ee.

You can find it at status.lemm.ee. It currently contains three sections:

  1. A web status section, which I will update manually to communicate issues about lemm.ee
  2. A financial status section, which I will update monthly to give an overview of how we're doing financially
  3. A federation section, which automatically checks the current federation status, both incoming and outgoing, between lemm.ee and other instances. By default it shows 3 large instances, but you can also search for any specific instance you are interested in.

This status page is hosted completely separately from our main servers, so if there is any trouble with our servers, you can expect the status page to still be available!

If you have any issues with this page, or any other thoughts, feel free to comment.

22
 
 

I’m subscribed to a couple of communities where those bots post regularly. I noticed a drop in content recently and realized I haven’t seen those bots for the past week. I’m trying to figure out where they got blocked and was wondering if our instance blocked them. Anyone have an idea?

23
1
submitted 9 months ago* (last edited 9 months ago) by SKLC@lemm.ee to c/meta@lemm.ee
 
 

If I close all lemm.ee tabs that I have opened for a session it always logs me out and I have to relog, which is so annoying, especially after having 2FA enabled. Is this only me or has nobody else reported this issue?

~~Also, why is there no "Prev" button after clicking on "Next" at the bottom of a page? I get using the back button, or shortcut but sometimes it's convenient to click it, or even just to know if you've scrolled to another page.~~ OK so apparently this is an intentional Lemmy change that removed it. Hard disagree, but oh well.

24
 
 

Hey folks

This is a heads up that I will be performing some maintenance and hardware upgrades on our database this Saturday.

We are currently experiencing several spikes throughout the day which cause our database to become overloaded - this results in degraded performance for many users. The spikes are happening due to a combination of continued growth of the database, some expensive periodic scheduled tasks which Lemmy runs, and fluctuating traffic patterns. Some of this can be optimized on the code level in the future, but it seems that the best way to deal with it right now is to add some additional resources to our database server.

I am intending to switch to slightly different hardware in this upgrade, and will be unable to make this switch without downtime, so unfortunately lemm.ee will be unavailable for the duration.

As our database has grown quite a bit, cloning it will most likely take a few hours, so I expect the downtime to last 2-3 hours. Sorry for the inconvenience, I am hopeful that it will be worth it and that this upgrade will significantly reduce some of our recent long page load times!


Edit: upgrade complete!

I have now migrated the lemm.ee database from the original DigitalOcean managed database service to a dedicated server on Hetzner.

As part of this migration, I have also moved all of our Lemmy servers from the DigitalOcean cloud to Hetzner's Cloud. I always want the servers to be as close as possible to the database, in order to keep latencies low. At the same time, I am very interested in having the ability to dynamically spin up and down servers as needed, so a cloud-type solution is really ideal for that. Fortunately, Hetzner allows connecting cloud servers to their dedicated servers through a private network, so we are able to take advantage of a powerful dedicated server for the database, while retaining the flexibility of the cloud approach for the rest of our servers. I'm really happy with the solution now.

In terms of results, I am already seeing far better page load times and far less resource use on the new hardware, so I think the migration has been a success. I will keep monitoring things and tuning as necessary.

25
1
submitted 10 months ago* (last edited 10 months ago) by sunaurus@lemm.ee to c/meta@lemm.ee
 
 

Happy new year!

Hi folks! I hope everybody had a good holiday period and I wish you all the best for 2024. I have some quick updates to share about lemm.ee:

Image uploads

Image uploads are now enabled for all lemm.ee users 4 weeks after account creation. The upload size limit is currently set to 500kb.

The 4 week account age requirement is in place to discourage spam and abuse. It is of course not a fool-proof solution, but let's give it a go and see what the results are.

Please note that lemm.ee is not intended to be a image hosting service! Feel free to upload avatars and banners for your profile and communities, but please be aware that we reserve the right to modify the upload limits going forward, as well as delete old images if storage costs become too high.

For image posts and comments, it would still be preferable for you to use an external image hosting service.

Federation delays

Over the holidays, our outgoing federation workers began experiencing some significant delays. I have been working on this problem for the past few days, and after updating to 0.19.1, applying some additional patches to the code, and changing our infrastructure a bit, I believe the issue has been resolved.

The good news is that now that we are on 0.19, problems such as this do not cause Lemmy to completely drop federated activities, as we now retain a persistent queue of federation activities for all linked instances. This means that after the issue was resolved, our federation workers started going through the backlog of likes, comments, and posts which you had made over the past several days, and sending these out to other instances. Essentially, all of your activities did end up reaching their target servers, just with some additional delay.

One quick side-note here, while we are now federating your activities in real-time again to most big instances, there is still a bit of a backlog left on the lemm.ee -> lemmy.world federation (it is a few days behind). I expect this to also catch up by tomorrow.

Performance

The new persistent federation queue is still quite a new feature in Lemmy, so it's a bit rough around the edges - after resolving the federation issues, our federation workers started going through the queue at extreme speed, which caused intense additional load on our database. This was one of the reasons for some performance degradation many of you noticed over the past few days.

Additionally, since updating to 0.19, there have been regular performance issues for many users. I have managed to solve a few of these by making some changes in our infrastructure, but I am also aware of a few more issues which I will continue to monitor and hopefully improve in the near future. Sorry for the inconvenience, I hope that the changes I have made so far will help make it a bit smoother already!

That's all from me for now, as always, feel free to comment if you have any thoughts, and have a nice day!

view more: next ›