thundermoose

joined 1 year ago
[–] thundermoose@lemmy.world 2 points 2 months ago (1 children)

It's really more of a proxy setup that I'm looking for. With thunderbird, you can get what I'm describing for a single client. But if I want to have access to those emails from several clients, there needs to be a shared server to access.

docker-mbsync might be a component I could use, but doesn't sound like there's a ready-made solution for this today.

[–] thundermoose@lemmy.world 1 points 2 months ago

Yeah, they are ideally the same mailbox. I'd like a similar experience to Gmail, but with all the emails rehomed to my server.

 

Not sure if there's a pre-existing solution to this, so I figured I'd just ask to save myself some trouble. I'm running out of space in my Gmail account and switching email providers isn't something I'm interested in. I don't want to pay for Google Drive and I already self-host a ton of other things, so I'm wondering if there is a way to basically offload the storage for the account.

It's been like 2 decades since I set up an email server, but it's possible to have an email client download all the messages from Gmail and remove them from the server. I would like to set up a service on my servers to do that and then act as mail server for my clients. Gmail would still be the outgoing relay and the always-on remote mailbox, but emails would eventually be stored locally where I have plenty of space.

All my clients are VPN'd together with Tailscale, so the lack of external access is not an issue. I'm sure I could slap something roughshod together with Linux packages but if there's a good application for doing this out there already, I'd rather use it and save some time.

Any suggestions? I run all my other stuff in Kubernetes, so if there's one with a Helm chart already I'd prefer it. Not opposed to rolling my own image if needed though.

[–] thundermoose@lemmy.world 20 points 3 months ago (1 children)

Steam + Proton works for most games, but there are still rough edges that you need to be prepared to deal with. In my experience, it's typically older titles and games that use anti-cheat that have the most trouble. Most of the time it just works, I even ran the Battle.net installer as an external Steam game with Proton enabled and was able to play Blizzard titles right away.

The biggest gap IMO is VR. If you have a VR headset that you use on your desktop and it's important to you, stay on Windows. There is no realistic solution for VR integration in Linux yet. There are ways that you can kinda get something to work with ALVR, but it's incredibly janky and no dev will support it. There are rumors Steam Link is being ported to Linux, nothing official yet though.

On balance, I'm incredibly happy with Mint since I switched last year. However, I do a decent amount of personal software development, and I've used Linux for 2 decades as a professional developer. I wouldn't say the average Windows gamer would be happy dealing with the rough spots quite yet, but it's like 95% of the way there these days. Linux has really grown up a lot in the last few years.

[–] thundermoose@lemmy.world 1 points 5 months ago

Yeah, I don't fully understand why Nvidia cards have this problem on first setup with so many distros. On Windows, the default display driver can at least boot with reduced resolution on most cards made in the last 15 years until you install proper drivers. It seems like the Linux kernel and common desktop environments ought to be able to do the same.

Maybe this is better in the 6.x kernel, I haven't tried it. I'm not too much of a tinkerer, so the bleeding edge doesn't interest me. I just want a good shell, POSIX for personal coding projects, and the ability to play games on Steam. Mint is great for that once you get past the initial display driver issues.

[–] thundermoose@lemmy.world 4 points 5 months ago (2 children)

I've been using Mint for about 6 months now and it works with Nvidia just fine BUT the new user experience isn't great. You have to use the nomodeset kernel option and install Nvidia drivers, otherwise you'll boot to a black screen.

Helpful guide: https://forums.linuxmint.com/viewtopic.php?t=421550

[–] thundermoose@lemmy.world 2 points 7 months ago

I do quite like the stability of Cinnamon/Debian, and I think this problem is solvable (even if I have to solve it myself). I generally do not want to spend a lot of time futzing around with my desktop environment, but this is one thing I need to have.

[–] thundermoose@lemmy.world 0 points 7 months ago (4 children)

I saw that and tried it pretty early on. That just moves the screen, it doesn't fill the quadrant.

[–] thundermoose@lemmy.world 8 points 7 months ago (15 children)

Updated to be specific, I'm using Cinnamon. Muffin is the builtin tiling window manager for Cinnamon and it does exactly what you're describing. The problem is that it moves tiles, it doesn't absolutely position them. You have to keep moving tiles around to get them where you want them, Rectangle just has hotkeys to immediately place and resize to fit the active window for each quadrant that it supports:

  • ctrl+cmd+left: top left quadrant
  • ctrl+cmd+right: top left quadrant
  • shift+ctrl+cmd+left: bottom left quadrant
  • shift+ctrl+cmd+right: bottom left quadrant
  • alt+cmd+left: left half
  • alt+cmd+right: right half
  • alt+cmd+up: top half
  • alt+cmd+left: bottom half
  • alt+cmd+f: full screen

It's hard to express how natural that feels after using it for a bit, and I'm still using a Macbook for work so the muscle memory is not going away.

73
submitted 7 months ago* (last edited 7 months ago) by thundermoose@lemmy.world to c/linux@lemmy.ml
 

To preface this, I've used Linux from the CLI for the better part of 15 years. I'm a software engineer and my personal projects are almost always something that runs in a Linux VM or a Docker container somewhere, but I've always used a Mac to work on personal and professional projects. I have a Windows desktop that I use exclusively for gaming and my personal Macbook is finally giving out after about 10 years, so I'm trying out Linux Mint with Cinnamon on my desktop.

So far, it works shockingly well and I absolutely love being able to reach for a real Linux shell anytime I want, with no weird quirks from MacOS or WSL. The fact that Steam works at all on a Linux environment is still a little magical to me.

There are a couple things I really miss from MacOS and Rectangle is one of them. I've spent a couple hours searching and trying out various solutions, but none of them do the specific thing Rectangle did for me. You input something like ctrl+cmd+right and Rectangle fits your current window to the top right quadrant of your screen.

Before I dive into the weeds and make my own Cinnamon Spice, I figured I should just ask: is there an app/extension that functions like Rectangle for Linux? Here's the things I can say do not work:

  • Muffin hotkeys: Muffin only supports moving tiles, not absolutely positioning them. You can kind of mimic Rectangle behavior, but only with multiple keystrokes to move the windows around on the grid.
  • gTile: This is a Cinnamon Spice that I'm pretty sure has the bones of what I want in it, but the UI is the opposite of what I want.
  • gSnap: Very similar to gTile, but for Gnome. The UI for it is actually quite a bit worse, IMO; you are expected to use a mouse to drag windows.
  • zentile: On top of this only working for XFCE, it doesn't actually let me position windows with a keystroke

To be super clear: Rectangle is explicitly not a tiling window manager. It lets you set hotkeys to move/resize windows, it does not reflow your entire screen to a grid. There are a dozen tiling tools/window manager out there I've found and I've begun to think the Linux community has a weird preoccupation with them. Like, they're cool and all, but all I want is to move the current window to specific areas of my screen with a single keystroke. I don't need every window squished into frame at once or some weird artsy layout.

[–] thundermoose@lemmy.world 1 points 7 months ago* (last edited 7 months ago) (2 children)

I didn't say it wasn't amazing nor that it couldn't be a component in a larger solution but I don't think LLMs work like our brains and I think the current trend of more tokens/parameters/training LLMs is a dead-end. They're simulating the language area of human brains, sure, but there's no reasoning or understanding in an LLM.

In most cases, the responses from well-trained models are great, but you can pretty easily see the cracks when you spend extended time with them on a topic. You'll start to get oddly inconsistent answers the longer the conversation goes and the more branches you take. The best fit line (it's a crude metaphor, but I don't think it's wrong) starts fitting less and less well until the conversation completely falls apart. That's generally called "hallucination" but I'm not a fan of that because it implies a lot about the model that isn't really true. Y

You may have already read this, but if you haven't: Steven Wolfram wrote a great overview of how GPT works that isn't too technical. There's also a great sci-fi novel from 2006 called Blindsight that explores the way facsimiles of intelligence can be had without consciousness or even understanding and I've found it to be a really interesting way to think about LLMs.

It's possible to build a really good Chinese room that can pass the Turing test, and I think LLMs are exactly that. More tokens/parameters/training aren't going to change that, they'll just make them better Chinese rooms.

[–] thundermoose@lemmy.world 19 points 7 months ago (5 children)

Maybe this comment will age poorly, but I think AGI is a long way off. LLMs are a dead-end, IMO. They are easy to improve with the tech we have today and they can be very useful, so there's a ton of hype around them. They're also easy to build tools around, so everyone in tech is trying to get their piece of AI now.

However, LLMs are chat interfaces to searching a large dataset, and that's about it. Even the image generators are doing this, the dataset just happens to be visual. All of the results you get from a prompt are just queries into that data, even when you get a result that makes it seem intelligent. The model is finding a best-fit response based on billions of parameters, like a hyperdimensional regression analysis. In other words, it's pattern-matching.

A lot of people will say that's intelligence, but it's different; the LLM isn't capable of understanding anything new, it can only generate a response from something in its training set. More parameters, better training, and larger context windows just refine the search results, they don't make the LLM smarter.

AGI needs something new, we aren't going to get there with any of the approaches used today. RemindMe! 5 years to see if this aged like wine or milk.

[–] thundermoose@lemmy.world 3 points 8 months ago (1 children)

Hyperfixating on producing performant code by using Rust (when you code in a very particular way) makes applications worse. Good API and system design are a lot easier when you aren't constantly having to think about memory allocations and reference counting. Rust puts that dead-center of the developer experience with pointers/ownership/Arcs/Mutexes/etc and for most webapps it just doesn't matter how memory is allocated. It's cognitive load for no reason.

The actual code running for the majority of webapps (including Lemmy) is not that complicated, you're just applying some business logic and doing CRUD operations with datastores. It's a lot more important to consider how your app interacts with your dependencies than how to get your business logic to be hyper-efficient. Your code is going to be waiting on network I/O and DB operations most of the time anyway.

Hindsight is 20/20 and I'm not faulting anyone for not thinking through a personal project, but I don't think Rust did Lemmy any favors. At the end of the day, it doesn't matter how performant your code is if you make bad design and dependency choices. Rust makes it harder to see these bad choices because you have to spend so much time in the weeds.

To be clear, I'm not shitting on Rust. I've used it for a few projects and great for apps where processing performance is important. It's just not a good choice for most webapps, you'd be far better off in a higher-level language.

[–] thundermoose@lemmy.world 4 points 8 months ago (3 children)

I wouldn't shortchange how much making the barrier to entry lower can help. You have to fight Rust a lot to build anything complex, and that can have a chilling effect on contributions. This is not a dig at Rust; it has to force you to build things in a particular way because it has to guarantee memory safety at compile time. That isn't to say that Rust's approach is the only way to be sure your code is safe, mind you, just that Rust's insistence on memory safety at compile time is constraining.

To be frank, this isn't necessary most of the time, and Rust will force you to spend ages worrying about problems that may not apply to your project. Java gets a bad rap but it's second only to Python in ease-of-use. When you're working on an API-driven webapp, you really don't need Rust's efficiency as much as you need a well-defined architecture that people can easily contribute to.

I doubt it'll magically fix everything on its own, but a combo of good contribution policies and a more approachable codebase might.

view more: next ›