I use Arch for personal and gaming, Debian for self hosting and hacking, Alpine for containerized cloud deployments.
Pretty much the same for me: bleeding-edge Arch for my workstation, rock-stable Debian for my server.
I use Arch for personal and gaming, Debian for self hosting and hacking, Alpine for containerized cloud deployments.
Pretty much the same for me: bleeding-edge Arch for my workstation, rock-stable Debian for my server.
I maintain a rule that all files above the repo must be inside a folder, with one exception: a README file. Including the code
folder, this typically results in no more than 5 folders; the project folder itself is kept organized and uncluttered.
Don't forget: entrepreneur, playboy, philanthropist.
They are the project's subfolders (outside of the Git repo):
code
contains the source code; version-controlled with Git.wiki
contains documentation and also version-controlled.designs
contains GIMP, Inkscape or Krita save files.This structure works for me since software projects involve more things than just the code, and you can add more subfolders according to your liking such as notes
, pkgbuild
(for Arch Linux), or releases
.
I tend to follow this structure:
Projects
├── personal
│ └── project-name
│ ├── code
│ ├── designs
│ └── wiki
└── work
└── project-name
├── code
├── designs
└── wiki
From a time when websites used <table>
or position: absolute;
to place elements on the screen. That website is just one big table.
And pretty much the rest of the FSF and GNU websites.
Where the dotfiles at?
I recommend Peer Calls as an alternative. Peer Calls uses peer-to-peer communication similar to Jami. You can check out Peer Calls on Github for more info.
So, in short, the things I really like about it:
I'm in the same boat and also looking for a privacy-respecting platform for communicating with family and friends. So I'd also like to add items that are not yet mentioned to the list of suggestions:
I wonder sometimes if the advice against pointing DNS records to your own residential IP amounts to a big scare. Like you say, if it’s just a static page served on an up to date and minimal web server, there’s less leverage for an attacker to abuse.
That advice is a bit old-fashioned in my opinion. There are many tools nowadays that will get you a very secure setup without much effort:
And of course, besides all these tools, the simplest way of securing public services is to keep them updated.
I’ve found that ISPs too often block port 80 and 443. Did you luck out with a decent one?
Rogers has been my ISP for several years and have no issue receiving HTTP/S traffic. The only issue, like with most providers, is that they block port 25 (SMTP). It's the only thing keeping me from self-hosting my own email server and have to rely on a VPS.
I'd also like to add that you can save an image to a local file using
docker image save
and load them back usingdocker image load
. So, along with the options mentioned above, you have plenty of options to backup images for offline use.