this post was submitted on 14 Jan 2026
28 points (88.9% liked)
Technology
78750 readers
3780 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Trust the experts bro
Well, if we relax and look at this from a different angle, for much of humanity's history advantageous knowledge was hidden or limited from competition, and in many things it still is.
Except advantageous knowledge of chemistry for early cannons, for example, could be confirmed. Better gunpowder.
This - can't.
Still, if the service is supposed to be security and privacy-oriented, how about you make the source-code available, so users can verify this for themselves?
Well, again, taking an unpopular but valid point of view - how good it really is to have the source code for finding vulnerabilities? Is it really harder to hide an intentional backdoor in the source code in plain sight than it is in something that's only distributed in binaries? I have no relevant experience, but I've listened to a lecture by someone from Kaspersky lab saying that.
Having commonly available source code is good for development and learning of functionality of something, but security flaws have that subgroup of backdoors.
If open-source, a lot more eyes could be on it, and therefore the chances of intentionally implemented vulnerabilities, by Threema itself, would have a higher chance of being noticed before able to be exploited, by both hackers and Threema (partners).
On the source code. Absolutely the same amount of eyes on the binary.
Anyway, there's a joke (by Linus Torvalds, I think, but maybe I am wrong) that most of the eyes that could look at the code are attached to hands typing the thing about "more eyes".
Source code being available is obviously beneficial for learning how a program works as a whole, or participating in its development, obviously, but for finding things hidden I'm not sure.
Ah sorry, it seems I read over that part. Unless programmers have the exceptional skills and time required, to effectively reverse engineer these complex algorithms, nobody will bother to do so; especially when required after each update. On the contrary, if source code was available, the bar of entry is significantly lower and requires way less specialized skills. So save to say, most programmers won't even bother inspecting a binary, unless there's absolutely no other way around or have time to burn. Where as, if you'd open up the source, there would be a lot more, let's say C programmers, able to inspect the algorithm. Really, have a look at what it requires to write binary code, let alone reverse engineering complicated code, that somebody else wrote.
I agree with Linus' statement though: I rarely inspect source-code myself, but I find it more comforting knowing, package-maintainers for instance, could theoretically check the source before distribution. I stand by my opinion that it's a bad look for a privacy- and security-oriented piece of software, to restrict non-"experts" from inspecting that, which should ensure that.
Once again you are talking about programmers in general and not security researchers.
I have had a look. I've also done some solving of simple crackmes and such. I'm definitely not competent, but to find a security backdoor well-hidden you'll have to examine behavior, which requires certain skills, and then you'll have to look at the executable code, and then, of course, having the source is good, but less so if it's deliberately made look like normal.
I think I'm mistaken on that attribution, OpenBSD's Theo de Raadt is more likely to be the author.
Yes, I agree that it's better when the source is present. But if you overvalue the effect, then it might be worse. Say, again, with Linux - plenty of people are using thousands of pieces of FOSS software, trusting that resulting thing far more than Windows. If we knew that the level of trust is absolutely the same, then one could say Linux is safer. But we know that people sometimes do with Linux all kinds of things they wouldn't do with Windows, because they overvalue the effect of it being FOSS. It's FOSS, but you still better not store 10 years of home video unencrypted on the laptop you are carrying around, things like that.
Yes, because they constitute a significant portion, of the eyes, traditionally involved with doing the verification of software. You can allow a potentially cherry-picked group of researchers to do the verification, on behalf of the user-base, but that hinges on a "trust me bro" basis. I appreciate you've looked into the process in practice, but please understand that these pieces of software, are anything but simple. Also if a state-actor were to deliberately implement an exploit, it wouldn't be necessarily obvious at all, even if source-code was available; they're state-backed, top of their game security-researchers themselves. Even higher tier consumer-grade computer viruses, won't execute in a virtualized environment, precisely to avoid being detected. They won't compromise when unnecessary, and might only be exploited when absolutely required; again to avoid suspicion.
I fully agree with the last paragraph though, and believe there to be an overreliance on digital systems over all. In terms of FOSS software, you have to rely on many, many different contributors to facilitate maintenance, packaging and distribution in good faith; and sometimes all it takes is just one package, for the whole system to become compromised. But even so, I'm more comfortable knowing, the majority of software I'm running on my machines, to be open-source; than relying on a single entity, like Microsoft, having an abysmal track record in respect of privacy, while doing so in the dark. Of course you could restrict access to Microsoft servers using network filtering, but it's not just that aspect, it's also not having to deal with Microsoft's increasingly restricted experience, primarily serving their perverse dark patterns. I do believe people should handle sensitive files with care, for instance: put Tails on a live-USB, leave it off the internet, put the files on an encrypted drive, dismount the drives physically, and store them somewhere safe.