Weakness Observatory

Jan. 17, 2021 [technology] [libre]

I do a lot of chest beating here championing free software for its successes but let’s take a look at some areas where free software has met with some failure.

Gaming

The trajectory of professionally developed computer games for Linux appeared to be on the upward back in 2013 when Valve announced their Steam Machines and had already been porting the Steam client and Valve titles natively to Linux. Initially, this had caused a number of studios and other distributors to also take on Linux efforts. But this quickly evaporated after Steam Machines failed to take off commercially, with Valve later capitulating and offically integrating WINE-wrapped games as “Steam Play” titles, further entrenching the Windows environment imposed on gamecraft.

Since then, we’ve seen a domino effect of developers dropping Linux support in favor of simply directing Linux players to WINE (Proton). Numerous other challenges plague Linux releases such as anticheat engines failing to support Linux as a target. A portion games that were “ported” in the short lived fury of the post-Steam Machine era were done so through commercial translation layers like Eon which come at a cost of performance and accuracy, leading to a false perception by uninformed players that the issues where somehow the fault of Linux. These porting houses have also largely thrown in the towel.

But on the libre side of things, games just tend to be awful, to be blunt. They often visually look as though they were developed for a school project. If there is multiplayer functionality, the servers are often barren wastelands. One of the staple genres, first person shooters, is highly disproportionaly parkour twitch arena shooters, half of which are just some twist of the quake engine. If that’s your kind of thing then maybe it’s not a mark against libre FPSes, but their active player count tells a different kind of story.

Over the years, I have arrived at the conclusion that gaming is fundamentally incompatible with the goals of free software. In order to make a compelling game, the author needs to set rules and boundaries. Otherwise there is no challenge to overcome. But in order to set these boundaries, it must first be possible to prevent the player from exercising some of the four basic freedoms on their device, especially in multiplayer settings. So it is no wonder that libre and open source games tend to be relegated to acedemic curiosities and dead projects.

I’m not hopeful that the situation is ever going to turn around. But if you’re interested in tracking free and open source gaming, have a look at:

Otherwise, one might settle for games which are just DRM-free, native Linux releases but still proprietary:

Application Aware Firewalling Resolved

The firewall situation on libre operating systems is developing. End users have been able to opt for tools such as firewalld or (g)ufw, or directly through iptables or nftables. But these are what I call whack-a-mole firewalls. They only care about allowing connections by port or address and are completely application blind. This leaves users constantly having to play whack-a-mole to find which connections are trying to go outbound and then constantly open these ports, all the while praying that it is the only application that will use this port. Desktop users will have a multitude of applications which may be reaching out over the network, sometimes unsolicited or through dynamic port ranges. Opening port 443 for web browsers will also facilitate any other application to connect on this port with no way to discriminate between individual applications.

It has taken all too long for free and open OSes, often mired for their handling of security and privacy, to devise a proper application aware firewall solution. And we now seem to have a winner:

OpenSnitch: Was a one-man-show for some time until development halted. It was eventually picked back up by another author and has begun to reach maturity. OpenSnitch is rapidly shaping up to become the uMatrix for your entire OS. And it is now widely available in distro repositories, beating the others to the punch!

The prognosis: I will be covering OpenSnitch in greater depth at some point.

HiDPI Scaling

Resolutions exceeding full HD are actually not in terribly bad shape, as most desktop environments seem to have implmented DPI scaling. In my experiences, this scaling extends itself to anything using qt and gtk. Problems remain however for individual programs. Any fixed icon in an interface will be roughly 1/4th the size of what it was designed for. Form boxes and text on fields can sometimes be cut off or shifted out of view.

Games have also been slow on the uptake. In many free games it is possible to set the resolution to 2560x1440 or 3840x2560 but the actual FoV and in-game UI might remain below 1920x1080 scale. Despite the DE-wide scaling implementation, it is clear that the teams behind these individual projects would need to adjust their UI options for high resolution displays. This invariably means that some programs simply may never play nice on your new ultrawide 4K monitor.

Video Calling

2020 has drawn a lot of attention to this shortcoming, which may end up being a positive thing, as video chat/video conferencing has traditionally been a weak area for free software. The best contendors we seem to have at the moment are;

  1. Tox: Works well, I have actually not found any technical issues with it but good luck getting friends, acquaintances and family to try it. P2P may be a limiting factor in holding a video conference with many peers, depending on bandwidth.

  2. Jami: Formerly Ring. All users must be on the same release version, I have run into forward/backward compatibility problems in my testing. Otherwise it is fine execpt for the same potentiality of P2P scaling limitations mentioned above for Tox. UPDATE 2023: Now implements swarm messaging so that offline users will still receive messages once they return.

  3. Telegram: As of September 2020, they have implemented one to one video chat as an alpha feature. I have not tried it, Telegram demands a phone number and relies on a third party man in the middle. Telegram cannot be self hosted.

  4. Jitsi: Worked well once when I tried it in the past, it seems to have been removed from distribution repositories (fell out of development?). The Jitsi site seems to indicate that it is now just a web application.

  5. XMPP (Protocol): Statuses and invitations where a bit slow to update, but I was pleasantly surprised at the call quality. The only drawbacks I see being that laypersons may struggle to conceptualize creating accounts with different servers and varying clients, and that not all servers fully comply with XMPP standards. I would be extra careful to introduce colleagues to this. Maybe use the analogy of email.

A scattered offering, but these as well as others I did not name have been receiving some much needed development attention in response to the events of 2020.

Web Search

It might be a bit harsh to call libre web search a weak area when the proprietary counterparts are an absolute dumpster fire that even normies have begun to deride. There are a few open source or copyleft engines that can be self hosted. The problems with each;

  1. Yacy: Written in Java. This causes massive issues once the index grows beyond a certain size. I have experienced Yacy silently crashing, screaming CPU fan due to out-of-memory JVM, the entire settings of my Yacy node being purged following a daemon restart, and other memory-induced issues. If you run Yacy, have lots of RAM and assign as much of it as possible to Yacy’s JVM. How much RAM is enough for Yacy? Who knows? The point of a searchable index is to grow it. What was enough memory only a week ago is entirely consumed only a week later. And capping the size of your index is an incredibly limiting “fix”.

  2. Marginalia: Java. What is it with open source search projects and using Java? Programs that will be handling comparitively massive amounts of data… using a memory hungry interpreter with dubious garbage collection.

  3. Gigablast: Dead project? Yup, dead. Gigablast’s official instance had been shut down a while ago and it looks like the last actual code contributed to the codebase was back in 2022. Also only builds for x86 targets. So not very useful for me.

  4. Spyglass Has pivoted into more of a local index role. Indexing websites is still possible, but the crawler will probably never be developed into a robust web crawler as a result. From their REAME:

Not all websites & not all data can be crawled by Spyglass.

There are also the metasearch engines, not having their own local index;

  1. SearX/SearX-NG: Cell-division into two separate projects because of some lover’s quarrel between developers (Actually, I don’t really know why and don’t really care. It’s just more needless fracturing.) Metasearch is good in theory. But, in practice, it leads to a lot of

Engines cannot retrieve results:
engineA (unexpected crash), engineB (unexpected crash)
engineC (unexpected crash), engineD (unexpected crash)

Commercial engines really don’t like answering POST requests, and especially not requests originating from known Tor exit nodes. Search results in SearX/NG are so unreliable, in fact, that requesting a second page of results is a roll of the dice as to whether anything actually gets displayed or if you’re met with more “(unexpected crash)” errors. And the longer and more complex a search query is, the less likely it is to successfully yield results. I find that searches have to be only one or two terms at a time.

FOSS Web Browsers

Okay, not weak, but not doing well. There are very few options and I won’t belabor the point as the situation as been written about extensively by others. Just a quick overview of the battlefield:

  1. Firefox
    Is the sole independent, not-so-independent browser with its own rendering engine that is functionally complete and used by more than twelve people. Mozilla are constantly harrassing their users and have recently pivoted to go hard as an advertising company. “We know that not everyone in our community will embrace our entrance into this market”. To use Firefox first requires a meticulous configuration procedure in order to make it respect user privacy. An extra step that is almost unheard of in the world of free and open source software.

  2. Firefox derivatives
    Some are actually quite good. But all suffer from being downstream from Firefox. Maintainership has lapses in activity, manpower is spread thin. Security patches will inevitably take longer to reach package distribution. And, often, the changes these teams make to Firefox are simply things that you can do yourself through user.js, about:config and addons.

  3. Chromium
    Ungoogled Chromium, which I haven’t really used so I cannot speak on, is a reductive effort. Using it still contributes to a web browser monoculture, further entrenching the root of the problem. Using Chromium as-is is you might as well just be using Chrome, which is worst-case scenario.

  4. Chromium derivatives
    Suffers from the same problems described in 2. Firefox derivatives.

  5. The indie guys
    I am hopeful for projects like Servo Browser, Otter Browser and others who are trying to go their own way. They seem to perpetually be in a state of alpha testing and development so using them leaves one without the option to do anything advanced such as using granular web blockers.

Wouldn’t it be nice if we could look back five or ten years from now and say “Remember when there was just Google Chrome and Mozilla Firefox and almost everything else was just based on those?”

*Article last revised on 10/10/24