I am a bit of a sucker for Disney. They constantly provide experiences that delight and entertain. While I know they’re a large, monopolistic company setting out on an ambitious quest to dominate the entertainment industry… I just can’t seem to dislike or even distrust them as much as I would, say, Amazon. My family seems to be much of the same.

That being said, we bought into Disney+ from the get-go. It offered a chance to see all of the classic Disney movies, Marvel smash hits and some NatGeo content as well as some specially created content designed for Disney+ exclusively. Honestly, so long as they kept up with the content, I didn’t see the problem.

Well, now we’re 2 months (as of now) in and I’m already bored with it. Mandalorian is already over, Forky with it. There are some new shorts pulled from the various Pixar movies and some Star Wars content that I haven’t seen in a while. The problem is, the vast majority of the content is either trash, or content I didn’t care about before the launch of the service.

This is a real problem. Not just for Disney, but for Netflix, Hulu (also Disney-owned), HBO, Amazon and Apple. There is a voracious desire for content, but it doesn’t come fast enough. People then look elsewhere for new stuff, or drop their subscription until content is fully loaded. I wish I could just buy the content ala carte. We’re back to the old television channels problem: There is so much crap I don’t want or need, but I’m still paying for it.

I know ala carte probably isn’t a good business model to sustain such expensive material, but tying me to a subscription that I use for only part of the time also seems like theft. It’s primarily why I keep waffling around on Apple Music/Spotify/Amazon Music Prime. I don’t listen to that much music, but I’d like to listen to it occasionally without any advertising. It’s certainly a thorny problem.

I’m probably also being a bit of a curmudgeon.


Performance Enhancers

With a bit of tweaking and some fixing up of the default nginx configuration on my host, I’ve finally hit very good performance metrics. Though I can’t imagine it’d behave this way under any serious load.


Computers Family Feelings


I sometimes get to a point where I question a lot of what I do and what matters to me. Sometimes it’s because of simple things, like a broken tool, or a frustrating problem. Sometimes it’s more of a philosophy problem, where I question how I feel about something and if I should change my ideas or mannerisms behind some action.

Lately, it’s been a little of both. I use an iPhone, Apple Music, iCloud (storage, etc), Office 365, and probably other tools and services that would be considered “non-free” in the “libre” sense. This has lead to me thinking about what I actually want out of computing. It’s such a thorny question, because there are so many comfortable choices that I’m in, that upending them would probably throw my life (and my family’s) into temporary turmoil.

For instance: Office 365. I pay for just the Exchange Online component, because I don’t need the actual Office suite as we (my family) get it free from my school and other organizations that we’re associated with. Exchange Online has been fine from a end-user perspective. Very rarely do I have any real issues to speak of, other than paying for it. My real problem here is that it’s a very proprietary platform, and because of that, it’s moderately difficult to get out of and to connect to with free-software tools. The IMAP support is… functional, but the contacts and calendars are tied down.

Similarly, I have lots of Apple devices and services. Like Office 365, I don’t have any complaints, per se, it’s just that they’re extremely proprietary and that means getting out of the ecosystem is difficult, and like above, connecting using free-software tools is straight up impossible.

The reason I have these things in the first place is that my family, who don’t hold my free-software ideals, want/need access to reliable tools they can use from multiple places. This is not an unreasonable request, and is one that can be solved with enough time, free-software, capital and expertise. Unfortunately, I’m not willing or able to host all of that, or even administer it. I don’t have the time, and I certainly don’t have the specific expertise to do all of it. Hence the current implementation.

Balance between freedom, convenience and cost is a tricky one. While I’d personally like to model myself more in line with the FSF’s computing ideals. The problem is that I have family members that I have to support as well. I’d also like to move them toward more free-software systems. While I recognize their choice in platforms is their own, I also get to say what I will and will not support (kind of, family is so complicated).

I don’t have any kind of resolution for this, I’m still trying to figure out what kind of path I want to take. Is pragmatism the smarter choice? Idealism feels right, but is massively more difficult to implement. There is probably a good middle of the road, but I don’t know if I’m on it. Maybe it doesn’t matter?

What should I do?

Featured Image

“Ugh, a Mac”, by Joe Wilcox – License: CC BY-NC-SA 2.0

Computers Rants

Historical Function

Today, I delved into a tiny bit of X11/xorg plumbing to see if I could move this damnable .xsession-errors file to someplace more out of the way. For those of you not in the know about Linux or X11, it’s basically a holding tank of any GUI application errors that aren’t normally logged. To me, it seems vestigial of a time before centralized system logging via systemd, but who am I to question our forefathers?

Well, damn it, it’s open source! Lets see if I can fuck with it and make it do what I want it to do!

Turns out no. At least not easily.

Inside /etc/X11/Xsession (on Debian, anyway), exists the following bit of code to create or temporarily create an error log file:


# attempt to create an error file; abort if we cannot
if (umask 077 && touch "$ERRFILE") 2> /dev/null && [ -w "$ERRFILE" ] &&
  [ ! -L "$ERRFILE" ]; then
  chmod 600 "$ERRFILE"
elif ERRFILE=$(tempfile 2> /dev/null); then
  if ! ln -sf "$ERRFILE" "${TMPDIR:=/tmp}/xsession-$USER"; then
    message "warning: unable to symlink \"$TMPDIR/xsession-$USER\" to" \
             "\"$ERRFILE\"; look for session log/errors in" \
  errormsg "unable to create X session log/error file; aborting."

exec >>"$ERRFILE" 2>&1

The gist (haha) here is that if you don’t have one, make it, if you can’t make it, create a dummy one in the system temp folder and if that fails, just give up. All well and good, right? Right.

So, what happens if we, say, modify the ERRFILE path to something else? Well, I tried that and I got an unexpected result: The file being created correctly in the right location (yay) and an empty file created in the original location (boo).

Excuse me, what the actual fuck?

Right now, I’m kind of at an impasse here. It seems like changing the code above to do what I want it to do works but there is something probably hard-coded elsewhere that just goes “DUMP IT IN $HOME/.xsession-errors“, which is frustrating. I don’t know where else to look. If anyone has any ideas, let me know, I’d be glad to credit you.


I thought I had a smoking gun. I redirected everything in my .xprofile to two different logs, manually. This logged correctly, but I still got a leftover file, let’s see who has it open:

Oh, hello i3bar….

So, I go and re-introduce my changes to Xsession, like so:

# Original
# ERRFILE=$HOME/.xsession-errors

# attempt to create an error file; abort if we cannot
# if (umask 077 && touch "$ERRFILE") 2> /dev/null && [ -w "$ERRFILE" ] &&
#   [ ! -L "$ERRFILE" ]; then
#   chmod 600 "$ERRFILE"
# elif ERRFILE=$(tempfile 2> /dev/null); then
#   if ! ln -sf "$ERRFILE" "${TMPDIR:=/tmp}/xsession-$USER"; then
#     message "warning: unable to symlink \"$TMPDIR/xsession-$USER\" to" \
#              "\"$ERRFILE\"; look for session log/errors in" \
#              "\"$TMPDIR/xsession-$USER\"."
#   fi
# else
#   errormsg "unable to create X session log/error file; aborting."
# fi

exec >>"$ERRFILE" 2>&1

I checked and it dumps the standard rigamarole from X starting up. That’s good(tm). Problem is, I still get an (empty) .xsession-errors file.

Time to investigate sway/wayland.


Link Expiry

We all know it. You search for an issue, or topic you’re interested in, click a few links and boom. Dead end. The page no longer lives there, the domain is gone, or the server ended up at the bottom of a river. Even my website is no exception.

While hypertext documents shouldn’t change, we all know they can, and will do so often. Which is why we have such interest int tools like and the Wayback Machine. These tools regularly scrape, or have users submit interesting material for archiving. It’s frequently used to ensure a particular version of a page or site is preserved.

I started thinking about this because I read an article about strategies for linking to obsolete websites (thanks Beko Pharm). One was to use a periodic link checker to find stale or broken links on your site. Optionally swapping out outdated references with fresh ones, or with links into the Wayback Machine. While this is all well and good, I think it might be more useful to self-archive sites. Use something like wget to pull down the document and associated resources and host it yourself (statically), or at least provide an archive for people to download and inspect.

Has anyone given this any further thought? It doesn’t sound like a technically complicated project, but I’m sure someone has already trodden down this path and came to some sort of outcome or reason it’s not worth it.