Tag Archives: linux

Running Real’s Rhapsody in Linux

Every once in a while I have to put something actually useful up here. I just spent a while trying to get Rhapsody to work in Firefox 3 under Ubuntu 8.10. Having an entire music store at my disposal is one of the things I miss most from my Windows machine when I need to run something under linux. There is a web version of Rhapsody that should run under linux, but I kept getting “Technical Issue” errors in the Rhapsody plugin every time I tried to play a song. It would tell me to restart, but would never work, even after a restart. It turns out I had an old version of the Rhapsody plug in laying around, and didn’t have a proper version of Adobe’s Flash player installed (the new version of the web-based Rhapsody client just uses Flash). Apparently, Rhapsody either doesn’t, or can’t, check the version of the plug-in installed on your machine, and if it’s the old version it will try to use it and fail. So, to get Rhapsody working first go to the following two directories

~/.mozilla/plugins
/usr/lib/firefox/plugins

and make sure you delete any file with the name nprhapengine.so. Then, in Synaptic (or whatever package manager you like) make sure you’re running the actual official Adobe version of Flash 10. Ubuntu, being produced by ideologues (didn’t think I’d be able to write about linux without taking a shot, did you?), makes it hard to install software built by evil monolithic companies. So, if you have Flash installed, it’s probably a buggy “free-as-in-doesn’t-work” version. Do a search for “adobe flash” in Synaptic and uninstall any currently installed free versions. Then, install the flashplugin-nonfree package. (If you don’t find that package, leave a comment and I’ll help you make it available to Synaptic.)

After that, restart Firefox. Rhapsody should now work using their excellent Flash-based client.

(By the way, I was able to get the full Rhapsody client running in WINE, and was even able to play 30 second clips. However, it wouldn’t let me login, saying that cookies were being blocked. If anybody has been able to overcome that, I’d love to hear about it.)

Major bug in Ubuntu 8.10 networking for static IP addresses

There is a bug in the new version of Ubuntu (8.10, or Intrepid Ibex), where static IP network settings are lost after every reboot. Kind of makes it hard to connect to your box remotely with ssh.

Pretty big bug, huh? You’d think it would be rather embarrassing when your latest operating system release breaks the internet for a large proportion of your users. You’d assume this would be high priority, right? Nope. Until recently, the bug was considered only Medium priority since there were workarounds, even though they were completely nonintuitive and nothing a basic user could ever figure out by themselves. Worse, Ubuntu 8.10 has been out for months now, and this still hasn’t been fixed, and probably won’t be for a long time. (In fact, it will end up being fixed in the next version of Ubuntu before they backport the solution to 8.10.)

This bug exposes a fundamental flaw in the Linux distribution development model, wherein the people releasing the operating system don’t actually write, or even understand, the various components they are packaging. If this bug came up in Vista, Microsoft could have it fixed in less than a week, because the guy that wrote their network manager actually works there. Of course, Microsoft tests their products before sending them out the door, so it wouldn’t have happened to begin with.

A lot of the people using Ubuntu 8.10 are going to turn tail and run back to Daddy Gates when they encounter this bug. They aren’t going to check bug tracking sites to figure out what’s going wrong, or look into work-arounds. So, you can ratchet the Linux market share down just a little bit more. At this point, the idea that Linux will take over the computing world is actually becoming downright laughable from the perspective of anybody who hasn’t been drinking a lot of kool-aid.

I’ve asked this question before, but why spend so much time developing Linux only to not bother to actually put out a quality product? If these folks are so inclined to spend time programming a mediocre operating system for free, why not just volunteer at Microsoft? At least that might actually help people.

(In case you found this page looking for a solution to the problem, instead of a pointless rant about it, I offer the following two solutions, in order of decreasing utility. The most obvious solution, assuming you value your time at something north of minimum wage, is to simply install a usable OS built by professionals who actually have something to lose were they to release an operating system with such an astoundingly egregious bug. If you’re too cheap for that, or suffer from Linux Masochistic Personality Disorder and absolutely must get this working, you can always edit your network settings by hand.)

Zen and the Art of Linux Maintenance

As I sat watching the Ubuntu upgrade work its way through the packages, at some point the computer became unresonsive to mouse clicks. I ended up having to do a hot shutdown in the middle. As you might imagine, this completely and utterly hosed my Linux partition.

You might wonder why I keep banging my head against the wall of Linux, despite my rantings about it. So did I. As I sat starting at the kernel panic message, however, I realized something:

As much as I complain, part of me enjoys putting up with this stupid operating system, even though it long ago exausted its utility by causing me to spend so much of my time that it was no longer worth any amount of avoided software cost.

As an engineer, I like to tinker and fix things, and Linux gave me the opportunity (or rather, forced me) to delve into the workings of the OS in order to manage it. Linux provided me with the illusion of feeling useful and productive on a regular basis as it required me to put my knowledge to work fixing the never ending litany of problems.

But as I sat looking at a hosed partition, I had the embarassed, hollow feeling that I’d really wasted an extraordinary amount of time focused on my computer as an object of inherent interest, as opposed to an expedient for actual useful work. My linux machine had become a reflexive endevour, largely existing for its own purpose, like a little bonsai garden that I tended to with wearing patience.

And now what do I have for it? I have some profoundly uninteresting knowledge of the particulars of one operating system, and a munged disk that’s about as practically useful as a bonsai tree. (Yes, my actuall work is backed up, but it’s never trivial getting everything exactly the way you had it with a new system install, no matter how much you backed up.)

This was all good, though, because it ripped from my hands something I didn’t have the good sense to throw away. Rather than huddle down with an install CD and try to fix my little Linux partition, I just let it go and started to get back to work, actual work in the outside world, using Windows.*

It feels good. I’m done with operating systems as a hobby, tired of indulging technology for its own sake. One must not get too attached to things.

*I’m not trying to insult OS X, which I think is probably better than Windows. I just don’t have a Mac at work. (I can only fight one holy war at a time.)

Why Linux is failing on the desktop

I should’ve known better. I wrote a post a few days ago detailing my frustration with Linux, and suggested (admittedly in very indelicate terms) that the global effort to develop Linux into an alternative to general use desktop OSes such as Windows and OS X was a waste of resources. I have absolutely no idea how 400 people (most of them apparently angry Linux fans based on extrapolation from the comments) managed to find their way to the article within hours of me posting it. I think they must have a phone tree or something. Nonetheless, I should’ve been more diplomatic. So, as penance, I will here attempt to write a more reasonable post better explaining my skepticism of desktop Linux, and will even try to offer some constructive suggestions. I’m sure this post will get no comments, in keeping with the universal rule of the Internet that the amount of attention a post recieves is inversely proportional to the thought that went into it.

Before starting, let’s just stipulate something supported by the facts of the marketplace. Desktop linux has been a miserable failure in the OS market. If you’ve consumed so much of the purple koolaid prose of the desktop linux community that you can’t accept that, you might as well quit reading now. I think every year for the past decade or so has been “The Year Linux Takes Off.” Except it’s actually going down in market share at this point.

As I pointed out in my first post, (perhaps a bit rudely) this isn’t just a bad performance, it’s a tragic waste of energy. Can you imagine the good that could’ve been done for the world if these legions of programmers hadn’t spent over a decade applying their expertise (often for free) on a failure like desktop Linux? For one, they could’ve made a lot of money with that time and donated it to their favorite charities, assuming they were as hellbent on not making money as they appear to have been. And two, it might have been nice to see what useful things they would’ve produced had they done something somebody were actually willing to pay for as opposed to trying to ram desktop linux down the collective throat of the world. You know, sometimes the evil capitalistic market does useful things, like keeping people from wasting their time.

Open Source community projects put innovation and scale at odds. If an Open Source project is to be large, it must rely on the input of a huge distributed network of individuals and businesses. How can a coherent vision arise for the project in such a situation? The vacuum left by having no centralized vision is usually filled by the safe and bland decision to just copy existing work. Thus, most large scale Open Source efforts are aimed at offering a open alternative to something, like Office or Windows, because no vision is required, just a common model to follow. This is not to say that innovation is not found in the OS community, but it is usually on the smaller scale of single applications, like Emacs or WordPress, that can grow from the initial seed of a small group’s efforts. The Linux kernel is a thing of beauty, and is actually a small, self-contained project. But the larger distribution of a deskop OS is another matter, and here we find mostly derivative efforts.

An OS is only as good as the software written for it. One of the great things about Open Source is that there is a tremendous power in being able to take an existing project and spawn off a new one that fixes a few things you didn’t like. While this is fine for an application, it’s problematic for a piece of infrastructure software expected to serve as a reliable, standard substrate for other software. Any Linux application requiring low-level access to the OS will have to be produced in numerous versions to match all the possible distros and their various revisions. See OpenAFS for an example of how ridiculously messy this can get. For apps, do you support GNOME or KDE or both or just go, as many do, for the lowest common denominator? And supporting hardware accelerated 3D graphics or video is the very definition of a moving target. There are multiple competing sound systems, and neither are nearly as clean or simple as what’s available on Windows or the Mac. The result is usually a substandard product relative to what can be done on a more standardized operating system. Compare the Linux version of Google Earth or Skype to the Windows version of the same to see what I’m talking about. (That is, if you can even get them working at all with your graphics and sound configuration.)

Pointing the finger at developers doesn’t solve the problem. To begin with, for the reasons I’m explaining in this essay, some of the poor quality of Linux software is the fault of the Linux community diluting its effective market share with too many competing APIs. Even without this aggravating factor, developers just can’t justify spending significant time and money maintaining a branch of their software for an OS that has less than 3% market share, regardless. Because of this, the Linux version of commercial software is often of much lower quality than the Windows or Mac version. A prime example of this is the ubiquitous and required FlashPlayer. It consistently crashes Firefox. Is it Adobe’s fault? Maybe. But when this happens to somebody, do you think they know to blame Adobe or Firefox or just Linux? Does it even matter? It’s just one more reason for them to switch back to Windows or the Mac. And for the record, why should Adobe bother to make a good version of FlashPlayer for a platform with no stability and few users?

This solution to all of this is easy to state, but hard to enforce. (There are downsides to Freedom.) Somehow the fractious desktop Linux community must balance the ability to innovate freely with the adoption of, and adherence to, standards. Given the low market share, linux has to be BETTER than Windows as a development target, not just as good or worse. However, one of the problems with Linux seems to be a certain arrogance on the part of its developers, who consider applications as serving Linux, and not the other way around. An OS is only as good as the programs written for it, and perhaps the worst thing about Linux is that it hinders the development of applications by constantly presenting a moving target, and requiring developers to spend too much time programming and testing for so many variants.

It’s downright laughable that an OS with single digit market share would further dilute its market share by having two competing desktops. Yeah, I know KDE and GNOME are supposedly cooperating these days. But (a) it’s probably too late, (b) it’s not perfect and (c) even if you disagree that it dilutes effective market share, it still dilutes the development effort to maintain two desktop code bases. For godsake, somebody kill KDE and make GNOME suck less! Yeah, I know that’s never going to happen. That’s why the title of this essay is what it is.

One of the few things MS gets right is that for all they get wrong, they do understand one crucial thing: developers are the primary customer of an operating system. They may advertise to consumers, but they know that at the end of the day it is developers whom they serve. The Linux community just doesn’t get this.

Unfortunately, I don’t have much hope of desktop Linux ever becoming sufficiently standardized. If the focus becomes on making Linux friendly to applications and their developers, the distributions must be become sufficiently standardized as to render them effectively consolidated, and the desktop frameworks so static as to negate much of their Open Source character. For Linux to become more developer friendly, it would have to essentially become a normal operating system with a really weird economic model.

OS development actually requires physical resources. The FOSS movement is based on the idea that information is cheap to distribute, and thus it is a tremendous leverage of the human capital to develop it. People write some software, and the world perpetually benefits with little marginal cost. That works beautifully for applications, but OS development, especially desktop OS development, requires tremendous continuous resources to do correctly. For one, you need tons of machines of different configurations and vintages on which to test, which costs money. And you need a large team of people to run all those tests and maintain the machines. Any respectable software company dedicates a tremendous amount of their budget to QA, fixing stupid little bugs that happen to come up on obscure hardware configurations. Linux just can’t achieve the quality control of a commercial OS. And that’s probably why when I “upgraded” from Gutsy to Hardy, my machine no longer completes a restart on its own. Maybe this will get fixed when the planets align and somebody with the same motherboard as me who also knows how the hell to debug this runs into the same problem, but I’m starting to get weary of this, and the apparently I’m not alone based on the declining desktop linux market share.

Continue reading

The results of my annual desktop Linux survey are in: It still sucks!

Note: If you are a member of the Orthodox Church of Linux and you suffer from high blood pressure, you might want to consult a physician before reading this. In fact, you may just want to skip to my follow up article, which presents my criticisms of Linux in a much more explanatory form.

I’m a sucker for a good story and that Linux certainly is: millions of programmers working out of the sheer goodness of their hearts on a project to benefit humanity by providing a free operating system. Never mind that they only cost about $100 anyway, and represent less than 10% of the cost of a new computer. Microsoft just makes us all so angry that if we have to spend billions of person hours so that we can all save $100 every few years, so be it. Time well spent.

So, it’s with heady optimism and hope for the future that once a year I anxiously download and install the latest consumer desktop incarnation of Linux, my eyes watering with the promise of life without Microsoft. For the past six years, I have installed Linux at some point during the year with the hope of never having to go back. And for the past six years I have used Linux for a week or so, only to inevitably capitulate after tiring of all the little things that go wrong and which require hours searching on the web for the right incantation to type in /etc/screwme.conf. While every year it gets a little bit more reliable, I am always guiltily relieved to finally get back to Windows, where there are no xorg.conf files to get corrupted or fstab files to edit.

This year, I decided to try Ubuntu 7.10. Given the hype, I had very high hopes. It installed without a hitch, and came up working fine for the most part. Just a small problem with the screen resolution being off and my second monitor not being recognized. I thought, “That should be easy to take care of. This could be the year!”

Continue reading