Web Design

Why I Don't Use Linux

Intel recently removed Mir support from its video drivers.  I could get into the politics behind it - and I will - but the larger point here is: this is why I don't use Linux.

Well, actually I do use Linux.  I've been using Linux pretty much 100% of the time since 2005.  I made it through law school on Linux and have gotten friends and family to use it.  I even briefly practiced law on Linux.  (Writing briefs is hard but I found a way.)

But when I started working at SFO, I had to switch back to Windows.  The rest of the office used Windows and I wanted my dev environment to be the same as theirs (never mind the fact that the web servers use Linux).  It's also hard to get 100% compatibility with Microsoft Outlook in an enterprise environment.  But the real sticking point for me was the video drivers.  See, I've got three monitors at work, and they're all plugged into my laptop dock.  The graphics card is an Nvidia Optimus card - similar to the one on my home laptop, and I never got that to work perfectly either.  (Well enough to do dual monitor output with the Compiz spinny cube, but I can't tell you how many times I had to break Linux and reinstall it before I could get that to work.)  At work, I could get dual monitors to work, although it was sketchy.  As soon as I tried for triple monitors, it simply refused to start up.  I think the dock was part of the problem, but I can't plug in two external monitors to a laptop without the dock.  Sure, I could manage with two monitors instead of three, but intentionally crippling myself for Linux didn't seem worth it.

Besides, Windows support for web design isn't that bad.  Well, that's not true either - it's terrible.  But I found it relatively easy to install all the tools I needed.  That's a topic for another blog post, but with Aptana, Drush, and the Acquia Development Desktop I'm pretty much there.

That's worth noting.  With those three apps, I get syntax highlighting, doxygen support, bash, drush, ssh with key support, git, and a full WAMP stack.  In Windows.

Of course, I'd still much rather be using Linux.  It's a much better OS for many reasons, and it's still a much better OS for web development.  I've never seen a web design team that's all-Windows, and I probably never will again.  But hey, my monitors work.

And that's the real point there: graphics support.  For the last 10 years or so, hardware support on Linux has been excellent.  Better than Windows, in fact, because pretty much all the drivers are pre-installed in the kernel.  On Windows you still have to install drivers with every piece of hardware you add, but Linux offers true plug-and-play. 

Lately, however, it seems like the graphics card companies want to send Linux back to the dark ages.  NVidia still refuses to open source their drivers, which was okay when the open source community had pretty much reverse engineered everything, but now NVidia is coming out with baffling new hardware like Optimus that simply doesn't work in Linux.  They refuse to release Linux drivers for Linux and they refuse to let the community to do it either (with open source).  And now Intel has decided to screw us over by removing Mir support.

I understand there are politics involved; they've been pushing Wayland and have even hired full time employees to write open-source code for Wayland, and I'm sure they saw it as a slap in the face that Canonical decided to ignore all their hard work and support Mir instead.  Especially since Intel has, historically, been much friendlier to open source than any other graphics card manufacturer.  (Read: NVidia & Radeon.)  It does seem rather tone deaf for Canonical to make such a decision so unilaterally.  When you rely on alliances with large manufacturers, you should make sure you never piss them off.  Something like this should never be a surprise.

Bottom line: I have to use Windows at work because Linux can't figure out how to run graphics cards.  In 2013.