One of the nice things of buying a new pc is that you have an old pc to mess with. Having backed up the most important stuff, my old machine now is the victim of some random linux installer abuse. Right now I’m installing ubuntu, a debian derived linux distribution. It’s been a few years since I ran linux outside of vmware (basically when I bought the machine I am now using for my linux install). I used to have more time to mess with trying out this kind of stuff. I know from experience that getting linux to work is easy and getting it to work properly is very difficult. Presumably, ubuntu should make this more easy but lets see what we end up with. I actually typed most of this review during the install, plenty of time for that.
If you came here to read how beautiful ubuntu is, move on because the stuff below is probably a bit painful for you.
I opted for the bittorrent release of th 5.10 release. It’s a 2.8GB download so bittorrent is mandatory. Burned it to a dvd with my new drive
Insert the dvd in the drive, make sure bios is configured to boot from cd (on most systems the default is wrong) and reset.
Here it gets interesting. I can select install, server and live dvd. Install seems a good plan. Up comes the text based installer. I was half expecting a graphical installer so that is disappointing. Worse, the installer seems of the intimidating, piss off end user variety. Luckily, I’ve seen worse (I installed slackware on 486 SX 25 mhz once). Navigating is pretty straightforward if you’ve dealt with ms dos or similarly clumsy uis in the past. The only severe usability issue is going back. There’s a back option on some screens but you can’t get to it using the arrow keys. You have to use the backspace, doh!
Progress bars or lack thereoff.
Another odd thing in the installer is that in between the screens where you are supposed to select stuff you get these nice blue screens without any content whatsoever. For example, I’m currently running the disk partition tool and the screen has been very blue for the past ten minutes (of the ms bsod variety). I mean, at least display some text telling me that everything is fine and I should be patient.
My network cards are detected and configured using dhcp. Bonus points for that, nothing worse than trying to fix linux problems offline. The usb mouse seems to work (led is on) as well but I can’t use it in commandline ui.
This tool, aside from the before mentioned HUGE usability problem, seems to behave rather nice. The default is resize my hdb1 partition which supposedly makes it possible to leave my windows partitions alone. That’s nice but it takes a loooooong time. A warning might have been nice. Anyway I remember the old days of manually partitioning using all sorts of obscure tools including the commandline fdisk tools of both windows and linux. Again usability rears its ugly head. After resizing the UI reappears with some worrying information about new partitions it is about to write on the (supposedly?) freed space. What’s worrying is that it doesn’t say how large each partition will be and what happened to the resized partition. Some confirmation that resizing worked as expected would have been nice. After some hesitation I select yes to indicate that it can do its thing. Had there been any important stuff on the partition I would probably have ejected the ubuntu disc at this point. This is bad. This is a crucial phase in the installation and if something goes wrong, it will likely be here. Bonus points for functionality but the usability totally sucks here. Partitioning is scary, especially with a tool you’ve never used before. I’ve seen it go wrong in the past.
Installing the base systems and copying remaining packages.
Finally some scrollbars. But no package selection at this point. That’s probably good as debian package selection is not something you want to put in front of users at this point. More on this later.
Timezone and user configuration, configuring apt.
I suppose this is necessary but I’d prefer a real user interface. Also there’s some odd stuff here. Like having to know if the hardware clock is gmt or not (it’s not, I happen to know this). Ntp + me telling what timezone I’m in provides the same information. Finally it offers to configure a bootloader (grub) so I can choose to boot into linux or windows xp. That’s a nice touch. Debian got this wrong last time I tried it and I had to fix LILO manually to get back to windows.
Time for a reboot.
The boot screen. Pretty, if you like brown. And then back to the commandline UI in stylish bsod blue. It’s now doing its post installation routine which appears to involve configuring a user (no root, ubuntu has no root!), installing all the debian packages, downloading a few new ones. I know how debian works so not unexpected but definately not very user friendly. It involves lots of cryptic messages about various obscure pacakages being prepared, configured etc.
It comes up with a question about screen size halfway. I select 1280×1024. I can’t select refreshrate and indeed this proves to be configured wrong after the installation (60Hz instead of 85hz) Then the install continues, no more questions.
Then suddenly it is done and the login screen appears. This is linux, no further reboots necessary the installer finished without much ceremony and X was launched. The bootscreen is nice, if you like brown. I log in with my user/password. Gnome appears to be configured apple style (menu bar at the top, taskbar at the button) a popup informs me that 48 updates are available. Install seems to work fine which proves that the network is indeed configured properly.
Configuring the screen properly.
60 hz will give me a headache so that needs to be changed. Upfront I’m not very hopeful that tools have improved to the point where this can be done without manually editing X configuration files. But lets see how things have improved in the past few years.
Not much apparently. The good news is that there is a resolution tool in the system->preferences. It even has a dropdown for the refreshrate. Only one item is in it: 60HZ. Doh!
This is linux at its worst. It’s not working and the provided tools are too crappy to solve the problem at hand. A search on the ubuntu site confirms that monitor configuration is undocumented. In other words, I’m on my own. Google brings up the solution which indeed involves the commandline and hoping that the autorecognition will magically work when tried again.
Of course it doesn’t. Worse, I now understand why the installer tries to hide the inevitable sudo dpkg-reconfigure xserver-xorg. This basically is the good old XF86Config wizard. I have fond memories of toying with it in 1995 (slackware). It has gotten worse since. At the time it asked a few difficult but straightforward questions. The modern version of this tool presents you with a whole array of bullshit features and autorecognition features which half work. Lets face it, if they worked you wouldn’t be running the reconfigure. Forget about autoconfiguration. Everything the installer figured out is now forgotten (with no obvious way to redo that part other than placing the backup back).
Essentially this is horrible tool brings together everything that sucks about X in one convenient tool. Mere mortals are guaranteed to be totally confused by this beautiful piece of shit that after all these years still survives in linux. The inability of the linux community to fix this once and for all is illustrative of the hopelessness of the whole concept of desktop linux. The linux solution to display configuration is to hide this tool instead of implement an alternative. On the first go I did not manage to get the optimal refreshrate. On the second go I screwed up the display configuration. Copying back the backed up configuration did not fix the problem.
Ahem, reboot seems to ‘magically’ fix the problem. At least, I’m back where I started (1280×1024 @ 60 Hz).
Ok, so much for wizards. I knew in advance that I was going to end up manually editing the display settings. For the record, this is where normal users either go back to windows or accept the headache. I know from experience that editing X configuration is a matter of trial and error. In my case five reboots and the documentation for my plug and play m990 monitor did the trick. Ubuntu failed to setup my monitor’s horizontal and vertical refreshrates, something it should have figured out from the plug and play information. OK shit happens. The next problem was that the tool to fix this problem is reconfiguring the package. Doing this undos most of the good work the ubuntu installer did (so it makes things worse). Solution: copy the backup of the ubuntu configuration and edit it manually to fix the refreshrates (30-96 and 50-160 in my case). Then reboot because misconfiguring X really screws things up to the point that a reboot is required to make X start again after you fix the configuration. Been there, done that before. At least the bloody wheel mouse works out of the box nowadays.
Conclusions for the installer
Usability sucks but the installer gets the job done anyway except for configuring the screen (important). However there are several majr pitfalls you have to know how to avoid. The installer is not particularly informative about what it is doing and needlessly verbose at the same time. However, the defaults are sane and a strategy of going with the obvious choices will work most of the time (if in doubt, hit enter).
The default theme is ugly. There’s no other word for it. It looks like
shit. Damn this is ugly. Yes you can fix it. There’s hundreds of shitty
themes to select from. but the default is unbelievably ugly. It leaves no other conclusion than that the the ubuntu people are (color) blind. Menu layout seems ok. I have the feeling stuff is being hidden from me.
Configuring the screen properly is back to the commandline. There is no excuse for this in 2006 and I knew this was going to happen. The provided (ubuntu forum, the official documentation is of no use here) solution corrupted my configuration to the point where X just wouldn’t start anymore. Unbelievable, inexcusable.
It’s 2006, ten years after my first slackware install and I’m still messing
with the X configuration the same way as ten years ago. X continues to
And of course the installer fails to install the commercial nvidia driver (or even point me in the right direction). Amusingly the documentation is full of helpful stuff you can do manually that IMHO the installer should do for me. What the fuck do I care about ideological issues with commercial stuff? I’m not a GPL communist. Give me the choice to install the driver that I likely want. Why would I spend 400 euro on a video card and then opt not to run the software that is required to access the more interesting features of this card? Exactly, that’s very rare user.
OK on to the rest of the system.
Read only ntfs has been possible for years and even some experimental rw capabilities are possible these days. Not in ubuntu. Both my ntfs partitions are nowhere to be found. The system->administration->disks tool is as useless as the resolution tool. It fails to ‘enable’ the partitions. Yes I know how to mount stuff from the commandline. But as for Joe average, he can’t get to his ntfs files with ubuntu. Bad but I can probably fix this.
Lets see about the sound card. It’s soundblaster audigy. But there’s also a motherboard sound card (I actually uses both under windows). Pleasant surprise, ubuntus seems to have configured this correctly. Succeeding where, so far, every version of knoppix has failed.
Good. So far I’ve been sceptical but lets be positive. I have a working system, ubuntu has configured itself properly my windows install still works and I have not lost any data.
Installing kde using synaptec.
Wow, this is easy. There’s a separate distribution called kubuntu which is just ubuntu with kde instead of gnome. If you install ubuntu, like I did, you get only Gnome. Installing kde is as simple as installing the kubuntu-desktop package. This involves installing more pacakages from the dvd and downloading a few new ones. Alltogether, including the downloading this takes about 20 minutes (120 KB/seconds). I don’t understand why the kde packages are not on the dvd though, there’s plenty of room. Anyway, I now have the latest kde and gnome on one machine. The KDE theme looks much better even though it is not the default KDE theme.
The menus in both kde and gnome are a mess. This is a linux problem in general and it’s not fair to blame this on ubuntu. But still, 90% of the crap in the menus probably shouldn’t be there.
The installer has lots of usability issues. Aside from not being graphical, it is confusing, misguiding and asks a lot of stupid stuff. The partitioning tool has good functionality but also does a good job of scaring you with some misleading information.
Configuring X still is an issue. Probably it’s slightly better if you have an LCD screen (60 hz is ok then).
Hardware support is pretty decent, it failed to detect the monitor but the rest seems to work fine. It doesn’t install the commercial nvidia driver that most users will want to use.
The ubuntu gnome theme is ugly
Kde install went smooth and the ubuntu kde theme is not that bad.
Update 30/07/2009I just bought an imac and moved the same, but now consolidated, library over to it. Check out the instructions here.
Whoohoo! My new hardware has arrived, last week. I’ve been busy playing with it so that explains the small delay in posting.
Right now I am still going through the tedious procedure of getting everything the way I want it. I have a local network so I can access my old PC. However, dragging my external HD between the two machines is much faster.
Tediousness includes copying my itunes library. Tricking itunes into accepting the old library is somewhat of a challenge. But that’s what’s google is for. Since I found google’s answers a bit disappointing (lots of drag this folder there type of stuff from Apple users), I’ll post some detailed instructions for real users who do not “consolidate” to the itunes folder but choose to keep their music organized manually. To add some difficulty, my new machine has no second harddrive so the paths are different after copying.
If all goes well everything is moved (music, playlists, play statistics, ratings) AND I can sync my ipod with the new pc without that requiring it to be wiped and refilled with the moved library. I’m moving the library, not recreating it.
The Itunes library consists of only two files, its own itunes music folder and whatever external directories you imported (two in my case). One of the two files is a binary file, the other one is an xml file with data on all your songs, including path names, statistics, ratings, etc. Essentially, the xml file contains everything we want to migrate except for the mp3s. Unfortunately, moving the itunes library is not as simple as copying the files to the new machine. Sadly, Apple deliberately made it hard to do what you are about to do. So here’s a step by step guide (windows specific though Apple probably is about the same):
- At all times, keep at least one intact backup of all files mentioned in this post. Never work on the originals. Preferably, leave the original library untouched, you can always go back to that.
- Start by copying your mp3 folders to your new machine. That may take a
while. Make sure they are where you want them to be. It took 20 minutes for my folders using an external HD, not
counting the time it took to create the backup from scratch on
the external hd (basically I used my incremental backup). Also copy both Itunes files (xml and itl) and the itunes mp3 folder (if not empty)
onto the external hd.
- Now dowload, install, run & close itunes. It will create an itunes
directory for you the first time it starts, that’s where it will look for its files. Replace the stuff inside this directory (My Documents\My Music\iTunes) with the
backups on your external hd (including the itunes music folder). Now here comes the tricky part. Thanks for
this post for putting me on the right track! DO NOT start itunes again until after the steps below.
- First fix the pathnames in the xml file. They still point to the old location. Open the file in a capable editor, the thing to look for is search and replace functionality. Search and replace the parts of the path names that are now different: your itunes music folder and any other folders you imported in your old library. Save the file.
- Now this is important: iTunes will ignore whatever path info is in the xml file! Unless the itl file becomes corrupted. We can fix that! Open the itl file in an editor, delete the gibberish inside, save. Your itl file is now corrupted, normally this is a bad thing. You still have the xml file though (and a backup of the itl).
- Start itunes, it will ‘import’ your music and afterwards complain that the itl file is corrupted, let it fix it.
- Check if everything is there. In my case I messed up with the search and replace and some files were missing. Just go back a few steps, copy your backups and retry.
- Done. Everything now is on the new PC. What about the ipod? Just plug it in!. You already installed iTunes on the new machine so you have the drivers for your ipod. The key or whatever itunes uses to recognize you ipod is in the xml file. And now also in the recreated itl. Apparently the xml file is sort of a backup of the itl. I suspect the itl is a bit more efficient to manipulate programmatically. I have no idea if this preserves any itunes store stuff you purchased. Presumably, this involves deauthorizing your old machine and authorizing the new one. I never used the itunes store so it’s not an issue for me.
The only thing I lost in the transition is some iTunes preferences that are easy to restore. For example I had some of my playlists set to shuffle. The imported playlists no longer had the shuffle enabled. Big deal. The preferences probably aren’t part of the library. I noticed that the shuffle settings do not sync to the ipod either. This is annoying actually because the shuffle settings is deep down in some menu on the ipod and I only want to shuffle playlists. I like my album songs served up in the order that they were put on the album.
I’ve used winamp for most of the past decade (I think from 1996?). Only when I got my ipod a few months ago, I started using iTunes, by choice. There is an excellent winamp plugin which will allow you to sync winamp with your ipod. Presumably, moving a winamp library is a lot more easy since winamp uses a file based library rather than a database. However, the main developer has left AOL, so winamp development seems a lot less interesting these days. AOL seems to just pile on commercial crap with every release. So I’ve given up on it for now.
After the zillionth crash I’ve given up. I managed to get the freezing down to once every few hours but it’s not good enough. I just got the good old cat 5 cable out of the closet and connected the pc to the cable modem the old fashioned way. That seems to work very nicely. I can still use the wireless connection with my laptop from work.
Naturally, I’ll take the siemens gigaset usb stick 108 crap back to the store. See what happens. If you’ve googled for that piece of shit and read my blog posts: don’t buy it or if you bought it, trade it in for a different brand. I’ve updated everything: drivers (kt400 chipset, wlan), bios. Nothing seems to work and the pc consistently not crashes without the usb stick. It might be a compatibility problem with my motherboard, it might be faulty hardware, it might be crappy siemens drivers. My guess is the latter.
Now the only problem is the cable connection itself. That too is not working to well. I’ve had several disconnects already. The online led just goes off and the modem tries to reconnect for a while, which, so far, it manages to do after fifteen minutes or so. I suspect the cable signal isn’t to good. This is an old building after all.
I almost ordered a new PC last friday. Unfortunately the video card I wanted had to be ordered and the guy from bt-mikro (right across the street) couldn’t guarantee he’d have it within a reasonable time frame so he recommended me not to order right away. Otherwise the setup seems nice: dual core amd 4400+ with an octek motherboard, 2 GB of memory, a nice 300GB disk, a samsung 204TS 20″ lcd screen and logitech mouse and keyboard (wired, I’ve had enough of this wireless shit). I’ll buy it when the video card is available again: a nvidia 7800 GT card. This card (and it’s slightly heavier brother the gtx) seem to be the card to have at the moment.
That should keep me happy for a few years. My current geforce 4200 still works very nicely despite its age. I recently played doom3 on it with reasonable framerates and some of the eyecandy turned off.
Update: hardware has been ordered now 🙂
The story below has gotten a little long so I’ll summarize for the benefit of those with similar problems:
I applied various solutions to the connectivity and stability problems I encountered:
- My PC froze up every half hour. This required a reboot and then unplug/replug the usb stick and then repair the connection.
- I noticed gaim reconnected every few minutes. Connection was marked as good and then it was suddenly dropped.
- Sometimes my router could not be found at all and then one minute later it had an ‘excellent’ connection
Here are the solutions I applied:
- Look for firmware upgrade. In my case there wasn’t any but if there is this may be your solution
- Look for driver updates for your wireless card/usb stick. Generally the software on the cd is obsolete. This applies to any hardware you buy.
- See if you can live without the wireless monitor thing that was installed along with the driver.
- Uninstall the software and driver
- make sure you have the (downloaded) driver somewhere on disk
- windows may recognize your usb stick on boot, opt to select the driver yourself and then to locate the driver yourself
- If windows does not detect your hardware, go to control panel-systems-device manager. Likely there is an exclamation mark at the icon of the device. You can fix the problem by rightclicking, properties and then install the new driver. Alternatively you can use the find new hardware wizard.
- If all goes well, you can now configure your wireless connection through the default windows wizard. It is less feature rich than your monitor app, which is just fine if all you want is a network connection.
- Some of the more advanced/non-standard features may not be available to you. Most notably, non standard speeds such as 108mbps. This does not really as matter as much as you might think since those speeds are based on compression rather than actual bits. Hard to compress data like multimedia, photos and compressed binaries do not really benefit that much from this network layer compression.
- You may have disabled upnp in the past. In it self this is a good thing because you probably don’t need it. Unfortunately, in my case the driver depends on this service. So, go to control panel-administrative tools-services and set the upnp service from disabled to automatic (this means it will start if it is required).
- If connectivity problems remain the reason may be interference from other devices. Most wifi cards default to channel 1 (this is a busy channel). Depending on where you live you may select from 11 or 13 channels. Some of these channels have overlapping frequencies. Channels 1, 6 and 11 do not overlap. You can configure the channel on your router and may need to configure it elsewhere to. I set the channel to 11 and the number of disconnects has dropped substantially.
- Figure out type and brand of the faulty hardware (and possibly of the chipset in that hardware) and google for your problem. Likely you will find this site if you have the same hardware as me: a motorola sbg900e cable modem and a siemens gigaset 108 usb wlan stick. You may find it useful anyway even if you don’t have the same hardware.
This more or less solved my problems. I still encounter the occasional disconnect. But it now reconnects immediately, by itself. A smart download manager, gaim and most media players should survive the brief disconnect without intervention.
If the above does not solve your problem, bring back the hardware to the store and change it for a different brand. Seriously, having to go through all of the above constitutes a really bad end user experience. Things are definately not working as advertised and you are entitled to complain, loudly! Most shop people are totally incompetent so be understanding of their ignorance but insist on a replacement or refund. Don’t accept the same brand unless you are sure you had a hardware failure. If you do so anyway, make a deal with the shop that if the replacement doesn’t work either he’ll refund.
If someone from Siemens reads this: you people suck. Your software is absolute crap and you know it.
Since last year, April the 1st, Google is regularly releasing interesting new products and services. Last year they kicked off with gmail, a free email service with some interesting technical characteristics.
So gmail uses xmlrpc to implement some interesting things using ajax:
- Spelling checker
- Address completion
- Auto save of new messages
- Fetching content without refreshing the page
Plus they allow their users to store gigabytes worth of email. The entire collection of email I’ve collected and kept over the past ten years is less than that!
Recently I’ve abandoned thunderbird, which is a nice mail client, and started using gmail for all my things. I find it works very well and includes a number of features that I have not seen elsewhere so far. The most important feature for me is that it groups related messages (and replies) into conversations on one webpage. This allows you to keep track of long running conversations easily and is also very convenient for mailinglists.
Another thing that I like is that the mail sits on a server. It doesn’t matter where I am, I can always access my full mail archive. I’ve been messing with pop based accounts for years. Inevitably you answer some important mail via some webbased account while traveling. Then you need to forward the reply to yourself for archiving, which is error prone and generally forgotten.
Yesterday I posted a story on x-plane and mostly commented on it’s superior scenery engine as opposed to the lack of content that uses it. Later that same evening I installed google earth, a new software package that downloads satellite imagery and annotates it with roads, places, hotels, restaurants etc.
What a brilliant program. It looks awesome (well some places do, mostly inside the US). For example, you can zoom in to about 500 feet altitude, hover a bit over phoenix arizona (and drag it around) and find back the very same pool I swam in six years ago. Mind you, all I had was a vague recollection of what the surroundings looked like and the name of the hotel chain (holiday inn, has many hotels in phoenix). So I zoomed to Scottsdale looked for some green areas. Found a label called holiday inn, with a building that I recalled was similarly shaped to the one I spent about five days and had a correctly shaped pool besides it. Impressive.
My point: why don’t these guys from google spent some time talking to Austin Meyer (the guy who makes x-plane). Google earth beats x-plane in displaying realistic scenery (and any other simulator) hands down, all it lacks is a flightsimulator. It now streams data over the internet. Stream it from harddisk and you stream terrain faster than you can fly.
Of course the engine is not optimized for flightsimulation. It might require a bit more elevation info, some more accurate texture positioning. The quality of the textures however is excellent and better than any satellite based scenery for any flightsimulator I have ever seen. It looks excellent at low altitudes (300 feet).
Co-linux is a custom linux kernel that can run as a windows application. It is bundled together with a debian linux base distro. On a whim I tried it today and I have to say that I am impressed. It boots very fast. Once booted you have what is known as the debian base image. 2d graphics are not implemented on colinux. But since linux guis can be served over a network that is not a problem. So rather than emulating some crappy display driver you just do apt-get install vncserver, download a vnc client for windows and tada graphics.
The rest is just straight debian configuration. For the average windows user that is pretty hard of course. But been there done that so no problem for me. I’ve been at it for a bit over an hour now.
The hard part was convincing windows to do internet connection sharing and remembering how to configure networks in debian (it’s been a while so it took me a few google attempts). After that it’s apt-get this and that. Woody was obsolete the day it was released years ago so I fixed sources.list and did a apt-get dist-upgrade to upgrade to testing. Then a few apt-get install commands to get an xserver, kdebase and vncserver (this is all explained in the co-linux documentation). Then I started a vncserver and connected to it using tightvnc (a nice vnc client for windows) and I am now looking at a kde 3.2.2 desktop. It’s actually running at native speeds. The only bottleneck is vnc so graphics performance basically sucks. I’m going to try using the cygwin xserver as well.
Apparently the pornsites have found my site and are now unleashing their stupid tools on it in order to get their links in my referrer thingy. I’ve been getting thousands of hits from domains with such words as ‘rape’ and ‘perversion’ in them. Presumably this is intended to improve their google rankings. Anyway I don’t feel like monitoring referrerstats anyway.
So bye bye referrer thingy.