Google Android

Update: a slightly updated version of this article has been published on the Javalobby weekly news letter and on the javalobby site itself after Matthew Schmidt invited me to do so.

Update 2: The serverside has linked here as well. Readers coming from there, the version on Javalobby linked above is the latest and also has some discussion attached.

About an hour ago, Google released some additional information on the SDK for Android, its new mobile platform. Since I work for Nokia (whom I of course not represent when writing things on my personal blog, usual disclaimers apply), I’m naturally interested in new software platforms for mobile phones. Additionally, since I’m a Java developer, I’m particularly interested in this one.

I spent the past half hour glancing through the API documentation, just to see what is there. This does not provide me with enough information for a really detailed review but it does allow me to extract some highlights that in my view will matter enormously for platform adoption:

  • The SDK is Java based. No surprise since they announced it but it is nice to see that this doesn’t mean they are doing J2ME but instead use Java as the core implementation platform for all applications on the platform.
  • The Linux kernel and native libraries are just there to run applications on top of Google’s custom JVM Dalvik which is optimized for running on embedded hardware.
  • There is no mention of any native applications or the ability to write and install native applications
  • Particularly, there’s no mention of a browser application. Given Googles involvement in Firefox and their recent announcement of a mobile Firefox, this is somewhat surprising. Browsers are increasingly important for high end phones. Without a good, modern browser, Android is doomed to competing with low end feature phones. Browser seems to be webkit, the same engine that powers the iphone browser and the S60 browser.
  • Google has chosen to not implement full Java or any of the ME variants. This in my view very bad and unnecessary.
  • Instead a small subset of the Java API is implemented. Probably the closest is the J2ME CDC profile (so why not go all the way and save us developers a few headaches)
  • Additionally Google has bundled a few external libraries (httpclient, junit and a few others). That’s nice since they are quite good libraries. I’m especially fond of httpclient, which I miss very much when doing J2ME CLDC development.
  • The bulk of the library concerns android.* packages that control everything from power management, SMS to user interface.
  • I did not spot any OSGi implementation in the package; Google seems to intent to reinvent components and package management. This is disappointing since it is very popular across the Java spectrum, including J2ME where it is already shipping in some products (e.g. Nokia E90).

In my opinion this is all a bit disappointing. Not aligning with an existing profile of Java is a design choice that is regrettable. It makes Android incompatible with everything else out there which is unnecessary in my view. Additionally, Android seems to duplicate a lot of existing functionality from full Java, J2ME and various open source projects. I’m sure that in each case there is some reason for it but the net result seems reinvention of a lot of wheels. Overall, I doubt that Android APIs are significantly faster, more flexible, usable, etc. than what is already out there.

On the other hand the platform seems to be open so not all is lost. This openness comes however with a few Strings attached. Basically, it relies on Java’s security system. You know, the same that is used by operators and phone vendors to completely lock down J2ME to restrict access to interesting features (e.g. making phone calls, installing applications). I’m not saying that Google will do this but they certainly enable operators and phone vendors to do this for them. This is not surprising since in the current market, operators insist on this, especially in the US. The likely result will be that Android application developers will have to deal with locked down phones just like J2ME developers have to deal with that today.

The choice for the Apache 2.0 license is a very wise choice since it is a very liberal license that will make it easy for telecom companies to integrate it with their existing products. Provided that the Android APIs are reasonably well designed, it may be possible to port some or all of it to other platforms. The Apache license ensures that doing so minimizes risk for underlying proprietary platforms.

Additionally, the apache license also allows for some interesting other things to happen. For example, there’s the Apache Harmony project that is still working on a full implementation of Java. Reusing this work might of course also make much of android.* redundant. Additionally, there is a lot of interesting mobile Java code under eclipse’s EPL, which is similar to the Apache license. This includes eSWT, a mobile version of the eclipse user interface framework SWT. Eclipse also provides a popular OSGi implementation called equinox. Again, lack of OSGi is a missed opportunity and I don’t care what they put in its place.

Frankly, I don’t understand why Google intends to ignore the vast amount of existing implementation out there. It seems like a bad case of not invented here to me. Ultimately this will slow adoption. There’s already too many Java platforms for the mobile world and this is yet another one. The opportunity was to align with mainstream Java, as Sun is planning to do over the next few years. Instead Google has chosen to reinvent the wheel. We’ll just have to see how good a job they did. Luckily, the Apache license will allow people to rip this thing apart and do something more productive with it. OpenMoko + some apache licensed Java code might be nice. Also our Nokia Maemo platform can probably benefit from some components. Especially the lower level stuff they’ve done with the VM and kernel might be interesting.

Kdiff3 to the rescue

I was struggling this evening with the default merge tool that ships with tortoise svn. It’s not bad and quite user friendly. However, I ran into trouble with it when trying to review changes in a latex file (don’t ask, I still hate the concept of debugging and compiling stuff I would normally type in word). The problem was that it doesn’t support word wrapping and that the latex file in question used one line per paragraph (works great in combination with an editor that does soft word wrapping like e.g. Jedit).

A little googling revealed that the problem had been discussed on the tortoise svn mailing list and dismissed by one of the developers (for reasons of complexity). Figuring that surely somebody must have scratched this itch I looked on and struck gold in the form of this blogpost:KDiff3 – a new favorite about KDiff3.

The name suggests that this is a linux tool. Luckily it seems there is a windows port as well so no problem here. I installed it and noticed that by default it replaces the diff editor in tortoisesvn (good in this case but I would have liked the opportunity to say no here). Anyway, problem solved :-). A new favorite indeed.

Update:

Nice little kdiff3 moment. I did an update from svn and it reported a python file was in conflicted state. So I dutifully right clicked and selected edit conflicts. This launched kdiff which reported: 4 conflicts found; 4 conflicts automatically resolved. It than opens into a four pane view (mine, base, theirs and merged) allowing you to easily see what the merged result looks like and what the conflicts were. OMFG! where were you all this time kdiff3!! Damn that is useful. The resolutions look good too. I remember using tortoise svn doing merges on very large source base in my previous job and this is exactly what made them suck so much.

More ubuntu

I’ve given up on feisty. I’ve blogged several times now about my failure to install it properly. Today I gave it another try and partially succeeded before failing again.

I read somewhere that you can bypass the scanning the mirrors problem by disconnecting the network cable. You see, running ifdown eth0 is not good enough because ubuntu tries to outsmart you with its network manager. It’s becoming more and more like windows. The assumption that the user is a complete idiot now dominates the whole installer. Anyway, unplugging the network forces ubuntu to acknowledge some hard reality.

So I ended up with a bootable ubuntu system this time (misconfigured and all). Great, only the network still didn’t work properly. For some reason some stuff loads in the browser (e.g. google) but most stuff does not. So I was in the weird situation that I could google for my problem, get a lot of hits but unable to access any of them. So I spent the whole morning booting windows and ubuntu repeatedly. Each time in windows I tried to get some answers (no network trouble there) and in linux tried to mess with the ubuntu networking subsystems.

I failed. After the fifth time I just decided not to bother anymore. Obviously ubuntu has some weird bugs in their network layer that I do not encounter with older versions or with windows. From what I googled I learned that there are many unrelated problems related to networking in ubuntu. I now strongly suspect the dns related behaviour of my cable modem is the cause. Replacing the modem might solve my problems. But then again it might not. It’s a motorola cable modem and there is no question about the quality of the firmware being particularly sucky. I have to reboot this sucker quite often and already have decided to never ever buy anything with motorola firmware embedded again. Without a working network, ubuntu is basically not very interesting. Can’t run updates or install software. If anyone has a proper solution, I’d be interested to hear it.

Part two of my misery started when I started messing around with gparted. Basically I removed the linux partitions and then decided to give back the space to the ntfs partition. At this point it started to throw very scary messages at me about my ntfs partition halfway through resizing it. For about 15 minutes I was assuming it had foobarred the partition table (very depressing even though I have back ups). Finally after a reboot the ntfs partition showed up fine in gparted (unmodified) and it even mounted it without being asked (really annoying feature but welcome in this case). Next problem was booting it. I relearned a few lessons about mbr there (fdiks /mbr helped me out a few times when I was playing with slackware ten years ago). Basically the fix in windows xp is running fixmbr from the windows rescue console. Until you do that, you are stuck with a broken grub that points to a deleted partition. For legal reasons (I assume) gparted and grub lack the feature of undoing the damage to the mbr.
It took me about half an hour to figure that out and I’m now running my old windows desktop again.

So I have three conclusions to add to my review a few weeks ago:

  • The networking subsystem has huge compatibility issues that likely affect many users. From what I encountered while googling, I learned that there are many different issues. The fixes I tried didn’t work which suggest that these many issues are different from my specific issue. Not good. The same modem and pc have booted older ubuntu releases (dapper) so it is a regression! My modem is quite common so likely thousands of users are affected.
  • Gparted has some nasty issues and should not be used to resize ntfs partitions. You will risk losing all your data. I guess any partition resizer is better than none but this crap should not be put in front of end users.
  • In this form, ubuntu is guaranteed to cause lots of users to have a very negative introduction to linux.

I will try a future version again. For now, I’ve had enough.

Feisty fawn

I tried (again) to install ubuntu and ran into a critical bug that would have left my system unbootable if it weren’t for the fact that I know how to undo the damage the installer does. Since this will no doubt piss off the average linux fanboy, let me elaborate.

  • I ran into the “scanning the mirrors” bug. Google for “scanning the mirrors” + ubuntu and you will find plenty of material.
  • This means the installer hangs indefinitely trying to scan remote mirrors of apt repositories.
  • Predictably the servers are under quite a bit of load currently: this is extremely likely to not work for a lot of people right now. I recall running into the same issue a month ago with edgy when there was no such excuse.
  • The bug is that the installer should detect that things are not working and fail gracefully
  • Gracefully in the sense that it should
    • Allow the user to skip this step (I only had a close button, which was my only option to interrupt the scanning the mirrors procedure)
    • Never ever, ever, ever, let the user exit the installer AFTER removing the bootflag on the ntfs partition but before installing an alternative bootloader.
    • Recover from the fact that the network times out/servers are down. There’s no excuse for not handling something as common as network failure. Retry is generally a stupid strategy after the second or third attempt.
    • I actually ran ifdown to shut the network down (to force it into detecting there was no connection) and it still didn’t detect network failure!

The scanning the mirrors bug is a strange thing. Ubuntu actually configured my network correctly and I could for example browse to Google. However, for some reason crucial server side stuff was unreachable. Since ubuntu never gave an error, I can’t tell you what went wrong there. This in itself is a bug, since murphy’s law pretty damn much guarantees that potential network unreliability translates into users experiencing network problems during installation.

Could I have fixed things? Probably. Will I do so? Probably not, my main reason for trying out 7.04 was to verify that in fact not much has changed in terms of installer friendlyness since 6.10. All my suspicions were confirmed. In short, the thing still is a usability nightmare. The underlying problem is that the installer is based on the assumption that the underlying technology works properly. In light of my 10+ years experience with installing linux, this is extremely misguided. The installer merely pretends everything is ok. The problem is that it sometimes doesn’t in which case a usable system distinguishes itself from an unusable system by offering meaningful ways out. For example, display configuration failed (again, see my earlier post on edgy installation) which means I was looking at a nice 1024×768 blurry screen on a monitor with a different native resolution. I suspect the nvidia + samsung LCD screen combo is quite popular so that probably means lots of users end up with misconfigured ubuntu setups. The only way to fix it is after the installation using various commandline tools. Been there done that. The resolution change dialog is totally inadequate because it mistakenly assumes it was configured correctly and only offers resolutions that it properly detected (i.e. 640×80 -1024×768 @60Hz, no hardware has shipped this decade with that as the actual maximum specs).

I found the two main new features in the installer misguided and confusing. The installer offered to migrate my settings and create an account. The next screen then asks me who I am. Eh … didn’t I just provide a user/password combo. And BTW. what does it mean to migrate My Documents? Does it mean the installer will go ahead and actually copy everything (which I don’t want, it’s about 80GB) or will it merely mount the ntfs disk (would be useful). I need a little more info to make an informed decision here.

The other main new feature is to actually advertise the binary drivers that most end users would probably want installed by default. That’s good. The problem is that the dialog designed to offer it is very confusing (using such terms as unsupported) and also that the drivers are not actually on the cd. In other words I couldn’t actually install them for the same mysterious network problem outlined above. Similarly, the dialog doesn’t seem to have a good solution for network failure. The reality with the drivers is that they are the only thing that the hardware vendors support (i.e. they have better support for the hardware and from the actual vendor that provided it). The problem of course is that they are ideologically incompatible with some elements in the OSS community. Which probably lead to the probably highly debated blob of text explaining to the user that it is not recommended to install the unsupported software which happens to be the only way to get your 300$ video card working as advertised. The dialog does not do a good job of explaining this, which is it’s primary job.

Using rsync for backup

As you may recall, I had a nice incident recently which made me really appreciate the fact that I was able to restore my data from a backup. Over the years I’ve sort of gobbled together my own backup solution using rsync (I use the cygwin port to windows).

First a little about hardware. Forget about using CDs or DVDs. They are just too unreliable. I’m currently recovering data from a whole bunch of CDs I had and am horrified to discover that approximately one third has CRC errors on them. Basically, the light sensitive layer has deteriorated to the point that the disc becomes unreadable. Sometimes as soon as within 2 years. I’ve used various brands of CDs over the years and some of them have higher failure rates than others but no brand seems to be 100% OK. In other words, I’ve lost data on stored on pretty much every CD brand I’ve ever tried. Particularly Fujifilm (1-48x) and unbranded CDs are bad (well over 50% failure rate) on the other hand, most of my Imation CDs seem fine so far. Luckily I didn’t lose anything valuable/irreplacable. But it has made it clear to me to not trust this medium for backups.

So, I’ve started putting money in external harddrives. External drives have several advantages: they are cheap; they are big and they are much more convenient. So far I have two usb external harddrives. I have a 300GB Maxtor drive and the 500GB Lacie Porsche drive I bought a few weeks back. Also I have a 300 GB drive in my PC. Yes that’s 1.1 TB altogether :-).

The goal of my backup procedures is to be ‘reasonably’ safe. Technically if my apartment burns down, I’ll probably lose all three drives and all data on them. Moving them offsite is the obvious solution but this also makes backups a bit harder. Reasonably safe in my view means that my backed up data survives total media failure on one of the drives and gives me an opportunity to get to the reasonably safe state again. When I say my data, I’m referring to the data that really matters to me: i.e. anything I create, movies, music, photos, bookmarks, etc.

This data is stored in specific directories on my C drive and also a directory on my big Lacie drive. I use the Maxtor drive to backup that directory and use the remaining 200GB on the lacie drive for backing up stuff from my C drive.

All this is done using commands like this:

rsync -i -v -a --delete ~/photos/ /cygdrive/e/backup/photos >> /cygdrive/e/backup/photos-rsync.txt

This probably looks a bit alien to a windows user. I use cygwin, a port of much of the gnu/linux tool chain that layers a more linux like filesystem on top of the windows filesystem. So /cygdrive/c is just the equivalent of good old c:\. One of the ported tools is ln, which I’ve used to make symbolic links in my cygwin home directory to stuff I want to backup. So ~/photos actually points to the familiar My Pictures directory.

Basically the command tries to synchronize the first directory to the second directory. The flags ensure that content of the second directory is identical to that of the first directory after execution. The –delete flag allows it to remove stuff that isn’t in the first directory. Rsync is nice because it works incrementally. I.e. it doesn’t copy data that’s already there.

The bit after the >> just redirects the output of rsync to a text file so that afterwards, you can verify what has actually been backed up. I use the -v flag to let rsync tell me exactly what it is doing.

Of course typing this command is both error prone and tedious. For that reason I’ve collected all my backup related commands in a nice script which I execute frequently. I just turn on the drives; type ./backup.sh and go get some coffee. I also use rsync to backup my remote website which is easy because rsync also works over ftp and ssh.

Part of my backup script is also creating a dump from my subversion repository. I store a lot of stuff in a subversion repository these days: my google earth placemarks; photos; documents and also some source code. The subversion work directories are spread across my harddrive but the repository itself sits in a single directory on my cdrive. Technically I could just back that up using rsync. However, using

svnadmin dump c:/svnrepo | gzip > /cygdrive/e/backup/svnrepo.gz

to dump the repository allows me to actually recreate the repository in any version of subversion from the dump. Also the dump file tends to be nicely compressed compared to either the work directory or the repository directory. Actually, the work directory is the largest because it contains 3 copies of each file. In the repository, everything is stored incrementally and in the dump gzip squeezes it even further. The nice thing of a version repository is of course that you preserve also the version history.

porsche gets some good testdrive

It’s only two days ago that I bought myself a Lacie Porsche 0.5 TB usb drive. Yesterday evening, after a reboot caused by an apple security update weird shit started to happen. Basically windows informed me that “you have 3 days to activate windows”. WTF! So I dutifully click the activate now only to watch a product key being generated and the dialog closing itself, rather than letting me review the screen and opting for a internet or telephone based activation. After that it informed me that I had three more days to activate. Very weird and disturbing news! A few reboots and BSODs later (which had now also started to appear on pretty much every reboot), I took a deep breath and decided that the machine was foobarred and I needed to reinstall windows. I suspect the root cause of my problems was a reset a while back which resulted in a corrupt registry and repeated attempts by windows to repair it before booting normally. I thought the problem was fixed but apparently the damage was more extensive than I originally thought.

Considering I had a few more days to reactivat, which despite my attempts I could not do, I decided to back up everything I could think off. I.e. I have about 100 GB left on the external drive, bought it just in time :-). Copying that amount takes shitloads of time. Basically most of the backup ran overnight with the assistence of the cygwin port of rsync. After re-installing windows earlier this evening (which activated fine, to my surprise), I got to work reinstalling everything (I have a few dozen applications I just need to have) and moving back all my data. Some interesting things:

  • Luckily I thought of backing up my c:\drivers dir in which I stored various system level drivers for my motherboard and other stuff that I downloaded when I installed the machine a year ago. This included the essential driver for the lan, without which I would have had no network after the install and no way to get the driver on the machine (or to activate it). Pfew.
  • I reapplied the itunes migrate library procedure I described last year (and which still gets me loads of hits on the blog). It still works and my library, including playlists, ratings and playcounts imported fine in my new itunes install. Would be nice if Apple was a bit more supportive of recovering your stuff in a new install.
  • After installing firefox 2, I copied back my old profile folder and firefox launched as if nothing had happened. Bookmarks, cookies, passwords, extensions all there :-). Since I practically live in this thing, that pleased me a lot.
  • Then I reinstalled gaim and copied back the .gaim directory to my user directory. Launched it and it just worked. Great!
  • Same with jedit.
  • Then I installed steam, logged in and ran the restore back up tool that was created along with the 13GB backup. Seems to work fine and I’m glad that I don’t have to wait a few weeks for the download to finnish. Ok the restore was not fast either but it got the job done.

Lesson learned: backups are important. I had the opportunity to create them when it turned out I needed them. But I should have been backing up more regularly. A more catostrophic event would have caused me dataloss and much more annoyance.

So a big thank you to Bill Gates et al. for wasting my precious spare times with their rude and offensive activation crap. Fucking assholes! I’m a paying customer and very pissed. I will remember this waste of my time and genuine disregard for my rights when making any future microsoft purchasing decisions. And yes, that probably means lost revenue for you guys in Redmond. I’ve adopted opensource for most of my desktop apps by now. There’s only two reasons for me to boot windows on my PC: games and photoshop. I understand the latter is now supported by wine and I’m much less active with gaming than I used to be. Everything else I use either runs on linux or has great alternatives. But for the moment, I’ll keep using windows because I’m lazy.

another ubuntu installation test

Currently ubuntu is my favourite linux distro. I don’t use it but do install it and other distros from time to time to see where linux is going. I’ve been doing this a couple of times per year since about 1995 when I was biking home from university with 30 or so slackware floppy disks in my bag. So, I am not a newby!

Things have been getting a lot easier since then. I bought a new PC last year and recently wiped everything from the old one so I can sell it, eventually. It’s a AMD 2200+ with 1 GB of ram, 2 hds (80 and 120 GB), a nvidia 4400, a cd burner and a dvd burner and some minor heat related instability (works fine with the cover off). The stability issue is the main reason I still have this machine, I just feel bad about selling something with issues. If you are interested in taking over the hardware, let me know. I promise full disclosure of any hw issues I’m aware of and you’ll inherit some fine components at bargain price (negotiable). Should make an excellent linux desktop or server still. I suspect it is either the powersupply or cpu fan that is the issue. Should be easy to fix by somebody with more hardware tinkering experience than me.

In short, it’s ideal for one of my linux tests. So I downloaded the ubuntu 6.10 desktop iso, burned it and popped in the cd. Below is a chronological account of what I did and how that worked out.

  • It boots into a graphical desktop from the install cd. This is nice but it also takes about five minutes for it to properly boot.
  • On the desktop sits a big install icon. I guess I’m supposed to click it even though I already selected install in a previous menu.
  • The wizard launches and leads me to new and improved dialogs for setting the usual stuff (keyboard, language, user and password) and then presents me with a menu with three options to use the first hard drive, the second one or manually partition. The both option seems to be missing, so I go manual. This seems to work nicely although I now find out that I should have picked a different cd if had wanted soft raid 0. Ah too bad.
  • I partition it to have root and swap on drive 1 and home on drive 2 (the 120 GB one).
  • I click next and the installer goes to work. This takes about 20 minutes during which I have the opportunity to use the fully functional desktop that runs the installer. It seems to include a couple of games, a browser and some other stuff. Fun, I guess but cdroms don’t repsond to nicely to multiple processes doing stuff at the same time so I leave it alone.
  • Done. Reboot!
  • Oops, it’s 2007 and ubuntu still fucks up display configuration. I smelled a rat when I wasn’t presented with an option to configure the screen and indeed ubuntu botched it completely. It sort of works but 1024×768 at 60HZ sort of sucks on a CRT that can do 1280×1024 at 85HZ. It hurts my eyes! And there’s no option to fix it in the menu. There is a resolution dialog but this only allows me to select even worse resolutions. Naturally, the excellent binary driver nvidia provides is not installed so I can’t use any of the high end features of my once very expensive nvidia card. Come on guys, fix this. Both monitor and videocard are plug and play. You have no good excuse to not recognize either. I guess removing the, admittedly, very crappy monitor and videocard configuration was a bad idea because now I’ll have to do things manually and find out how much X configuration still sucks.
  • It informs me there’s 160 MB worth of updates so I do the autoupdate. Apparently there were quite a bit of issues with 6.10 when it was released causing some people to recommend the previous version. Lets see how good things are after the update.
  • Wow! A linux update that requires me to reboot. That’s a first. I guess it is getting better at reproducing the windows experience :-). OK, lets do this.
  • Yikes! So I click around a bit and suddenly the desktop freezes. Mouse still works but clicking anywhere doesn’t seem to work as expected. I can switch to the text console using ctrl+alt+f1 keybinding. So a default install with no custom software installed produces a freeze by just clicking around in the menus a bit.
  • Ctrl+alt+del has not been implemented so I use the good old reboot command from one of the consoles: “sudo reboot now”. Life is great, 0 things installed/configured and I need a command line already. This had better not happen again!
  • OK, rebooted. Lets fix the fucking screen first. Synaptic package manager sounds right, let search for nvidia. OK, clickety, nvidia-glx sounds right. Apply means install I guess. So much for usability :-).
  • Great the video driver is installed but X was not restarted. I guess that requires another manual override. I know there’s ways to do this without a restart but the option is not provided in the shutdown screen. So here goes another required reboot. This is feeling more and more like windows by the minute. Random freezing, check. Required reboots (three so far), check!
  • Oops, I still can’t fix the default resolution. This is bad! Effectively I am now stuck with a broken desktop that gives me headaches. I will need to get under the hood to fix it. I know how to do it but it just pisses me off that I have to do this on what is supposedly the most user friendly linux distro.
  • As explained here you need to fix things from the commandline. The good news: whoohoo 85 HZ. The bad news, still no 1280×1024 in the screen resolution thingy.
  • This page has the instructions for launching the good old x configuration tool. This sucker hasn’t changed substantially in ten years and insists on leading you through a whole bunch of screens for mouse and keyboard (which were detected just fine). It brings tears to my eyes that this tool actually wants me to enable the emulate three buttons option. Anyway, hitting enter works as advertised and I now run in the right resolution and refreshrate.
  • Lets launch firefox …. seems to work and it’s 2.01! Good!

OK so far for a default install. So overall, this is not good. X installation after ten years of linux distributions is still a mess. The basic job of configuring generic plug and play monitors as well as the installation of essential drivers is still a mess. Ubuntu has made this more “userfriendly” by removing the tools needed to fix this from the setup. This just makes them harder to find.

On the other hand, once installed, ubuntu is a pretty nice distribution. Of course the cd install is a bit limited but I’m fixing this right now by installing a few hundred MB worth of add on packages which are now being downloaded over the network.

Some things I noticed good and bad:

  • GOOD Graphical install seems more user friendly.
  • NOT SO GOOD The graphical install takes ages to launch. Effectively it adds five to ten minutes to the whole installation procedure.
  • VERY GOOD The graphical install boots into a fully functional CD with network connectivity and a browser. Though I did not need it this time, I have had to abort installations in the past in order to boot into windows to be able to access online documentation. So this is great!
  • EXTREMELY BAD Screen configuration is not part of the setup procedure and the automated routines fail miserably. Predictably my monitor and graphics card are misconfigured and the only fix involves entering commands on a commandline. I think it is safe to say that this part of the installation is not compatible with most end user capabilities. I don’t see how a non technical user could fix their screen settings without somebody holding their hands while they execute cryptic commands on the commandline. This is a regression from last time I installed ubuntu.
  • BAD I don’t like that the screen resolution tool provides no obvious references for fixing the extremely likely case of a misconfigured X desktop. Both the binary nvidia driver and a properly configured CRT are things most users would want to fix.
  • BAD installing nvidia-glx via synaptic does not trigger the additional configuration of x.org to actually use it that is obviously implied by this action.
  • BAD X froze on me after the updates but before I had configured anything. I had no obvious way to fix this other than go to the comandline and execute a command that no end user should be aware of. If I were an end user, I would have reset the pc.
  • NOT SO GOOD the shutdown screen does not contain an option to restart X. True, you shouldn’t need it but I needed it anyway during screen configuration so it should be there.
  • BAD synaptic is hopelessly complicated and lacks a few simple coarse grained options like just install the bloody KDE desktop without me micromanaging all the details.

BTW. I still think the shitty brown sucks.

UPDATE. install of additional software went fine but using the hibernate pc resulted in an obscure error. If it doesn’t work, don’t put it in the fucking shutdown screen!

reinstalling windows sucks

Luckily my parents laptop came with a windows xp pro cd and a valid product key. So, after backing up important files using ubuntu, I proceeded to install windows. I had forgotten how annoying installing windows can be.

After having seen ubuntu the experience is, well, extremely user unfriendly and extremely likely to end in disaster unless you know what to do. For the record, the ubuntu live dvd recognized all hardware out of the box without any intervention. Impressive.
The easy, but tedious, part is putting the windows cd in the tray. Then the installer gives you a few seconds to trigger it by hitting enter. Then you are confronted with a slowly loading text only wizard where you will have to do something very risky: removing a partition and creating a new one. The provided tool for this is outdated by the standards set by linux distributions. The point of this step is to start with a clean disk, so getting this right is important. 99% of the user population would at this point likely make the wrong choice and either end up installing a very old xp over their existing installation or installing it in some weird location on a harddrive that is still full of spyware. OK, been there done that so, clickety and nice clean drive.

Then it starts copying. It reboots. More copying and then some questions about where I am. More copying. Product key. More copying. Modem & Network settings. This fails but I end up being online anyway (network cable plugged in so how hard can it be)? Another reboot. Then during the first boot a wizard with an annoyingly loud shitty music in the background. At this point my laptop’s volume buttons are not yet operational (have to be logged in?). Activation and finally done. It takes about one and a half hours and it requires constant fiddling with wizards so you can’t just leave it alone to do its thing.
Oh wait. We’ve just started. This is an old windows cd, I do not yet have the billion security updates that are the whole point of this operation and that do not install themselves without manual intervention. OK, two hours later I’m patched all the way up to two years ago (sp2). Now the auto update finally kicks in. Another hour, three reboots and 50+ security updates (not kidding) later it is finally over, I think.

Oh wait. Much of the hardware is actually not working correctly. Stuff like my usb ports, pc card slot, video graphics chip are all not working correctly. Nor is my wireless card (smc pcmcia card, plugging it in hangs the system). And damn this internet explorer default page is annoying. And thanks for showing me this unavoidable welcome to windows promo again (you have to watch it to get rid of the annoying traybar icon). And no thanks for the hotmail msn thingy.
Turns out the compaq support site has a nice list of stuff you could install. The problem with this site is twofold: there are multiple things to choose from without any clear guidance on what to install and what not to install. Secondly, all downloads (and there are a lot) are conveniently named sp4325345345.exe where the number is the only thing that varies. The idiot who invented that naming scheme should be taken out and shot. Anyway, I installed half a dozen of their driver packs and everything seems to work now. At least several of the downloads there are crucial to the correct operation of the laptop.

No way my parents could have done all of this on their own. The sad truth with windows is that if it is not configured correctly out of the box, it is really difficult to get everything working correctly. My mother was asking me this weekend if she could do it herself or go to the shop to have somebody do it. If only that were possible :-). In terms of labour cost, the price is a nice low end PC just waiting for all of the above to complete. Of course there is plenty of opportunity for mistakes so realistically it is probably cheaper to just go to dell and buy new hardware & software.

ubuntu: debian still sucks, nothing to see here

One of the nice things of buying a new pc is that you have an old pc to mess with. Having backed up the most important stuff, my old machine now is the victim of some random linux installer abuse. Right now I’m installing ubuntu, a debian derived linux distribution. It’s been a few years since I ran linux outside of vmware (basically when I bought the machine I am now using for my linux install). I used to have more time to mess with trying out this kind of stuff. I know from experience that getting linux to work is easy and getting it to work properly is very difficult. Presumably, ubuntu should make this more easy but lets see what we end up with. I actually typed most of this review during the install, plenty of time for that.

If you came here to read how beautiful ubuntu is, move on because the stuff below is probably a bit painful for you.

The download.

I opted for the bittorrent release of th 5.10 release. It’s a 2.8GB download so bittorrent is mandatory. Burned it to a dvd with my new drive

Booting.

Insert the dvd in the drive, make sure bios is configured to boot from cd (on most systems the default is wrong) and reset.

The installer.

Here it gets interesting. I can select install, server and live dvd. Install seems a good plan. Up comes the text based installer. I was half expecting a graphical installer so that is disappointing. Worse, the installer seems of the intimidating, piss off end user variety. Luckily, I’ve seen worse (I installed slackware on 486 SX 25 mhz once). Navigating is pretty straightforward if you’ve dealt with ms dos or similarly clumsy uis in the past. The only severe usability issue is going back. There’s a back option on some screens but you can’t get to it using the arrow keys. You have to use the backspace, doh!

Progress bars or lack thereoff.

Another odd thing in the installer is that in between the screens where you are supposed to select stuff you get these nice blue screens without any content whatsoever. For example, I’m currently running the disk partition tool and the screen has been very blue for the past ten minutes (of the ms bsod variety). I mean, at least display some text telling me that everything is fine and I should be patient.

Hardware detection.

My network cards are detected and configured using dhcp. Bonus points for that, nothing worse than trying to fix linux problems offline. The usb mouse seems to work (led is on) as well but I can’t use it in commandline ui.

Disk partitioning.

This tool, aside from the before mentioned HUGE usability problem, seems to behave rather nice. The default is resize my hdb1 partition which supposedly makes it possible to leave my windows partitions alone. That’s nice but it takes a loooooong time. A warning might have been nice. Anyway I remember the old days of manually partitioning using all sorts of obscure tools including the commandline fdisk tools of both windows and linux. Again usability rears its ugly head. After resizing the UI reappears with some worrying information about new partitions it is about to write on the (supposedly?) freed space. What’s worrying is that it doesn’t say how large each partition will be and what happened to the resized partition. Some confirmation that resizing worked as expected would have been nice. After some hesitation I select yes to indicate that it can do its thing. Had there been any important stuff on the partition I would probably have ejected the ubuntu disc at this point. This is bad. This is a crucial phase in the installation and if something goes wrong, it will likely be here. Bonus points for functionality but the usability totally sucks here. Partitioning is scary, especially with a tool you’ve never used before. I’ve seen it go wrong in the past.

Installing the base systems and copying remaining packages.

Finally some scrollbars. But no package selection at this point. That’s probably good as debian package selection is not something you want to put in front of users at this point. More on this later.

Timezone and user configuration, configuring apt.

I suppose this is necessary but I’d prefer a real user interface. Also there’s some odd stuff here. Like having to know if the hardware clock is gmt or not (it’s not, I happen to know this). Ntp + me telling what timezone I’m in provides the same information. Finally it offers to configure a bootloader (grub) so I can choose to boot into linux or windows xp. That’s a nice touch. Debian got this wrong last time I tried it and I had to fix LILO manually to get back to windows.

Time for a reboot.

The boot screen. Pretty, if you like brown. And then back to the commandline UI in stylish bsod blue. It’s now doing its post installation routine which appears to involve configuring a user (no root, ubuntu has no root!), installing all the debian packages, downloading a few new ones. I know how debian works so not unexpected but definately not very user friendly. It involves lots of cryptic messages about various obscure pacakages being prepared, configured etc.

It comes up with a question about screen size halfway. I select 1280×1024. I can’t select refreshrate and indeed this proves to be configured wrong after the installation (60Hz instead of 85hz) Then the install continues, no more questions.

Done!

Then suddenly it is done and the login screen appears. This is linux, no further reboots necessary the installer finished without much ceremony and X was launched. The bootscreen is nice, if you like brown. I log in with my user/password. Gnome appears to be configured apple style (menu bar at the top, taskbar at the button) a popup informs me that 48 updates are available. Install seems to work fine which proves that the network is indeed configured properly.

Configuring the screen properly.

60 hz will give me a headache so that needs to be changed. Upfront I’m not very hopeful that tools have improved to the point where this can be done without manually editing X configuration files. But lets see how things have improved in the past few years.

Not much apparently. The good news is that there is a resolution tool in the system->preferences. It even has a dropdown for the refreshrate. Only one item is in it: 60HZ. Doh!

This is linux at its worst. It’s not working and the provided tools are too crappy to solve the problem at hand. A search on the ubuntu site confirms that monitor configuration is undocumented. In other words, I’m on my own. Google brings up the solution which indeed involves the commandline and hoping that the autorecognition will magically work when tried again.

Of course it doesn’t. Worse, I now understand why the installer tries to hide the inevitable sudo dpkg-reconfigure xserver-xorg. This basically is the good old XF86Config wizard. I have fond memories of toying with it in 1995 (slackware). It has gotten worse since. At the time it asked a few difficult but straightforward questions. The modern version of this tool presents you with a whole array of bullshit features and autorecognition features which half work. Lets face it, if they worked you wouldn’t be running the reconfigure. Forget about autoconfiguration. Everything the installer figured out is now forgotten (with no obvious way to redo that part other than placing the backup back).

Essentially this is horrible tool brings together everything that sucks about X in one convenient tool. Mere mortals are guaranteed to be totally confused by this beautiful piece of shit that after all these years still survives in linux. The inability of the linux community to fix this once and for all is illustrative of the hopelessness of the whole concept of desktop linux. The linux solution to display configuration is to hide this tool instead of implement an alternative. On the first go I did not manage to get the optimal refreshrate. On the second go I screwed up the display configuration. Copying back the backed up configuration did not fix the problem.

Ahem, reboot seems to ‘magically’ fix the problem. At least, I’m back where I started (1280×1024 @ 60 Hz).

Ok, so much for wizards. I knew in advance that I was going to end up manually editing the display settings. For the record, this is where normal users either go back to windows or accept the headache. I know from experience that editing X configuration is a matter of trial and error. In my case five reboots and the documentation for my plug and play m990 monitor did the trick. Ubuntu failed to setup my monitor’s horizontal and vertical refreshrates, something it should have figured out from the plug and play information. OK shit happens. The next problem was that the tool to fix this problem is reconfiguring the package. Doing this undos most of the good work the ubuntu installer did (so it makes things worse). Solution: copy the backup of the ubuntu configuration and edit it manually to fix the refreshrates (30-96 and 50-160 in my case). Then reboot because misconfiguring X really screws things up to the point that a reboot is required to make X start again after you fix the configuration. Been there, done that before. At least the bloody wheel mouse works out of the box nowadays.

Conclusions for the installer

Usability sucks but the installer gets the job done anyway except for configuring the screen (important). However there are several majr pitfalls you have to know how to avoid. The installer is not particularly informative about what it is doing and needlessly verbose at the same time. However, the defaults are sane and a strategy of going with the obvious choices will work most of the time (if in doubt, hit enter).

The default theme is ugly. There’s no other word for it. It looks like
shit. Damn this is ugly. Yes you can fix it. There’s hundreds of shitty
themes to select from. but the default is unbelievably ugly. It leaves no other conclusion than that the the ubuntu people are (color) blind. Menu layout seems ok. I have the feeling stuff is being hidden from me.

Configuring the screen properly is back to the commandline. There is no excuse for this in 2006 and I knew this was going to happen. The provided (ubuntu forum, the official documentation is of no use here) solution corrupted my configuration to the point where X just wouldn’t start anymore. Unbelievable, inexcusable.

It’s 2006, ten years after my first slackware install and I’m still messing
with the X configuration the same way as ten years ago. X continues to
suck.

And of course the installer fails to install the commercial nvidia driver (or even point me in the right direction). Amusingly the documentation is full of helpful stuff you can do manually that IMHO the installer should do for me. What the fuck do I care about ideological issues with commercial stuff? I’m not a GPL communist. Give me the choice to install the driver that I likely want. Why would I spend 400 euro on a video card and then opt not to run the software that is required to access the more interesting features of this card? Exactly, that’s very rare user.

OK on to the rest of the system.

Read only ntfs has been possible for years and even some experimental rw capabilities are possible these days. Not in ubuntu. Both my ntfs partitions are nowhere to be found. The system->administration->disks tool is as useless as the resolution tool. It fails to ‘enable’ the partitions. Yes I know how to mount stuff from the commandline. But as for Joe average, he can’t get to his ntfs files with ubuntu. Bad but I can probably fix this.

Lets see about the sound card. It’s soundblaster audigy. But there’s also a motherboard sound card (I actually uses both under windows). Pleasant surprise, ubuntus seems to have configured this correctly. Succeeding where, so far, every version of knoppix has failed.

Good. So far I’ve been sceptical but lets be positive. I have a working system, ubuntu has configured itself properly my windows install still works and I have not lost any data.

Installing kde using synaptec.

Wow, this is easy. There’s a separate distribution called kubuntu which is just ubuntu with kde instead of gnome. If you install ubuntu, like I did, you get only Gnome. Installing kde is as simple as installing the kubuntu-desktop package. This involves installing more pacakages from the dvd and downloading a few new ones. Alltogether, including the downloading this takes about 20 minutes (120 KB/seconds). I don’t understand why the kde packages are not on the dvd though, there’s plenty of room. Anyway, I now have the latest kde and gnome on one machine. The KDE theme looks much better even though it is not the default KDE theme.

The menus in both kde and gnome are a mess. This is a linux problem in general and it’s not fair to blame this on ubuntu. But still, 90% of the crap in the menus probably shouldn’t be there.

Conclusions

The installer has lots of usability issues. Aside from not being graphical, it is confusing, misguiding and asks a lot of stupid stuff. The partitioning tool has good functionality but also does a good job of scaring you with some misleading information.

Configuring X still is an issue. Probably it’s slightly better if you have an LCD screen (60 hz is ok then).

Hardware support is pretty decent, it failed to detect the monitor but the rest seems to work fine. It doesn’t install the commercial nvidia driver that most users will want to use.

The ubuntu gnome theme is ugly
.
Kde install went smooth and the ubuntu kde theme is not that bad.