There’s some filthy stuff out there on the internet but this beats everything.
This article presents an elaborate and IMHO misguided approach to handling exceptions: ONJava.com: An Exception Handling Framework for J2EE Applications
The author poses the problem that handling exceptions is tedious and leads to lots of boilerplate code. His proposed solution is to use unchecked, run.time exceptions. His reasoning is flawed for a number of reasons:
- Most Exceptions come from external components. When bad stuff happens, you’re supposed to do stuff (other than just logging). Thinking that bad stuff won’t happen is naive, it will. In most cases, the reason that you get an exception is either that your assumptions were wrong (add some if statements to check) or there is a real problem (like something is misconfigured, the db is down, network is down, …). In some poorly designed code there may be a third reason: the software is wrapping state information in an exception. Don’t do this, ever.
- You shouldn’t create new exception types if you can reuse existing types. That leads to less boilerplate code and more clarity. Nothing worse than having to figure out the cause of the cause of the cause of the exception that tomcat logged.
- A good IDE makes handling exceptions really easy (eclipse ctrl+1 will give you handy quickfixes like “add throws declaration” “add catch clause for exception”). If you’re typing all this stuff manually, you’re doing something wrong. That leaves the problem of code readability. Poorly written code tends to be unreadable. Lots of exception handling code is a symptom, not a cause. If it’s unreadable: refactor it. In general if your methods don’t fit on a 1600×1200 screen you might want to start thinking about refactoring. If your classes regularly exceed 500 lines of code you’re having design issues. What makes code really unreadable is excessive coupling and lack of cohesiveness. Refactoring is the solution.
- Unhandled exceptions either end up in front of the user, in the log or both and can leave your application in an unexpected state. Basically all these things are bad. Users should never see any stacktrace and should always get some kind of response from the application. Nothing worse than clicking next and ending up on the same screen because some runtime exception prevented the server from doing anything useful with the request (I see this a lot).
So in short, use a decent IDE (generate the boilerplate code) and handle the exception instead of throwing it to the caller if you can. If your code is still unreadable, don’t make it worse by throwing unchecked exceptions.
This article does a good job of explaining how to deal with dependency conflicts (trying to use multiple versions of the same library).
I use this nice browser extension called forecastfox that notifies you of the weather outside. Today it came up with an icon I had never seen before and that people in most parts of the world will never see. It’s really cold outside (-22) and forecastfox has an icon for that :-).
One of the nice things of buying a new pc is that you have an old pc to mess with. Having backed up the most important stuff, my old machine now is the victim of some random linux installer abuse. Right now I’m installing ubuntu, a debian derived linux distribution. It’s been a few years since I ran linux outside of vmware (basically when I bought the machine I am now using for my linux install). I used to have more time to mess with trying out this kind of stuff. I know from experience that getting linux to work is easy and getting it to work properly is very difficult. Presumably, ubuntu should make this more easy but lets see what we end up with. I actually typed most of this review during the install, plenty of time for that.
If you came here to read how beautiful ubuntu is, move on because the stuff below is probably a bit painful for you.
I opted for the bittorrent release of th 5.10 release. It’s a 2.8GB download so bittorrent is mandatory. Burned it to a dvd with my new drive
Insert the dvd in the drive, make sure bios is configured to boot from cd (on most systems the default is wrong) and reset.
Here it gets interesting. I can select install, server and live dvd. Install seems a good plan. Up comes the text based installer. I was half expecting a graphical installer so that is disappointing. Worse, the installer seems of the intimidating, piss off end user variety. Luckily, I’ve seen worse (I installed slackware on 486 SX 25 mhz once). Navigating is pretty straightforward if you’ve dealt with ms dos or similarly clumsy uis in the past. The only severe usability issue is going back. There’s a back option on some screens but you can’t get to it using the arrow keys. You have to use the backspace, doh!
Progress bars or lack thereoff.
Another odd thing in the installer is that in between the screens where you are supposed to select stuff you get these nice blue screens without any content whatsoever. For example, I’m currently running the disk partition tool and the screen has been very blue for the past ten minutes (of the ms bsod variety). I mean, at least display some text telling me that everything is fine and I should be patient.
My network cards are detected and configured using dhcp. Bonus points for that, nothing worse than trying to fix linux problems offline. The usb mouse seems to work (led is on) as well but I can’t use it in commandline ui.
This tool, aside from the before mentioned HUGE usability problem, seems to behave rather nice. The default is resize my hdb1 partition which supposedly makes it possible to leave my windows partitions alone. That’s nice but it takes a loooooong time. A warning might have been nice. Anyway I remember the old days of manually partitioning using all sorts of obscure tools including the commandline fdisk tools of both windows and linux. Again usability rears its ugly head. After resizing the UI reappears with some worrying information about new partitions it is about to write on the (supposedly?) freed space. What’s worrying is that it doesn’t say how large each partition will be and what happened to the resized partition. Some confirmation that resizing worked as expected would have been nice. After some hesitation I select yes to indicate that it can do its thing. Had there been any important stuff on the partition I would probably have ejected the ubuntu disc at this point. This is bad. This is a crucial phase in the installation and if something goes wrong, it will likely be here. Bonus points for functionality but the usability totally sucks here. Partitioning is scary, especially with a tool you’ve never used before. I’ve seen it go wrong in the past.
Installing the base systems and copying remaining packages.
Finally some scrollbars. But no package selection at this point. That’s probably good as debian package selection is not something you want to put in front of users at this point. More on this later.
Timezone and user configuration, configuring apt.
I suppose this is necessary but I’d prefer a real user interface. Also there’s some odd stuff here. Like having to know if the hardware clock is gmt or not (it’s not, I happen to know this). Ntp + me telling what timezone I’m in provides the same information. Finally it offers to configure a bootloader (grub) so I can choose to boot into linux or windows xp. That’s a nice touch. Debian got this wrong last time I tried it and I had to fix LILO manually to get back to windows.
Time for a reboot.
The boot screen. Pretty, if you like brown. And then back to the commandline UI in stylish bsod blue. It’s now doing its post installation routine which appears to involve configuring a user (no root, ubuntu has no root!), installing all the debian packages, downloading a few new ones. I know how debian works so not unexpected but definately not very user friendly. It involves lots of cryptic messages about various obscure pacakages being prepared, configured etc.
It comes up with a question about screen size halfway. I select 1280×1024. I can’t select refreshrate and indeed this proves to be configured wrong after the installation (60Hz instead of 85hz) Then the install continues, no more questions.
Then suddenly it is done and the login screen appears. This is linux, no further reboots necessary the installer finished without much ceremony and X was launched. The bootscreen is nice, if you like brown. I log in with my user/password. Gnome appears to be configured apple style (menu bar at the top, taskbar at the button) a popup informs me that 48 updates are available. Install seems to work fine which proves that the network is indeed configured properly.
Configuring the screen properly.
60 hz will give me a headache so that needs to be changed. Upfront I’m not very hopeful that tools have improved to the point where this can be done without manually editing X configuration files. But lets see how things have improved in the past few years.
Not much apparently. The good news is that there is a resolution tool in the system->preferences. It even has a dropdown for the refreshrate. Only one item is in it: 60HZ. Doh!
This is linux at its worst. It’s not working and the provided tools are too crappy to solve the problem at hand. A search on the ubuntu site confirms that monitor configuration is undocumented. In other words, I’m on my own. Google brings up the solution which indeed involves the commandline and hoping that the autorecognition will magically work when tried again.
Of course it doesn’t. Worse, I now understand why the installer tries to hide the inevitable sudo dpkg-reconfigure xserver-xorg. This basically is the good old XF86Config wizard. I have fond memories of toying with it in 1995 (slackware). It has gotten worse since. At the time it asked a few difficult but straightforward questions. The modern version of this tool presents you with a whole array of bullshit features and autorecognition features which half work. Lets face it, if they worked you wouldn’t be running the reconfigure. Forget about autoconfiguration. Everything the installer figured out is now forgotten (with no obvious way to redo that part other than placing the backup back).
Essentially this is horrible tool brings together everything that sucks about X in one convenient tool. Mere mortals are guaranteed to be totally confused by this beautiful piece of shit that after all these years still survives in linux. The inability of the linux community to fix this once and for all is illustrative of the hopelessness of the whole concept of desktop linux. The linux solution to display configuration is to hide this tool instead of implement an alternative. On the first go I did not manage to get the optimal refreshrate. On the second go I screwed up the display configuration. Copying back the backed up configuration did not fix the problem.
Ahem, reboot seems to ‘magically’ fix the problem. At least, I’m back where I started (1280×1024 @ 60 Hz).
Ok, so much for wizards. I knew in advance that I was going to end up manually editing the display settings. For the record, this is where normal users either go back to windows or accept the headache. I know from experience that editing X configuration is a matter of trial and error. In my case five reboots and the documentation for my plug and play m990 monitor did the trick. Ubuntu failed to setup my monitor’s horizontal and vertical refreshrates, something it should have figured out from the plug and play information. OK shit happens. The next problem was that the tool to fix this problem is reconfiguring the package. Doing this undos most of the good work the ubuntu installer did (so it makes things worse). Solution: copy the backup of the ubuntu configuration and edit it manually to fix the refreshrates (30-96 and 50-160 in my case). Then reboot because misconfiguring X really screws things up to the point that a reboot is required to make X start again after you fix the configuration. Been there, done that before. At least the bloody wheel mouse works out of the box nowadays.
Conclusions for the installer
Usability sucks but the installer gets the job done anyway except for configuring the screen (important). However there are several majr pitfalls you have to know how to avoid. The installer is not particularly informative about what it is doing and needlessly verbose at the same time. However, the defaults are sane and a strategy of going with the obvious choices will work most of the time (if in doubt, hit enter).
The default theme is ugly. There’s no other word for it. It looks like
shit. Damn this is ugly. Yes you can fix it. There’s hundreds of shitty
themes to select from. but the default is unbelievably ugly. It leaves no other conclusion than that the the ubuntu people are (color) blind. Menu layout seems ok. I have the feeling stuff is being hidden from me.
Configuring the screen properly is back to the commandline. There is no excuse for this in 2006 and I knew this was going to happen. The provided (ubuntu forum, the official documentation is of no use here) solution corrupted my configuration to the point where X just wouldn’t start anymore. Unbelievable, inexcusable.
It’s 2006, ten years after my first slackware install and I’m still messing
with the X configuration the same way as ten years ago. X continues to
And of course the installer fails to install the commercial nvidia driver (or even point me in the right direction). Amusingly the documentation is full of helpful stuff you can do manually that IMHO the installer should do for me. What the fuck do I care about ideological issues with commercial stuff? I’m not a GPL communist. Give me the choice to install the driver that I likely want. Why would I spend 400 euro on a video card and then opt not to run the software that is required to access the more interesting features of this card? Exactly, that’s very rare user.
OK on to the rest of the system.
Read only ntfs has been possible for years and even some experimental rw capabilities are possible these days. Not in ubuntu. Both my ntfs partitions are nowhere to be found. The system->administration->disks tool is as useless as the resolution tool. It fails to ‘enable’ the partitions. Yes I know how to mount stuff from the commandline. But as for Joe average, he can’t get to his ntfs files with ubuntu. Bad but I can probably fix this.
Lets see about the sound card. It’s soundblaster audigy. But there’s also a motherboard sound card (I actually uses both under windows). Pleasant surprise, ubuntus seems to have configured this correctly. Succeeding where, so far, every version of knoppix has failed.
Good. So far I’ve been sceptical but lets be positive. I have a working system, ubuntu has configured itself properly my windows install still works and I have not lost any data.
Installing kde using synaptec.
Wow, this is easy. There’s a separate distribution called kubuntu which is just ubuntu with kde instead of gnome. If you install ubuntu, like I did, you get only Gnome. Installing kde is as simple as installing the kubuntu-desktop package. This involves installing more pacakages from the dvd and downloading a few new ones. Alltogether, including the downloading this takes about 20 minutes (120 KB/seconds). I don’t understand why the kde packages are not on the dvd though, there’s plenty of room. Anyway, I now have the latest kde and gnome on one machine. The KDE theme looks much better even though it is not the default KDE theme.
The menus in both kde and gnome are a mess. This is a linux problem in general and it’s not fair to blame this on ubuntu. But still, 90% of the crap in the menus probably shouldn’t be there.
The installer has lots of usability issues. Aside from not being graphical, it is confusing, misguiding and asks a lot of stupid stuff. The partitioning tool has good functionality but also does a good job of scaring you with some misleading information.
Configuring X still is an issue. Probably it’s slightly better if you have an LCD screen (60 hz is ok then).
Hardware support is pretty decent, it failed to detect the monitor but the rest seems to work fine. It doesn’t install the commercial nvidia driver that most users will want to use.
The ubuntu gnome theme is ugly
Kde install went smooth and the ubuntu kde theme is not that bad.
With living in Finland comes the inevitable sauna experience (apparently, Finland has an estimated 2 million saunas, one for every three persons). Today I had some work related meeting which finished with sauna and dinner. That was quite an experience. The facilities at BÃƒÂ¥tvik included a pool, electric sauna and most importantly a old fashioned smoke sauna. Very nice! I’m sold. I had no idea what I had been missing out on. The concept of getting naked with your (male) colleagues and sitting together in a dark, hot room full of smoke may seem unappealing but it’s really nice. I had the complete experience, including whipping myself with berch leaves. Standing naked outside afterwards is surreal. Strangely enough it takes several minutes before your body notices that is actually pretty cold outside. Sadly, while the facilities were on the seaside, the sea was frozen solid so we couldn’t skinny dip but I’m sure that would have added to the experience.
Sadly my apartment has no sauna (as is common for newer apartments here), nor does the building have shared facilities but we do have a sauna at work that I might try. Though going outside naked is probably not really socially acceptable in the middle of Helsinki.
My new PC has a smc wlan pci card. It seems to work very nicely with none of the problems my previous siemens usb stick had. However, I appear to be not connected, currently. Which is strange since I seem to have no problem browsing the web, downloading stuff, etc. But the windows wireless icon insists on displaying a red cross (usually that means trouble). Weird. If Idouble click it, it will confirm that it isn’t connected, along with a text “you are connected to this network” below the text “not connected”. It also displays four out of five green bars which tell me the (dis)connection is excellent. Ah well, it’s only service pack 2.
Update 30/07/2009I just bought an imac and moved the same, but now consolidated, library over to it. Check out the instructions here.
Whoohoo! My new hardware has arrived, last week. I’ve been busy playing with it so that explains the small delay in posting.
Right now I am still going through the tedious procedure of getting everything the way I want it. I have a local network so I can access my old PC. However, dragging my external HD between the two machines is much faster.
Tediousness includes copying my itunes library. Tricking itunes into accepting the old library is somewhat of a challenge. But that’s what’s google is for. Since I found google’s answers a bit disappointing (lots of drag this folder there type of stuff from Apple users), I’ll post some detailed instructions for real users who do not “consolidate” to the itunes folder but choose to keep their music organized manually. To add some difficulty, my new machine has no second harddrive so the paths are different after copying.
If all goes well everything is moved (music, playlists, play statistics, ratings) AND I can sync my ipod with the new pc without that requiring it to be wiped and refilled with the moved library. I’m moving the library, not recreating it.
The Itunes library consists of only two files, its own itunes music folder and whatever external directories you imported (two in my case). One of the two files is a binary file, the other one is an xml file with data on all your songs, including path names, statistics, ratings, etc. Essentially, the xml file contains everything we want to migrate except for the mp3s. Unfortunately, moving the itunes library is not as simple as copying the files to the new machine. Sadly, Apple deliberately made it hard to do what you are about to do. So here’s a step by step guide (windows specific though Apple probably is about the same):
- At all times, keep at least one intact backup of all files mentioned in this post. Never work on the originals. Preferably, leave the original library untouched, you can always go back to that.
- Start by copying your mp3 folders to your new machine. That may take a
while. Make sure they are where you want them to be. It took 20 minutes for my folders using an external HD, not
counting the time it took to create the backup from scratch on
the external hd (basically I used my incremental backup). Also copy both Itunes files (xml and itl) and the itunes mp3 folder (if not empty)
onto the external hd.
- Now dowload, install, run & close itunes. It will create an itunes
directory for you the first time it starts, that’s where it will look for its files. Replace the stuff inside this directory (My Documents\My Music\iTunes) with the
backups on your external hd (including the itunes music folder). Now here comes the tricky part. Thanks for
this post for putting me on the right track! DO NOT start itunes again until after the steps below.
- First fix the pathnames in the xml file. They still point to the old location. Open the file in a capable editor, the thing to look for is search and replace functionality. Search and replace the parts of the path names that are now different: your itunes music folder and any other folders you imported in your old library. Save the file.
- Now this is important: iTunes will ignore whatever path info is in the xml file! Unless the itl file becomes corrupted. We can fix that! Open the itl file in an editor, delete the gibberish inside, save. Your itl file is now corrupted, normally this is a bad thing. You still have the xml file though (and a backup of the itl).
- Start itunes, it will ‘import’ your music and afterwards complain that the itl file is corrupted, let it fix it.
- Check if everything is there. In my case I messed up with the search and replace and some files were missing. Just go back a few steps, copy your backups and retry.
- Done. Everything now is on the new PC. What about the ipod? Just plug it in!. You already installed iTunes on the new machine so you have the drivers for your ipod. The key or whatever itunes uses to recognize you ipod is in the xml file. And now also in the recreated itl. Apparently the xml file is sort of a backup of the itl. I suspect the itl is a bit more efficient to manipulate programmatically. I have no idea if this preserves any itunes store stuff you purchased. Presumably, this involves deauthorizing your old machine and authorizing the new one. I never used the itunes store so it’s not an issue for me.
The only thing I lost in the transition is some iTunes preferences that are easy to restore. For example I had some of my playlists set to shuffle. The imported playlists no longer had the shuffle enabled. Big deal. The preferences probably aren’t part of the library. I noticed that the shuffle settings do not sync to the ipod either. This is annoying actually because the shuffle settings is deep down in some menu on the ipod and I only want to shuffle playlists. I like my album songs served up in the order that they were put on the album.
I’ve used winamp for most of the past decade (I think from 1996?). Only when I got my ipod a few months ago, I started using iTunes, by choice. There is an excellent winamp plugin which will allow you to sync winamp with your ipod. Presumably, moving a winamp library is a lot more easy since winamp uses a file based library rather than a database. However, the main developer has left AOL, so winamp development seems a lot less interesting these days. AOL seems to just pile on commercial crap with every release. So I’ve given up on it for now.
Since moving to Finland I have a dynamic ip address. In NL I had a nice static address which had its own domain name: jilles.xs4all.nl. Useful for running servers such as ftp and ssh. With a dynamic ip that’s still possible but you have to keep track of the ip address somehow.
Or you can use dyndns. Dyndns offers a nice dynamic dns service, for free. So now my pc is reachable as jillesvangurp.mine.nu. They also have a nice tool that updates the ip address whenever it changes, also free. I like free goodies and these certainly work very nice :-).
Now I just have to reconfigure my ssh server to use port 443 (ssl) so I can reach it from work (proxy blocks port 22 :-().