ubuntu – the story continues

If you’ve been following my ubuntu rants (one, two, three + latest comment), you’ll know that so far the experience has been not as FUCKING advertised (excuse the explicitive). Well, here’s another rant:

After tracking down the kernel driver issue (mind you all my notes on installer usability still apply) that prevented my network from working, I had a way to get a working install. Since I no longer trust gparted to resize my ntfs partition (see post three) I opted for Wubi. Wubi is a great idea, just install everything in a disk image file on your windows C:\ and add a item to the default windows xp bootloader to boot ubuntu. Brilliant. The installer works as advertised:

  • You fill in some details
  • It downloads a custom ubuntu iso image for you (would be better if I wasn’t forced to download it with the installer)
  • it adds an item to the bootloader
  • it reboots
  • it boots into a loopback filesystem on the disk image on your windows drive

Here Wubi’s job ends and Ubuntus text installer takes over (so bye bye usability, welcome to the wonderful world of text based installers). Unlike the normal installation you have 0 control so naturally all the same things go wrong. I.e. it got stuck at “scanning the mirrors” again. This time I unplugged the network cable and killed a few processes in one of the other terminals (ctrl+alt+f2, nice trick I remember from slackware days) hoping the installer would pick up the hint. Surprisingly it did, although it did mess up the apt sources.list in the process. Anyway, the installer completed, I rebooted and configured the WLAN, which does work, unlike on many other people’s hardware. One reboot later (sounds like windows doesn’t it :-). I was looking at the ubuntu desktop.

Fixing what’s wrong.

As I know from previous times, ubuntu does not do a good job of installing my video card + monitor and my sound card. The video+monitor card took a few tries and obscure commands to get right. Apparently the x.org 7.3 in the next release of Ubuntu will do a better job, The sound card issue is due to the fact that I have a modern PC with multiple sound devices. Since Ubuntu likes to guess when it should ask me, I end up with all the sound going to my USB headset instead of the soundblaster and no obvious way of fixing it. The problem is that the tools to fix this are not installed. That’s right, it is assumed that Ubuntu is always right and if not you are on your own. This is true for network; this is true for video; this is true for sound.

It gets worse.

Then (after fixing sources.list which curiously had all entries twice?!) I did an update with synaptic: 260 MB. And promptly ran into this bug. Oops broken pipe bla bla bla, upgrade failed: here’s a bunch of really obscure errors, isn’t synaptic great? Pasting the first line of where things went wrong into google brought me straight to this bug (lucky me). This was another of those opportunities where maybe ordinary users would give up. A rather obscure fix in the bug report helped (basically touch all the failed files and re-run apt-get upgrade). For the record, I did not install anything before running into this bug. Just install ubuntu in Finland + upgrade is enough to trigger this bug. Known since April apparently and related to timezones.

More later. The good news is that I have a bootable system and can probably resolve most remaining issues. The bad news is that so far the experience has been really, really, really bad. I’ve been struggling with things that should just work or fail more gracefully.

websites and stupid assumptions

I just went to a blog and wanted to leave a comment. So the site redirects to blogger.com where I can leave a comment. The site correctly detects I am located in Finland. Very good! That’s so clever! Amazing what you can do these days!

The only problem is that like most of this world (barring around 4-5 million native speakers) I don’t speak Finnish. Not a word. Really, I have a coffee mug that lists pretty much all knowledge I have of this beautiful but incomprehensible language. I haven’t even managed to memorize most of that. And somehow I don’t believe “maksa makkara” (pay the sausage?) is going to help me here.

So, no problem, lots of sites make this mistake. Just click the little “gimme the english version” link that surely must be hidden somewhere …. missing in action. So I check the url …. no obvious way to change fi to en there either. Maybe on the frontpage …. nope, www.blogger.com insists on Finnish as well. So www.blogger.com is unusable for me. Lets just hope it doesn’t spread to the rest of the world. That would be really annoying.

Anyway, this assumption of setting the language based on IP address is just stupid and wrong. First of all, the site should respect my browser settings: doesn’t list Finnish, at all. Neither does my OS. And the browser sends this information as part of the http headers so you can know that my preferred language is US-en. Secondly, Finland is bilingual and for some 500.000 people the right language would have been Swedish. I happen to speak some Swedish at least. And finally any modern country like Finland has a large number of travelers, tourists and migrant workers like me. So not offering a way out is just stupid. Confronting hundreds of thousands of users (just in Finland) with the wrong language when each of them is providing you with their preferred language is even more stupid. Additionally, not offering a link for English is just retarded.

xampp, skype and port 80

For some time I’ve been considering setting up some php development environment. Not that I like php but I want to play with some php stuff nevertheless (e.g. Drupal seems interesting). So I downloaded one of the popular all in one packages that combine apache, mysql and php: xampp. I have actually set up apache, mysql and php manually once on windows and know that it is A) doable and B) very tedious, hence the integrated package this time.

Xampp sure makes it really easy. Download, install, run xampp configuration tool, start mysql … green, start apache … ???!??!!! WTF, it won’t start. So I go to localhost with the browser, blank page instead of the expected error. So I check my processes list, no sign of httpd. Now this is weird, some process is definitely listening on port 80. So, I run netstat to find out who is guilty of this crime. It turns out that skype is actually listening on port 80 for some stupid reason. That just sucks. Luckily there’s an option in the skype preferences to turn it off but still, don’t open port 80 if you are not a web server.

Anyway, problem fixed and 2 minutes later I’ve created a database using phpmyadmin and installed drupal 5.2 and configured it. That’s just what I wanted: 2 minutes of work and *poof* instant website.

In case you are wondering, yes I am considering to dump wordpress. The reason is the lack of clear progress in getting proper openid, atompub and microformats support in wordpress. You can all sort of bolt it onto a wordpress install but not without editing php and default templates (and this tends to break during upgrades, i.e. every 2-3 months). Drupal seems much more feature rich and configurable than wordpress and it sure is tempting. Concerns I have include import/export of data (including e.g. uploads); openid support; comment & referral spam blocking; etc.

Update.

After playing with drupal 5.2 and a development snapshot of 6.0, I’ve decided not to migrate because simply the migration is too hard currently. There is only a seriously outdated module for drupal 4.7 which can only migrate wordpress version 2.0. In other words, this is unlikely to work for my blog without a lot of tinkering. Additionally, moving from drupal to something else is likely not exactly trivial either. I migrated from pivot to wordpress early 2006. That was quite painless since wordpress has excellent import feature. Drupal lacks such features and wordpress has no Drupal import as far as I know (would be hard due to the generic node datastructure in drupal).

BTW. I’ve spent some time researching the topic. This link here is the most informative I was able to find: http://drupal.org/node/69706. Be sure to also check the comments.

I’ve taken a brief look at joomla too. Interesting product but not really designed for Blogs. Overall, I’m pretty happy with wordpress. It’s just that I want proper openid support.

More ubuntu

I’ve given up on feisty. I’ve blogged several times now about my failure to install it properly. Today I gave it another try and partially succeeded before failing again.

I read somewhere that you can bypass the scanning the mirrors problem by disconnecting the network cable. You see, running ifdown eth0 is not good enough because ubuntu tries to outsmart you with its network manager. It’s becoming more and more like windows. The assumption that the user is a complete idiot now dominates the whole installer. Anyway, unplugging the network forces ubuntu to acknowledge some hard reality.

So I ended up with a bootable ubuntu system this time (misconfigured and all). Great, only the network still didn’t work properly. For some reason some stuff loads in the browser (e.g. google) but most stuff does not. So I was in the weird situation that I could google for my problem, get a lot of hits but unable to access any of them. So I spent the whole morning booting windows and ubuntu repeatedly. Each time in windows I tried to get some answers (no network trouble there) and in linux tried to mess with the ubuntu networking subsystems.

I failed. After the fifth time I just decided not to bother anymore. Obviously ubuntu has some weird bugs in their network layer that I do not encounter with older versions or with windows. From what I googled I learned that there are many unrelated problems related to networking in ubuntu. I now strongly suspect the dns related behaviour of my cable modem is the cause. Replacing the modem might solve my problems. But then again it might not. It’s a motorola cable modem and there is no question about the quality of the firmware being particularly sucky. I have to reboot this sucker quite often and already have decided to never ever buy anything with motorola firmware embedded again. Without a working network, ubuntu is basically not very interesting. Can’t run updates or install software. If anyone has a proper solution, I’d be interested to hear it.

Part two of my misery started when I started messing around with gparted. Basically I removed the linux partitions and then decided to give back the space to the ntfs partition. At this point it started to throw very scary messages at me about my ntfs partition halfway through resizing it. For about 15 minutes I was assuming it had foobarred the partition table (very depressing even though I have back ups). Finally after a reboot the ntfs partition showed up fine in gparted (unmodified) and it even mounted it without being asked (really annoying feature but welcome in this case). Next problem was booting it. I relearned a few lessons about mbr there (fdiks /mbr helped me out a few times when I was playing with slackware ten years ago). Basically the fix in windows xp is running fixmbr from the windows rescue console. Until you do that, you are stuck with a broken grub that points to a deleted partition. For legal reasons (I assume) gparted and grub lack the feature of undoing the damage to the mbr.
It took me about half an hour to figure that out and I’m now running my old windows desktop again.

So I have three conclusions to add to my review a few weeks ago:

  • The networking subsystem has huge compatibility issues that likely affect many users. From what I encountered while googling, I learned that there are many different issues. The fixes I tried didn’t work which suggest that these many issues are different from my specific issue. Not good. The same modem and pc have booted older ubuntu releases (dapper) so it is a regression! My modem is quite common so likely thousands of users are affected.
  • Gparted has some nasty issues and should not be used to resize ntfs partitions. You will risk losing all your data. I guess any partition resizer is better than none but this crap should not be put in front of end users.
  • In this form, ubuntu is guaranteed to cause lots of users to have a very negative introduction to linux.

I will try a future version again. For now, I’ve had enough.

Broken pc

The easterbunny killed my power supply. So, I’m posting this with my phone. I’ll be back once it is fixed.

Using rsync for backup

As you may recall, I had a nice incident recently which made me really appreciate the fact that I was able to restore my data from a backup. Over the years I’ve sort of gobbled together my own backup solution using rsync (I use the cygwin port to windows).

First a little about hardware. Forget about using CDs or DVDs. They are just too unreliable. I’m currently recovering data from a whole bunch of CDs I had and am horrified to discover that approximately one third has CRC errors on them. Basically, the light sensitive layer has deteriorated to the point that the disc becomes unreadable. Sometimes as soon as within 2 years. I’ve used various brands of CDs over the years and some of them have higher failure rates than others but no brand seems to be 100% OK. In other words, I’ve lost data on stored on pretty much every CD brand I’ve ever tried. Particularly Fujifilm (1-48x) and unbranded CDs are bad (well over 50% failure rate) on the other hand, most of my Imation CDs seem fine so far. Luckily I didn’t lose anything valuable/irreplacable. But it has made it clear to me to not trust this medium for backups.

So, I’ve started putting money in external harddrives. External drives have several advantages: they are cheap; they are big and they are much more convenient. So far I have two usb external harddrives. I have a 300GB Maxtor drive and the 500GB Lacie Porsche drive I bought a few weeks back. Also I have a 300 GB drive in my PC. Yes that’s 1.1 TB altogether :-).

The goal of my backup procedures is to be ‘reasonably’ safe. Technically if my apartment burns down, I’ll probably lose all three drives and all data on them. Moving them offsite is the obvious solution but this also makes backups a bit harder. Reasonably safe in my view means that my backed up data survives total media failure on one of the drives and gives me an opportunity to get to the reasonably safe state again. When I say my data, I’m referring to the data that really matters to me: i.e. anything I create, movies, music, photos, bookmarks, etc.

This data is stored in specific directories on my C drive and also a directory on my big Lacie drive. I use the Maxtor drive to backup that directory and use the remaining 200GB on the lacie drive for backing up stuff from my C drive.

All this is done using commands like this:

rsync -i -v -a --delete ~/photos/ /cygdrive/e/backup/photos >> /cygdrive/e/backup/photos-rsync.txt

This probably looks a bit alien to a windows user. I use cygwin, a port of much of the gnu/linux tool chain that layers a more linux like filesystem on top of the windows filesystem. So /cygdrive/c is just the equivalent of good old c:\. One of the ported tools is ln, which I’ve used to make symbolic links in my cygwin home directory to stuff I want to backup. So ~/photos actually points to the familiar My Pictures directory.

Basically the command tries to synchronize the first directory to the second directory. The flags ensure that content of the second directory is identical to that of the first directory after execution. The –delete flag allows it to remove stuff that isn’t in the first directory. Rsync is nice because it works incrementally. I.e. it doesn’t copy data that’s already there.

The bit after the >> just redirects the output of rsync to a text file so that afterwards, you can verify what has actually been backed up. I use the -v flag to let rsync tell me exactly what it is doing.

Of course typing this command is both error prone and tedious. For that reason I’ve collected all my backup related commands in a nice script which I execute frequently. I just turn on the drives; type ./backup.sh and go get some coffee. I also use rsync to backup my remote website which is easy because rsync also works over ftp and ssh.

Part of my backup script is also creating a dump from my subversion repository. I store a lot of stuff in a subversion repository these days: my google earth placemarks; photos; documents and also some source code. The subversion work directories are spread across my harddrive but the repository itself sits in a single directory on my cdrive. Technically I could just back that up using rsync. However, using

svnadmin dump c:/svnrepo | gzip > /cygdrive/e/backup/svnrepo.gz

to dump the repository allows me to actually recreate the repository in any version of subversion from the dump. Also the dump file tends to be nicely compressed compared to either the work directory or the repository directory. Actually, the work directory is the largest because it contains 3 copies of each file. In the repository, everything is stored incrementally and in the dump gzip squeezes it even further. The nice thing of a version repository is of course that you preserve also the version history.

Doh! and phewwwww

This morning I had some more fun marking locations in Google Earth. I documented more holidays. Then I managed to lose all my additions by crashing google earth. It reset to the version I saved last evening. It turns out, it doesn’t actually save anything until you exit properly. :-(.
Next time, I’ll be saving and exporting a lot more often. Luckily I did some exporting just before it crashed. The exported file imported perfectly. But I came very close to losing a few hours of work. :-).
Anyway, I now have placemarks for all my summer holidays since 1998. The turkey and sweden vacations already have a lot of text on my photo site, so I did not bother to write much text. Both greece vacations will get some more attention later on.

Update 07-12-2006: Google introduced a feature that allows you to display kmz’s on Google Maps. This has now been integrated into the xsl so you can now view the kmz in Google Earth or Google Maps. Also I created a nice zip file distribution with some documentation (download here).

Nvidia tv out aspect ratio trouble & workaround

For a few months I’ve been putting off upgrading my nvidia video card drivers from 80.21 to the 9x.xx because of an issue with tv out. Anyway, there’s a few other bugs in the 80.xx series that have been annoying me and several nice features in the new 9x.xx series so I upgraded anyway.

It seems nvidia introduced a bug/feature in the 9x.xx series that fucks up the aspectratio of tvout. Essentially it detects when the overlay used for the full screen signal on the tv does not have 4:3 aspect ratio (e.g. a dvd movie would be 16:9 or even wider) and then ‘compensates’ by scaling it to 4:3 unless the resolution of the overlay is less than 640×480. Why is this wrong: the s-video signal is always 4:3, even if you have a widescreen tv. If the movie is widescreen (e.g. 16:9 or wider as is common with dvds) black bars should be included in the signal rather than stretching the image to 4:3. This used to work correctly but is now broken in the new driver.
There seems to be a lot of people complaining in the various nvidia forums about the aspect ratio of the tv out signal. As usual most of the suggestions are useless. Essentially this is an unfixed bug in the driver that the 80.xx series did not have. So far nvidia has not fixed it. There is no magic combination of ‘correct’ settings in the new control panel that fixes the issue. However one of the suggested solutions provides a usable workaround that involves using ffdshow to add the black bars so that the resulting image is always 4:3. Ffdshow is a popular mpeg4 decoder available on sourceforge that comes with a bunch of useful filters that you can use to postprocess the output of the decoder. This works around the problem by ensuring the overlay aspect ratio of the overlay is always 4:3 so that the driver thinks it does not need to be scaled.
To do this:

  • open ffdshow configuration
  • enable resize & aspect ratio filter
  • configure it as follows
    • toggle resize to on
    • select ‘specify size’ and fill in 1024×768. In the various forum posts everybody seems to recommend 640×480. However most nvidia cards have a tv out chip that supports 1024×768 and this seems to be the default resolution used. Both resolutions are a 4:3 aspect ratio and that seems to be the main point. Also the native resolution of 1024×768 is what the image will be scaled to anyway so scaling it to 640×480 means it gets scaled twice before being sent to the tv!
    • set aspect ratio to keep original aspect ratio. This ensures that ffdshow adds black bars rather than stretching the image to fill the 1024×768.

OK, this will addresses the issue sort of. A limitation is of course that it only works for content actually passing through ffdshow. This excludes for example dvds. However dvd playing software probably exists that can do similar things. This trick is also compatible with another little trick I apply to get a little extra resolution out of my widescreen (16:9) tv. This one requires media player classic (some other players probably have a similar feature).

  • Open a movie, right click on the picture and select pan & scan – scale to 16:9.
  • This scales the movie to anamorphic aspect ratio, it stretches the picture vertically. Due to the ffdshow config above this means the black bars are smaller. In other words there is less pixels wasted on black bars and more to actual picture.
  • This looks bad on both your monitor and tv, don’t worry.
  • Usually a widescreen tv has an option to properly display anamorphic dvds by ‘unstretching them’. Usually it is called something like wide (my tv). It does the opposite of scale to 16:9 on your tv. Use it to ‘fix’ the image on your tv.
  • Why do this? You get a slightly better resolution due to the fact that a larger portion of the 4:3 signal sent to your tv consists of actual movie pixels rather than black bars. The slight detail loss due to stretching and unstretching is more than compensated by the extra pixels you gain.

An obvious improvement would be to actually scale to a 16:9 resolution in ffdshow and then let the driver scale it to 4:3. I tried this of course but then the driver compensates by cutting away a bit on both sides to make the image scale to 4:3. DOH!

motorola cable modem & bittorrent

I’ve blogged several times already about my problems connecting my pc to the internet:

  • Getting a cable modem was easy.
  • I mistakenly bought a Siemens wireless network USB stick. Solution don’t buy crap and use a decent brand. Currently I’m using an smc pci card; my ibm/lenovo’s laptop’s built in network card and my Nokia e70 with its wlan support.
  • The driver software going paranoid from time to time.

A remaining problem that has been annoying me for months is that my cable modem, a Motorola sbg900e, has some issues. Most of the time it works fine except when applications like bittorrent run. Then it just resets more or less continuously. Motorola apparently does not believe it is important to support their customers with useful advice or firmware updates so that basically meant no bittorrent for the past few months. Bittorrent is a resource intensive protocol and it probably represents a worse case scenario for the modem in terms of number of connections, bandwidth consumed etc.
Some googling (“motorola modem reset bittorrent”), once again, brought me a work around. It is not the first time that I find out the hard way that solving a technical problem is just a matter of providing google with the right keywords. Believe me I’ve been searching many times using the type number of my modem in the query bringing up nothing but unrelated problems and advertisement material.
Anyway, one of the people in this forum was kind enough to explain that the problem is with the number of connections that the bittorrent client tries to open simultaneously. If this exceeds a certain number, the modem firmware crashes and the modem resets (apparently earlier models just crashed and did not reset, lucky me :-). The workaround consists of telling your bittorrent client to not open too many connections at the same time. It’s no problem having say 50-100 connections open at the same time but opening them all at once is a problem. True, most bittorrent clients do not have such a setting but recent versions of my favourite one (Azureus) do have such a setting. It’s called “max simultaneous outbound connection attempts” and by default it is set to 16. You can find it under connection->advanced network settings. I find that, so far, limiting it to 8 prevents the modem from crashing.

Problem solved 🙂

jedit plugin manager

I tried to install some plugins in jEdit, my favourite programming editor (for things other than Java, for that I use eclipse of course). I got some IO errors trying to install some plugins in the plugin manager. Since this has happened before, I looked into it and found a solution to the problem:

  • go to Utilities->Global options
  • Select plugin manager
  • click ‘update mirror list’
  • select one of the alternatives

Apparently the problem is that the default repository url in jEdit is no longer ok. Changing it to another one fixes that problem. Since the whole point of jEdit is using the many plugins that are available this is a pretty critical thing to fix.

Anyway, I’m glad to see that development for jEdit seems to be picking up again. I noticed that the 4.3pre4 release is fairly recent. Also the sourceforge page shows that there is a healthy activity on the core jEdit source code. I’m glad because it started to feel like this project was more or less dead. Jedit is pretty unique, all the other editors have (much) less features.