Mobile Linux

A lot has been written about mobile and embedded device platforms lately (aka. ‘phone’ platforms). Usually articles are about the usual incumbent platforms: Android, IOS, and Windows Phone and the handful of alternatives from e.g. RIM and others. Most of the debate seems to revolve around the question whether IOS will crush Android, or the other way around. Kind of a boring debate that generally involves a lot of fan boys from either camp highlighting this or that feature, the beautiful design, and other stuff.

Recently this three way battle (or two way battle really, depending on your views regarding Windows Phone), has gotten a lot more interesting. However, my in view this ‘war’ was actually concluded nearly a decade ago before it even started and mobile linux won in a very unambiguous way. What is really interesting is how this is changing the market right now.

Continue reading

N900 & Slashdot

I just unleashed the stuff below in a slashdot thread. 10 years ago I was a regular there (posting multiple times per day) and today I realized that I hadn’t actually even bothered to sign into slashdot since buying a mac a few months ago. Anyway, since I spent time writing this I might as well repost here. On a side note, they support OpenID for login now! Cool!

…. The next-gen Nokia phone [arstechnica.com] on the other hand (successor to the N900) will get all the hardware features of the iPhone, but with the openness of a linux software stack. Want to make an app that downloads podcasts? Fine! Want to use your phone as a modem? No problem! In fact, no corporation enforcing their moral or business rules on how you use your phone, or alienation of talented developers [macworld.com]!

You might make the case that the N900 already has the better hardware when you compare it to the iphone. And for all people dismissing Nokia as just a hardware company, there’s tons of non trivial Nokia IPR in the software stack as well (not all OSS admittedly), that provides lots of advantages in the performance or energy efficiency domain; excellent multimedia support (something a lot of smart phones are really bad at), hardware acceleration, etc. Essentially most vendors ship different combinations of chips coming from a very small range of companies so from that point of view it doesn’t really matter what you buy. The software on top makes all the difference and the immaturity of newer platforms such as Android can be a real deal breaker when it comes to e.g. battery life, multimedia support, support for peripherals, etc. There’s a difference between running linux on a phone and running it well. Nokia has invested heavily in the latter and employs masses of people specialized in tweaking hardware and software to get the most out of the hardware.

But the real beauty of the N900 for the slashdot crowd is simply the fact that it doesn’t require hacks or cracks: Nokia actively supports & encourages hackers with features, open source developer tools, websites, documentation, sponsoring, etc. Google does that to some extent with Android but the OS is off limits for normal users. Apple actively tries to stop people from bypassing the appstore and is pretty hostile to attempts to modify the OS in ways they don’t like. Forget about other platforms. Palm technically uses linux but they are still keeping even the javascript + html API they have away from users. It might as well be completely closed source. You wouldn’t know the difference.

On the other hand, the OS on the N900 is Debian. Like on Debian, the package manager is configured in /etc/sources.list which is used by dpkg and apt-get, which work just as you would expect on any decent Debian distribution. You have root access, therefore you can modify any file, including sources.list. Much of Ubuntu actually compiles with little or no modification and most of the problems you are likely to encounter relate to the small screen size. All it takes to get to that software is pointing your phone at the appropriate repositories. There was at some point a Nokia sponsored Ubuntu port to ARM even, so there is no lack of stuff that you can install. Including stuff that is pretty pointless on a smart phone (like large parts of KDE). But hey, you can do it! Games, productivity tools, you name it and there probably is some geek out there who managed to get it to build for Maemo. If you can write software and package it as a Debian package and can cross compile it to ARM (using the excellent OSS tooling of course), there’s a good chance it will just work.

So, you can modify the device to your liking at a level no other mainstream vendor allows. Having a modifiable Debian linux system with free access to all of the OS on top of what is essentially a very compact touch screen device complete with multiple radios (bluetooth, 3G, wlan), sensors (GPS, motion, light, sound), graphics, dsp, should be enough to make any self respecting geek drool.

Now with the N900 you get all of that, shipped as a fully functional smart phone with all of the features Nokia phones are popular for such as excellent voice quality and phone features, decent battery life (of course with all the radios turned on and video & audio playing none stop, your mileage may vary), great build quality and form factor, good support for bluetooth and other accessories, etc. It doesn’t get more open in the current phone market currently and this is still the largest mobile phone manufacturer in the world.

In other words, Nokia is sticking out its neck for you by developing and launching this device & platform while proclaiming it to be the future of Nokia smart phones. It’s risking a lot here because there are lots of parties in the market that are in the business of denying developers freedom and securing exclusive access to mobile phone software. If you care about stuff like this, vote with your feet and buy this or similarly open (suggestions anyone?) devices from operators that support instead of prevent you from doing so. If Nokia succeeds here, that’s a big win for the OSS community.

Disclaimer: I work for Nokia and I’m merely expressing my own views and not representing my employer in any way. That being said, I rarely actively promote any of our products and I choose to do so with this one for one reason: I believe every single word of it.

Migrating iTunes windows to Mac

As part of my recent imac purchase came the chore of moving stuff over from my old hard drive which thankfully survived my old PC’s premature demise. Part of the stuff that I naturally wanted to migrate is of course my iTunes library.

Last time I moved my iTunes library is 3.5 years ago, when I bought my then new and now broken PC. At the time I wrote up a little howto on iTunes migration that over the years has gotten me tens of thousands of hits (dozens per day). Some people were kind enough to leave a message and apparently I helped out several of them with those instructions.

So, here’s the follow up.

Circumstances are slightly different now. I “consolidated” my library ages ago, which means iTunes moves all files to its own directory. Additionally, I reorganized my library, fixed tagging issues, etc. In other words, I put a lot of time into organizing my music and keeping it organized and would hate to start from scratch. Secondly, we’re several major iTunes versions down the road and they added album art fetching, genius and loads of other features.

So, I started out by following the original instructions of three years ago. This nearly works, except the music is not playable from iTunes for some vague reason. So no longer up to date despite a comment of some person who successfully managed a win to mac migration the “old” way.

Next I tried something that I didn’t believe would work but which worked anyway:

  • Copy library from former C drive to where it is supposed to be on your mac (all files & dirs)
  • Don’t modify a single file, they’re fine as they are.
  • Start itunes

Right after it starts it “updates” the library and that’s it. That’s a lot faster & easier. Play counts, play lists, ratings, you name it. It’s there. Thanks Apple, this is a lot better than three years ago and exactly the way it is supposed to work.

imac 24″

Saturday morning I turned on my PC and basically the screen did not come on (I have it on standby). Suspicious but has happened before. So I press the powerbutton to do a reset and then it did single beep followed by 8 rapid beeps. That’s the bios telling you: this pc is FOOBARRED, please try installing a new motherboard. Or something. Good luck with that. Anyway, eight beeps and nothing.

After the predictable “godverdomme, kutzooi”, which needs no translation here, I calmed down and did what I was planning to do anyway (by coincidence). Which was visiting the local Apple store. Or rather the Gravis M&S store on the Ernst Reuter Platz here in Berlin, a nice big store specialized in reselling all the nice Apple gadgets along with a helpdesk and good support options (I hate putting expensive hardware in the mail). So, I basically already decided to go for an imac. Question: which one. Eh … why go for anything less than the biggest one? Sure it costs money but it I’ll be spending the next few years glued to its screen. So 24″, 3Ghz dual core cpu, 4GB ram, a 1TB diskdrive and a nice nivida chipset with 512MB (which will no doubt run X-plane just fine).

Anyway, they didn’t have one in store with an English keyboard. So they made the order for me and told me “one and a half week”. So, to my pleasant surprise, I got the call already today that they had my new imac ready. So I fetched it, plugged it in and enjoyed the famous Apple out of the box experience, which is excellent.

Then I went to work installing the basics: firefox, adium, skype and some other essentials. I haven’t gotten around to applying all the tweaks I have on my work laptop but will of course be doing that to fix e.g. annoying home-end key behavior and a couple of other things.

I am now in need of:

  • A proper usb mouse (sorry but the mighty mouse will join its brother in a drawer)
  • A USB2 – SATA converter to read both internal drives in my old PC with all my music, photos and other essentials. I have a pretty recent backup on an external drive but had gotten a bit sloppy backing up the last few months. BTW. I noticed that NTFS is read only on macs, so any tips for fixing that are welcome. Macfuse seems to be one option, any alternatives?
  • To fix that, I will need a nice new big external drive to hook up to time machine.

Not all is great though:

  • The keyboard sucks compared to the one that came with my Mac Book Pro last year. WTF is up with the weirdly small enter key and the weird symbols where it used to say page down, page up, etc.
  • The mighty mouse still stinks
  • All the mobile me and .mac spam on first launch is kind of annoying

Anyway, happy to be online again.

Java Profiling

One of the fun aspects of being in a programmer job is the constant stream of little technical problems that require digging into. This can sometimes be frustrating but it’s pretty cool if you suddenly get it and make the problem go away. Anyway, since starting in my new job in February, I’ve had lots of fun like this. Last week we had a bit of Java that was obviously out of line performance wise. My initial go at the problem was to focus on the part that had been annoying me to begin with: the way xml parsing was handled. There’s many ways to do XML parsing in Java. We use Jaxb. Jaxb is nice if you don’t have enough time to do the job properly with XPath but the trade off is that it can be slow and that there are a few gotchas like for example creating marshallers and unmarshallers is way more expensive than actually using them. So when processing a shitload of XML files, you spent a lot of time creating and destroying marshallers. Especially if you break down the big xml files into little blobs that are parsed individually. Some simple pooling using ThreadLocal improved things quite a bit but it was still slow in a way that I could not explain with just xml parsing. All helpful but it still felt unreasonably slow in one particular class.

So I spent two days setting up a profiler to measure what was going on. Two days? Shouldn’t this be easy? Yes, except there’s a few gotchas.

  1. The Eclipse TPTP project has a nice profiler. Except it doesn’t work with macs, or worse, macs with jdk1.6. That’s really an eclipse problem, the UI is tied to 1.5 due to Apple stopping to support of Cocoa integration in 1.6.
  2. So I fired up vmware, installed the latest Ubuntu 9.04 (nice), spent several hours making that behave nicely (file sharing is broken and needs a patch). Sadly no OpenGL eyecandy in vmware.
  3. Then I installed Java, eclipse, TPTP, and some other stuff
  4. Only to find out that TPTP and JDK 1.6 is basically unusable. First, it comes with some native library compiled against a library that no longer is used. Solution: install it.
  5. Then every turn you take there’s some error about agent controllers. If you search for this you will find plenty of advice telling you to use the right controller but none whatsoever as to how you would go about doing so. Alternatively people tell you to just not use jdk 1.6 I know because I spent several hours before joining the gang of “TPTP just doesn’t work, use netbeans for profiling”.
  6. So, still in ubuntu, I installed Netbeans 6.5, imported my eclipse projects (generated using maven eclipse:eclipse) and to my surprise this actually worked fine (no errors, tests seem to run).
  7. Great so I right clicked a test. and chose “profile file”. Success! After some fiddling with the UI (quite nerdy and full of usability issues) I managed to get exactly what I wanted
  8. Great! So I exit vmware to install Netbeans properly on my mac. Figuring out how to run with JDK 1.6 turned out to be easy.
  9. Since I had used vmware file sharing, all the project files were still there so importing was easy.
  10. I fired up the profiler and it had remembered the settings I last used in linux. Cool.
  11. Then netbeans crashed. Poof! Window gone.
  12. That took some more fiddling to fix. After checking the release notes it indeed mentioned two cases of profiling and crashes which you can fix with some commandline options.
  13. After doing that, I managed to finally get down to analyzing what the hell was going on. It turned out that my little test was somehow triggering 4.5 million calls to String.replaceAll. WTF!
  14. The nice thing with inheriting code that has been around for some time is that you tend to ignore those parts that look ugly and don’t seem to be in need of your immediate attention. This was one of those parts.
  15. Using replaceAll is a huge code smell. Using it in a tripple nested for loop is insane.
  16. So some more pooling, this time of the regular expression objects. Pattern.compile is expensive.
  17. I re-ran the profiler and … problem gone. XML parsing now is the bottleneck as it should be in code like this.

But, shouldn’t this just be easy? It took me two days of running from one problem to the next just to get a profiler running. I had to deal with crashing virtual machines, missing libraries, cryptic error messages about Agent Controllers, and several unrelated issues. I hope somebody in the TPTP project reads this: your stuff is unusable. If there’s a magic combination of settings that makes this shit work as it should: I missed it, your documentation was useless, the most useful suggestion I found was to not use TPTP. No I don’t want to fiddle with cryptic vm commandline parameters, manually compiling C shit, fiddle with well hidden settings pages, etc. All I wanted was right click, profile.

So am I now a Netbeans user? No way! I can’t stand how tedious it is for coding. Run profiler in Netbeans, go ah, alt tab to eclipse and fix it. Works for me.

Localization rant

I’ve been living outside the Netherlands for a while and have noticed that quite many web sites are handling localization and internationalization pretty damn poorly. In general I hate the poor translations unleashed on Dutch users and generally prefer the US English version of UIs whenever available.

I just visited Youtube. I’ve had an account there for over two years. I’ve always had it set to English. So, surprise, surprise, it asked me for the second time in a few weeks, in German, whether I would like to keep my now fully Germanified Youtube set to German. Eehhhhh?!?!?! nein (no). Abrechen (cancel)! At least they ask, even though in the wrong language. Most websites don’t do even bother with this.

But stop and think about this. You’ve detected that somebody who has always had his profile set to English is apparently in Germany. Shit happens, so now what? Do you think it is a bright idea to ask this person in German whether he/she no longer wants the website presented in whatever it was set to earlier? Eh, no of course not. Chances are good people won’t even understand the question. Luckily I speak enough German to know Abrechen is the right choice for me. When I was living in Finland, convincing websites I don’t speak Finnish was way more challenging. I recall fighting with Blogger (another Google owned site) on several occasions. It defaulted to Finnish despite the fact that I was signed in to Google in and have every possible setting Google provides for this set to English. Additionally, the link for switching to English was three clicks away from the main page. Impossible to do unless you know the Finnish word for preferences, language, and OK (in which case you might pass for a native speaker). I guess I’m lucky to not live in e.g. China where I would stand no chance whatsoever to guess the meaning of buttons and links.

The point here is that most websites seem to be drawing the wrong conclusions based on a few stupid IP checks. My German colleagues are constantly complaining about Google defaulting to Dutch (i.e. my native language, which is quite different from Deutsch). Reason: the nearest Nokia proxy is in Amsterdam so Google assumes we all speak Dutch.

So, cool you can guesstimate where I am (roughly) in the world but don’t jump to conclusions. People travel and move around all the time. Mostly they don’t change their preferred language until after a lot of hard work. I mean, how hard can it be? I’m already signed in, right? Cookies set and everything. In short, you know who I am (or you bloody well should given the information I’ve been sharing with you for several years). Somewhere in my profile, it says that my preferred language is English, right? I’ve had that profile for over four years, right? So why the hell would I suddenly want to switch language to something that I might not even speak? A: I wouldn’t. No fucking way that this is even likely to occur.

It’s of course unfair to single out Google here. Other examples are iTunes which has a full English UI in Finland but made me accept the terms of use in Finnish (my knowledge of Finnish is extremely limited, to put it mildly). Finland is of course bilingual and 10 percent of its population are Swedish speaking Finns, most of which probably don’t handle Finnish that well. Additionally there are tens of thousands of immigrants, tourists and travelers, like me. Now that I live in Germany, I’m stuck with the Finnish itunes version, because I happened to sign up while I was in Finland. Switching to the German store is impossible. I.e. I can’t access the German TV shows for sale on iTunes Germany. Never mind the US English ones I’m actually interested in accessing and spending real $$$/€€€ on. Similarly, I’ve had encounters with Facebook asking me to help localize Facebook to Finnish (eh, definitely talking to the wrong guy here) and recently to German (still wrong).

So, this is madness. A series of broken assumptions leads to Apple losing revenue and Google and others annoying the hell out of people.

So here’s a localization guideline for dummies:

  • Offer a way out. Likely a large percentage of your guesses as to what the language of your users is, is going to be wrong. The smaller the amount of native speakers the more likely you will get it wrong. Languages like Finnish or Chinese are notoriously hard to learn. So, design your localized sites such that a non native speaker of such languages can get your fully localized sites set to something more reasonable.
  • Respect people’s preferences. Profiles override anything you might detect. People move around so your assumptions are likely broken if they deviate from the profile settings.
  • Language is not location. People travel around and generally don’t unlearn the language they used to speak. Additionally, most countries have sizable populations of non native speakers as well as hordes of tourists and travelers.
  • If people managed to sign up, that’s a strong clue that whatever the language of the UI was at the time is probably a language that the user has mastered well enough to understand the UI (or otherwise you’d have blind monkeys signing up all the time). So there’s no valid use case for suggesting an alternative language here. Never mind defaulting to one.

Anyway, end of rant.

Mac at home

Since I’m quite happy with the mac at work. I’m half convinced that I might want one at home. One problem though. The imacs on sale are crap. They’re nice configurations, for 2006. But why the hell would I sink money in a configuration with a maximum of 4GB and a video card that is the 2006 budget model? I don’t want a noisy mac pro. I want a silent iMac. But at the prices they are selling at, I also want specs that are worthy of the current market. My 3 year old AMD more than matches the specs of what Apple is selling as new today.

I looked at the upgrade options. For 150$ I can upgrade from 2 to 4Gb. I know for a fact that these modules sell for around 30€ here in Finland, including 22% taxes. For 250$ I can upgrade the hd to 1TB (from 0.5). 1TB drives are going around 100-120€ here, including taxes. I can upgrade to a 512 Nvidia 8600GS. Gee, do they even sell these at nvidia? These are not upgrade options but an excessive price for what should be the standard, minimum configuration of a 2000€ PC.

The point here is that for this kind of money, I want more upgrade options. I’m not buying a PC that can’t handle memory modules on sale today. 4GB and 8GB modules are commonplace, 16 & 32 GB modules are on the market as well. A decent PC should have 2-4 slots so a maximum of at least 16GB is what I expect today. Given the memory prices, I’d max the memory out too.

1TB drives should be standard in a premium PC at the current prices.

Regarding the video card. Both ATI and NVIDIA are two generations ahead with their chip architecture. A recent card from the 9×00 series should be standard and the upgrade option should be one in the current GTX range. I’d say 512GB is sort of the minimum for acceptable 3D experience right now. Apple ships macbooks with this now.

So, my plan is to wait until Apple upgrades their product line (any month now?). At that moment I will decide that either it is overpriced crap or I will sink good money in a maxed out version of the imac. Basically, the current specs are not worth my money. If I sink 2000+ euro in a PC it needs to significantly out perform my three year old AMD.