Joost

I spent some time playing with Joost, the streaming video/online TV thingy from the makers of Skype and Kazaa. There’s a lot to like about it conceptually. I live in Helsinki where basically public television sucks and the commercial cable packages suck slightly less. So to me Joost is pretty compelling: major content publishers lining up to join and hassle free streaming straight to my screen. So I signed up via the “send me an invite” bullshit feature on the website. It’s just an ordinary web registration form except in this case you have to do it twice. Once to get the invite and then another time to get a Joost username (“duh! didn’t I just tell you who I was?!”). I reactivated my good old Yahoo mail address for the occasion. No way I’m giving my primary email address to these guys.

The user interface seems quite nice even though it is a bit unconventional. It launches into full screen and has a menu system with which you can select channels and programs. The program has no network settings menu, it either works or it doesn’t. This is probably the kind of simplicity end users need. However, from experience I know that things sometimes don’t work and when that happens it is nice if the UI informs you what the hell is going on. Things like whether it is actually connected or how many bytes are flowing to and from me are not represented in the UI. You could stare at a black screen forever (with the default wobbly screensaver thingy in the background) if for some reason you would be disconnected. This happened a couple of times and it’s quite annoying. Pretending everyting is ok is not the same as a usable experience.

Aside from this, the UI is pretty easy to figure out. I’m not sure I like the whole channels and programs paradigm. I guess this is what couch potatoes understand. The problem is that there is a huge and growing number of channels which each can have numerous programs. You have to first select and add a channel and then you can browse its programs and select one. Since there’s a lot to choose from it takes some time to find something. And scrolling through huge lists gets old pretty quick. The UI does not really support people in doing that. I would prefer to have a UI more like an RSS reader.

Technically the experience is not as smooth as it should/could be. But then, this is a beta and probably the experience will improve as the number of users skyrockets, as it will no doubt do when this goes public. Anyway, I found that a great number of channels simply don’t play at all or seem to have a problem getting the bytes to my machine. It seems that everything except the most popular stuff is likely to not work. No doubt this is due to the fact that there are less peers with that content. I have a 1 Mbit connection (roughly 100KB/s download and 30-40 KB/s upload), which should be enough although it probably is the bare minimum. My impression is that most content is not streamed at that rate anyway. Additional problem is that Welho, my finnish cable provider, sucks big time and that the motorola modem I use to connect to them is complete crap as well. So I’m not ready to point the finger at Joost for all network problems I have. Nevertheless, the difference between some stuff working just fine and some stuff not working at all I cannot explain using my network situation.

The most annoying thing about Joost is no doubt the fact that it inserts commercials before, after and even during the streaming. Commercials suck and there seems no way around them other than not using Joost. If Joost fails, it will be because of the fact that users reject the commercial content interrupting the crappy other content. They’ll have to be very careful with this. There’s just no way I’m going to sit through commercials when my browser is one alt+tab away. Also if the content is crap, I might end up not alt tabbing back. I’ve explored the channels on offering a bit and am not really impressed so far. Seems to be on a par with Zune which doesn’t have that much good content either.

More ubuntu

I’ve given up on feisty. I’ve blogged several times now about my failure to install it properly. Today I gave it another try and partially succeeded before failing again.

I read somewhere that you can bypass the scanning the mirrors problem by disconnecting the network cable. You see, running ifdown eth0 is not good enough because ubuntu tries to outsmart you with its network manager. It’s becoming more and more like windows. The assumption that the user is a complete idiot now dominates the whole installer. Anyway, unplugging the network forces ubuntu to acknowledge some hard reality.

So I ended up with a bootable ubuntu system this time (misconfigured and all). Great, only the network still didn’t work properly. For some reason some stuff loads in the browser (e.g. google) but most stuff does not. So I was in the weird situation that I could google for my problem, get a lot of hits but unable to access any of them. So I spent the whole morning booting windows and ubuntu repeatedly. Each time in windows I tried to get some answers (no network trouble there) and in linux tried to mess with the ubuntu networking subsystems.

I failed. After the fifth time I just decided not to bother anymore. Obviously ubuntu has some weird bugs in their network layer that I do not encounter with older versions or with windows. From what I googled I learned that there are many unrelated problems related to networking in ubuntu. I now strongly suspect the dns related behaviour of my cable modem is the cause. Replacing the modem might solve my problems. But then again it might not. It’s a motorola cable modem and there is no question about the quality of the firmware being particularly sucky. I have to reboot this sucker quite often and already have decided to never ever buy anything with motorola firmware embedded again. Without a working network, ubuntu is basically not very interesting. Can’t run updates or install software. If anyone has a proper solution, I’d be interested to hear it.

Part two of my misery started when I started messing around with gparted. Basically I removed the linux partitions and then decided to give back the space to the ntfs partition. At this point it started to throw very scary messages at me about my ntfs partition halfway through resizing it. For about 15 minutes I was assuming it had foobarred the partition table (very depressing even though I have back ups). Finally after a reboot the ntfs partition showed up fine in gparted (unmodified) and it even mounted it without being asked (really annoying feature but welcome in this case). Next problem was booting it. I relearned a few lessons about mbr there (fdiks /mbr helped me out a few times when I was playing with slackware ten years ago). Basically the fix in windows xp is running fixmbr from the windows rescue console. Until you do that, you are stuck with a broken grub that points to a deleted partition. For legal reasons (I assume) gparted and grub lack the feature of undoing the damage to the mbr.
It took me about half an hour to figure that out and I’m now running my old windows desktop again.

So I have three conclusions to add to my review a few weeks ago:

  • The networking subsystem has huge compatibility issues that likely affect many users. From what I encountered while googling, I learned that there are many different issues. The fixes I tried didn’t work which suggest that these many issues are different from my specific issue. Not good. The same modem and pc have booted older ubuntu releases (dapper) so it is a regression! My modem is quite common so likely thousands of users are affected.
  • Gparted has some nasty issues and should not be used to resize ntfs partitions. You will risk losing all your data. I guess any partition resizer is better than none but this crap should not be put in front of end users.
  • In this form, ubuntu is guaranteed to cause lots of users to have a very negative introduction to linux.

I will try a future version again. For now, I’ve had enough.

OSGi: some criticism

Over the past few weeks, I’ve dived into the wonderful world called OSGi. OSGi is a standardized (by a consortium and soon also JCP) set of java interfaces and specifications that effectively layer a component model on top of Java. By component I don’t mean that it replaces JavaBeans with something else but that it provides a much improved way of modularizing Java software into blobs that can be deployed independently.

OSGi is currently the closest thing to having support for the stuff that is commonly modeled using architecture description languages. ADLs have been used in industry to manage large software bases. Many ADLs are homegrown systems (e.g. Philips’ KOALA) or simply experimental tools created in a university context (e.g. XADL). OSGi is similar to these languages because:

  • It has a notion of a component (bundles)
  • Dependencies between components
  • Provided and required APIs
  • API versioning

Making such things explicit, first class citizens in a software system is a good thing. It improves manageability and consistency. Over the past few weeks I’ve certainly enjoyed exploring the OSGi framework and its concepts while working on actual code. However, it struck me that a lot of things are needlessly complicated or difficult.

Managing dependencies to other bundles is more cumbersome than handling imports in Java. Normally when I want to use some library, I download it; put it on the classpath; type a few letters and then ctrl+space myself through whatever API it exposes. In OSGi it’s more difficult. You download the bundle (presuming there is one) and then need to decide on which packages you want to use that it exposes.

I’m a big fan of the organize imports feature in eclipse which seems to not understand OSGi imports and exports at all. That means that for one library bundle you may find yourself going back and fort to the manifest file of your bundle to manually add packages you need. Eclipse PDE doesn’t seemt to be so clever. For me that is a step backwards.

Also most libraries don’t actually ship as bundles. Bundles are a new concept that is not backwards compatible with good old jar files (which is the form most 3rd party libraries come in). This is an unnecessary limitation. A more reasonable default would be to treat non OSGi jar files as bundles that simply export everything in it and put everything it imports on the import path. It can’t be that hard to fish that information out of a jar file. At the very least, I’d like a tool to that for me. Alternatively, and this is the solution I would prefer, it should be possible to add the library to the OSGI boot classpath. This allows all bundles that load to access non OSGi libraries and does not require modifications to those libraries at all.

Finally, I just hate having to deal with this retarded manifest file concept. I noticed the bug that requires the manifest to end with a empty line still exists (weird stuff happens if this is missing). This is equally annoying as the notion of having to use tabs instead of spaces in makefiles. I was banging my head against the wall over newline stuff in 1997. The PDE adds a nice frontend to editing the manifest (including friendly warning of the bug if you accidentally remove the newline). But the fact remains that it is a kludge to superimpose stuff on Java that is not part of Java.

Of course with version 1.5 there is now a nicer way to do this using annotations. Understandably, OSGi needs to be backwards compatible with older versions (hence the kludge is excused) but the way forward is obviously to deprecate this mechanism on newer editions of Java. Basically, I want to be able to specify at class and method level constraints with respect to import & export. An additional problem is that packages don’t really have first class representation in java. They are just referred to by name in classes (in the package declaration) but don’t have their own specification. That means it is difficult to add package level annotations (you can work around this using a package-info.java file).