Random thoughts on being late

Don’t know if you’ve heard but Microsoft had to delay the launch of Vista until 2007

Apparantly this caused a lot of fuss, and even some rather dodgy reports of a 60% rewrite of the codebase, predictions of the demise of Microsoft, an opportunity for Apple and/or Linux, and so on.

Scoble had an interesting series of posts where he got increasing irate at some of the bad journalism and then collected all the tellings off he got.

To me the whole thing is a quite bizarre storm in a teacup – is it really so shocking that a Microsoft operating system schedule slipped? Given that in their entire history they’ve never shipped and OS according to the original ship date. I’m not just trying to pick on MS here – it’s an extremely common thing for software projects to miss their ship date, especially extremely large projects that have to have an earthquake like effect – even Ubuntu have had to slip Dapper by six weeks – although it just goes to show what a great job the Ubuntu team have done releasing 3 previous versions every 6 months on the dot (and the whole 6 month release cycle starts to look quite clever).

I did find the mini-microsoft response interesting, especially all the kvetching from Microsoft employees (or bored geeks trying to cause trouble?). This has got to be the most worrying aspect for Microsoft, not the peed-off OEMS, vulture like media pundits, or effect on the stock price. If the staff have such low morale, and so many bureaucratic hoops to dance through to get their code checked in, is it any wonder that the beast is shipping late? And what does that bode for the quality and especially the security of the system?

The other amazing aspect is that this is all this fuss for an operating system. This is just the stuff that allows you to run the applications you want (or need) to use. The comparison to Ubuntu above is a little disingenuous, because Ubuntu is more than just an operating system, it includes a whole range of additional software.

To be fair, the big 200lb weight that Microsoft have round their neck is that they feel they have to keep backwards compatability. The article by Joel Spolsky on how Microsoft lost the API war seems so prescient, for example he said in June 2004

Even if Longhorn ships when it’s supposed to, in 2006, which I don’t believe for a minute, it will take a couple of years before enough people have it that it’s even worth considering as a development platform.

(and in the article he explains the obsession with backwards compatability)

The weird thing is that when Vista does come out (and I wouldn’t bet against another delay) despite what anyone might say, whether it’s good or not so good, whether it requires only the fastest new hardware or will run just fine on a bottom of the range Dell, it’ll probably be a huge success – the juggernaut is just that big. Every OEM will want to ship it (to make up for the lost Xmas sales), and many large enterprises will start to roll it out, so the CIO can justify the large Software Assurance licence they bought.

Please, please me

Good to see Jono is on board with my plea for someone (or many people) to help the Xorg project with the simple, no eye candy, missing features related to rotation, mirroring, and dual head.

I think that perhaps I need to clarify what’s wrong and some ideas of how I’d like things to be (although, unfortunately I have no idea where or how to start coding X ro the relevant drivers).

Rotation

Why is rotation important?

  • LCDs. Most LCDs have the ability to rotate from landscape to portrait mode. There are some situations where this might be useful, for example DTP page layout. Also, if you happen to watch it, in Twelve weeks with geeks check outJoel Spolsky’s desk – two Dell widescreens (I assume 21’’ 2005FPW) rotated and in dual screen mode (talk about double hell to try and configure in Linux, dual head and rotation).
  • Tablets. While the jury is still out on whether tablet pc’s (or the new Origami form factor) will ever take off in the general market – I think they will be relevant in certain vertical markets, for example education and design.

There is some support for rotation in the xrandr (X Resize and Rotate) extension. This is the extension that’s used in the screen size applets in Gnome and KDE that allows dynamic resizing of the display – which is great, however in my experience there aren’t any open-source drivers that support rotation, except the specialised ones for the ipaq and Nokia 770.

So the missing features are;

  • consistent rotation support across the most common drivers e.g. ati, radeon, nv, i810 (anyone for any others?).
  • ability to individually rotate screens in dual head setup

Dual head

Why is dual head important?

  • Hardware wise it’s becoming easy and cheap – even bottom end cards have two outputs, and the price of monitors is always falling. Most laptops have graphics chipsets that can drive two screens.
  • Many developers that I know either use laptops with an external monitor, or a desktop with two screens, in dual head mode. Again, look at 12 weeks with geeks – all the interns had dual screen desktop machines.

The good news is that most of the drivers I’ve played with support dual head. The bad news is that it’s tricky to setup and requires a restart of X to switch from single head to dual head and back to single head.

So the missing features are;

  • Abliity to dynamically switch (i.e. without restarting X) between single and dual head mode.
  • Ability to change the resolution of each head independently – of course without needing to restart X
  • Autodetection of a monitor as it’s attached – I realise this is a big ask. Windows doesn’t always deal with this well (probably a driver problem), OS X does well in my experience, but it should as Apple have control of all the hardware.

Mirroring

Why is mirroring important?

Do I really need to explain? Goto any conference, goto to any Universtiy, goto any corporate office, goto any sales meeting, and you will see laptops and projectors.

So the missing features are;

  • Ability to turn mirroring on and off
  • Auto negotiation of the best compatible resolution between the laptop screen and the projector.
  • Just like dual head – autodetection of a monitor as it’s attached. The ability to choose between mirror or dual head setup.

Final stuff

I imagine all of this is difficult, but is it more difficult than Xgl or AIGLX? My guess is probably, simply because of the reverse engineering of the hardware needed to drive the features of the chipset. However I would point out that for dual head the basic functionality is already there – it just doesn’t happen in a Just Works way.

Less eye candy please

Forgot to mention that a few weeks ago Jono and I got Xgl working – or what I should say is that Jono spent two days failing to get it working on his Radeon based laptop, and I learnt from hist mistakes and got it working on a Intel based machine in a couple of hours.

I have to say that the visual effects were at the same time cool and a bit underwhelming. Part of it was probably to do with the slowness since it was on a mere Intel 915 graphics chip (although bear in mind that the Intel 8xx/9xx family are the most widely deployed graphics chips) and of course this was just released stuff that didn’t have much / any optimisation yet (of at least I hope that optimisation is to come). Of all the stuff I’d probably use the expose-like function the most.

Of course a few days later RedHat / Fedora release aiglx (warning fedora wiki is slow today – FC5 was released). So it would seem that Novell and RedHat are now competing to see who has the best / most eye-candy.

All of this is great and I don’t want to discourage anyone from hacking on Xgl/AIGLX or whatever offsping becomes of those two, however I would like to point out to the Novell, RedHat, Ubuntu and the Xorg project that I still can’t easily do simple things like plug in a projector and mirror the display, or plug in an external monitor and expand my desktop, or rotate my LCD and then rotate X to match.

Most of this is possible, which is the first problem – we need all of it to be possible – but the second problem is that it requires knowledge and hackery of xorg.conf and then a RESTART OF X! The closest I’ve seen to a user facing tool for any of this was on Fedora Core 2 (and still there presumably – I’ll check when I’ve finished downloading FC5). This tool allowed you to configure desktop mirroring or dual head configurations – but then you had to log out (so that X could, you guessed it, restart).

This is madness. We already know that laptops have overtaken desktops in sales, pluging in a laptop to a projector to give a presentation or training course (or even just watch a DVD) is becoming at least an every week occurance for many people (and think of the influence of people for whom this is an every day thing), and more and more developers I know are using their laptop in a dual head configuration when it’s on their desk.

Given Jono’s success in invoking the lazyweb to buld an audio editor to meet his needs, I’m calling out to anyone involved with X to help sort this out. I’m not good at hardware driver coding but I’ll help with testing, documentation, encouragement, whatever I can. If you ever visit Birmingham or I meet you at a conference then I will buy you beer (or other beverage of your preference). But only if rotation, mirroring, and dual head setup are easy for nornal users and don’t require an X restart.

Aside Xinerama as a name has such great marketing potential, but no one uses it because it’s so damn hard to setup. Just think: YourFaveDistro, brought to you by Xinerama dual head display.

When people talk about Linux desktop adoption, what that really means from now on (in fact from about 2004 on) is Linux laptop adoption – and laptops are all about, hardware wise, plugging and unplugging external devices, whether that’s usb keys, external hard discs, dvd burners, mp3 players, digital cameras, scanners, printers, pcmcia cards, keyboards, mice, docking stations, and monitors / projectors. Nearly all that stuff behaves really well and ‘Just Works’ when it’s dynamically plugged and unplugged from the machine, all except for external monitors and projectors. That stuff requires wizardry.

I don’t know if 2006 will be the year of the Linux laptop (or desktop) or the year of Linux desktop eye-candy but I would love it 2006 was the year of Just Works rotation, dual head, and mirroring.

Napsterize your Knowledge

Don’t just take it from me that shared ideas are more valuable – so say Ben McConnell and Jackie Huba of the Church of the Customer Blog

The primary lesson: The more that a company shares its knowledge, the more valuable it becomes.

Companies that share their intellectual property and business processes with customers and partners are more likely to have their knowledge (or products) passed along to prospective customers. People tend to evangelize products and services they love, admire or find valuable, so Napsterizing one’s knowledge allows for the grassroots effect of distributed marketing. The network is the channel.

Companies that Napsterize their knowledge in the marketplace also tend to have the marketplace respond with help and improvements to its intellectual capital.

from the article Napsterize your Knowledge (see BugMeNot for a pass).

Saw this quote in the Getting Real book and had to get it in here.

Shock news

The CEO of a large IT company says that large IT companies are the key to the success of Linux and open source, and not communities.

It’s all very amusing because in the end Linus and Andrew Morton don’t ask IBM, Intel, or Oracle, what to allow into the kernel, rather it’s IBM, Intel, and Oracle engineers who ask Linus and Andrew to let their stuff in. In fact it’s even more complicated than that because probably someone from Oracle is the maintainer for one sub system, and someone from Intel another sub system, IBM another, and so on, so they are all asking each other to accept their patches with Linus and Andrew the ultimate arbiters. Which all sounds terribly polititcal, except it isn’t because the fact it’s GPL means that code flows in all directions, engineers talk to engineers in the open and avoid all of the legal and contractual issues that cross corporation work would normally entail. (Anyone remember the debacle that was OS/2).

It also amuses me how these guys never learn from history – IBM lost control of the PC market on two fronts; the open architecture of the hardware meant they lost that to Compaq, Dell, and others, and the decision to outsource the operating system to Microsoft (not once but twice!) eventually led to a flipover point in the late 80’s or early 90’s where IBM needed Microsoft more than Microsoft needed IBM (a situation which has irked the corporate psyche of IBM ever since OS/2 lost and this became obvious).

If you took away to corporate involvement with Linux (and other healthy open-source projects) then the community would change, but would carry on mostly unaffected. However if you took the community out of Linux devleopment – took away the openness, the independent leadership, the ‘no single company in control’ – then would Oracle, et al, want to, or even be able to continue? My guess is that already Oracle and IBM need Linux more than Linux needs them.

 

Mini Review: John Battelle’s The Search

In general I tend to devour books, ones that I like anyway. I just get so into them that I can’t put them down – except for maybe food. This approach to reading is unrealistic with a young child in the house, so I’ve had to adapt and read things in a more piecemeal fashion. The Search is ideal in that John Battelle has broken the text into nice bitesized chunks as if he had anticipated my dilema.

What I like about the book is that it isn’t just the story of Google, and doesn’t show too much reverence to the world’s favorite search engine. JB rightly points out that search was important before Google, even if both the early pioneers and users didn’t quite realise. Plenty of credit, and comisseration, goes to the companies like AltaVista and GoTo.com (now Overture) which were the companies that could have been King. There’s also a deserved focus on the companies that rely on their place in search result that have got caught up in the crossfire of the war against search spam (and therefore forced to buy AdWords to get their traffic back).

Of course like all the best reviewers I haven’t actually read the whole book yet – it’ll probably take me another couple of weeks to find some time to finish it, and I just couldn’t wait. If I find that Battelle completely screws it up in the last quarter of the book, I’ll come back and update this.

Howto give a Web2.0 talk

I’m at the Carson Workshops Web2.0 Summit learning all about web2.0 goodness. One thing that’s become obvious is that there are certian rules to giving presentation to this kind of audience.

  1. Find yourself a funky font to use – not one that is hard to read, one that is clean and readable but non-standard
  2. Try to have as little text as possible on the slide – you are the best if you have just one word on the page.
  3. Either put your single words centered on a white background or find relevant metaphorical photos to use as a slide background and stick the text in the corners.
  4. If you absolutely have to have slides with more than one word on them use colour and different font weights to highlight words (so the slide is really about that smaller collection of words)

Just a note.

Update: Other alternatives for number 3; white text on a black background, or white text on coloured backgrounds (with a new colour for each theme in your talk).

Getting started with Linux Admin – Part 2

This is the second part of my occasional blog dumps of stuff from my two FastTrack Intro to Linux Admin. Part 1 outlined the approach and what we needed to get started. This can be boiled down to get a PC and a LiveCD – we’ll be using theUbuntu LiveCD.

Getting started with the command line

As Linux sysadmins we do the majority of our work at the command line, or shell. There are a wide variey of shells, BASH,tcshKornShellZsh, are some of the common examples. On most Linux systems the default shell is BASH. BASH stands for Bourne Again SHell – one in the long line of witty word jokes in a Unix system – one of the original shells for Unix was written by Stephen Bourne and called the Bourne Shell. BASH is more than just a shell, it also has a built in scripting language (which is boxy but useful).

Aside: One of the things you have to get used to within the world of Linux and open-source software is that there is almost never just one example for any class or type of software. We need a shell to do our work, so of course there’s never just one type of shell. Later on we’ll need an editor, and the number of open-source editors is approaching inifinity. This theme will come back again and again.

Once our machines has booted the LiveCD we’ll have a graphical desktop like the picture below (I’ve rather presumtiously assumed you’ll have opened a browser to this page).

deskop-terminal-menu

Within the GUI environment the type of program we need to run to get a command line is called a terminal – and of course there are loads of terminals. Thankfully on the Ubuntu LiveCD there is only one, which we can find in Applications->Accessories->Terminal (as shown on the picture above).

terminal

Now we have the terminal we can get going. The first command we’re going to use is echo which is the simplest command going: it simply repeats back to us what ever we give it.


ubuntu@ubuntu:~$ echo "Hello there I'm the echo command"
Hello there I'm the echo command

Once you’ve had a play with that the next command is ls which lists the files in the current directory.


ubuntu@ubuntu:~$ ls
Desktop

Aside: Those of you with exprience in DOS might be expecting to type dir to get a list of files in the current directory. Go ahead and try it. On Ubuntu it works, but is just a copy of ls with a different name, on other Linux systems it might be an alias (we’ll cover those later) that gives an error message or even just not there.

There aren’t any files to look at – we’ll fix that in the next section – the next thing we might want to do is change directories. This time it is the same command as DOS.


ubuntu@ubuntu:~$ cd Desktop
ubuntu@ubuntu:~/Desktop$ ls
ubuntu@ubuntu:~/Desktop$ cd ..
ubuntu@ubuntu:~$

That last line introduces one of two hidden directories in every directory. The double dot represents the parent directory of the current directory, a single dot represents the current directory. So you might be wondering why we haven’t seen these directories in either of the previous examples of ls? Well, there is a convention of hiding files that start with a full stop (aka dot, making those files dotfiles). There is an argument to ls to turn this convention off and show all files.

ubuntu@ubuntu:~$ ls -a
.              Desktop    .gnome2_private    .mozilla          .xsession-errors
..             .esd_auth  .gstreamer-0.8     .nautilus
.bash_history  .gconf     .gtkrc-1.2-gnome2  .Trash
.bash_profile  .gconfd    .ICEauthority      .update-notifier
.bashrc        .gnome2    .metacity          .Xauthority
ubuntu@ubuntu:~$

There’s a whole heap of hidden files and directories in the user’s home directory, most of them storing configuration and metadata. There is a whole load of other arguments we can give to ls to output info we may be interested in. To find out more about ls we can type

ubuntu@ubuntu:~$ man ls

when we hit enter the teminal is taken over by the manual viewing application

ls-manpage

You can now page through the manual with the space bar or move up and down a line at a time with the cursor keys. When you’re done type q to quit the manual. All commands in the system (should) have a manual page, even if it’s very terse and assumes that you know how to use it in the first place.

That’ll do for this blog entry. Next time (if there is a next time) we’ll look at input and output redirection and shell variables. If you want to get ahead then all of this and more is covered in ‘The Command Line’ chapter in the free LPI Manuals that starts on page 52 (in my copy).