Wednesday, 27 August 2008

Is 'Splashtop' a Trojan horse?

I think it is every technologist's dream to develop a “killer app” or (disruptive technology). However, I suspect they are often not the result of deliberate endeavor; but are the by-product of trying to address a deep need or by having faith in a solution that isn’t yet recognised or adopted elsewhere. Sometimes I don’t even think they know they have the killer app until much later on. Perhaps not surprising, since it can't naturally be obvious, otherwise everyone would have developed it!

However, this hasn’t stopped Netscape, Oracle, Sun, IBM, Linux and latterly Google and Apple – and probably many others - trying to develop a solution that would compete or undermine the Microsoft monopoly and/or give them a competitive edge. Microsoft have become – rightly – paranoid about almost everything and everyone appearing with a silver bullet or wooden stake (not that I’m comparing Microsoft to anything evil :) ).

Windows gives Microsoft a competitive edge, domination and channel/relationship with what must be close to 1 billion + people. Microsoft will do almost anything to protect this franchise.

And of course windows has had to compete with Larry Elison’s 'network computer', Netscapes 'the browser is the computer', Linus Torvulds 'free operating system' and Google’s 'Internet Centric computing' and more recently Apples 'Leopard'.

With Microsoft’s latest Vista release receiving much negative publicity and an increasing Apple market share many would see Apple as a potential rival to Microsoft.

However, Apples operating system is not disruptive at all, simply a better product. A (disruptive technology) is in my view something that often goes under the radar – like a Trojan horse - and is initially purchased/considered for purpose ‘A’ but eventually (and sometimes rapidly) is adopted for purpose ‘B’. It is the later scenario which unseats the established players and disrupts the market.

Whilst Apple may not have the disruptive technology, I suspect splashtop and a number of other offerings ( see engadget ) may deliver a significant blow to Microsoft.

Splashtop – and other similar offerings – is effectively an “instant ON” cut down hardware compliant operating system with browser/email/calendar contacts etc.

And the benefits ?

You do not have to wait ages for windows to load or close down.
You can be on the internet in seconds.
There are less attack vectors for viruses/hackers.
It requires less/no patching/hotfixes.
Battery life is measured in DAYS not hours. Yes DAYS!
There is almost certainly a faster browsing experience than via windows.

I am sure these points are an obvious attraction to anyone with a laptop. Note the 'instant ON' offering is NOT instead of windows but an additional solution. You can still boot up your 'full fat' windows solution whenever you want.

But windows now isn’t needed if you simply want to surf the web or pick up/send email. You will also be able to view any office documents, just not write any documents in word etc. But you can use the various online word/excel solutions that Google and others provide etc.

So hang on, you actually will be able to use the web, email, book meetings, check your calendar, write documents online as well as spreadsheets etc. You will be able to do this instantly and quickly and with a battery life that is measured in days....

You will still load up windows, but won’t it be less and less, as you find more ways of doing stuff online and/or Splashtop and others provide more instant functionality? Sure you will still have windows – as a comfort blanket – for a while at least, but only for a while.

And gradually windows would die a death of every increasing instant small silver bullets, not for everyone but for the many people who simply use their compute for these activities. After all we have the Playstation 3 and X box for games don’t we?

So whilst Splashtop is not seen as a challenge to Windows, but simply complementary (purpose 'A'), I can see it developing into a real challenger and even replacement (purpose 'B'). I predict this will be one of many slimOS products being released in the next 12 months.

My quesiton is, have Microsoft seen this coming and is Midori their response?

But perhaps the most important question is, will they be too late?

Tuesday, 19 August 2008

The Three Legged Dog

One of my bug bears - and everyone in IT seems to have at least one – is performance.

Badly performing applications and operating systems really annoy me – often more than other bugs due to their persistence - and more importantly, reduce my productivity. Yet such an annoying feature seems commonplace and I for one do not buy the “it’s the hardware” argument any more than I would buy the “it’s the road” argument for a badly performing car.

I should distinguish between apps run on recommended hardware and those run on an old desktop next to the broom cupboard in an office – far far away... - a not too uncommon experience.

I am sure many of us can catalogue the list of applications and operating systems that fail to perform everyday functions. I have previously referred to Vista as poorly performing (and in my view too fat to live) and even my brand new HTC PDA - sporting Windows Mobile 6.1 – runs like a snail.

As well as not buying into the “it would work well on a CRAY supercomputer”, I also don’t buy into the “part of testing” argument either. Sure, testing should report the issue and indeed FAIL the release because of it, but they are hardly responsible for the performance(though I have to question why they do not exercise this veto more often).

Finally I do not buy into the “it’s a bug” argument. Sure, it is a bug, but not in the sense that I find performance normally referred to. E.g. that that performance issues will become apparent during Performance Testing which, according the V-MODEL, occurs during system testing.

Of course, this is a reasonable place for verification BUT not the first instant for this testing, in the same way that you would not expect it to be the first place to discover of bugs that would otherwise be discovered during unit testing.

I also do not subscribe to the idea that code performs by default and it is only the random occurance of bugs that interfere with this natural state of affairs. I assert that actually the reverse is true; that unless performance is architected and coded in from the start, then it will always perform badly. I would also assert that this statement is more able to explain the reality of our experience of applications.

In my experience, more often than not, the functionality has had to be re-written from scratch – rather than bug fixed - due to its poor performance. It has not been a “bug that crept in” but simply the absence of performance architected/designed code. This also explains why poor performance often isn’t rectified pre-release due to the cost of re-writing code and the effect on the date of release.

However, this is all the more reason why code needs to be architected and designed for performance and not just functionality. It would appear that “non functional requirements” - such as performance – are treated as “soft” requirements rather than “hard” requirements, the implication being that they are an optional extras.

I do not buy into this or the run out of time argument either, since this performance has to be considered up front, not last. If you leave it to last you have left it too late.

The example I had quoted to me on several occasions is the cash machine. Accepting all the functionality of the cash machine, if it took 10-15 minutes to dispense the cash, just how often would you use it?

I also understand that performance (and also scalability) is harder to achieve these days due to “The Stack”. In the “good ‘ole days” you coded in machine code/assembler direct to the CPU. Now you go through numerous stacks, api’s, classes etc which you didn’t write and have actually no idea how they perform under different circumstances. But that only sustains my previous assertion that code does not perform by default, but only through careful design, architecture and unit testing.

It is also true that code may have different performance profiles given the tasks required. For instance, a piece of code relating to train schedules may be very quick to list all the trains departing on a given day, but horrendously slow to list the trains on a given line. This dimensional performance characteristic can be present in many different areas.

Another mistake is data processing; Developers responsible for data entry logic can overly influence the schema for data entry performance over data access, yet data reads often account for 80% of db access in the wild. Thus data entry is fast (since it is dictated initially) but data access is slow. The effect is poor performance overall.

So my TEN principles for improving performance are;

1. Use cases must include performance profiles/response times.
2. Design must include reporting, scope and response times.
3. DBA must be involved in database design and focus on data access/reporting performance.
4. Architecture must be involved in development start-up meetings with performance a key agenda item.
5. Developers need regular performance training.
6. Unit Testing MUST include performance profiling including use cases which effect data dimensions/reporting etc.
7. Developers/architects need to include performance strategies as part of their training/coding approach.
8. Unit Testing must have success/failure criteria including performance.
9. Architecture must be involved in reviewing code not just in terms of formation/patterns/quality but also performance
10. QA Departments are given veto over poor performing releases.

Thursday, 14 August 2008

Will Midori have Brains as well as Brawn?

I recently blogged about the arrival of Vista. The gist was that Vista was suffering from the bloated consequences of Microsoft’s golden rule of compatibility. The “new hope” being Midori which apparently drops this mantra with a fat free Operating System embracing the latest technology and paradigms of “manycore” computing etc.

Great! This will definitely have a positive impact on security, performance, scalability etc. But in looking for a new operating system I still want something more than just a young fat free athlete. I want it to have brains as well as brawn. I know I’m demanding, but then Microsoft does have the resources to deliver.

What do I mean by brains? Well to be honest I don’t have a nice one line explanation. But let me start by giving you some context....

In January, Bill Gates was over in the UK doing what I’d describe as his “goodbye” tour and I was lucky to get invited to one of these events. Even luckier still, I entered a question which got chosen to be asked of Bill Gates. This was my question....


Is the pace of innovation slowing down?

"There have been many technological and software innovations over the years with the Internet, Client/Server, Web applications and .NET etc., but is the pace of innovation slowing down? .net was a major innovation but since then many would characterise Microsoft releases as user experience enhancements? Are we likely to experience this consolidation until the next leap, perhaps to AI platforms? What work is Microsoft doing in this area?
"

Bill’s answer wasn’t a one line answer either, but the gist was that Microsoft can only move as fast as the market will allow etc. True of course, but who do you choose as the pace-maker? Some organisations are still running DOS and Windows 3.1 whilst others are more than happy to lead with the latest (and yes, of course, a lot of organisations sit in between). The point is that you need to cater for both with the leaders helping to mature the new products that the followers later upgrade to. So whilst I accept Bill had a point, I don’t accept it fully.

So back to my point about the next operating system. I guess some would argue, that Microsoft breaking compatibility, is in itself, a momentous decision (if they do make it!). However, it won’t be that dramatic, since either Midori will run in a VM under windows or windows rill run under a Midori VM; or both concurrently under VM. So not really as momentous as we might think.

So what does a windows operating system - with brains - look like? Well here is just a taster (perhaps you can add some of your own)...
  • It should be fully integrated to the cloud with security implemented as standard.
  • Data/Apps stored online with backups to the pc/laptop rather than the other way round.
  • Solutions – not simply software – should be delivered jointly through install and services adding greater value.
  • Removal of data silo’s through xml standards and online data hubs.
  • Improved meta-data (as well as standards/hubs) to aid search.
  • Not having to rebuild your PC/Laptop every 6-12 months to restore performance.
  • Application installs that do not pollute the Operating System: integration rather than invasion.
  • Ability to restore your environment online to any laptop/pc.
  • Software that adds to the whole rather than simply a point solution (this requires the Operating System to have the right architecture).
  • An Operating System capable (really capable) of self repair and that can diagnose (and fix) automatically and not come up with dumb suggestions like "the cable isn’t plugged in" when you are trying to connect via wifi.
  • An Operating System that supports a degree of artificial intelligence which can be leveraged by applications - and I don’t mean pathetic “office assistant” type gimmicks.

If pushed, I would define the new O/S as being: Secure, fully integrated to the cloud, AI Powered, Data and user centric. e.g a long way from where we are today.

For the majority of users out there who don’t understand about firewalls, service packs, antivirus databases, backups etc (and quite frankly they don’t want to) this would Rock their World. Oh and yes, they wouldn’t go out and buy a new PC each year because they don’t know how to fix the one they have that each year gets infested, slow and unusable.

Like I said, Microsoft could deliver on this. If not I can see both Apple and Google eying up the possibilities of giving birth to this new type of computing.

Monday, 11 August 2008

The Vista Saga

There has been much noise on the arrival of Vista in our lives with deep emotion and drama. Having used Microsoft's latset Operating System for over 6 months I can relate to this emotional experience!

To me, an operating system is much like a car; it’s a means to an end. We expect it to do the business; start first time, get us to where we want to go safely and quickly, and without breaking down etc. We don’t celebrate that it achieves this, but get very emotional when it fails, as we rely on it, day in day out, to get us where we want to go.

However, emotions aside, it’s time to take a dispassionate view of the Vista Saga. I’m not one who views Vista as a great product marred by poor perceptions (as Microsoft would suggest) or that Vista has transitioned overnight from a Skoda to a Rolls Royce with the arrival of Service Pack 1. But nor do I believe Microsoft was invaded by gremlins that had too much food after midnight.

What I do believe is that Microsoft are finding it harder and harder to build on top of windows. Vista, of course, is not a single operating system, but just one more incarnation of NT which had to inherit legacy windows which itself was built on DOS (and with many bits, 16, 32 and 64). It has also had to cope with many files systems from FAT to FAT32, NTFS (and until it was jettisoned from the plans, WinFS). It has also had to cope with the myriad of configuration architectures, from .ini to .reg to metabase to .xml. The list is not exhaustive but you get the drift.

The Microsoft Windows dynasty, whilst hugely successful, has got fatter and fatter and is now not just clinically obese, but morbidly so. This overweight geriatric, whilst sporting beautiful clothes, is about to croak.



Vista stuggling to delete a 443 byte shortcut in under a minute

Sorry, I said I’d stay away from emotions.

Microsoft’s mantra of compatibility has lead to windows inheriting everything from the past. This has resulted in the bloating of windows and critically, 3 increasingly impossible missions: Performance, Quality & Security.

Windows development and testing is constrained by the many "personalities" that exist within it and the many compromises that Microsoft have had to make to ensure compatibility.

Trying to ensure performance, quality and security whilst not impacting on compatibility must be like trying to tap dance through a mine field. And of course testing all the different components, personalities, configurations, files systems is a gargantuan undertaking that in reality could never be completed fully.

So the real problem isn’t the developers or testers but the impossible mission that Microsoft set itself in the name of backward compatibility. Fortunately it looks like Microsoft have an alternative plan to develop a compatibility free, fat free replacement called MIDORI.

The only question is whether Microsoft will try and put windows on an Atkins diet to extend its life or accelerate Midori to enter in a new dawn.

Whatever they do, I just hope it does the business.

Wednesday, 6 August 2008

Product Development Balanced Scorecard

The biggest challenge for anyone responsible for software product development in an ISV is getting the right functionality developed at the right cost in the right timeframe.

In a later post I may blog on the blunt tools of “de-scope or de-lay” to manage releases, but for now I'll focus on the functionality side.

This is always a hot topic as there is never enough time to do everything and you can easily be pulled from pillar to post from almost every area of the business and externally (from marketing to presales, support to projects, customers to prospects).

It’s almost impossible to please everybody particularly with a single release, but here is one way to almost get there.... The Product Development Balanced Scorecard. I developed this specifically to address the area of product management and in an attempt to keep as many stakeholders happy as possible.

The concept is based on the Balanced Scorecard developed by Dr Robert Kaplan and Dr David Norton as a business performance measurement framework. The key is achieving balance across the 4 quadrants.

I have categorised functionality based on its key benefit to the business and also aligned to the stakeholders/end users. All functionality should fit into one of the quadrants below. Some functionality can sometimes fit into both. In this case you would allocate it to all those relevant.

Q1 - MARKET:
Functionality which is entirely new (rather than an enhancement) and/or targeted at a new market (could be geographical, vertical, horizontal or commercial) would come under this quadrant along with Market Trends/forces (e.g. Technology, Standards) etc. Such work would often come from market analysis, often be considered strategic and an investment in future revenue.

Q2 – PRESALES
Once you are in the Market, you still need to win the business. Developing some Unique Selling Points and “Sex ‘n Sizzle” in the product can make the difference between winning and losing. Often many suppliers can deliver similar functionality and so winning is about having something that stands out from the crowd. Working closely with presales will inform you of what is needed in this quadrant. This is not to be confused with core functionality (e.g. what is expected) but about demo’ing and delivering something eXtra.

Q3 – SUPPORT
All software products need to be Implemented, Deployed and Supported. How much effort required in each of these phases is as much to do with the product design as product functionality. By careful design (and if necessary redesign and further development) you can reduce the time to implement a solution (improve revenue recognition) as well as significantly reducing support effort either by designing out potential for issues and/or fully building in diagnostic capability. The net result will be happier customers, reduced support costs, improved revenue delivery and profit.

Q4 - CUSTOMER BASE
Everybody agrees that your customer base is your life blood and treated well do all the presales and marketing you need. You need to keep them happy to retain them and reference them. Every customer will have some niggle or wish list, however minor, that if developed in (or in some cases out) via an enhancement will turn them into a raving fan. Make sure you listen to your customer base and identify these gems. As well as turning the customer into a fan and a reference you will also be able to recognise upgrade revenue as they move to that release and ensure your customers are on the latest codebase. In many cases customers won’t upgrade for any other reason (including Q1 and Technology refresh!).





Ok, so these are your quadrants. Now you need to balance them. Every Development Plan should categorise each feature against the Scorecard. You can then add up (in both cost/effort and number) the items in each quadrant to ensure it is balanced.




It is important to look, not just at the number of days, but also the number of items. For instance if you have developed 150 days of functionality for the customer base, but this represents only 1 distinct item of functionality you may have problems getting your customer base to upgrade unless you can be very sure that they all will really benefit from it (which is rare in my experience). So in the example above (and graphically below), whilst the cost is largely balanced, the number of items is out of balance.



As with all statistics, analysis and scorecards, they are a guide and there are exceptions to the rule. However, if used properly this scorecard can help keep all your stakeholders happy. Indeed it can often be used to show the stakeholders how you plan or have given them a fair slice of the development budget.

Nick