Wednesday, 23 September 2009

BCS: A lesson in irrelevance or alienation ?


The British Computer Society (BCS) was created in 1957 and has about 70,000 members out of 1 million + UK IT Professionals (figures from E-Skills UK).

A membership of less than 7% after 52 years in existance is not particularly impressive. Given its royal charter and priviledged position you have to question why the BCS fails to attract the majority of IT professionals.

IT Training Firm Firebrand believes 91% of IT professionals are unaware of the BCS accreditation scheme (CITP). However, I believe the real reason that the BCS fails to attract members is not simply down to its lack of relevancy to both employees and employers, but its alienation of potential members.

It has tried many schemes to attract members and I remember many years ago as a teenager visiting computer shows and seeing the BCS trying to sign up young people to join 'ACE', the Amateur Computer Enthusiasts club for young people who "weren't good enough to be proper BCS members". This rather patronising approach to attracting members- even young ones- unsurprisingly failed!

25 years later, the BCS launched ELITE, "Effective Leadership in Information Technology" but this time only open to members of the BCS. So the journey from patronising to pretentious has been completed. The irony is that all you need to be ELITE is to be a member of the BCS and in IT Management. Quite frankly its harder to become ELITE in the classic arcade game (and more cherished I suspect, even after 25 years) than this BCS assignment.

However, if you want Chartered status, then you need to take one more step and take the test, but given their sample question (see below) I don't think this is a barrier either.


So, many IT professionals could be ELITE, MBCS and CITP, yet shun these attributes and prefer instead to seek certifications from Microsoft, Oracle, Cisco etc as a means of personal development, recognition and progression.

I am not sure whether the latest revelation from the BCS (trying to create a global IT profession) is another example of corporate delusion or the recognition of its failure to attract people in the UK and hope that professionals abroad will be more receptive.

The problem though is within the organisation's DNA which remains elitist and apparently oblivious or impotent to the many IT challenges facing the industry and society. Whilst the BCS proports to be "Enabling the Information Society" the reality is that it has failed to have any significant impact within the industry, even after 50+ years.

Sadly this - more than membership numbers- is the real tragedy. The BCS appears ineffective in making any positive impact on IT in society, in government and industry.

As a country needing to diversify from manufacturing to a knowledge economy there are obvious areas where the BCS - if it had achieved a broader membership and greater vision - could have impacted including; PC's in every home, everywhere internet, re-platforming schools, UK Software independence, open standards, public IT projects etc .

The list is endless. The contribution, it seems, isn't.

Friday, 18 September 2009

"Code Monkeys" and the "Software Eunuchs"


Its odd given the historical roots of programming - and in many ways its growing complexity -that the term 'Code Monkey' should have arisen.

You don't get 'Lawyer Monkeys' or 'Financial Monkeys', so why has this derogatory term emerged? 

My view is that this is the result of a 'devaluation' of the role of programmers and the increased demarcation of roles that has arisen - at least in part - from the introduction of the Waterfall methodology.

The demeaning term of "Code Monkey"  suggests that coding is just a basic and simple process of churning out lines of code according to the specification. Rather like a copy typist or a translator taking a specification and transposing it onto the computer. The "real brains" being with the analyst, the designer or the architect (in fact anyone but the person who built it). No wonder many coders want to become something other than a coder because they don't feel their role is valued. 

It may seem a rather minor point to be making, but actually I believe this has contributed significantly to poor software. Naturally it is demoralising to programmers to be thought of as "Code Monkeys" (either explicitly or implicitly), but more than this, it disengages and divorces them from the analysis, the design, the user, in fact everything except the lines of codes they are "cutting" (not sure I like that expression either).

From a coders perspective, they feel more like a "Software eunuch", a coder without involvement, influence or power over their project. 

So product development is reduced from being a creative team effort to a production line effort ('A' doing what 'B' passed down the line) and unidirectional.

This is where you get the typical "swing" characterised so well in the cartoon.

You can just hear the programmer turn round saying "but I'm just the code monkey" not as a badge of honour, but to excuse their involvement in a project they had no influence. 

There are many lessons to take way from these experiences, one of which is the need for team work and joint ownership of outcomes not simply isolated ownership of processes or individual deliverables that might be expected in a production line environment.

The other is the recognition that coding is an intellegent and creative activity not best suited to monkeys of any description, and that intelligent creative people have more to contribute to the development process than simply cutting code.  

This is one reason why I believe the Agile methodology has made a significant positive impact on development due to its much higher involvement and engagement with developers. Certainly this is my experience.  

My Three  (R)evolution Rules...

1) Don't hire code monkeys.

2) Don't treat coders like monkeys and

(coders)

3) Don't work like code monkeys.














p.s. they never did get any monkeys to type the works  - or even words - of Shakespeare.

Friday, 28 August 2009

Unit Performance Testing


I mentioned in my previous post about the need to test performance early. I always intended to follow up with more detail on how this is achieved and this entry is designed to provide some detail behind this. The fallacy of only looking at performance after system integration testing is now becoming well documented so there is no need to repeat my earlier points.

Note: I am not saying that system performance testing shouldn’t be done, just that Unit Performance Testing should be done as well to catch problems early on and therefore reduce costs/rework. So Unit Performance Testing should be thought of in the same way as Unit Testing (e.g. not a replacement for system testing but as a precursor to)

However, the next question is how can you measure TAP (Total Application Performance) at a unit level before the whole application has been put together? To expand on this, supposing you are developing a functional area of the system which must respond to the user within 10 seconds. So you have a TAP Metric of 10 seconds for this functional point/use case (this being the typical post integration performance test target metric).

However, in this functional area, there are 5 discrete units of code that need development. These units all require execution to deliver the functionality to the user. They most likely will have different levels of complexity, performance profiles, and could possibly behave slightly differently (e.g. in terms of performance) depending on how they invoke each other.

The point is, although you can record the performance of each unit of code, how can you possibly derive a target pass/fail mark from the TAP given the variables above? It is this sticking point which has lead many to give up pursuing this further.

The solution is heuristics and probability. Whilst it doesn’t guarantee to catch 100% of all performance bugs, it has a high degree of probability to catch the majority of the really bad performance bugs (and it is these which generally require expensive re-writes).




Heuristics:

Using the previous example of a TAP target of 10 seconds comprising 5 separate units of code (A,B,C,D,E), I have presented 3 scenarios above (1-3).

Imagine that only unit “A” has been built yet. What should its pass mark be?

A starting place (without weighting or analysing any of the unit specifications) is 2 seconds. This simply divides each unit up equally (see scenario 1). If “A” is within this metric then it is probably low risk.

In Scenario 2, unit “A” is taking 5 seconds to execute and should immediately flag a warning. It may be that taking 50% of the TAP target is not an issue as this particular unit of code is doing most of the work (e.g. getting data from the database) but it does flag the need for inspection and clearly redefines the acceptable Uniform Performance Target of 1 second for each of the subsequent Units (B-E).

Scenario 3 is actually not uncommon. One – often significant unit – of code performs exceptionally badly and uses up (and sometimes exceeds) the TAP target. The remaining units of code (B-E) must use ZERO time or otherwise fail the TAP Target. Given that this is unlikely, Unit A needs to be profiled and optimised in order to reduce its execution time.

So to conclude a very simple approach (without needing a lot of upfront analysis) provides a mechanism to identify units of code that may contribute to failure when full system performance testing is conducted. The three scenarios are indicative (Green: within allocated time limit; Amber: above allocated unit limit but not above TAP Limit; RED: up to or exceeding TAP limit)

It is also worth noting, that sometimes “A” has already been written in a previous development cycle and whilst performant then, is now performing badly in the context of its usage/invocation by unit “B”. For example, say the purpose of unit “A” was to process thousands of widgets in a day, but unit “B” is asking it process a single widget over a thousand days. The subtle difference may not have been apparent at the time, but “A” had been optimised to iterate over widgets not days and the new use case is causing it to badly perform.

In this example, unit “A” isn’t even on the radar (as it already exists), but through the “RED/AMBER” flagging of unit “B”, the subsequent profiling reveals that 80% of Unit “B” processing time is taken up by “A”. In response to this it may be necessary to rewrite/enhance/optimise Unit “A”. This again may affect any other units relying on Unit “A”. Again all this is identified long before system testing is conducted.

Weighting:

As shown above, even using a simplistic rule of thumb approach can identify and capture issues early on and is worth piloting to gain further insight into how this approach can be customised within the context of your development processes.

The Unit Targets can be further refined with the upfront involvement from architects on the unit specifications. Analysis up front should be able to weight the units in terms of likely performance impacts and used to modifiy the time allocation given across the units (e.g. from the equal distribution of time to weighted).

For example, an obvious default weighting might be 3 for database activity, 2 for processing/business logic, 1 for gui/presentation. The higher the weighting the longer the predicted execution time.

You would then add up the weightings and divide the total by the TAP target and this would give you the weighting value in seconds (or part of) which you then apportioned to the unit based on its weighting.

For example using say that “A” was getting data from the database and was weighted as 3. The remaining 4 units (B-E) were simply presentation and were weighted as 1 each. This would give you a total of 7 divided by the TAP of 10 gives you an individual weighting of 1.428 seconds.

From this you derive the acceptable performance for Unit “A” is 4.285 with each of the other units limited to 1.428 seconds.

Whilst no formula is guarenteed, the use and refinement of these tools will enable you to capture a greater degree of performance issues early on reducing significant rework, costs and delays.


Thursday, 9 July 2009

Google "Kill Bill III" & Larry Strikes Back


As predicted in October 2008 Google announced yesterday that they were bringing out a new operating system for laptops/netbooks etc. As previously thought, Chrome is going to be at the heart of the system, in fact I suspect - other than speed, simplicity and security - the operating system is actually going to only play a "supporting role" in powering the browser to do its job.

This brings Google into the "stack owning society". Microsoft own the stack from Windows to Internet Explorer, Apple own OS X to Safari (not to mention the hardware) and of course Sun had Solaris & Java. Finally Oracle was also building its own stack (and now owns Sun's to boot).

So actually the prediction that Google would go this way was perhaps not the greatest prophesy! Of course, they haven't necessarily got exactly the same blocks, but it would seem the key ones converging are the o/s and browser (though don't rule out hardware just yet).

So firstly how will Google differ from Microsoft? Well the paradigm is different. Microsofts "eco-system" is around fat (ok "rich") client software on top of its fat operating system. Google on the other hand has its eco-system firmly in the cloud. The operating system isn't (YET) the destination, but a departure lounge for the web based applications rendered by Chrome.

It is unlikely that Google will Kill Microsoft with its new o/s, in fact potentially its o/s is likely to be quite complementary (though i'm sure google will not admit it), but what it will do - perhaps not immediately - is start to erode the monopoly of Microsoft.

Googles new operating system may well be perfect for netbooks. Windows 7 isn't going to get close to the speed I suspect google will deliver to the netbook. My guess (having used windows 7 on a laptop and xp on a netbook) is that windows 7 on a netbook will be comparable with Windows XP. But that is still slow!

If googles operating system is as good - and as fast - as Chrome, then I think we could see netbooks migrating towards Google. In addition I suspect Googles o/s will be free versus Microsofts chargeable windows and when applied to low cost netbooks, you will see that users will have more than one reason to choose google.

Given the power of the netbooks and their reduced screen/keyboard you won't necessarily want to use it for much more than browsing. However, as Microsoft well knows, the power will increase and as users get aclimatised to Google O/S and 3rd parties start developing for it, the eco-system will develop and potentially spill over into laptops and pc's. This is Microsofts greatest fear.

Microsoft is vunerable. Vista was a disaster. They got "disrupted" by the netbook and had to resurrect XP for the netbooks and focus on putting windows 7 (Vista 1.1) on an atkins diet to stand any chance of creating an upgrade path for XP. But they are still "disrupted", a fat-os isn't actually right for netbooks and also, given the price-point, the monetisation model is also wrong.

Add to this that Microsoft's Internet Explore has nose-dived to about 40% usage a (second place to Firefox) and you have to conclude that Microsoft is not dead but wounded in the key "Platform" space. The platform is where the war is being fought. The O/S runs the hardware, and the Browser increasingly is the Application Engine.

Time now to predict ACT 2: Larry Strikes Back.

The reall author of this drama of course is Larry Ellison who tried to do what Google are doing about 14 years ago. It wasn't called the netbook or chrome then, but in 1995, he brought out the "network computer", a scaled down computer for the network centric world (read cloud/web). Of course, he was ahead of his time and it failed. But now is the time, and what he tried to do then, is becoming a reality now, except not lead by Oracle, but by Google.

However, Larry is not one to be left out of anything. He has the cash and - with the Sun acquistion - quite a bit of the technology to launch a 2nd front against Microsoft. In the previous "Microsoft Monopolistic Paradigm" it would be crazy for yet another o/s and browser to be fighting out the crumbs left by Microsoft. But things have moved on. Microsoft doesn't have the monopoly on the browser, and could quite easily be squeezed out of the growing netbook market leaving Google and Oracle to have a 3-way fight (i'm assuming here Apple remain fixed on the high value "up market" space).

Now Oracle have the capability to build the kit (e.g. SUN) as well as build on top of the kit (Solaris and Java). Given that this was always close to Larry's heart and that he loves Microsoft just as much as well Google, I think it won't be long before he announces his OracleTop. Where he may decide to do one "upmanship" is that he will add one more item to the stack. Whereas Google and Microsoft have to work with 3rd party generic hardware, Oracle could build their solution from the ground up e.g. build it from hardware to browser - just like Apple.


Monday, 23 February 2009

GoogleTop II

In my October 2008 post, I stated that Chrome was perhaps just a precursor to Google trying to own the entire Stack (Operating System/Browser/Web app) and that Android was their first stab at this via the mobile phone.

It now looks like that Google and/or others will use/extend Android to do just that. Asus is apparently considering deploying Android on their Netbook and in January 2009 it was reported that techies had already achieved this on an Eee PC.

If they do meld Android with Chrome and the Google Apps then you can see with the Google marketing machine a clear strategy and alternative to Microsoft in some areas.

I still believe Splashtop and variants could be a real challenger in the mid term, but this news isn't a good one for Microsoft particularly at the low end netbook space.
See latest news on Freescale's $100 netbook with Android. With the reduced computing power of Android, netbooks can be even cheaper and further undermine Microsoft in that wintel netbooks have to be more powerful and with an additional license cost.

Tuesday, 17 February 2009

The Netbook Revolution

I am fortunate to be writing this post on my new Dell Mini 9 Netbook which I received gratefully as a birthday present. Since it was from my wife I had some say in just what netbook was chosen and what operating system loaded.

I agonized over the operating system for quite a while. I did not want to have a legacy - read Windows XP - operating system loaded. I would certainly NEVER go for vista even if it was "possible" and Windows 7 (which would be viable) is not out yet.

Actually there are in fact only 2 options right now, Windows XP or the Linux variant Ubuntu. I wouldn't mind playing around with Ubuntu, but not fighting with it and the general feedback is that the return rate (back to the shop) is 4 times higher for Ubuntu than Windows.

This is not to say that the problem is soley down to Ubuntu (it is apparently the easiest of the Linux variants to use) or simply culture shock to those used to windows (although also likely the case). Whilst these are contributory factors, I think the bigger issue is drivers both in terms of availability and usability. e.g. I use vodafone mobile broadband which works on Windows or Mac but not Linux and whilst this situation may change, the issue then becomes ease of installing and configuring drivers on Linux. The latter is not something Linux is known for and when you think of the myriad of devices that you often connect to your laptop (printers, memory sticks, usb drives, camera's, mp3 players etc) you start to get a little bit nervous.

This is NOT to say that these things are not insummountable just that I spend enough time sorting out IT problems at work and don't really want to create them at home.

So like many techies and non-techies the answer was Windows XP.

This is of course good news for Microsoft. There was a real opportunity for Linux to become a defacto standard on Netbooks due to the inability of Vista to run on these low-end devices. In fact given the huge popularity of these devices - My wife has just ebay'd her Apple Laptop and ordered a Dell Mini 9 for herself - it could have dealt a serious blow to Microsoft. Instead Microsoft have been wounded in terms of lost revenue (XP costs less than Vista) and pride (The Fat os is the couch potato left watching the two horse race).

So is this the end of the Linux Attack on the Laptop?

I don't think we have seen the last of Linux at all, but as i've previously blogged, I do not believe a "Full Frontal Attack" by Linux will succeed. By this I mean that giving users an ultimatum of Linux vs Windows, even an easier Ubuntu vs Legacy Windows XP ultimatum, will result in users turning to Windows.

Why? Because Linux simply isn't a complete replacement for Windows and there is enough doubt amoung the windows user base to prevent them taking a step into Linux on the chance that Linux is enough for their personal needs.

That is why Splashtop is so interesting as a Windows Trojan Horse because it doesn't Take on Windows directly nor does it pretend to be a replacement, just an alternative o/s for some specific tasks particularly the web. However, the Web is a now a major part of our computing life and Splashtop (including varients) offer some unique abilities, like almost "instant on" and hugely extended battery operation.

And whilst this feature hasn't filtered down to netbooks yet I can see it being another transforming feature of netbooks. Why? Well netbooks are the perfect size to carry around with you but you don't want to be waiting longer for the laptop to start-up than the cup of latte - you ordered - to arrive. Neither do you want to carry a power supply brick with you or be married to the local power point of the coffee shop.

Given that netbooks have less memory/slower cpu's a cut down version of linux also makes sense. This will be one of the few inhibitors left preventing netbooks from becoming as popular and widespread - and perhaps on the street - as mobile phones.

Lets look at the others criteria already met;

Cost. The cost of netbooks is half the price of laptops and a number of mobile phone resellers are offering them free with a mobile broadband contract. Lets face it, netbooks are the same price or cheaper than many mobile phones!

Internet. Relying on local wifi and/or putting up with the costs some places charge has been inhibitive. With mobile broadband, you get internet wherever you go (assuming you get a signal).

Data. One of the reasons you didn't take your laptop with you was fear of theft/loss/breakage. This was partly down to the cost of replacing your laptop (see point on cost above), but also about loss of data (often not replaceable). However, with Web 2.0 you can work online with data held centrally or you can use one of the online storage solutions such as www.box.net to secure data on your laptop.

With online applications and data, the demise of your laptop is less of an inconvienience than if you lost your mobile phone. Funny enough halfway through doing this post I was evicted from my netbook by the wife. Since I was blogging online, I simply logged into my laptop and continued the blog. If I had been writing a document using an online word processor it would have been a similar story.

The only thing left is the ability to "lock" your netbook in the same way mobile network providers can do with mobile phones when they are lost or stolen.

For those interested in a review of the Top Ten Netbooks click HERE

Tuesday, 3 February 2009

Balmer's 7


After many months of tolerating and suffering Vista I could take it NO more. I wanted to retrograde back to Windows XP as soon as possible. However, after investigation I was to be told that XP did not have drivers for my newish laptop, so going backwards was not possible. 

Faced with the intolerable current o/s - that would be Vista - and the impossibililty of the past, I opted for the only remaining option; "to boldy go where no one has gone before". Well the future anyhow in the form of Windows 7 build 7000 with the lovely "for testing purposes only" warning.

So having ventured into this pre-release version of windows 7 on my work laptop I have the following to report after almost 2 weeks of continuous usage... 

Windows 7 is what vista should have been! Even as a beta it is more stable, more reliable and considerably faster even compared to XP. 

Connecting to wifi with Vista was a black art, it might work instantly or you could be flaffing around for an hour having to reboot the laptop. With Windows 7 it works seamlessly even from hibernation onto another network. Shutdown (which could take hours on vista) is perflectly fast as is startup and hibernation.

Its not all perfect and there has been a few issues with a printer driver and installing one application, but otherwise everything has installed (including drivers) as it would with Vista. So it looks like they have maintained quite a degree of compatibility. 

Searching is also fast and there are no unexplained slowdowns and freezes that would occur with Vista. This mixed with a mostly improved GUI is quite impressive given that I am comparing a pre-release beta verison of Windows 7 with a post release Service Packed (SP1) Vista. 

7 is certainly lucky for Balmer and particular in relation to version 6 (Vista). Funny though that actually Windows 7 isn't really Version 7, but actually 6.1.7000 which you could argue is a bit of a point release on Vista. I'd agree in this respect that Windows 7 major differential to Vista is not functionality but quality and performance. It is Vista working.

I certainly won't be going back to Vista and am looking forward to the full release of version 7. I believe it just may help settle the nerves of the Microsoft fellowship. It of course isn't Midori nor is it anorexic when it comes to size but it definitely has been on some kind of atkins diet with vastly reduced memory requirements. 

Microsoft says it will even run on netbooks which generally have 1GB or less of memory and Intel Atom processors. Will it run well though? With the performance comparisons i've seen of Windows 7 versus XP I suspect it may actually pull this one off as well. However, it is yet to be clear whether only specific versions or all versions will run on netbooks.