Why is the consumer vs. enterprise technology debate affecting IT departments so much?
I think we have reached an impasse in terms of how the IT organisation maintains and supports IT assets. Over the last 20 years we perfected a system of managing IT assets that was founded upon the assumption of predictability. We lost our way in the late 1990s – there was a lot of complexity back then, and IT organisations were desperately trying to deliver on the initial promises that IT had made: automation and standardisation of business processes. Organizations gradually built up broad portfolios of management process and infrastructure to support their PCs and applications, but the correct function of all of it was predicated on knowing exactly what assets the employee has on their desk or in their hands.
Now, because most of IT management ‘best practice’ is focused on managing the known, it has got to the stage where many IT departments can no longer deal with the new. This combined with the diminishing returns that come from renewing existing IT assets, have created barriers to technology change and a lack of appetite to invest more than the minimum. Many IT organizations would have been happy to stay just where they were, in technology terms.
Meanwhile, we have all seen the ongoing emergence of consumer technologies that are ‘good enough’ for many of the tasks we do at work. Consumer IT has essentially provided a shop window where individuals can see new capabilities emerge much more quickly, where the rate of technology change has actually accelerated in the last 6/7 years. In enterprise IT it often feels like things have ground to a halt. It’s hard to have new capabilities at home that would make your work easier and more efficient without wanting to take advantage of them at work. Mobility is bringing this all to a head, with personal consumer technology naturally entering the enterprise space as a result.
Has consumer-grade technology overtaken enterprise-grade technology?
That’s a complicated question and I don’t think you can begin to answer it by just looking at individual pieces of equipment. If you only look at your business processes and systems in terms of the assets they’re built from, then you can only evaluate whether or not each part meets the standard of ‘good enough’, you can never really see where you get something better, or new – some new capability that you would not have had access to otherwise.
By simply looking at a business’s IT systems in terms of equipment, CIOs are missing out on a key opportunity to better assess, revaluate and invest smarter. Generally CIOs don’t invest in the PC refresh cycle with any expectation of a productivity increase, they reinvest because they have to, and they replace with fairly similar equipment. This is where the diminishing returns come in – why would you continue to invest in assets that don’t deliver any additional business benefit?
Now we have reached the point that consumer grade is good enough to rival enterprise-grade. The additional value we previously got from ‘enterprise class’ equipment has been eroded by those diminishing returns and with the volumes in the consumer market now so large, economies of scale can make consumer-grade technologies a better business choice. So the mass market equipment the consumers buy can do the same job just as well, but that doesn’t mean it has overtaken enterprise kit – it is just passed the line of good enough.
What is the primary constraint on IT departments that has resulted in the rise of consumer technology in the workplace?
As Europeans, we currently live against a backdrop of contending austerity and growth – this is a microcosm of what we see in the IT organisation. Most IT organisations spend 60-80% of their budget on operations – not on doing new things with technologies, but on keeping the existing IT assets and capabilities running. Clearly, most of that spend is a prime target for the austerity of IT budget cuts. That means they don’t have the budget or processes to embrace and manage all the new technology ‘stuff’ that is out there. This is a big problem, because much of the productivity growth in our economies over the last 50 years has been driven and enabled by technology, and there is seemingly no budget available to harness that.
So there’s the dichotomy for IT – budgetary pressure, the belief that existing IT is good enough plus the natural instinct to resist change and stick with what is known, verses a growing requirement for change in order to take advantage of new technology.
How are IT departments trying to address this dichotomy?
This dichotomy is being addressed to some extent; the Mobile Rebels survey showed us that smart businesses are now looking at how to tap into the innovation of their employees and take advantage of their use of personally owned devices – they are harnessing the technology experimentation of their workforces as a test bed for new capabilities, without having to fund it themselves.
But the wider issue here is a perception one – how the IT department itself is seen within the business. In many cases, IT is viewed pejoratively by the organisation – as a cost centre, not as a centre of innovation. Those now taking advantage of their users’ desire to use the new will be among the first to break that perception.
You can read more about our Mobile Rebels research here [will link to when UK/homepage post is live]. Keep an eye on the blog for more thoughts and advice from Brian.