Adapt to the end of 'imperial IT' or risk becoming irrelevant

gap.iStock_000030482764Small.jpg
gap.iStock_000030482764Small.jpg

 Image: iStock

With massive budgets and talk of "legacy" technologies, it's sometimes difficult to remember that corporate IT is still a relatively young business function. Workers nearing retirement age likely recall a time when IT did not exist, and the invention of the IT leader, in the guise of CIO or CTO, is even newer. There's still rigorous debate in IT leadership circles about who the CIO or CTO should report to, what their ultimate responsibility is, and if they're even needed at all.

In many cases, IT came to exist as a separate business function out of necessity, rather than some compelling need to separate IT from other business functions. In most companies, IT started out as a component of finance or operations, since these functions were early users of technology. As IT became more complex and was deployed across the organization, an IT department, and the requisite leadership structure, made a great deal of sense. In an environment where users lacked technology savvy and the skills to make educated technology decisions, a cadre of distinct "IT experts" who did all the decision making was a sensible business move, and gave rise to "imperial IT," where the corporate IT function could unilaterally define and enforce technology decisions.

If there was any single event that changed this paradigm, it was the advent of the iPhone. Before the iPhone, employees happily adopted company-issued BlackBerries that were secured, monitored, and configured by IT. A device neutered by IT that freed one from their desk was a fair trade. The iPhone went completely against the BlackBerry model, initially providing no centralized management functions and targeting consumers, who quickly grew disenchanted with a "managed" device when they saw the power of apps, enhanced messaging, and personalization. IT was no longer setting the technology standards, but being forced to adapt to employees, with CEOs marching into the CIO's office, iPhone in hand, demanding "Make this work with our email!"

IT goes back to the people?

If imperial IT came into existence since technology became too complex for individual business units and employees to understand and manage, should it continue to exist as technology becomes less complex and users become more knowledgeable about what these tools can accomplish? Provisioning a new application and the supporting hardware once took a cadre of experts, and can now be done with a web browser and a credit card; does this require a unique and distinct management organization?

Opponents of pushing IT "to the people" express concern that this will likely result in a hodgepodge of incompatible systems and applications that cannot communicate. A customer might have completely different information and sales history depending on which division he or she is doing business with, and financials and reporting across the company become a nightmare. This is obviously a disconcerting outcome, but suggesting that IT continue to be the gatekeeper of all technology decisions and selections is untenable, and a recipe for business units to make unilateral technology decisions.

The two competing models for the future

If users are increasingly empowered to select their technology, two broad models for IT exist. In the first, IT quietly transitions into a low-cost, highly flexible organization of implementers and integrators. This organization might transition expensive in-house applications to a cloud platform, or build integrations between business unit applications and cross-company back office applications. In this capacity, IT will have some ability to define technology standards, but attempts to force platforms by fiat will quickly fail. While it may seem inferior to the imperial IT of days past, a highly effective, low-cost IT organization that maintains and connects business-critical platforms is a capability that's in high demand. Standardized interconnections also mitigate some of the risks once associated with the dozens of custom interfaces required to connect disparate platforms.

Alternatively, IT could transition from a highly efficient support organization to a "federation" of technology experts embedded in various business units. This assumes an organization already capable of maintaining the basic platforms and infrastructure, or one that has outsourced this function completely. Rather than implementing and integrating, this model of IT helps define a technology strategy that accelerates business strategy, and does so at the business unit level. Not only technology "experts," these employees are also experts at pulling the appropriate levers to implement a technology vision, whether via internally supported tools or using external providers. Centralized management might consist of a half-dozen boards and committees, rather than a traditional "IT department."

Adapt or become irrelevant

In nearly every company I've worked with, imperial IT departments are quickly becoming marginalized when it comes to setting strategy, and building new products and services. In a competitive marketplace, an internal department with long timelines, inflexible platforms, and convoluted chargebacks simply can't compete with external vendors and cloud-based services. In some cases, IT and its leadership are tolerated as a necessary evil, or actively worked around and ignored.

Start by taking an honest look at the capabilities your IT organization is delivering, and how they're meeting the needs of a rapidly changing workforce and business environment. Even accountants and ops managers generally aren't using the techniques of a decade ago, so if you're still applying imperial IT, you may be headed for extinction.

Automatically subscribe to TechRepublic's Tech Decision Maker newsletter.