Placing ITAM at the centre of IT Governance

29 April 2021
9 minute read
Best practice

Placing ITAM at the centre of IT Governance

29 April 2021
9 minute read

IT Asset Managers can be critical players in IT Governance. We have the data & insight needed to support informed decision-making and an estate-wide perspective on current priorities such as eliminating technical debt and improving end-user service. This is the value provided by a strategic approach to ITAM. In this article, I’ll explore how ITAM has evolved and continues to evolve to meet the demands of IT and wider business operations.

The evolution of ITAM

At a recent webinar with ServiceNow I explored how ITAM has evolved alongside what I see as four key shifts in the use of IT assets by businesses. Each shift has made what we do as IT Asset Managers more critical to IT Management as a whole, giving us the opportunity to elevate our profession to becoming a trusted advisor to senior leaders.

From Computation to Information Systems

The first shift in business computing came in the mid-70s when computers became information processors, used to inform qualitative business decisions. Until then, their purpose had been largely to compute or calculate. They were job or batch-based calculators and a little more, with few users and concentrated in large secure rooms. Whilst ITAM as a discipline or job role didn’t exist then, there were still tasks familiar to us as modern ITAM managers. In the mainframe era, particularly in educational and engineering settings, there was a need to manage the use of scarce computing resources. Time-sharing feels very similar to cloud computing in many ways – large, centralised computing resources being used by individuals to process their own workloads. Computing in this era tended to be a managed service, another modern parallel. Mainframes and terminals were often managed by the supplier’s army of field service engineers.

With the arrival of integrated circuitry and microcomputing – driven by companies such as Intel, Apple, & IBM – software such as MS-DOS, Visicalc, and Wordstar ushered in the first wave of personal computing. ITAM was about to become relevant due to the growth in importance of software license terms, but largely only as a records-keeper.

Client/Server computing

Bill Gates’ goal of a “computer on every desk” started to become a reality in business settings in the early 1990s. Large organisations transitioned from mainframe/mini computing served by terminals to first standalone PCs and then networked computing. This shift meant that there were more expensive assets running more software on more desks. When I started my IT career my first computing device was an orange-screen terminal connected to a proprietary computing environment provided by Ford Motor Company to its dealerships. There was little need for ITAM in this setting as the entire environment was owned and managed by Ford. However, by the early ‘90s, I had an IBM PC running Windows for Workgroups, a full Office suite, and accounting and payroll software. PCs were expensive, software was expensive, and license compliance risk was a reality, with software industry trade bodies (forerunners of the BSA) being formed in various jurisdictions. It was also the era of the “dongle” – hardware devices designed to prevent the use of unauthorised copies of licensed programs.

An ITAM manager in this era had a lot to keep track of:

  • Physical media copies
  • Hardware dongles
  • Components
  • Software installations
  • Paper licenses

As such, ITAM was firmly ensconced in the physical world, providing a largely clerical function, not much different to Facilities Managers who would keep track of floor plans, desks and other furniture, peripherals, and so on. This focus left limited options for advancement or involvement in key decision-making. Until the advent of tooling such as WMI, SCCM, and desktop management software such as Altiris people responsible for managing IT assets often conducted physical audits, much like a retail stocktake. Naturally, this was focused more on hardware than software.

In time, the need for better management of hardware and software, driven by compliance risk, high profile court cases, and increasing focus on IT budgets, meant that the first dedicated ITAM tools became available. These evolved to equip ITAM professionals to step up and address the issues presented by the next shift in business computing – virtualisation.

Virtualisation

Virtualisation in mainstream business computing began with the launch of VMWare’s x86 virtualisation products in the early 2000s. As evidenced by high profile software license compliance cases over the intervening years managing compliance risk of virtual systems has provided a compelling business case for cost and risk management-driven ITAM programmes. We’ve seen license terms evolve to keep pace with on-premises virtualisation and now cloud computing. Per-machine licenses became per processor and then per core, and now in some cases have restricted cloud-usage rights included.

These license agreement changes directly impact decision-making in virtual environments. Consider, for example, the approach of Microsoft to licensing desktop applications in virtual desktop infrastructure (VDI) environments. At an organisation where I worked per-device licensing for MS application software resulted in the cancellation of a VDI rollout solely on cost grounds. I have a feeling that the promise of VDI from endpoint management & employee enablement perspective was delayed due to license restrictions from the major players.

What that means for ITAM Managers is that there is a much closer alignment between what we do and what stakeholders such as IT Architects do. Licensing needs to be considered in the early stages of an architecture refresh, to accurately build the business case for the transformation. The VDI project I mentioned above ended up with a multi-million-dollar hole in its business case because it wasn’t possible to configure the environment in a way to comply with several license agreements. The outcome was a recognition that ITAM input was required early in the technology refresh process which resulted in “gates” being placed in the solutions lifecycle to check licensing impact. As a result, the ITAM manager became a key player and became involved in high-level management decisions earlier. Finally, ITAM had a role to play in IT strategy and in shaping architecture decisions.

Everything-as-a-Service

Alongside on-premises virtualisation the last decade has seen cloud computing – including SaaS – come to the forefront. I recall claims that this would mean the end of the need for ITAM because cloud-first organisations no longer owned licensed software and potentially had less hardware to manage as datacentres moved to the cloud. As we know, that’s not the case. Modern ITAM teams are busier than ever, adding management of cloud computing and SaaS to the on-premises assets we’ve managed for the last two decades.

The cloud-first approach I would argue makes ITAM more relevant and strategic. We’re in a position now where modern ITAM systems can provide a vast quantity of accurate normalised software and hardware inventory data. That data that is useful in enabling critical business decisions to be made about an organisation’s use of technology. Cloud-first enables businesses to pivot rapidly, to deploy the right technology at the right point in time to the right customer. ITAM teams have the data needed to enable that pivot to be made and to deliver benefits to multiple IT and business stakeholders.

In a recent case study from ServiceNow customer Covea Insurance, ITAM was deployed alongside an IT Service Management tooling refresh. With both tools working from the same centralised CMDB a range of services were improved, including employee self-service, automation, & Service Desk productivity. Additionally, license compliance risk was reduced. My key takeaway from this case study is that a wider IT Management tools refresh was driven in part by ITAM requirements. I’ve written about this platform-based approach to IT Management previously and I expect it to be an accelerating trend.

Modern ITAM drives IT decision-making

This article has illustrated how IT Asset Management evolved from being a clerical task to being a stakeholder in key decision-making and ultimately driving IT Management change. At the ITAM Review, we see ITAM increasingly as a governance role and part of this is a recognition that we don’t have to retain our own ITAM data silo to deliver desired business outcomes.

Trustworthy centralised data that’s made available securely to stakeholders enables informed business decisions to be made with consistent, accurate data. The many moving parts of a modern IT estate are documented and continuously updated, meaning that multiple stakeholders can deliver against their objectives using the same data:

  • Application Owners build an Application or Service Catalogue
  • Service Desk identify applications requiring installation for a user role
  • Security identify applications at risk of a particular vulnerability
  • ITAM Managers identify unused software
  • Architects identify technical debt
  • Operations identify hardware warranty expiry dates

All this is delivered from the trustworthy data ITAM teams have painstakingly gathered and maintained for decades. Our ITAM functions are experts at discovery and inventory because to manage costs and license compliance risk we need to peer into every corner of our estate. Now the whole of IT Management can benefit from that rigorous approach.

Can’t find what you’re looking for?