Integrating multiple data streams is critical to achieving success with automated IT Asset Management.
The saying “garbage in, garbage out” still rings true, and your data makes the difference between getting the right answer from your system and simply getting any answer.
These are my top tips for best managing those data streams:
GOALS: Understand what outputs you want from the data sets you are merging. It’s important to identify which operational and management reports, and what additional integrations of merged data to 3rd party systems (financial, service desks) will be needed. You’ll also need to identify what data the users will be updating on a daily basis. There is little point in joining all the company systems together and duplicating a large quantity of data if there is no use for it.
DATA QUALITY: Get the data quality right. You should be aiming for 95%+. Poor quality data can be information that is either out of date, badly formatted (e.g. text in date fields) or irrelevant to the project. Don’t import data which has low integrity, this will just hamper the system’s evolution and adoption by the organisation.
PRIMARY KEY: Whether computer name, serial number or another field, make sure have a suitable primary key to join the data sets together. Good quality data and a primary key should allow you to integrate data seamlessly in to the target system. I always add a secondary unique primary key to data being imported from spread sheet data. This can be imported in to a spare field in your target system but it means you can always trace the data back to the source. It also allows you to add any additional data fields without reimporting the entire data set again.
Alasdair Robertson, CTMS
FEDERATION: You can choose systems that let you federate data. This means that instead of duplicating data between databases the system can look up data from a second system on the fly. The appeal of these systems is that your data is always up to date rather than scheduled (often overnight) batch transfer integrations which are carried out every hour or overnight.
KEEP IT SIMPLE: Don’t over complicate your system. There are web services based integration systems out there that cost around £50K, for that you get bells and whistles, although it still won’t make you a cup of tea. Whilst these may be technically excellent systems, providing you with multiple exception reports and data quality checks, it may be completely over specified for your requirements. Sometimes a simple SQL extract from a first system and then an import to a second system is sufficient. Keeping integrations as simple as possible minimises costs and potential design flaws and, when the integrators walk away, it should be simple to maintain.
TEST: Test, test and test again. It’s key to a successful data integration project to plan and then test the system repeatedly with different data if the integration is to be repeated. Getting the bugs out whilst in the test phase saves lots of time later.
DOCUMENTATION: Make sure you get complete integration documentation and copies of any code or scripts. This may cost you a little extra but is probably worth every penny, especially in maintaining the integration in the future.
The ongoing legal battle between VMware (under Broadcom ownership) and Siemens is yet another example of why ITAM goes far beyond license compliance and SAM. What might, at first glance, appear to be a licensing dispute, ...
During one of the keynotes at the FinOps X conference in San Diego, JR Storment, Executive Director of the FinOps Foundation, interviewed a senior executive from Salesforce. They discussed the idea of combining the roles of ...
I recently reported on the FinOps Foundation’s inclusion of SaaS and Datacenter in its expanded Cloud+ scope. At that time, I highlighted concerns about getting the myriad SaaS companies to supply FOCUS-compliant billing data. A couple ...
Podcast
No time to read? Want to stay up to date on the move? Subscribe to the ITAM Review podcast.
Marks & Spencer (M&S), the iconic UK retailer, recently became the latest high-profile victim of a devastating cyberattack. Fellow retailers The Co-Op and Harrods were also attacked. Recent reports suggest the rapid action at the Co-Op ...
During our Wisdom Unplugged USA event in New York in March 2025, we engaged ITAM professionals with three targeted polling questions to uncover their current thinking on Artificial Intelligence—what concerns them, where they see opportunity, and ...
In the world of ITAM, the regulatory spotlight continues to intensify, especially for financial institutions facing increasing scrutiny from regulatory bodies due to the growing importance of IT in operational resilience, service delivery, and risk management. ...
Executive Summary For ITAM teams, sustainability is a core responsibility and opportunity. Managing hardware, software, and cloud resources now comes with the ability to track, reduce, and report carbon emissions. Understanding emission scopes—from direct operational emissions ...