Updated: Nov 24, 2020
This story originally appeared in the CIO Ink blog and is republished here with permission.
In a perfect world, CIOs would have the bandwidth to ensure that every piece of data is perfectly managed and properly handled. But in the real world, no technology team has the time or resources to handle all of the information that bombards us every second of every day. Just how bad is the problem?
According to DOMO, by 2020, it is estimated that 1.7MB of data will be created every second for every person on earth. There is absolutely no way for human-centred systems to keep up with that many bits and bytes. On top of the issues that the sheer volume of data creates is the problem of quality. Data quality must be ensured in order for it to be useful. There is little point in collaborating on poor quality data, but today, enforcing guardrails on metadata and data formats is far too complicated to be done manually. Data governance can only be achieved at scale with automation.
Automation has existed for decades, but today’s environment, which includes high-powered smartphones, remote workers and contractors, and the Internet of Things, has made it impossible to implement a single solution to manage all data. We are past the days when a company firewall could control all information and make sure that it is used in a way that complies with its policies on privacy, storage, and security.
According to Statista, over two million apps are available for download on the Google Play store, while 1.8 million apps are available on the Apple App Store. Mix in the millions of platforms and tools that companies use to maximize the value of their data, and the scope of the problem is apparent. This is where data collaboration comes into the picture.
The problem is not the apps themselves—it is that these apps are designed in a way that they do not really talk to each other. It is possible to move data between them, but that does not count as real integration.
As a result, there is a multi-billion dollar industry built around creating patches to let systems talk to each other. It is clumsy and clunky, and what it looks like in the real world is downright scary. Every handoff, every data handshake between systems is an opportunity for a security breach, data loss, or a delay that can negatively impact productivity and profitability. There is a reason why banks have to reconcile all of their account data once a day rather than in real time: there are so many systems to coordinate that it is almost impossible to calibrate them.
One of the biggest issues that organisations face is that the marketplace for applications in the data space is immature. It is a lot like where we were in the security space 10 years ago, when IT departments were forced to cobble together many systems to actually get what they needed. Over time, of course, the security sector matured, and today there are a number of excellent full-service platforms out there. I think that is where the overall “app economy” is also headed.
A major reason for that is not even a technology problem at all: it is a talent issue. As every CIO knows, it is incredibly difficult to find data experts, so the technology will have to evolve to fill that gap. This is where automation comes into play. Organisations will need tools and automation to master data and then use that mastered data in all of their transactional systems. Once that is done, they will have the ability to leverage commonly defined data for enhanced analytics that can be automated and leveraged in real time.
We are not there yet, but the evolution of automation in the data governance space has already started. Over the next two-to-three years, I fully expect to see automation become part of the mainstream as organisations realize that they cannot meet their business and technical goals without it.
And 10 years from now it will be inconceivable that we ever lived without it.
The author is SVP & chief information officer at Rocket Software. Views are personal.