Alastair Mair looks at the current state of information management, suggesting that companies need to go back to basics if they are to ensure they are using the right information platforms.

Businesses need information to survive. Technology has provided many tools over the last 30 years in an attempt to satisfy ever-growing information delivery demands. The culmination of these tools and facilities are embodied in the services which are emerging on the world wide web, where an enquirer can format their own view of the information being interrogated and have the results delivered by return message. To achieve such results requires appropriately `performant' software, but underlying every system there is a need for diligence and care to ensure the underlying data is consistent, up to date and bears a guaranteed integrity. The penalties for non-conformance can be immediate and damaging.

Although the internet has flung the door wide open to customer-facing data, the information sources within organisations are similarly challenged. Internal systems are under extreme pressure to deliver access to data at least as fast and definitely as accurately as their client-facing counterparts. A lot of time and money is spent each year on information processing systems and supporting technology to deliver to corporate and customer demands. But on many occasions the investment has been mismanaged, and this has now grown to such an extent that an environment has been created for corporate misinformation. With the advent of an `information on demand' culture, this situation can inadvertently damage or even cripple an organisation.

The pressure on information delivery
The top business drivers at the forefront of commercial strategy are:

  • reducing costs;

  • streamlining processes;

  • increasing productivity;

  • creating new demand;

  • stimulating and generating customer loyalty; and

  • providing information anywhere, anytime.

    Companies constantly re-evaluate how they can achieve business economies. Effective cost control remains one of the most important features of corporate strategy. It drives the introduction of new technology and methods, providing challenges for implementors to demonstrate short or long-term savings or benefits.

    In an effort to reduce costs and increase shareholder value, control has been gradually distributed to areas of businesses that can best manage it. Departments and business units have been empowered to decide how to operate cost-effectively without reference to higher management. The visible cost of central administrative overheads has been reduced as a result.

    In changing the organisation to become more self-sufficient, a heavy investment has been made in information processing and information management solutions. Staff have been given systems which have circumvented the bureaucratic intrusion of central implementation services (IS) or information technology (IT) departments and front-line business performance has improved. The financial criteria of improved processes and productivity in conjunction with cost savings have been adequately matched by the marketing goals of satisfying customers and generating growth.

    But are the solutions as effective as they seem? Are the bottom line savings genuine? What is the effect on the organisation's most valuable corporate asset - information? Has enough thought been applied to all the facets that are affected?

    In many instances, insufficient understanding has been gained as to the full impact of the introduction of these corporate improvement programmes. This is not to suggest that companies aren't saving money, or that they haven't improved their efficiency. But the routes they have chosen are often short-term, short-sighted `point' solutions, which are very likely to have ignored the full extent of what might be achieved. And worse, they may actually lead to wasted money and time in the very near future. From an organisational perspective, the devolved responsibility can be costing the company more than it thinks, and certainly more than was anticipated by any of the original proposals.

    Corporate information is in grave danger. Although it can be accessed and analysed, and then delivered promptly and accurately, the control and management disciplines that are required to ensure relevance and usefulness is often missing. Information has therefore become open to misuse and abuse.

    Organisations that haven't considered and implemented an information management policy that is sensitive to the modern requisites of accessibility and delivery will fall behind their peers.

    The modern business model
    The challenge for businesses today is not very far removed from that of 100 years ago - how to conduct business faster, cheaper and better than one's competitors. In the early 1900s, voice communications embodied the new technology, and a century on it is the internet. The evolution of the internet as a mainstream medium means that delivery can be much faster - but so can the exposure of mistakes.

    No company that wants to survive in the 21st century can afford to ignore the new business environment created by the internet. It is here to stay, but as well as presenting many competitive opportunities and advantages, it also brings threats and pressures which could force some companies to their knees.

    E-business will become embedded in any surviving organisation's processes. This radical shake-up is happening now. Fundamental to an e-business is information. Businesses can't exist without information - they publish it externally to promote services, and generate it internally to control operations. In the brave new world, accuracy and timeliness of information will become more important. And productivity will be determined as much by the availability of widespread access to information and its accuracy as by the business process efficiency.

    The internet and its siblings, intranets and extranets, provide the means to deliver information simply and quickly to a large number of users. The emergence of this channel comes at a time when knowledge management has been steadily increasing in visibility in organisations.

    But in parallel to these developments, organisational structures have become flatter. In a lot of industries the role of the middle manager as an information channel is disappearing. The presence of a conduit to control information has been eliminated. So who now pushes the information up and down the hierarchy? Who checks its validity?

    Examples already exist of data posted on websites being embarrassingly wrong - Compaq computers for sale from a website for £1, presumably because someone somewhere put the decimal point in the wrong place. Someone goofed big-time - either the information was flawed, or the process was. By the way, some 50 people who actually took advantage of the mistake are now pursuing legal action against Compaq.

    The information age has thrust another technological challenge at businesses. Protecting investment in IT has been one of the main objectives for every organisation for many years. With the rapid take up of the internet, the profile of IT has been raised yet further. Given the vast investments required for e-business, it follows that a clear and coherent strategy, which builds on existing IT infrastructures, will be a key differentiator of businesses that succeed in the new millennium. Fundamental to any such strategy will be the existence and application of the right information management infrastructure.

    In order to remain competitive, companies must make use of all their assets, be they fixed, human or data. But many are not in a position to, not least because too many are seemingly unwilling to spend the time analysing how to best use and disseminate the information they own.

    So will the panacea that was rife in the 1990s, Business Process Re-engineering (BPR), now be supplanted by something along the lines of Information Management Re-engineering? In fact, it already has, but it won't need any industry guru to craft some glitzy terminology to embody the problem. The solution has been there since businesses started, in the form of information analysis.

    Evolution of technology
    The way computer systems have evolved over the last quarter of a century has had a great impact on the approach taken to the understanding and delivery of information. The changes that have been made in technology have allowed more sophisticated software delivery but there has been a penalty in the way corporate data is addressed.

    In the 1960s and 1970s, COBOL programming and mainframe computers were the norm rather than the exception. The high price of electronic storage media rendered any system development a non-starter unless the storage and processing costs could be justified as a viable expense.

    This was a time when database technologies were just emerging, when programmes were written on coding pads, and when high volume processing was conducted with master files held on magnetic tape reels. Programmers undertook months of training before they could become productive. All their work had to be checked by chief programmers before it could be used.

    Information processing systems were crafted only after rigorous discussions and meetings were held with the prospective user. A lot of the software was developed on a bespoke basis. Those packages that were around were typically cumbersome, expensive and had a niche focus. As a result, the developers knew how to manage both the software and data - they were information system experts, and looked on with reverence by their user communities.

    Compare that scenario to today. Now, one is more likely to see the implementation of packaged solutions than bespoke systems, and only minor tailoring or customising is required. The developers' time is typically spent building integration bridges to different packaged systems. Knowledge of the package and how it works remains with the vendor.

    Even if bespoke work is undertaken, the modern software development environment is very different. Your average VB or Java developer would no doubt be astounded at the crudity of the punched card machine that was used to write computer programs and input data `in the old days'. Software engineers now have at their fingertips an array of inexpensive tools that give them the ability to quickly create working software solutions often without being checked to the same rigours. Indeed, advancement in software and database technology has been such that people with no computer development training whatsoever are able to produce operational programs that deliver business solutions.

    So the developers have lost their knowledge of software because they didn't author it all. They also have less knowledge of the underlying information because they don't understand the processing. The result is that the users can no longer place them on pedestals as experts, and the corporate entity has less knowledge of the information it holds.

    Systems can be implemented simply and cheaply, and can be operational quickly. The old constraints that arose from high infrastructure costs are typically no longer relevant; the relative cheapness of computer equipment coupled with the high availability of packaged software and the perceived speed of software delivery has seen to that. In addition, more and more software is being sold on a `pick'n'mix' basis, and is capable of being swapped out for other products with minimal fuss.

    In many cases it is now possible to implement the information processing solutions without any need to call on the services of software technicians at all. The corporate knowledge is eroded still further.

    This emergence of software as a commodity item that no longer requires such specialist skills is a concept that has not gone unnoticed at board level. Ever keen to keep costs under control, software budget paring now typically focuses on reducing extra mantime services rather than excluding functionality elements. Whilst keen to ensure the software does what the organisation needs, the board reduces the ubiquitous IS to the minimum. But this can be a false economy, as knowledge to get best use of the system is often not acquired.

    Other factors are hitting businesses. Organisational restructuring, especially prevalent in mergers or downsizing, leaves fewer people with long-term corporate knowledge. In a lot of instances this has been to the detriment of business performance. Also, a prevalence of what is referred to as poaching is taking its toll. Employees are choosing to depart to seemingly more attractive employment with the lure of greater instant rewards, particularly from lucrative share options.

    From an information perspective, users have readily embraced the ability to do more for themselves without reference to others. But information management is more than just databases; it now includes EDI, electronic forms, workflow, imaging and web publishing, to name but a few of the relevant technologies. Information policies need to embrace these as well, and to do that effectively you need to understand them.

    The current situation
    There is a trend emerging - corporate insistence on slim budgets and short timescales mean software is increasingly installed on `run and go' basis. But as a business tool, most software bears some degree of complexity that requires a proper understanding for companies to make best use of it. The richness of features is borne from the package solution scenario, which provides a generalised solution that is attractive to a wider spectrum of potential buyers. Quite often these `added value' options do not get recognised, so advantage is taken of them.

    Packaged software typically has many options available, but it is rare for all of them to be used. Best use is only obtained from expert advice, but it is this very advice which is typically being cut from budgets, leaving inexperienced implementors and users to go it alone. And those users are increasingly `corporate virgins', new to the company who know little or nothing of the relevant processes.

    To understand why this situation should exist, take a look at your own systems. Your office systems more than likely include features such as word processing or e-mail. Are all of your users fully trained in the features they use? Do they all know how to deliver your requirements in the same way, wherever they are? If the answer is yes, you are probably in the minority.

    Software implementation has become cheaper and cheaper. But because it is cheaper it has become inevitable that less time is spent getting best use from it. There are not many companies who spend £500 on a software licence, and also commit three times that amount on courses to learn how to use it.

    The basics are all that is learnt, and very often some real benefits are never realised because the users don't know about them. Or worse, the users try new things out without understanding what they are really doing.

    There is a further concept that needs consideration - the increasingly widespread access to corporate data that is being provided to users. Moderately-priced software can be acquired, installed and used by staff to go and get their own data, without relying on anyone else. Just give them the access and they can get on with it. They can put the data where they want. They can create new data from it. They can send it to someone.

    They can report on it. And often they can do all this from within the same software suites that they use for their normal duties. Who is advising them? Who is ensuring the accuracy of their endeavours?

    We can all cite examples where a member of staff has acquired data for themselves that doesn't quite stack up to other sources. Which is wrong? So much corporate time is wasted on investigating any differences, and then yet more time is wasted the next time the data is presented, because of the suspicion that it may again be flawed.

    What has happened here is that organisational efficiency has been displaced in favour of personal efficiency. The user has the tools at hand to do the job, and does so to the best of their ability. There is no big picture view, no management, no control, and a real potential for disaster.

    Even the facilities offered by data warehousing tools can suffer the same fate. A typical problem area for a lot of databases is presenting information at a point in time in the past, and being able to repeat an enquiry exercise with the same results ad infinitum. Most databases do not hold data in this way. Data mining facilities in warehousing toolsets have difficulty solving this satisfactorily, so once again data obtained may actually be flawed, despite the best intentions of the user.

    Modern information problem
    This is the backdrop to the crux of the corporate information management dilemma. As the software that is used to store or to give access to information is not being best used, it surely follows that best use is probably not being made of the information it manages. But this information is the very lifeblood of an organisation.

    This damaging latent problem is pervasive in many organisations, and most are too busy to recognise it. To date they have got away with it, but not for much longer. As suggested earlier, the information age and the internet as a delivery mechanism will quickly expose these shortcomings.

    While many software packages boast `best business practice', in reality they offer only a generic approach or solution. `Out of the box' implementations will typically only give the business a market leader advantage, or help them play catch-up.

    The delivery of a sustained competitive advantage can only be achieved by striking a balance between the evolution of business processes, the understanding of the information owned, and the manipulation of technology to suit an organisation's needs and methods. And of the three, information management has now become the most important.

    Many software suppliers focus on what they term knowledge management. However, whilst they might indeed be easy to use, the solutions that they deliver to manage information are frequently difficult to use. True, they provide tools and capability, but as described earlier these are often woefully under-utilised or misused.

    Alongside this, most organisations are allowing information flows to become confused and blurred. Their steady acquisition of software solutions following the new open system approach has led to a patchwork of systems to cover the information base.

    A major stumbling block can be cultural rather than technical. Simply putting existing processes into new tools does not spell success. Too often, implementations fail to improve efficiency because the process is not adapted to the online world. Lengthy approval processes will not be significantly improved by replacing 50 paper communications with 50 electronic ones.

    Because of the way in which computer systems have evolved, organisations have left behind the principles which once governed their information assets. Their focus on the importance of information management has been lost. They need to change. They need to look carefully at how the information is used, where it is misused and to whom it is disseminated. If they don't they won't survive.

    They are now in the position where they have to renew their acquaintances with corporate data.

    To do this they need the help of information experts. But in many organisations this skill has been lost, swept away in the euphoria of software advancement and self-service users. So they have to look elsewhere.

    This does not mean a return to the days of O&M (Organisation & Methods) officers swooping on parts of a business to map its processes, that time and motion studies should be instigated to establish operational hiccups. But the principles embodied in these activities are as valid today in the modern business world as they were in the days when manufacturing companies dominated national productivity.

    By Alastair Mair
    Alastair Mair is director of the information consultancy unit at Eurobase Consultancy,