Hardly any other topic is subject to so many misunderstandings as the use of key figures for corporate management. Some companies prefer to trust the intuition and experience of their managers rather than some "academic rules". Others, on the other hand, do not even want to concern themselves with it because they mentally associate it with "additional expenditure". And those who really want to base their operational planning and control on objective facts and figures often reach their limits. Many poke around in the "fog of potentially useful data" in the hope of somehow getting the information they need.
The reason: really relevant key indicators are difficult to define, difficult to collect and usually have the "audacity" to change frequently. After all, KPIs are often defined arbitrarily and elaborately (i.e. cost-intensively) without knowing exactly which ones are really relevant and whether the figures even say what is expected of them.
Due to the technical complexity of today's processes - also and especially in document production - many companies are tempted to simply record everything that can be measured or what the IT systems used voluntarily give away. But: What to do with this data jungle?
Another common method is to adopt seemingly universally valid key figures one-to-one (according to the motto "That's how others do it, too", "I read that in a recent technical paper" or "These are the standard reports of our MIS") - regardless of whether they really benefit the company. You do have numbers, but how do they relate to your own processes?
If the data are not conclusive or meaningful in the end, one is quickly tempted to reflexively conclude that one does not have enough information. Wherever you started - you just want more. So more numbers are added cheerfully. Class instead of mass then becomes the order of the day.
Instead of demanding that the systems produce more and more data, which sometimes leads to costly changes, sometimes a different approach is taken: "Just give me all the numbers that are freely available and I'll find out what I'm looking for".
At the end of the day, the disappointment is great when no viable decisions can be derived from the "data chaos" for the optimization of processes or the avoidance of recurring problems. After all, you are not a Google & Co. to get the desired search results in a matter of seconds using sophisticated algorithms or even artificial intelligence.
The fact is that key figures only make sense if they are clearly oriented to the company's own strategic goals/specifications. In document processing, for example, these can be cost reductions in output management, increased customer satisfaction or compliance with contractually agreed services (Service Level Agreement = SLA).
Therefore, a basic definition and analysis is crucial:
Instead of "collecting information like wildfire", you should first "take a closer look" at your existing IT infrastructure. Otherwise, there is a lot of data, but little indication as to whether and to what extent the defined objectives have been achieved.
In short: The key to useful KPI lies in the symbiosis of minimal data acquisition and optimal data quality.
KPIs are not just a management issue at the C level. Rather, they can represent a benefit for a number of levels of government. However, there is no standard size for KPIs. For example, does a print center manager need detailed information about the response times to customer email requests? It is more likely that he wants to achieve the best possible machine utilization. He therefore needs data in this context.
What are the benefits of even the most detailed data if it cannot or cannot be evaluated according to the target? This would be like the doctor sending the patient for an X-ray, but not able to derive a suitable treatment from the findings, or not be able to derive it because the cause of the discomfort cannot be seen on the X-ray. So was x-raying the appropriate diagnostic procedure at all, or should another method perhaps be used (MRT, computed tomography, etc.)?
So the question remains: What is the aim of the query? What do you want to find out? Not everything that an IT system can offer in the form of figures must really be measured. After all, KPIs are often not identical to raw data - they are usually compiled or calculated from it.
It is therefore crucial to first define the KPIs relevant for obtaining the desired information, then measure the correct data from the correct sources and finally prepare them in such a way that concrete instructions for the implementation of the associated strategic guidelines can be derived from them. Creating the "X-ray image" alone is not enough.
The following questions need to be answered in this context:
It is also important to continuously review the defined KPIs and compare them with the current corporate strategy. Are they still relevant or do they need to be redefined? Do other metrics possibly provide more and better information about the degree of fulfillment?
For the KPI systems, this means that they must be flexible enough to deliver the desired information quickly and decisively and to prepare it in such a way that it can be used to derive concrete instructions for achieving the targets, even if the strategic goals suddenly change.
In addition, there is another important aspect: Before the data is collected, it must be clearly defined where it is collected and stored. In document processing one should focus on the events along the document production (process chain). All processing steps that generate or change information and formats can be of interest to different KPIs. Typical error in this context: You leave the data in the IT system in which they occur - ergo only the department that works with this application has access to the KPI.
The fact is that most companies now see the creation of a central data repository as a feasible solution. It is accessible to everyone, but at the same time "acts" independently of the departments and also of the document-generating and processing applications.
The creation of such a central data instance would lay a technological foundation for complete traceability. Instead of storing data decentrally ("silo architecture"), it is managed and consolidated by a "control center".
But be careful! The idea could arise of also transferring information from resource management or asset management to the same central repository. However, it is crucial to only store the really document-relevant data in the repository. Otherwise, it is very difficult to generate meaningful information from the resulting "data chaos" with the aid of expensive tools.
In addition, the repository must be able to store both data from the entire production chain (e.g. printing/shipping dates, document volumes, postage costs, shipment bundling) and accompanying information (when was the e-mail sent, received and read?). When was a link clicked? When will the letter be delivered? When did you look at the archive copy?).
In order to avoid misunderstandings, it is not necessary to record the data of each individual processing step within the process chain if the resulting information is not relevant at a certain point in time ("one never knows" approach). But it should be possible as soon as it becomes necessary. Therefore, all interfaces of the affected systems must be assessed accordingly.
In addition, the system should have a high degree of flexibility in order to pick up information from "new" media of the future - after all, customer communication is subject to constant change.
The importance of the principles listed here is most evident in companies that operate the entire document processing chain in-house. No partial aspect can now be transferred to the responsibility of another organisation. Thus, the process chain to be monitored covers everything - from transactional data generation, document creation, formatting and optimization, to medium selection and associated channel-specific output optimization, to shipment tracking.
Despite all modern technology, the decisive factor is and remains the determination of the right key figures. And they can be very variable, depending on the corporate strategy and market situation. For example, the world of document and output management today is characterized by fundamental and frequent changes. Among the most important are the following two aspects:
1. Transactional entry: Companies today are more and more confronted with information streams resulting from concrete customer inquiries; documents that arrive not only by e-mail and traditional mail (paper), but above all also via new channels such as portals, chats and messenger services (WhatsApp & Co.). This increasing individualization and digitalization of communication (omnichannel/personalization) means that the customer expects fast response times from the company. What is certain is that today the consumer, the insured, the bank customer, etc. determine the communication medium. The companies have to adjust to this.
The second aspect is closely linked to this:
2. Digital output: Paper is losing importance in customer communication and is developing into a premium product for less frequently used high-quality documents (haptic and optical effect). Instead - as mentioned under 1.) - standard communication increasingly takes place via electronic channels.
As a result of these two trends, for example, an insurer with a still high proportion of paper documents in its output (e.g. insurance policies) could still regard the optimisation of the printing line as an important strategic goal. Accordingly, the relevant key figures must be defined, measured and analysed (e.g. machine utilisation, changeover times, toner/ink consumption).
With the spread of new (electronic) media, however, it is also a question of shortening the response times in digital communication, because the percentage of customers who contact the company electronically ("digital natives") is constantly growing. Ultimately, digitization always means acceleration. Capturing response times is therefore a key task.
It would be conceivable, for example, to stipulate that the insurer's management board requires that every transaction initiated by the customer (contact/communication via digital means) is to be reacted to within a certain period of time (x minutes/hours) - in the form of a confirmation of receipt, for example. ("Thank you for your message. We will contact you as soon as possible."). The challenge now for the clerks is to respond to each transaction as quickly and competently as possible, no matter how complex the communication structures at the insurer are. And if the answer should take place by automated transaction of the existing company-owned systems, then the expectation attitude to the response times will be surely still higher.
That's the theory. In practice, however, the development of relevant indicators is still a challenge. Most companies are of course aware of the importance of KPI-supported controlling in document and output management - but if it is to become concrete, there are still several question marks.
Independent institutes, in addition, specialized ECM providers give in this connection practical assistance (consultation, analysis, software). They help users to obtain the clearest possible information about all process conditions and to precisely identify optimization potentials. The extent to which they are necessary is derived from the control requirements of the respective organization.
"KPI systems are not an end in themselves, but a pragmatic instrument for quality and cost management of all processes in document and output management," says Thorsten Meudt, Chief Technology Officer at Compart AG. "It is important to keep the balance between a reasonable effort for the establishment of a KPI system and the actual gain of knowledge, which must ultimately lead to concrete improvements of the processes.“
KPIs concern both the quality and the effectiveness of processes. Key figures in documents and output management can often be assigned to the following areas:
1. Lead time and adherence to schedules
This includes first of all a look at how fluently the processes in the ECM processes are carried out. This question is answered by determining the real time expenditure, which takes into account both the actual processing time and the delays caused by waiting and idle times. These results represent the "lead time" as a Key Performance Indicator (KPI).
If document-based processes also contain time periods, as is the case, for example, with respect to cash discount periods for automatic invoice receipt or defined response times for responding to customer inquiries, a measurement parameter is also required to display the "deadline compliance rate". This key figure is used to determine the extent to which the relevant processes are completed within the time limits or how often delays occur.
A further focus is on the "reliability" of the workflow and thus the possible errors that can occur, for example, due to qualitatively inadequate scans, processing errors during the individual process steps or incorrect forwarding of documents. As a rule, you generate additional work (postprocessing) and thus reduce the productivity level.
3. Regulatory compliance, availability, customer satisfaction and process handling
For compliance reasons, it is usually necessary for every organization to have a key figure that shows at a glance the extent to which the overall process meets legal and company-specific requirements. Typically, an evaluation of the "availability" of the ECM infrastructure also belongs to the obligatory program with the key figures. It should cover both technical systems and human resources. If required, KPIs can also be generated for "customer satisfaction" in document-related processes and for the satisfaction of ECM users with regard to "process handling" and the solution used.
4. Utilization improvement and setup time minimization
There are no key figures here that can display these aspects directly and immediately. In fact, there are only possible "indicators" from which conclusions can be drawn.
All this information can be, but does not have to be, key figures. And finally, when changes are introduced, how do the patterns that were previously observed change? What can be inferred from that? Have the strategic goals of an optimization been achieved?
These examples do not represent a manual for standard KPIs in document processing. Even if essentially always the same or at least similar processing steps emerge with the enterprises, so nevertheless the current processes, the systems used and also the data sources used are very organization-specific. Ultimately, information can only be obtained from the context of existing applications.
Free White Paper: "Discover How to Address the Increasing Complexity of Personalized Customer Communication" - Download Here
Carsten Lüdtge, a qualified journalist (University Degree: Diploma) and specialist editor, is responsible for press and public relations at Compart, an international manufacturer of software for customer communication, and is in charge of the Compart Group’s entire content management. He has PR expertise of more than 20 years with a focus on IT.