by Philippe Magne

A strong trend is underway and is set to continue: businesses are adopting an increasing number of standard software packages, often in the cloud, into their Information Systems. Today’s period of major technological change fosters this trend. However, if you consider software packages, you must also consider parameters – or configuration data – of all kinds. Indeed what differentiates software package vendors is the degree of customization that they support, allowing the best of these to expand their markets even further. However, this customization has a non-negligible cost for the customer. This is clearly true in the implementation phase, but also later in the lifecycle as the customer must continually adapt their parameter data to handle enhancements in their software package, as well as changes in their own organization. In the past, with specific in-house development, developers were asked to make the necessary updates. Now, parameter changes are made by the project owners. This means that change management is naturally evolving from a management of source code to a management of parameters.

These crucial adaptations not only have a cost; but they must also be performed under highly secure conditions. A simple parameter error can have disastrous consequences on a company’s data. It is imperative for a business that wants to protect itself against this risk to consolidate and reinforce its change management model.

The techniques needed for this rigorous organization are practically identical to those used to maintain code. They begin by creating separate test environments. The new changes can then be extensively tested by users in order to guarantee that results are compliant. A copy of all or part of the data from the production environment is very useful to create conditions that are as realistic as possible and accurately reflect the existing parameters. Several questions should be asked at this stage:

  • How can the disk space requirements be minimized while continuing to deploy an ever-increasing number of different environments?
  • How can the test data be refreshed periodically?
  • How can the confidentiality of this production data be guaranteed?

It is reassuring to know that the answers to all these questions can now be found through tooling and internal procedures.

When these changes are validated, the question then arises of their transposition into the production environments. How does this happen? In the best case scenario, we can rely on any existing import/export functions in the software package. Specific routines could also be developed in order to attempt automation, but in most cases, the transposition is made by manual re-entry. As human reliability in repetitive tasks is approximately 1 for 100, it is easy to see how vital it is to automate this process – the only way to be both productive and secure.

Automating the testing activity is an intrinsic part of this same process. The level of testing depends on the tolerated level of reliability. Testing is an expensive activity which tends to constitute a chokepoint in the reactivity to changes. There are a number of tools on the market which provide concrete solutions in this field. HP’s purchase of the market leader, Mercury Interactive, shows the extent to which this tooling has become highly strategic for many companies. It is true that when we see the level of automation on offer, notably in test scenario tools, we cannot resist. But are they really effective? Here again, the successful use of this type of product will depend greatly on the robustness of your underlying change management process.

Above all, implementing a rigorous change management process is a matter of common sense and pragmatism. However, it must also be acknowledged that this approach is not necessarily part of our culture today, which tends to reject constraints that many consider to be too rigid. But our increasingly globalized world demands ever more security and reliability, so we must adapt. Ultimately, a little rigor and structure never hurt anyone.

An organization is based on methodology and proper tooling – tooling that is entirely appropriate to your technical context and wholly accepted by the majority of your team. I use the term ‘majority’ as it is over-optimistic to try to achieve unanimity in a tool-based approach. You can be certain that somewhere in your organization lurks an incorrigible diehard who will oppose this approach with more or less solid arguments. This is human nature.

These organizational changes are exciting. They call on technical and human factors to accomplish a goal. Of course, there are obstacles on the path, but that path leads us straight to the future, an era of maximum automation.

Contact Us

REQUEST A DEMO

Let’s talk about your project!

Speak with an expert

Customized Demo

Contact our experts