A Long Goodbye
In the last part of this series, I took you through what many consider to be the final phase of the software development lifecycle, demonstrating how software testing helps to reduce bugs in the final deliverable, and how the testing process leads naturally to a release of the software to the customer. Once the software is delivered and all pending payments cleared, it's time to say goodbye to your customer and get ready for the next project.
Or is it?
Now, this might not make much sense...but typically, the successful conclusion of a software project is just the beginning of a longer maintenance cycle, one which many consider to be even more profitable than the initial development effort. Over the next few pages, I'm going to tell you a little more about how this works. So flip the page, and let's get started!
The Real World
Most often, the customer is represented by a small team during the software development process; this team (sometimes just a single person) is responsible for interacting with the software vendor, approving key deliverables, providing feedback on the progress of the project and making course corrections where required. Consequently, most of the features and capabilities that make it into the final release are based on the (largely subjective) decisions of a very small group of people. These decisions may not be accurate, or even representative of the application's user base; however, in the absence of more data, the development team has to take them into account when designing the software.
Now, once the customer's software has been released and installed to the target environment, it will come under the scrutiny of a much larger number of users, many of whom will have suggestions for improvement. If the customer is interested in keeping his or her users happy, these suggestions will need to be taken seriously, and implemented in future versions of the software wherever possible. Additionally, as the software is used on a regular basis in a live environment, bugs hitherto undiscovered by the testing team will surface, and will need to be rectified on a priority basis.
Since the customer already has a pre-existing relationship with the original developers of the software, and since those developers are intimately familiar with the inner mechanics of the application, it makes sense for these change requests and bug fixes to come back to the original development team for implementation. Thus begins a software maintenance cycle, in which released software is upgraded to account for changes, improvements and bugs on a periodic basis.
The initial software development effort is always a focused one, which takes place on a fixed schedule over a specified period of time. Change requests and bug notifications, however, take place on an ongoing basis after the software has been delivered to the customer, and tend to occur over a much longer time period than the initial development effort. Thus, the post-release phase of a software project can continue on for weeks and months after the project has officially concluded, and can even provide the vendor with an additional revenue stream in the form of charges for implementing changes.
Changing Things Around
Change requests may arise on account of problems encountered in the software or documentation, or because of an enhancement to previously-defined requirements. Typically, a change request contains a detailed description of the item to be changed, together with information on the reason for the change, the task priority and an expected date by which the change request should be implemented.
This request is then passed on to the project manager during a formal software review, via written or verbal request to the project manager, or through the on-site customer representative. The project manager should log each change request, perform an evaluation of the impact of the change with the development team, and notify the customer of the time and cost associated with implementing the change request. When calculating this time and cost, it's important to factor in the effort required to update the documentation delivered in previous stages, and to re-execute all test cases relevant to the change.
Once the customer formally approves the change, the change request log should be updated and the request handed over to the development team for implementation. The updated software then passes through unit and system testing before being released to the customer, together with updated documentation. The original requirements specification, design document and test plan should also be updated by the project manager to incorporate the changes.
An important component of this entire process is the so-called "change request log", functionally similar to the "defect log" maintained during the test phase. The status of each change request (reviewed/canceled/approved/underway/delivered) should be maintained and updated on a daily basis by the project manager, so that a quick overview of the current status of all change requests related to the project can be quickly obtained at any time. This change request log also serves as a record of the modifications made to the software over a period of time.
It's also important to ensure that proper version control processes are followed when executing change requests, especially if these requests occur on an ongoing basis. By ensuring that every change to the source code of the application is placed under version control, the development team has the capability of reverting to an earlier version of the application at any time, in case unexpected problems crop up (or the customer changes his or her mind). As I've said before, this source code repository should be backed up on a regular basis, with all concerned focals aware of how it may be restored in the event of a disk crash or system failure.
Every released version of the software should also be archived on reliable media (CD-ROM is the cheapest and most effective at this time), and stored in a library, so that it can be accessed at any time by the development or testing team. This archive makes it easier to retrieve a particular version of the software, provides a history of the various software releases, and simplifies the process of testing for, and replicating, errors encountered by the customer with specific versions of the software. Ensure that each archived version is clearly tagged with a version number and a release note detailing the changes that took place in that version.
Playing The Numbers
It doesn't matter how long you've been developing software - you learn something new with every project you execute. And so, after a project has been formally concluded, it's important to spend some time auditing the effort and expense that went into it, both to see if it matched your initial estimates and to locate areas for improvement.
Some of the possible audits are:
Requirements definition: This report measures the number of iterations a requirements specification goes through before it is formally approved, and provides an indication of how rapidly and efficiently the organization can understand and document customer needs.
Estimation accuracy: This report compares the effort and expense estimated by the organization at the beginning of the project with the actual effort and expense at the conclusion, and thereby measures the organization's skill at providing customers with accurate project estimates and the validity of the estimation methods and formulae used.
Test planning and execution: This report compares the tests planned with the tests actually executed in each of the testing phases, in order to measure the integrity of the test process. The time and cost estimates of each test may also be compared against actual time and cost, to verify the accuracy of the organization's estimation formulae.
Error resolution: This report graphs the number of errors reported against the number of errors actually resolved, and thus provides a measure of how well the organization's defect management process actually works. A variant of this report involves graphing the number of errors against the time taken to close them, in order to measure response time.
Error frequency: This report graphs the number of errors against each component of the software architecture, and can be used to identify organizational deficiencies in skill or programming knowledge (by mapping each component to specific developer skill sets or responsibilities). It also provides a measure of the accuracy and effectiveness of the organization's test plan and test cases.
Change request resolution: This report compares the number of change requests against the time taken to execute them, and provides a measure of how well the organization is set up to respond to evolving customer needs.
This is by no means an exhaustive list - every organization has different needs, and must develop different metrics to analyze its own performance over time.
The data that is used to compile these reports must be collected throughout the software development process, and reports based on that data should be generated and provided to the project manager on a regular basis throughout the project lifecycle. This ongoing trend analysis allows managers to identify specific problems in different phases of the project, and take corrective action to resolve them.
In addition to ongoing analysis, once the project has concluded, a comprehensive set of audit reports should be built and analyzed to get a big-picture view of the problems encountered over the project timeline. These reports can be used to identify specific action items for different departments of the organization, and to prevent a re-occurrence of the same mistakes in subsequent projects.
It is also useful to perform a re-estimation of the entire project once it is formally concluded. At this point, project managers have a very clear idea of what exactly the software requirements are (after all, they just built it!) and they can use this understanding to estimate how long the project would take. This estimation should again be compared with the original estimate delivered to the customer, discrepancies should be analyzed to understand their source, and corrections should be made so that future estimates are more accurate.
The ultimate goal of all this analysis: better estimation, better implementation, better quality control. All of which leads to lower costs for the organization and greater value for the customer.
Going The Whole Nine Yards
As a professional software developer, it's important to understand that your relationship with your customer doesn't end with the delivery and installation of the software. Customers today expect their vendor to support them in the deployment of the software as well - which is why you should also consider offering your customers the following value-added services in the post-release phase of a software project:
- Technical support: In the initial days and weeks following the installation of your software, your customer is bound to have questions about the operation of your software. Most often, these questions can be resolved quickly over email or telephone, and they drop in volume as your customer acquires familiarity with the software.
However, for large projects which involves hundreds or thousands of users and administrators - for example, banking software systems - consider working out a commercial arrangement with your customer for dedicated product technical support. Such an arrangement offers advantages to both parties: the customer's comfort level goes up with the knowledge that he or she has the backing of a professional team of engineers who are familiar with the software, and the vendor acquires both a new revenue stream and a toehold into the organization for new product offerings and services.
- Training: If the delivered software is complex or highly specialized, users may require special training in order to make effective use of it. Sure, a manual was probably delivered along with the software - but when was the last time you ever read a manual? Consequently, many large organizations prefer to give their employees one-on-one training on the live system - and, as the developer, you're obviously the best person to deliver this training.
Of course, teaching is a very different skill from software development, and organizing a successful training session, especially if it's at the customer's site, is a fairly complicated affair - which is why you should always hire a professional to take care of it. Provide this professional with all the information your developers have about the system, and then watch closely as he or she magically turns the confused mass of information into an organized syllabus, complete with practical exercises, spot quizzes and a certification examination...all designed to get the relevant information across to end users as effectively as possible.
The results of a good training exercise are always immediately evident: fewer support calls, greater productivity and a happy customer. Which bodes well for your chances when the next contract comes along.
- Updates: If your application needs to be updated on a regular basis with new features or content, consider having your customer contract this task to your organization. Work out regular schedules to review user feedback on the application, make modifications to the software to make complex tasks easier and simpler for the user, and evolve the software to meet new customer requirements over time. This allows your customer to concentrate on other things, secure in the knowledge that the application is being maintained professionally, and provides you with both revenue and new business opportunities, in the form of additional product and service offerings.
Before I go, one final parting shot - a broad list of things you should do (and things you should avoid at all costs!) in your journey over the software development landscape.
Define and adhere to specific goals. Don't begin implementation of a project until you're sure you have all the requirements defined and approved, the priorities clearly mapped, and all the resources, assets and sample data are in place. The absence of clear goals can affect the motivation and productivity of your team, as can shifting priorities and a lack of direction.
Work and manage to a plan. The software development plan you create in the early stages of the project contains detailed information on how the project will be executed, and you should follow it as closely as possible. Equally important, be sure to update the plan on a regular basis with re-estimations, schedule changes and deviations, and ensure that all your team leads have the latest copy at any time.
Perform regular re-estimations. It's important to regularly analyze the progress of the project and re-estimate the time, effort and expense involved; this helps in proper resource (re)allocation and assists in better planning. If a re-estimation results in an unavoidable delay to the committed delivery date, inform your customer in a professional manner, together with the reasons for the delay, and alternatives available (if any exist).
Be wary of feature creep. Every change to previously-defined requirements, no matter how small, has an impact on your development plan and estimates. Don't implement changes without first making a detailed assessment of how they will impact the project schedule and budget, and ensure that each change request (and its impact) is thoroughly reviewed with your customer before it is executed.
Don't compromise on quality. Poorly-written or inadequately-tested code will come back to bite you later. Avoid the temptation to meet a tight deadline by reducing the time spent on product quality assurance, and take your responsibility to deliver zero-defect software seriously. It's the best way to make your customers happy, and make sure they call you first the next time they needs a contract executed.
Don't favour processes over creativity. A process is valuable, but only as a guideline; it does more damage than good when it begins to stunt creative thought. Encourage your developers to experiment with new technologies or techniques wherever possible; it's the only way they'll learn new things. Some experiments will fail and some will succeed; all of them will teach you something.
Learn from your mistakes. No process is perfect, and no person infallible. You should audit your project as it passes through different stages, make notes of how and where improvements are possible, and use that learning in subsequent projects. To paraphrase George Santayana, those who fail to learn the lessons of history are condemned to repeat it.
And that's about all I have time for. I hope you enjoyed this series, and that it provided you with some insight into the processes and techniques that make for successful software development. Now...go practise!
Note: Examples are illustrative only, and are not meant for a production environment. Melonfire provides no warranties or support for the source code described in this article. YMMV!This article was first published on 11 Oct 2002.