On the 25th of June, for the 15th year running, the 'world’s leading specialist in software quality', Software Quality Systems (SQS), organised their conference on software, quality and testing. With the sponsoring of various test tool editors and testing service providers, the conference was assisted by over 100 participants eager to follow the current trends on software quality management and testing.
The conference featured three keynote speeches by prestigious representatives from the Academic community, the Swiss Air Force and the Swiss Post and in addition to several other presentations concerning many aspects surrounding the subject of quality. Such as: ways to ensure quality in productive environments, business and quality, the past and future of quality and the management of quality initiatives.
The key note speeches
The first keynote speech by Philippe Dugerdil, MBA, Professor of Software Engineering and Head of Research of the Geneva School for Business Administration, addressed the need to engineer formal models to assist in the comprehension of program code. According to Professor Dugerdil, program code comprehension accounts for 30 to 40 % of software development costs. Although comprehensibility is a fundamental characteristic that facilitates the adequate mapping and communication of requirements, there is a lack of tools aimed at comprehending code as solution providers concentrated on development frameworks.
Formal IT system models fill the gap that exists between the business concepts, expressed by application users, and the program code put together by IT specialists to satisfy business requirements. The unique way computer specialists can acquire the domain knowledge, suggests Professor Durgedil, is through an appropriate representation of the solution design in a model semantically (range of ideas) and syntactically (how ideas are expressed, language) rich enough to make explicit the implicit functional behaviour and quality attributes.
In the second keynote speech, Colonel Peter Bruns of the Swiss Air Force provided an overview of the structure of the armed forces and the tasks of the Swiss Air Force. He took us through the history of the usage of software tools in the Swiss Air Force and how the field of application has evolved on three main waves. Software tools usage at the Air Force started with the introduction of flight control systems, followed by the integration of sensors, with the latest development being data links. Working from lessons learned on user perspective highlighted the importance of early testing activities to the Swiss Air Force. The need to clarify actor responsibilities and engage the final users throughout the entire solution development process also proved important, in order to reduce the gap between them and the solution developers. Therefore, software systems in the Swiss Air Force are characterised by large amounts of testing during design and during the operations mode. Previous experience also demonstrated the need to formalise and reinforce the regression test phase, to ensure that core functionalities are not compromised by the introduction of new developments.
In the third keynote speech of the day, Christian Zeller, Head of the IT Department of Swiss Post, shared his opinion that users expect and take for granted the high quality services in the four core markets where the company operates: namely, communications, logistics, retail financial services and public passenger transport. This adds to the complexity of introducing new business modules into an application architecture that is in constant evolution and innovation to adapt products and services to the changes in technology and customer behaviour.
Considerable effort has been invested in rendering the Swiss Post IT platform scalable and maintainable in order to simplify the application architecture, which has already been standardised and consolidated. The application architecture is now undergoing a virtualisation and automation phase, with the aim to provide a self-service infrastructure in a multi-technology environment and in a secure Swiss Post cloud.
Bringing quality into live environments
In the session “Bringing Quality to Live Environments,” the speakers from Deutsche Bank and Nokia shared their experiences on how to ensure and maintain the quality of IT platforms undergoing continuous transition and change.
Ms Cornelia Friedhoff, Head of Production Management, Private Wealth Management and IT at Deutsche Bank, explained that the Global Technology (GT) division strives to deliver innovative, global solutions, in order to offer standard services and results with a low specialisation level. The approach to deliver such global solutions is through a comprehensive quality program where different stakeholders participate throughout the entire Software Development Life Cycle (SDLC). Project stakeholders express their concerns about all aspects, so as to guarantee the fulfilment of functional and non-functional requirements (also known as quality attributes such as performance, availability, usability, portability), as well as the operational needs of the solutions delivered.
Solution development projects at the Deutsche Bank are structured in a number of phases articulated by quality gates that, with the use of standard checklists, serve to evaluate the impact of changes in the production environment. The level of production readiness is measured using scorecards and key performance indicators. This approach guarantees a tighter organisational integration around the functional evolution of the IT landscape and ensures a high level of customer satisfaction. According to Friedhoff, the solutions delivered following this participative approach will be reliable, better accepted and supported. Ultimately, this will enable better project and portfolio governance.
Siddhart Somasundaram and Stefan Verloeff, presented how the “HERE” (formerly Nokia Maps) team has developed a set of best practices to maintain the high level of quality of their mapping solutions, in a continuous delivery environment. This simple approach involves the extensive automation of test cases and scenarios, as well as continuous integration and deployment. A fully automated process enables the deployment of a new release in minutes. This is quite a departure from the long, painful weekends of step-by-step execution of series of error-prone, poorly documented manual processes, known only by the application deployment gurus.
However, the “magical” deployment through the pressing of one button does not come for free. It demands not only the skills and effort required to automate test cases and deployment processes, but also a strong collaborative approach and a disciplined development organisation.
After these different presentations, we can conclude that the common understanding of the need to ensure end user involvement throughout project development is growing. It is clear that end users need to participate in the verification and validation of requirements under development all along the implementation process, and not only during the final test phase that precedes production. The importance of regression testing is now evident to delivery organisations that are reinforcing the scope and automation of regression test scenarios.