Our last event was the first one clearly focused on the theme of Quality Engineering, from the pivot “From QA to QE, where to start?”.
I thank the following participants for their presence and contribution:
- Xavier Pigeon, Quality & IT method and strategy expert, author of gearsoftesting.org and founder of TestAsYouThink
- Florian Fesseler, Head of Engineering at iObeya
- Fatima-Zahrae Abbadi, Lead Quality Assurance Engineer at Odigo
- Nicolas Guyon, Expert Test & QA, CI, DevOps and Agile deployment at Murex
In this event, we addressed the following points:
- What is the current vision and perception of Quality Engineering?
- What first actions are the most relevant to initiate an approach?
- What approaches and practices to implement transversally?
Contextualization of the event
We started with an exchange around the Quality Engineering to better define its contours.
We also shared rigging points between QA and QE before moving on to the practical phase.
Each participant was able to list the themes selected below.
The challenges of Quality Engineering
We started by contextualizing which challenges Quality Engineering should help to solve.
The challenges are found balanced and aligned from business to technological, and counter-intuitively outside of a pure engineering practice.
Bringing a structural quality induces a transversal perspective of the system, hence the importance of the first point of communication between the various stakeholders.
The instability of needs complicates the ability of actors to align themselves with a vision, requiring the discussion to focus on more stable points, we will come back to this.
The underlying trend of “anywhere, anyplace, anytime” is materialized in the notion of always-on, made more complex by the diversity of deployment models.
The need to deliver a useful, regularly updated quality user experience is one of the key expectations of QE.
Antoine Craske
This need for rapid iteration implies a smooth end-to-end development experience.
The ability to measure the value provided supports the process, without being based solely on performance indicators.
The indicators can effectively be limited to a qualitative analysis of the deliverables.
The foundation also includes the need to focus more largely on the value delivered than on the technical field of local intervention.
This identification remains a real challenge as we have to share in this round table.
What are the differences between QA and QE?
We then switched to the articulation of Quality Assurance (QA) and Quality Engineering (QE).
Intuitively the two practices share foundations of quality management, also applicable to the field of software.
I found the following two definitions allowing to bring out the respective perspectives of each domain.
Like any definition, they have interpretive biases, are superficial and may lack context.
Nevertheless, I find it interesting to note the words ensure and software development process, clarifying the transversal perspective.
QA is for its part limited in this definition and strongly associated with an a posteriori control process.
As in a factory, QA can only be at the end of the chain, but through leadership and pro-activity it can be included in the entire value chain.
From the various recent studies around software quality, I take the view that QE is an evolving branch of QA and other disciplines, which still needs to be expanded in this area.
Here is a compilation of perspectives from Quality Engineering.
What criteria for successful transition?
Nicolas launched this question, noting the need to identify the management objectives behind quality improvement.
We have rediscovered a painful step of an organization to find itself facing the wall in order to really take the measure of the situation, and to really act.
More precisely: being facing the wall and realizing an n-th fix by a heroic developer can no longer solve the problem.
I shared a personal experience where we were faced with this situation. Our first goal was to deliver a small change as quickly and as smoothly as possible.
Knowing how to deliver a simple change the same day without a specific organization is a criterion for the success of QE.
Antoine Craske
This approach is not easy to convince from a business point of view, because in the short term we have the feeling of slowing down, regressing, and accumulating pending work.
Does this remind you of reading LEAN procedures, practices at Toyota or the acronym WIP (Work In Progress)?
This is indeed the case by adding this brick of system-thinking and LEAN for structurally effective Quality Engineering.
As for the Quality Engineering objectives, they must be geared towards acceleration, stability and quality from the short-term to the long-term.
The summary of questions to ask for the success criteria of a transition to QE:
- Are there customer and business objectives clearly identified?
- Does the organization appear to be mature to accept an alternate delivery value measurement model?
- What elements of context should you use to argue a need for change, now?
What place for the CI/CD in the QE?
That’s a good question, since CI/CD is often associated with DevOps, it nevertheless remains linked to quality through the pattern of Quality Gates.
Our discussion led us to identify the CI/CD as one of the necessary building blocks in a QE process.
It remains to be integrated into the overall approach through its multiple impacts: cultural, organizational, process, tools, and maintenance.
A structurally and organizationally functional CI/CD is like moving from the craft to the small factory.
Antoine Craske
On the cultural side, the CI/CD requires in part transversality of the software delivery responsibility, from its design to its use.
Approached only under a tooling aspect, it risks running out of steam in addition to bringing more constraints and frustrations than the business value initially identified.
A CI/CD approach must therefore be posed in its transversality:
- What problems are we trying to solve by automating processes?
- How should the CI/CD be supplemented to resolve the problems identified?
- How does the CI/CD fit into the processes, from end to end, and with what impacts?
DoD is not limited to User Stories
Xavier shared with us his vision of Definition of Done (DoD), too often limited to development tasks.
There is no shortage of examples of other tasks: bug fixing, organization of a sprint or a release.
Each of the DoDs therefore deserves to be identified and defined in the various tasks of a team.
This radar screen diagram DoD exemplifies the different typologies and levels of deliverables.
We note the usability and measurement criteria of the deliverable, making the link with the need to align customer issues in the Quality Engineering process.
Concretely for your context:
- What are the most relevant typologies of deliverables in your context?
- Does your DoD incorporate automation, usability and quality requirements?
- How to measure and support its actual use? Are these DoDs co-defined, shared and automatically tracked in a task management tool?
How to acculturate developers to the Test?
This question emerged with the aim of mainstreaming the adoption of tests outside of QA.
This step seemed key to us in order to integrate quality throughout the software development and delivery cycle.
Changing your habits remains a challenge for human beings.
Sharing the issues, giving meaning, and explaining how this will bring value to the person concerned is necessary.
In the case of a developer, the tests must therefore make it possible, at a minimum, to deliver code more quickly with which he is satisfied in terms of quality, just as much as his client.
At the same time, these new possibilities must limit the negative impacts on the existing one and provide real value, to the detriment of being put aside.
A developer must be able to access, use and view the different types of tests.
Florian Fesseler
We find the Developer Experience (DevEx) as one of the pillars of Quality Engineering, in order to bring value to the actors involved in the processes.
In return, this notion of platform requires having planned and integrated it upstream, on a shift-left pattern.
Guardrails are still necessary on the structural elements of the chain, secured by automation and complemented by well-thought-out access management.
If the team is asking this question once the delivery is underway, it is probably too late for the current iterations.
Possible avenues for extending the tests outside to developers:
- What are the expected objectives and how do they contribute to the overall objective?
- What use-cases and experience will be provided to the developer?
- How does this change existing habits? What are the contributions, brakes and limitations?
Quality Engineering must provide a platform for the different actors
The value brought by including quality in the whole process also extends to other actors in the system.
Like QA, legacy operations teams must move towards providing a platform for developers.
A concrete example is for example to provide gradual deployment (canary release, A/B testing, …) or supervision As A service.
These changes are widely discussed in the paradigm of DevOps, it is interesting to draw the parallel for the other necessary reconciliations.
Quality Engineering aims for transversality between the different players, inspired by DevOps, focused on quality at the team level.
Antoine Craske
As for the Product teams, they must be able to iterate on different functionalities to judge the value for the customers.
This implies, for example, that the developer has planned the management of this gradual activation, and that he is concerned with it until production.
In the foundation, the operations teams must have allowed the use of such tools by development.
We also hope that the application is properly supervised, perhaps supplemented by testing in production?
The questions to ask:
- Who are the actors collaborating around the Product are likely to act on its quality?
- Which information, use cases and actions could enable them to contribute?
- What organizations, processes and tools could support the identified needs?
How to co-create a testing strategy
A test strategy is too often defined by a single person (hint: someone from QA), lacking inclusion and transversal alignment.
This remains a real challenge: how to involve product management, architecture, developer in the definition of the test strategy?
Inspired by Moving Motivators from Management 3.0, Florian Zilliox’s “What’s My QA” card game gives us an animated track.
The exercise must be really collaborative and fun in its preparation and animation in order to guarantee a balanced contribution and with interest.
The process consists of preparing a set of cards per participant, which they will each have to prioritize according to their point of view.
The team then shares their different perspectives and then successfully aligns its priorities for its testing strategy.
This approach – if correctly animated and supported – has certain advantages:
- The inclusion of the business and client objectives in the test strategy
- An improvement of collaboration through the empathy created between the participants
- The materialization of a transversal team dynamic
How to prioritize your quality effort?
We shared the lack of inclusion of business context and objectives of the test pyramids.
They remain useful to provide a framework of the typologies of tests and their possible prioritization in a given context, historically of QA.
Quality Engineering aims to provide structural integration of quality throughout the software engineering process, starting from the customer.
So what models can we use, as much for QA as for QE?
Xavier shared a personal model with us, the Software Testing Quadrants, which overlapped with certain points of Lisa Crispin and Janet Grégory’s approach on Agile Testing.
The model has the advantage of reinforcing the need for balance to structurally address quality.
There is a necessary balance between the good construction of the software integrating basic durability, its acceptance and the response to the business need.
The four quadrants materialize the different objectives of the product by defining the ready, goal, success and done.
I find it interesting to note that some practices are not intended to be automated, which seems relevant given the maturity of the current ecosystem.
This model can support the collaborative exercise of test strategy in an agile approach by securing the following points:
- Are we addressing the different prisms of quality in our prioritization?
- Do we have a clear definition for each of the blocks?
- What organizational model can allow us to deliver the different blocks?
Quality Engineering, a collaborative platform approach
Our various discussions raised actionable points to initiate a QE process.
As with QA, there is the need to align the objectives of the organization and the success criteria that will be used.
The performance of the transition must be measured step by step and deserves an iterative and incremental approach by the complexity of the actors and processes involved.
Like software, an architecture and design phase are necessary for a scalable and maintainable foundation, partly integrating CI/CD.
The objective of QE being to provide a real platform, it must be oriented for the different actors such as developers and the rest of the team.
Finally, the quality strategy defined in collaboration makes it possible to align the various actors, their objectives and to materialize the necessary transversality.
Content evoked
Radar Screen of Definition of Done
https://gearsoftesting.org/tempo-of-testing.html#content4-2b
What’s my QA – collaborative exercise of pyramid of tests
https://github.com/FlorianZilliox /whatsmyqa/blob/master/whatsmyqa.pdf
Test typology
https://gearsoftesting.org/test-typology.html#content4-2u
Article Quality Engineering, Accenture
https://www.accenture.com/us-en/insights/ technology / quality-engineering-new
Agile Testing, Lisa Crispin & Janet Grégory
https://agiletestingfellow.com/
Moving Motivators, Management 3.0