A Test Managers Guide - Back to the basics

Views:
 
Category: Education
     
 

Presentation Description

We teach, manage, consult and audit testing. Check us out to see what we can do for your Test Audit, Test Strategy, Test Management or Test Schooling needs. In Oct 2019 we launched the eBook: Test Manager's Guide and started work on an online Test Manager course.to know details visite Our website https://courses.cania-consulting.com

Comments

Presentation Transcript

slide 1:

A TEST MANAGERS GUIDE BY LUCIAN DAN CANIA ISTQB TEST MANAGER COURSE MATERIAL

slide 2:

A LITTLE BIT ABOUT ME First of all there is not a lot to say I have 10 years of software test management experience with a diversified background in Telecom and Finance IT. Currently I am offering Test Management consulting services across Europe for either project delivery or quality assurance purposes. The advantages that I bring to the table is a cross view from different perspectives as I also covered Delivery Operations Service Management and Release and Deployment Management that help me look better understand open points for a multitude of stakeholders.

slide 3:

Diverse Experience Although my background is not in training and I am not a certified training consultant I manged to get certified along my career as a Prince2 Practitioner ITIL Intermediate for Service Management Service Transition and Continual Service Improvement but I am also certified as a AgilePM Practitioner and ISTQB Test Manager. In the past i worked on several transformation programs like ERP implementation Retail Billing but also I worked cross Europe on major technical upgrades and a IFRS Global implementation.

slide 4:

1 Back to the basics 2 Reviews 3 Defect Management 4 Test Management 5 Test tools automation 6 People skills 7 Improving the test process I n d e x Exam Prep at the END

slide 5:

https://cania-consulting.com/ This book is based on the ISTQB Advanced Syllabus version 2012 and it also references the ISTQB Foundation Syllabus version 2018. It uses terminology definitions from the ISTQB Glossary version 3.2. This book is intended for testing professionals which have at least 2 years of experience and are interested in taking the ISTQB Advanced Test Manager certification Exam. Reading this book does not guarantee a successful grade at the certification Exam and you should also study the ISTQB material provided for FREE at www.istqb.org

slide 6:

Back to the basics Component / Unit Test Integration Test System Test Acceptance Test In testing there are 4 Test Levels. They can be encountered when testing both a system and a system of systems multiple dispersed independent systems in context as part of a larger more complex system.

slide 7:

Component / Unit Test focuses on components that are separately testable. Integration Test focuses on interactions between components or systems. System Test focuses on the behavior and capabilities of a whole system or product often considering the end-to-end tasks the system can perform and the non-functional behaviors. Acceptance Test focuses on the behavior and capabilities of a whole system or product: confidence completion fit for use purpose.

slide 8:

The best way to navigate across the Test Levels is to follow a predefined course to constantly monitor it as you go along and apply course corrections whenever you see fit. Plan Monitor Control Metrics Start with a that you will by using in order to

slide 9:

Test Planning Is the activity of establishing or updating a test plan which starts at the initiation of the test process and in line with the Test Strategy. Test planning applies for each test level and also includes the methods for monitoring for each. Test Plan is a document describing the scope approach resources and schedule of intended test activities. As a record of the test planning process it also covers: test items features to be tested testing tasks who will do each task degree of tester independence test environment test design techniques Entry and Exit Criteria with their rationale risk assessment based on requirements contingency planning based on risk assessment integration of reactive test techniques in execution

slide 10:

During test planning the Test Manager defines the approach for each level: What is tested Goals Objectives Test techniques tools In order to have an effective planning we need to consider the complex relationships between test phases but also between development and test. Sine examples would be the requirement traceability matrix or informal transfer of information. Another factor for effective planning would be the proper listing of the testing scope with each feature associated with a design specification environment etc. Contact with all stakeholders has to be initiated at this stage but also all external dependencies identified and service level agreements put in place. In order to properly measure the progress evaluate the Entry and Exit criteria and to exercise control we need to put in place metrics starting with Test Planning. In other words the requirement traceability matrix is a document that maps and traces user requirement with test cases. The main purpose of Requirement Traceability Matrix is to see that all test cases are covered so that no functionality should miss while doing Software testing.

slide 11:

Test Plan content example as per IEEE829 standard Test plan identifier Introduction Test items Features to be tested Features not to be tested Approach Item pass/fail criteria Suspension criteria and resumption requirements Test deliverables Testing tasks Environmental needs Responsibilities Staffing and training needs Schedule Risks and contingencies Approvals

slide 12:

Metrics Are a measurement scale and the method used for measurement. It is important that the proper set of metrics is established as they are mainly used to measure the progress of testing. This will also enable testers to report results in a consistent way and with coherent tracking Example: of test coverage of test execution etc.. Although they should be as automated as possible to allow immediate understandings of where we are metrics should be defined based on specific objectives that can also be presented to stakeholders at various meetings for various concerns. Project metrics measure progress toward established project exit criteria. Product metrics measure some attribute of the product such as the extent to which it has been tested or the defect density. Process metrics measure the capability of the testing or development process such as the percentage of defects detected by testing. People metrics measure the capability of individuals or groups such as the implementation of test cases within a given schedule.

slide 13:

Monitor Control A testing schedule and monitoring framework need to be established to track progress versus plan. Due to this all ongoing activities should have targets which are tracked via ongoing measurements. When I am stating all ongoing activities I am also referring at test analysis test design test implementation and not only at test execution. It is important to be able to relate the information and status of the test work products in an understandable and relevant manner based on the audience of those reports. Not everyone is looking for the same level of details. The aim of test control is to compare actual progress versus the plan and implement corrective actions. Common examples Mon Tue Wed Thu Fri 50 40 30 20 10 0 Test Condition execution vs plan No Run 53.2 Passed 27.5 In Progress 9.2 Failed 6.4 Test Case status

slide 14:

Test Analysis Is process of analyzing the test basis all documents from which the requirements of a component or system can be inferred and defining test objectives. Test Design Is the process of transforming general test objectives into tangible test conditions and test cases. Test Implementation Is the process of developing and prioritizing test procedures creating test data and optionally preparing test harnesses and writing automated test scripts. Test Execution Is the process of running a test on the component or system under test producing actual result.

slide 15:

Test Analysis Is process of analyzing the test basis all documents from which the requirements of a component or system can be inferred and defining test objectives. Covers WHAT is to be tested in the form of test conditions and can start as soon as the basis for testing is established for each test level. It can be performed in parallel integrated or iteratively with Test Design. Evaluates and reviews the test objectives and product risks while it defines detailed measures and targets for success. deciding on the level of detail should consider The level of testing level of detail and quality of the test basis. System/software complexity and development lifecycle used. Project and product risk Relationship between test basis what is to be tested and how is to be tested Test management tool used The level of maturity of the test process and the skills and knowledge of the test analysts The level at which Test Design and other test work products are specified Availability of stakeholders for consultation

slide 16:

Test Condition Is an item or event of a component or system that could be verified by one or more test cases ex: function transaction feature etc.. A test condition may or may not specify values or variables. It all depends on the context at that test level. Some might be generic like "Test Payment" and others may be specific like "Test Payment with VISA for 3 items and a cost over 100". Dont forget if you go specific then expect a higher number of test conditions. Check what you need at that stage and adapt. It may not be the same for Component Test as for System Test. advantages of detailed test conditions More flexibility in relating other test work products Better and more detailed monitoring and control Contributes to defect prevention by occurring early Relates testing work products to stakeholders in terms that they can understand Influences and directs other testing activities but also other development activities Enables test design implementation and execution to be optimized by more efficient coverage Basis for clearer horizontal traceability within a test level

slide 17:

disadvantages of detailed test conditions Potentially time-consuming Maintainability can become difficult Level of formality needs to be defined and implemented across the team GO detailed when GO generic when Lightweight test design documentation methods Little or no formal requirements or other development work products The project is large-scale complex or high risk Component Unit level testing Less complex projects where simple hierarchical relationships exist Acceptance testing where use cases can be utilized to help define tests

slide 18:

Test Design Is an item or event of a component or system that could be verified by one or more test cases ex: function transaction feature etc.. Covers HOW something is to be tested by identifying test cases with step wise elaboration for the test conditions from Test Analysis or from the test basis using techniques identified in the test strategy or plan. This phase can start for a given Test Level once Test Conditions are identified and enough information is available to enable the production of Test Cases. In other words a test case is a set of input values execution preconditions expected results and execution post-conditions developed for a particular objective or test condition such as to exercise a particular program path or to verify compliance with a specific requirement. Although it can be merged together with Test Analysis for higher levels of testing it will remain a separate activity. It is likely that some tasks that normally occur during test implementation will be integrated into the test design process. Especially when using an iterative approach. The coverage of test conditions by either creating low- level and high-level test cases can be optimized by the creation of test data starting in Test Design.

slide 19:

Test Implementation Is the process of developing and prioritizing test procedures creating test data and optionally preparing test harnesses and writing automated test scripts. This is when tests are organized and prioritized and when test designs are implemented as test cases test procedures and test data. It is of great importance to pick the right tests and run them in the right order. The importance of this even grows exponentially in risk-based strategies when we prioritize based on the likelihood of risk and problems. During this stage you should ensure: delivery of test environment delivery of test data constraints risks and priorities are checked test team is ready for execution entry criteria is checked explicit/implicit Some organizations may follow the IEEE829 standard to define inputs and their associated expected results during testing. Other only have rigorous rules when they need to provide evidence of compliance for regulatory projects or for adherence to standards. In the most common cases the test inputs are usually documented together with expected results test steps and stored test data.

slide 20:

Just like test conditions and test cases even during test implementation we will face the decision to go into an extensive detailed stage or to have a light generic approach. This decision should be taken by your understanding of the development lifecycle and by the predictability of software features under test. For example in agile or iterative lifecycles where code changes dramatically from iteration to iteration the implementation work changes significantly between each stage. Please do not count off the extensive implementation preparation due to the above: Concrete test cases provide working examples of how the software behaves When tests are archived for long term and re-use in regression these details may become valuable Domain experts are likely to verify versus a concrete test rather than an abstract business rule Further weakness in software specification is identified Some defects can be found only in production-like test environments. These are often expensive to procure and difficult to configure and manage. Similar challenges are also faced for the use of production data or production like data which can even lead to daa privacy or other headackes.

slide 21:

Test implementation is not all about manual testing this is the stage where automation scripting takes place the stage where automation versus manual prioritization and execution order is established. And I am not talking only about automation even tool acquisition is done here especially for test data generation required to prepare for load volume or performance testing. Quick reminder before moving forward Test Suite groups of test scripts as well as a test execution schedule. Test Case A set of preconditions inputs actions where applicable expected results and post conditions developed based on test conditions. Test Script A sequence of instructions for the execution of a test. Test Charter An instruction of test goals and possible test ideas on how to test. Documentation of test activities in session-based exploratory testing.

slide 22:

Test Execution Is the process of running a test on the component or system under test producing actual result. Should finish before execution starts Tests are designed or at least defined Tools are in place for test management and defect management and test automation if applicable Standards for test logging and defect reporting are published Execution begins once The test object is delivered The Entry criteria for test execution is met During execution a Test Managers role is to: Monitor progress according to the plan Initiate and carry out control actions to guide testing Ensure that test logs provide an adequate record of relevant details for tests and events During execution it is important to keep a traceability between test conditions the test basis and the test objective and to have the appropriate level of test logging. Time should be reserved for experienced-based and defect-based test sessions driven by testers findings.

slide 23:

Entry Criteria Exit Criteria Set of generic and specific conditions for permitting a process to go forward with a defined task Purpose is to prevent a task from starting which would entail more effort compared to the effort needed to remove the failed entry criteria Set of generic and specific conditions agreed with stakeholders for permitting a process to complete Prevent a task from being considered completed when there are still outstanding tasks not finished Used to report progress against a plan and to know when to stop testing A Test Managers should: ensure that effective processes are in place to provide necessary information for evaluating entry exit criteria make sure that the definition of the information requirements and methods for collection are part of test planning ensure that members of the test team are responsible for providing the information required in an accurate and timely manner The evaluation of exit criteria and reporting of results is a test management activity.

slide 24:

We can measure properties of the test execution process 0 10 20 30 40 50 test conditions test cases test procedures planned tests executed tests passed tests failed Number of test conditions cases pass failed etc. Total defects classified by severity priority status Medium 53.2 Low 31.9 High 10.6 Critical 4.3 Medium 56.8 Low 22.7 High 15.9 Blocker 4.5 Closed 58.1 Open 11.6 Rejected 11.6 Verification 9.3 Clarification 5.8 Severity Priority Status Change requests Quality risks Planned versus actual Rejected 50 Postponed 23.3 New 16.7 Accepted 10 Mitigated 65.2 Deferred 21.7 Acknowledge 13 Week 1 Week 2 Week 3 Week 4 Week 5 20 15 10 5 0 Day 1 Day 2 Day 3 Day 4 Day 5 60 40 20 0 costs execution

slide 25:

Test Closure consists of finalizing and archiving the testware and evaluating the test process including preparation of a test evaluation report. Once test execution is deemed to be complete the key outputs should be captured: Test completion check - ensuring that all test work is indeed concluded Test artifacts handover - delivering valuable work products to those who need them Lessons learned - performing or participating in retrospective meetings where important lessons Archiving results logs reports and other documents These tasks are important often missed and should be explicitly included as part of the test plan. A Test Managers should: Look for opportunities to reuse test work products Keep in mind that retrospectives should apply to testing as well as to the entire project and indeed the wider organization. Problems tend to be systemic not isolated

slide 26:

Waterfall Agile Requirements Design Implementation Verification Maintenance Analysis Planning Design Testing Feedback Deploy Examples of tasks within a project

slide 27:

V-Model Specification Unit Design Code Development Unit Test System Test System Design Integration Test Requirements Acceptance Test Level Test Plan

slide 28:

Glossary Each of the terms specified below are defined as per the ISTQB® Glossary which is displayed online at: https://glossary.istqb.org/en/search/ The period of time that begins when a software product is conceived and ends when the software is no longer available for use. The software lifecycle typically includes a concept phase requirements phase design phase implementation phase test phase installation and checkout phase operation and maintenance phase and sometimes retirement phase. Note these phases may overlap or be performed iteratively. software lifecycle system of systems metric measurement test planning test plan Multiple heterogeneous distributed systems that are embedded in networks at multiple levels and in multiple interconnected domains addressing large-scale inter- disciplinary common problems and purposes usually without a common management structure. test basis The body of knowledge used as the basis for test analysis and design. The activity of establishing or updating a test plan. Documentation describing the test objectives to be achieved and the means and the schedule for achieving them organized to coordinate testing activities. The process of assigning a number or category to an entity to describe an attribute of that entity. A measurement scale and the method used for measurement.

slide 29:

test case test condition test execution test analysis The activity that identifies test conditions by analyzing the test basis. test design The activity of deriving and specifying test cases from test conditions. A set of preconditions inputs actions where applicable expected results and postconditions developed based on test conditions. An aspect of the test basis that is relevant in order to achieve specific test objectives. test control test implementation test script A sequence of instructions for the execution of a test. test procedure A sequence of test cases in execution order and any associated actions that may be required to set up the initial preconditions and any wrap up activities post execution. The activity that prepares the testware needed for test execution based on test analysis and design. The process of running a test on the component or system under test producing actual results. A test management task that deals with developing and applying a set of corrective actions to get a test project on track when monitoring shows a deviation from what was planned. test log A chronological record of relevant details about the execution of tests. exit criteria The set of conditions for officially completing a defined task.

slide 30:

test closure test summary report boundary value analysis branch testing cause-effect graphing classification tree method condition testing During the test closure phase of a test process data is collected from completed activities to consolidate experience testware facts and numbers. The test closure phase consists of finalizing and archiving the testware and evaluating the test process including preparation of a test evaluation report. A test report that provides an evaluation of the corresponding test items against exit criteria. Test Techniques A black-box test technique in which test cases are designed based on boundary values. A white-box test technique in which test cases are designed to exercise branches. A black-box test design technique in which test cases are designed from cause-effect graphs. A black-box test design technique in which test cases described by means of a classification tree are designed to execute combinations of representatives of input and/or output domains. A white-box test design technique in which test cases are designed to execute condition outcomes. condition determination testing A white-box test technique in which test cases are designed to exercise single condition outcomes that independently affect a decision outcome. control flow analysis A form of static analysis based on a control flow graph. data flow analysis A form of static analysis based on the definition and usage of variables.

slide 31:

decision table testing decision testing defect-based test design technique defect taxonomy dynamic analysis error guessing equivalence partitioning exploratory testing experience-based test technique A form of static analysis based on a control flow graph. A white-box test technique in which test cases are designed to execute decision outcomes. A procedure to derive and/or select test cases targeted at one or more defect types with tests being developed from what is known about the specific defect type. A system of hierarchical categories designed to be a useful aid for reproducibly classifying defects. The process of evaluating behavior e.g. memory performance CPU usage of a system or component during execution. A test technique in which tests are derived on the basis of the testers knowledge of past failures or general knowledge of failure modes. A black-box test technique in which test cases are designed to exercise equivalence partitions by using one representative member of each partition. An approach to testing whereby the testers dynamically design and execute tests based on their knowledge exploration of the test item and the results of previous tests. A procedure to derive and/or select test cases based on the testers experience knowledge and intuition. multiple condition testing A white-box test design technique in which test cases are designed to execute combinations of single condition outcomes within one statement. pairwise testing A black-box test design technique in which test cases are designed to execute all possible discrete combinations of each pair of input parameters.

slide 32:

path testing requirements-based testing specification-based technique static analysis statement testing A white-box test design technique in which test cases are designed to execute paths. An approach to testing in which test cases are designed based on test objectives and test conditions derived from requirements e.g. tests that exercise specific functions or probe non-functional attributes such as reliability or usability. A procedure to derive and/or select test cases based on an analysis of the specification either functional or non- functional of a component or system without reference to its internal structure. The process of evaluating a component or system without executing it based on its form structure content or documentation. A white-box test technique in which test cases are designed to execute statements. state transition testing A black-box test technique using a state transition diagram or state table to derive test cases to evaluate whether the test item successfully executes valid transitions and blocks invalid transitions. structure-based technique A procedure to derive and/or select test cases based on an analysis of the internal structure of a component or system. test charter Documentation of test activities in session-based exploratory testing. use case testing A black-box test technique in which test cases are designed to execute scenarios of use cases. wild pointer A pointer that references a location that is out of scope for that pointer or that does not exist.

slide 33:

accessibility testing accuracy testing heuristic evaluation interoperability testing maintainability testing operational acceptance test black-box testing Types of Testing Testing to determine the ease by which users with disabilities can use a component or system. Testing to determine the accuracy of a software product. Testing either functional or non-functional without reference to the internal structure of the component or system. A usability review technique that targets usability problems in the user interface or user interface design. With this technique the reviewers examine the interface and judge its compliance with recognized usability principles the "heuristics". Testing to determine the interoperability of a software product. Testing to determine the maintainability of a software product. Operational testing in the acceptance test phase typically performed in a simulated operational environment by operations and/or systems administration staff focusing on operational aspects e.g. recoverability resource-behavior installability and technical compliance. portability testing Testing to determine the portability of a software product. recoverability testing Testing to determine the recoverability of a software product. reliability testing Testing to determine the reliability of a software product. security testing Testing to determine the security of the software product.

slide 34:

usability testing white-box testing suitability testing Testing to determine the suitability of a software product. Testing to evaluate the degree to which the system can be used by specified users with effectiveness efficiency and satisfaction in a specified context of use. Testing based on an analysis of the internal structure of the component or system. Exercises ASTQB ISTQB ISTQB® Foundation 2011: 3 ISTQB® Foundation 2018 Exam A: 7 8 ISTQB® Foundation 2018 Exam B: 6 7 ISTQB® Foundation 2018 Exam C: 6 ASTQB® Foundation Exam 1: 9 10 11 12 13 ASTQB® Foundation Exam 2: 9 10 11 12 13 ISTQB® Advanced Test Manager: 1 2 3 4 5 6 7 8 9 ASTQB® Advanced Test Manager: 1 2 3 4 5 6 7 8 9 10 11 12 13 14

slide 35:

Exam Information Exams are offered by the ISTQB® local Member Boards and Exam Providers. Please contact them for information on exam dates applicable fees and booking information. You can take the exams as many times as necessary to pass them. The fees for re-taking the exams are defined by the local ISTQB® Member Boards and Exam Providers. To take an Advanced Level exam the candidate must hold the relevant Foundation Level Certificate obtained by passing the relevant Foundation Level exam. The possession of the Foundation Level Certificate may have to be proven by the candidate. Advanced Level Test Manager certificate is valid for life. Advanced Level Test Manager Exam is based on Multiple-Choice questions. Each Multiple-Choice question consists of a question stem and a list of possible options solutions and can be either single-answer or multiple-answer Pick-N. Advance Level exams may require the candidate to consider scenarios and several questions in the exam may be based on one scenario. A single-answer type question has 4 suggested options of which one and only one is correct. For a Pick-N type question the candidate is presented with 5 suggested options with 2 correct options to be selected. The Test Manager Exam has 65 questions and a 180 minute duration 225 for non-native English speakers. The minimum passing score is 75/115 points 65. Each exam is supervised by an authorized proctor who will distribute printed material or make material available electronically. The candidate cannot use any study material or use electronic devices not supplied by the Exam Provider. Only plain paper and simple non-programmable calculators are allowed. Candidates must return all notes and papers to the proctor before leaving the exam room. ISTQB ASTQB

slide 36:

Exam Preparation This book is my representation of the ISTQB® Syllabus and Glossary. I strongly recommend that you use them as a minimum besides this book. Other suggested preparation material may be found in the "Reference" chapter of each syllabus or in the References section of the ISTQB® Website. I recommend that you do all the ISTQB® Exercises listed below as soon as each chapter is done and cross check the results versus the answer sheet and syllabus. The ASTQB® Exercises should be kept aside for a full exam simulation. ISTQB You can download all the ISTQB® sample questions answer sheets and other documentation from www.istqb.org Another useful website is www.astqb.org but I would leave the execises from there for an exam simulation. Exam questions listed by chapter 1 2 3 ISTQB® Foundation 2011: 3 ISTQB® Foundation 2018 Exam A: 7 8 ISTQB® Foundation 2018 Exam B: 6 7 ISTQB® Foundation 2018 Exam C: 6 ASTQB® Foundation Exam 1: 9 10 11 12 13 ASTQB® Foundation Exam 2: 9 10 11 12 13 ISTQB® Advanced Test Manager: 1 2 3 4 5 6 7 8 9 ASTQB® Advanced Test Manager: 1 2 3 4 5 6 7 8 9 10 11 12 13 14 ISTQB® Foundation 2011: 14 15 ISTQB® Foundation 2018 Exam A: 14 15 16 17 18 ISTQB® Foundation 2018 Exam B: 14 15 16 17 18 ISTQB® Foundation 2018 Exam C: 14 15 16 17 ASTQB® Foundation Exam 1: 14 15 16 17 18 ASTQB® Foundation Exam 2: 14 15 16 17 18 ISTQB® Advanced Test Manager: 31 32 33 34 35 ASTQB® Advanced Test Manager: 40 41 42 43 44 45 ISTQB® Foundation 2011: 36 ISTQB® Foundation 2018 Exam A: 38 ISTQB® Foundation 2018 Exam B: 38 ISTQB® Foundation 2018 Exam C: 30 ASTQB® Foundation Exam 1: 37 ASTQB® Foundation Exam 2: 37 ISTQB® Advanced Test Manager: 36 37 38 39 ASTQB® Advanced Test Manager: 46 47 48 49 ASTQB

slide 37:

ISTQB Exam Preparation GOOD LUCK 5 4 6 7 ISTQB® Foundation 2011: 29 30 31 32 33 34 35 36 ISTQB® Foundation 2018 Exam A: 30 31 32 33 34 35 36 37 38 ISTQB® Foundation 2018 Exam B: 30 31 32 33 34 35 36 37 38 ISTQB® Foundation 2018 Exam C: 30 31 32 33 34 35 36 37 38 ASTQB® Foundation Exam 1: 30 31 32 33 34 35 36 37 38 ASTQB® Foundation Exam 2: 30 31 32 33 34 35 36 37 38 ISTQB® Advanced Test Manager: 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 ASTQB® Advanced Test Manager: 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 ISTQB® Advanced Test Manager: 51 52 53 54 55 56 ASTQB® Advanced Test Manager: 54 55 56 57 58 ISTQB® Foundation 2011: 37 38 39 40 ISTQB® Foundation 2018 Exam A: 39 40 ISTQB® Foundation 2018 Exam B: 39 40 ISTQB® Foundation 2018 Exam C: 39 40 ASTQB® Foundation Exam 1: 39 40 ASTQB® Foundation Exam 2: 39 40 ISTQB® Advanced Test Manager: 46 47 48 49 50 ASTQB® Advanced Test Manager: 50 51 52 53 ISTQB® Advanced Test Manager: 40 41 42 43 44 45 ASTQB® Advanced Test Manager: 59 60 61 62 63 64 65

authorStream Live Help