Friday, April 11, 2014

MVA Course: Software Testing with Visual Studio 2012 Jump Start (Modules 4a & 4b)

Manage Test Execution


These modules mostly covered running tests and managing work items.


Certificate of Completion, go me!



Notes:

*******************************************************************************************
MODULE 4a
*******************************************************************************************

Run Tests
- running tests with options
- creating Fast Forward for Manual Testing recordings
- selecting Fast Forward Playback options (all steps, multiple steps)
- setting Test Run states
- validating expected results (add comments, snapshots)

The problem
- When a tester finds a bug it is very common practise for the tester to repeat the exact
  same steps often two or three times to verify the bug and prove it is reproducible
// important that a bug is reproducible. If a bug can't be reproed, then the developers
// can't address it.
- This consumes significant amounts of a testers time and often involves repeating large
  numbers of steps

Test Case – Fast Forward
The idea behind fast forwarding test cases is as follows.
- When a tester runs a manual test, the UI actions are recorded in the background
- When the tester wants to re-run the test case to verify the bug, the tester can choose
  to automatically rerun as many of the recorded steps as necessary to take them quickly to the area they are most interested in
- Action recordings are attached to the test case allowing automated playback in the future

DEMO
// if doing a test many times, might use lightweight diagnostics. If we believe we have
// found a bug, we can rerun the test with heavier diagnostisc.
// explain adding comments and attachments to test run
// can specify the state of the test run (passed, failed, paused, blocked, NA)
// possible to connect to environments or take environment snapshots.
// comments can also be added to a particular step (as opposed to the entire run

Perform Exploratory Testing
- perform ad hoc Exploratory Testing
- exploring by Work Item
- generating Test Case from Test
- generating bugs from Exploratory Testing
- adding screenshots, video, or audio recording

Exploratory Testing in MTM
- When you want to commence an exploratory testing session you have the option to link any
  results to a particular requirement or not.
// sometimes refered to as ad hoc testing
// automated tests can "run ruts in your road"... nobody validates that the right things
// are being addressed.
// can automate a test in a bad UI and because they don't have to really deal with the
// UI too much, maybe there is no feedback that the UI is bad.
// Devs don't necessarily make good testers, Exploratory testing gives very quick feedback
// There is value to approaching the problem different ways
// Big challenge for exploratory testing is reproducing a bug, since it is a less
// structured approach to testing. Writing down all the steps taken can consume a gread
// deal of time
// possible to start an exploratory test by right clicking a requirement and selecting
// "explore"

DEMO
// can just go "explore" without a requirement, or can "explore" a particular work item.
// we can explore with options, which lets us change the build config, test settings, and
// environment.
// With exploratory testing, no test steps (which is the point)
// bugs identified with exploratory can fix problems AND lead to test cases being created
// that can be used for regression testing.
// creating a bug will automatically capture the steps that were taken during testing.
// we can save bug and create test case.  Steps captured for bug are automatically turned
// into test steps.
// Exploratory testing session is recording, can lead to multiple bugs and test cases
// being created.

EXAM BEST BETS
Practice executing test cases
Become familiar with test case fast forward
Make sure you are familiar with the Test Runner UI and it’s capabilities
Understand exploratory testing


*******************************************************************************************
MODULE 4b
*******************************************************************************************

Manage bugs
- tracking bug metrics (bug trends, status)
- verifying bugs (create Test from bugs)
- analyzing bug reports
- managing bug workflow

DEMO
// when we find a bug during the test, we can select "create bug" which will create a bug
// in TFS. On the first run Scott used lightweight diagnostics, which didn't include video
// recording.
// The bug will be related to the bug, and the test case will link to the bug
// If there is not already a test case, we can create a test case from the bug in MTM
// For a resolved bug, we can select "Verify" to confirm that bug was resolved.
// If the failing test step passes, and results in a passing test, the bug can be closed
// as verified. The test case state is still ready.
// If resolving the bug resulted in code changes, we can track which build the bug fix was
// integrated into.
// "State" is different for Agile, Scrum, and CMMI

Use Lab Center
- creating new environments
- creating copies of environments
- running Tests on remote environments
- executing Test Case in a Lab Center Environment

LAB MANAGEMENT AUTOMATION
A lab management automation solution reduces the cycle times to manage and provision
the test environments needed to enable continuous acceptance testing. A lab management
automation solution should include capabilities to:
1. Store and manage baseline templates for the needed test environments
2. Provision environments on demand using the environment templates
3. Provision environments to on-premise bare metal, private cloud, and public cloud
   infrastructures
4. Snapshot environments when defects are encountered and needed to be reported
5. Provision environments from snapshots to enable defect reproduction and resolution
See the Test lab management scenario deck in the Quality Enablement BOM for more
information on the related capabilities of the Microsoft ALM solution.
// Environments include dev environments, testing environments, and repro environments
// Lab management services include environment templates that we can provision on the go
// and create snapshots.
// Environment provisions are used to create new environments for development and
// testing. The test environment can then take a snapshot that can be used for bug
// repro

CONTINUOUS BUILD-DEPLOY-TEST
Modern applications need to be deployed and tested in multiple environments.
Whether you store your test environments on bare metal or in the cloud, Lab Manager
enables automation and increases collaboration, reducing the build-deploy-test cycle time.
Talk Track:
With or without SCVMM, Lab Center lets you create and manage your testing environments
from a single location. Environments can be stored on local metal, or virtualized on site
or in the cloud. Maintain multiple realistic environments so that your application can
be quickly deployed and tested under the configurations you expect your users to use.
// we can grab and creates labs from VM templates in SCVMM. Combine them to create
// environments.

AUTOMATION ENGAGE:
When developers and testers collaborate, quality wins.
Developers build the application, deploy it to an environment and run automated test
cases in a single step, preparing the way for testers.
Talk Track:
With Lab Manager, your deployment cycle times can be reduced in step with your
development and testing cycle times. Automating your deployment pipeline will reduce
manual errors, decrease cycle times, and increase quality.

MANAGE ENVIRONMENTS
Testers spin up environments to run manual tests, and test agents record all the data
required to recreate any bugs.
Snapshot environments and attach to bug reports so developers can easily reproduce in the
exact environment where the defect was found.
Talk Track:
Whether it is bare metal machines, or multi-part SCVMM environments, Lab Manager helps
you manage your test environments from a single location. Testers can quickly find the
right environment to test on, reducing hand-off times and decreasing manual deployment
defects. The ability to take environment snapshots when bugs are found significantly
reduces the likelihood of “no-repro” errors.

INTELLIGENT ANALYSIS
Enable data collectors and generate detailed test results to ensure any bugs that are
found are actionable.
Team Foundation Server keeps track of all the pieces, keeping your entire team on the
same page and reducing time to repair.
// test runs are attached to a particular build

DEMO
// can run tests on another device remotely, just need the device name and port (:6905 is
// default)

Analyze Recommended Tests
- selecting the build in use
- comparing the current build to a previous build
- viewing Recommended Tests
- analyzing related Work Items

DEMO
// when we select a build in use, we can select "Available builds" and see the work items
// that fall between the two. Just gotta make sure we aren't filtering out all the old
// builds.
// this requires "Test Impact Analysis" to work, plus ASP.NET client proxy for intellitrace
// on the local level.

Perform analysis
- analyzing reports (Requirements-User Stories)
- analyzing by Test Suite
- analyzing by configuration
- identifying areas where quality is low
- identifying Test Plan status
// multiple types of reporting. Excel, MTM, and SQL Server reports

SOFTWARE TESTING WITH VISUAL STUDIO
Software testing with Visual Studio 2012 is focused on testing all aspects of your
application.
1) Test planning and management, manual testing and exploratory testing are handled
   from Microsoft Test Manager.
2) Automated testing, unit testing, performance and load testing, and code analysis
   are handled from the Visual Studio IDE.
3) All aspects of the testing lifecycle are monitored and managed with Team
   Foundation Server, enabling up to date reporting across the entire project.
4) Extensibility points enable 3rd party solutions, customizable build engines and
   complex lab environments.
This module focuses on Lab Management, for more information on other aspects of the
Visual Studio testing solution, see the applicable materials contained within this
package.
// users, solution managers, devs, testers, ops, and stakeholders all feed quality reports
// into a datawarehouse. This datawarehouse is used to produce cross project reports on
// Stories Overview, Test Failure Analysis, Bug Reactivation, Bug Trends, and Custom
// SSRS/SSIS
// users - action feedback, exploratory testing
// solution managers - requirements management, acceptance criteria definition
// devs - dev testing, build automation, actionable diagnostics
// testers - test planning and management, test execution and defect reporting, lab
//           management
// operations - production monitoring, integrated incident management
// stakeholders - quality metrics and reporting.

MONITORING QUALITY METRICS
Every aspect of the development process feeds data into a data warehouse, and is then
available for reporting. You can choose from detailed pre-built reports, create ad-hoc
reports with MS Excel, or build fully custom reports with SSRS/SSIS.

VISUALIZE QUALITY
The build report is a barometer of quality and a harbinger of success or failure.
As the development cycle nears the end, defects and failed builds should trend towards
zero. Build quality reports provide teams clear insight into the development process and
can indicate where teams need to focus their efforts.

RIGHT OUT OF THE GATE
Waiting until the end of a cycle to address defects can adversely impact schedule
and resources.
Desired ATDD behavior should see a high spike in failing acceptance tests early in the
cycle with the spike tapering down as the cycle progresses indicating that tests are
beIng run early and uncovering defects early on.
Talk Track:
In an agile and acceptance test driven development cadence, the desired behavior is to
see a high spike in failing acceptance tests early in the development cycle, and the
spike tapering down as the cycle progresses – this indicates that the team is practicing
ATDD with tests that validate desired acceptance criteria and quality outcomes, being
implemented and run early (prior to and/or as code is developed) to surface and resolve
acceptance requirements as early as possible in the app lifecycle, resulting in overall
shortened cycle times and mitigating the higher costs of detecting unmet criteria later
in the lifecycle.
// reports allow you to see progress and project trends

BUG BASHING:
Development teams can use defect tracking reports to estimate levels of deliverable code
quality and to provide insight into a team’s ability to address defects.
A defect should ideally be detected and solved once, with a corresponding test
implemented to ensure the fix addresses the desired behavior. Bug reactivations should
remain close to zero.
Talk Track:
A defect once detected and resolved, should ideally not re-surface. Monitoring defect
regressions to detect opportunities to optimize practices to keep at the lowest possible
is the story to land for this – essentially, when fixing a reported bug first implement a
test to ensure that the fix addresses the desired behavior and instate the test as a
quality gate to prevent regression.
The bug reactivations report can be used to evaluate how effectively a team is addressing
defects and the Test failure analysis allows the team to monitor defect regression based
on test case failures.
// do we have regression issues, reactivated bugs? Reactivations are a symptom of poorly
// described bug (or a lazy dev lol...)

FILLING IN THE GAPS
The only thing constant is change and code that changes must be tested.
To ensure your code base is fully covered by test leverage the code coverage report and
identify your testing gaps.  Establishing a code base fully covered by tests ensures new
changes will not break your code!
// line by line code coverage reporting from manual tests

I CAN SEE CLEARLY NOW
A picture is worth a thousand words and a Stories Overview report gives that big picture!
The Stories Overview gives a comprehensive view of each implemented user story including
completion status, test results and bug status. A thousand words, indeed!
Talk Track:
Quality enablement is a full lifecycle undertaking. Teams should be able to trace
progress against the needed implementation and quality outcomes of business value
priorities, across every phase of the software lifecycle.
The stories overview and integrated progress report is a great way to land this storyline
// stories overview report. tells us hours remaining, how many tests have been run, passed
// or failed, and bug status.  If we have failing tests, we should probably have bugs
// reported. Also shows recent hours logged (so say whats been worked on the last three
// days)

DEMO
// different reports based on the process template used (Agile, Scrum, CMMI).
// test plan progress report on MSDN
// excel reports: agile and cmmi are nearly identical. Scrum there are pretty much none.
// Can take some agile reports and move them over to Scrum.
// http://msdn.microsoft.com/en-us/library/dd997876.aspx - Excel Reports
// reports are a big part of exam
// MTM Testing Center - Plan - Results -> can filter test results by test suites or test
// configurations

Manage Work Items
- validating requirements
- Work Item relationships (eg, what it means when a test case is associated with a
  requirement)
- creating Work Item queries
- performing bulk updates in Microsoft Excel

DEMO
// relationship of a bug to a test case to a user story
// test case is related to bug and to user story
// if we open bug, it is also related to test case and user story.
// bug related to user story by "Related to"
// bug related to test case by "Tested by"
// in the test case, user story is related by "Tests"
// we can create a work item query to tell us our requirements (user story) and bugs
// we can do this in visual studio or in the web interface.
// @Project = current project, @Me is current user
// Work Item Type = User Story OR Bugs. Need to group the clauses to ensure the query
// behaves as expected.
// @Today is todays date in days. You can subtract to knock off days. @Today - 7 is a week
// ago.
// Queries can be used to filter test cases in Test Case Manager
// Not equal is <>
// Bulk updates can be done in Excel. Select all the work items in Visual Studio,
// select "Open in Excel", make the bulk updates, and "Publish"
// MTM and VS can't do bulk edits natively. Web platform can do bulk updates.
// Edit test steps by round tripping to Excel...
// Ctrl-C, Ctrl-V from MTM into Excel, same to go back. Test cases cannot be opened
// natively in Excel
// Work Item Query - filter by test suite or test plan... No, these are not work items.
// However, all those relationships are in the datawarehouse and can be pulled from there
// Can you add state or status to defects? You can extend work item types (part of Admin)
// Manage defects across multiple test environments? If you find a bug in one config, you
// have to rerun the test in every other environment. Can have test plans for different
// environments rather than test configurations.
// Clarification: Same test plan, Test and Prod environments. Might use different states to
// record the level of validation
// If you want a carriage return in a test step, use Alt-Enter

EXAM BEST BETS
- Become very familiar with work items, work item queries, and how work items are related
- Understand the lifecycle of a bug and its associated test case
- Know what Excel reports are available for each process template
- Know what SSRS reports are available for each process template
- Understand the basics of the Lab Center tab, including environments and an overview of
  Lab Management

No comments:

Post a Comment