Friday, March 21, 2014

MVA Course: Software Testing with Visual Studio 2012 Jump Start (Modules 1 & 2)

Create and Configure test plans

Pretty heavy on the demos, which was a bummer because I don't have Hyper-v to run the VM and I don't have the permissions yet to start creating a bunch of stuff locally. Eh, here's my notes:


*******************************************************************************************
MODULE 1 & 2a
*******************************************************************************************

Software testing with Visual Studio 2012 is focused on testing all aspects of your 
application.

- Test planning and management, manual testing and exploratory testing are handled from 
Microsoft Test Manager.
- Automated testing, unit testing, performance and load testing, and code analysis are 
handled from the Visual Studio IDE.
- All aspects of the testing lifecycle are monitored and managed with Team Foundation Server, 
enabling up to date reporting across the entire project.
- Extensibility points enable 3rd party solutions, customizable build engines and complex 
lab environments.
// found out if you pay attention to the slide notes, they have stuff for the graphics.

This module focuses on Lab Management, for more information on other aspects of the Visual 
Studio testing solution, see the applicable materials contained within this package.

// this exam is less theory, more demo driven
// this course aimed primarily at testers, somewhat at devs
Create test plan properties.
- selecting test settings
- selecting configurations
- defining name, description, area, path, and iteration
- selecting test environments
- assigning Build to Test Plan

Demo 
- selecting test settings
- selecting configurations
- defining name, description, area, path, and iteration
- selecting test environments
- assigning Build to Test Plan
// looking at test plan in MTM (Microsoft Test Manager)
// creating another iteration with new test plan, configure with "properties" tab
// state and state transitions are very important. If a test plan is inactive, that 
// test plan will not be executed
// You need to have TFS in order to use MTM
// Anthony compares the relationship of MTM to TFS to the relationship between Outlook
// and Exchange
// Can use different test environments for manual runs and automated runs
// hitting "manage" on the environment takes us to the Lab center tab, where we can 
// create a new environment. Startard environments do not need SCVMM (System Center
// Virtual Machine Manager).
// Can add as many machines as we want, specify roles and tags. Tags are used to determine
// which test cases to run.
// UI test can be configured with a user agent to use (browser)
// Verify that the test controller is configured. Running tests requires a valid test 
// controller. Start->Run->test controller configuration tool
// Can use this tool to select the account to run the controller with (local or Windows acct)
// Choose the TPC (Team Project Collection). Can also configure load testing
// Only need to set up a test controller once
// Couldn't get the refresh on his test controller, had to start over (just took a minute)
// Environment can't be saved without verifying configuration
// Once configured, it takes 5-10 minutes to create the environment, installing and 
// configurnig test agents. In 2010 this was a pain in the butt...
// Once the test plan is built, we select the build to use in the plan (like say we have
// a build deployed, we could use that for one test plan, and compare it to the newest
// development build.  Important for testers to know what has changed in the build
// releasenotes.txt can make many problems go away.
// haha Anthony just absent-mindedly gave everyone the bird
// When assigning a build to the test plan, it pulls in the work items from TFS assigned to
// that build, so the tester will know what has changed (and thus what needs to be most
// tested)
// Can use "test impact analysis" to pick up changes that weren't tracked in TFS. Pops up when a 
// build is assigned to a test plan. (Demo shows how MTM recommends which regression tests
// to run.)
// Won't pick up a change in underlying database. Stored procs and unmanaged code will also
// not be picked up automatically.
// "Test impact analysis" looks at the stack trace. If the methods change, it picks up this 
// change. Not line by line, which is "code coverage"

Configure test settings.
- creating multiple test settings
- selecting data and diagnostics
- setting up roles

Demo
- creating multiple test settings
- selecting data and diagnostics
- setting up roles
// In data and diagnostics, we can specify what we want to collect. includes:
// action log - collects UI actions
// ASP.NET Client proxy for intellitrace and test impact - web apps in IIS and .NET 2
// code coverage
// event log
// intellitrace
// system information
// test impact
// video recorder
// Can configure any of these, (action log) can specify which programs to exclude or include
// can capture user delay (obstesibly to see if there are long asynch calls... or to show
// how lazy the testers are.
// Playback settings helpful if website DOM is being manipulated
// Some things with a configure button don't have anything interesting to configure
// can configure event log to capture application, hardware logs, errors, failure audits, 
// information, success audits, and limit max entries
// system information captures OS patch level, resolution ect.
// Test impact lets you configure which modules and processes to capture information from
// video recorder captures a video recording as the test runs. Always saved for failed test
// runs, can keep passed test recordings too, but not recommended. Vid quality doesn't
// need to be great.
// pick up code coverage on the web server side.
// Intellitrace has a lot of options. "Intellitrace events and call information" gives
// dev a lot of insight into errors. Select modules, processes, and which intellitrace
// events to record. Intellitrace uses a lot of storage, and you need VS Ultimate to read
// Intellitrace. 
// Video on web server probably would be boring...
// Can save test settings, so have a heavy capture test setting and a lightweight setting
// rule of thumb: every dev contributes a half meg per day. Tester contributes half a gig 
// per day. Size TFS server accordingly

Exam Best Bets
- Create a few different Test Plans
- Practice creating different Test Settings
- Understand the Data Collectors


*******************************************************************************************
MODULE 2b
*******************************************************************************************

// looking at builds in Visual Studio. Build explorer. They can assign a build quality
// sets a flag that MTM can use to filter builds.

Define configurations
- managing configuration variables
- setting default configurations
- adding new configurations
- setting configuration state
- deleting configurations

Why different configurations?
Different configurations exist
- SOE/MOE (Can you proactively test for upcoming releases?)
- Users will often have varying:
   - Browsers and browser versions
   - Operating systems
   - Architectures (x86 and x64)
- Server side software may have different:
   - Database versions (SQL 2005, 2008, 2008 R2)
   - IIS versions (6.0/7.0/7.5)
// don't necessarily want a huge number of configurations
   
Dealing with different configurations
- The traditional solution
   - A room with many PC’s around the outer wall with signs above them. Eg. Windows XP 
     Pro/IE7/x86/2Gb RAM
- A better solution
   - Multiple virtual machines running on a testers local machine so they can easily 
     switch between configurations
- A more ideal solution
   - Microsoft Lab Management running multiple network-isolated, test environments with 
     auto refresh and auto deploy of updated releases.
// Lab management takes care of a lot of the environment overhead, easy to maintain 
// environment state with snapshots
 
Tracking different configurations
Traditional approach
Test cases mapped to configurations in a matrix
// might have matrix in excel, cross checking which test cases to run in which 
// configuration. So you wouldn't test Aero support (introduced in Vista) in WinXP 
// configs.

Configurations
To manage your configurations, choose Organize in the menu then choose Test Configuration 
Manager

Deleting Test Configurations
- You can delete test configurations that you do not require.
- If a test configuration is referenced in a test result or is a default configuration for a 
  test plan or a test suite, you cannot delete that configuration. 
- Alternatively, you can change the state of the configuration to inactive so that it can no 
  longer be selected as a default configuration for a test suite or a test plan
// changing to inactive is a better choice, makes the test config invisible while maintaining
// historical information

Demo
- managing configuration variables
- setting default configurations
- adding new configurations
- setting configuration state
- deleting configurations
// using Keller VM, which requires Hyper-V to run... Hyper V requires a server OS or at
// least Windows 8 on the client.
// "Test Configuration Manager" in the testing center in MTM
// Friendly name of configuration doesn't necessarily need to describe the settings as long
// as it has meaning within your organization.
// "Manage Configuration Variables" lets you change which values show up in the configuration
// editor. So you can add different versions of IE to the browser list, add Windows 8 to the
// OS. We can also add a whole new variable, like "architecture" which might have x86, x84, 
// and ARM, or add a variable for SQL Server.
// Description can be added which is also a good idea
// once configuration is created, it can be added to a test plan as described in module 2a
// (same screen where you assign builds and change settings).

Create Test Suites
- creating query-based Test Suites
- creating requirement-based Test Suites
- creating static Test Suites
- copying Test Suites
- creating a Test Suite hierarchy
- assigning Test Suite states

// Anthony tries to draw on the screen lol
Different Types of Test Suites
- Requirement-based Test Suites
   - Groups test cases related to a specific requirement
   // name of test suite will probably include a number for a work item
- Static Test Suites
   - Groups test cases based on the tester manually adding test cases
   // arbitrarily choose which test cases to include in the test suite
- Query-based Test Suites
   - Groups test cases that match a filter the tester defines

Conveying test suite state information
Test Suites can be 
- In planning
- In progress
- Completed
Change state using
Plan Tab | Contents

Copying Test Suites
- Sometimes you need to create a test suite that is the same or similar to one you have 
  created in an earlier test plan.
- Microsoft Test Manager allows you to just copy the test suite to the current plan.
- Once copied you can add or remove test cases from this copied test suite as required, 
  without affecting the original test suite.
- Copying a test suite does not create duplicate copies of the test cases in your team 
  project. The existing test cases in the test suite that you are copying are just added 
  to the new test suite.
  // being able to copy test plans and suites is important when our cycle times are shorter
  // "pull" a copy into the current test plan/suite.
  
Cloning a Test Suite
- Cloning test suites by using tcm.exe creates new test cases in the destination test plan. 
- These new test cases are copies of the test cases in your source test plan. After the 
  copy, you can edit the test cases in either plan without affecting the other. 
- Cloning test suites is useful when you want to work on two differing releases 
  simultaneously. 
- The source and target suites must be in the same team project collection. (MTM RTM)

Summary - Copying vs Cloning
Copying Test Suites
Test Plan A              Test Plan B
  Test Suite               Test Suite
    Test Case 1  -copy-->    Same Test Case 1
Test Case 2  -copy-->    Same Test Case 2
// changes to the original test case will be reflected in the copied test cases (and 
// changing the copies changes the originals)

Cloning Test Suites
Test Plan A              Test Plan B
  Test Suite               Test Suite
    Test Case 1  -clone->    New Test Case 3
Test Case 2  -clone->    New Test Case 4
// changes to original test cases do NOT carry over to cloned test cases (or vis versa)
// copy points to he same test cases, cloning recreates the test cases

Demo
creating query-based Test Suites
creating requirement-based Test Suites
creating static Test Suites
copying Test Suites
creating a Test Suite hierarchy
assigning Test Suite states
// testing center is blue, lab center is green
// In testing center, we can create a new test suite. Requirements suite is seperated from
// static and query suites.
// In the requirements suite, we can add the requirement we want to test. Test cases created
// in this suite will automatically be linked to this requirement.
// Static suite is basically a folder, we drop in whatever test cases we want
// Query based suite lets us build up a query with and/or, fields, operators, and value
// can create a hierarchy of test suites (suites within suites)
// Part of regression tests may be a group of end-to-end tests
// Scott on a soap box on different default test configurations for test suites (as opposed
// to test plan). These are usually inherited from their parent (either a parent suite or
// the test plan.
// we can change the status of test suites to "in planning", "in process", or "completed"
// we can also create test suites by referencing existing test cases. We select other test
// plans, then select suites from those test plans. So if we create a new iteration, we 
// don't have to recreate all the old test suites from the old test plan, we can copy them
// forward.
// command line tcm suites /? -- powerful command line tool for managing test suites
// cloned test cases maintain a "link" to their parent test suite
// also possible to clone requirements
//  - requirements will link to their test cases.

Configure Test Suites
- assigning Testers
- selecting configurations
- ordering Test Cases within a Test Suite
- setting defaults

Assigned to
- The Assigned to field for Test Cases is different to the typical Assigned To for other 
  work item types such as Task.
- The person that writes a test case may not be the person who runs it.

Ordering Test Cases within a Test Suite
- You might want to list and run your test cases in a specific order.
- For example, you might want to run the simplest tests first, because if these tests failed 
  then the subsequent tests in your test suite would be blocked
// you can only order static suites
  
Demo
- assigning Testers
- selecting configurations
- ordering Test Cases within a Test Suite
- setting defaults
// select a test suite, then configure the test cases.
// "assigned to" in a test case is really the cases author
// In the test plan page, we can assign who is assigned to run the test
// test cases in a static suite are listed in the order they were created. Order change
// then be changed with "order"

EXAM BEST BETS
- Make sure you try a few different configurations
- Be mindful of idiosyncrasies of Test Case work items compared to other work item types
- Practice creating different Test Suite types

No comments:

Post a Comment