# Tool support for test execution and logging ###### tags: `ISTQB` `SQA` `Test tools` ### Test comparators Test comparators compare the contents of files, databases, XML messages, objects and other electronic data formats. **This allows expected results and actual results to be compared.** ==They can also highlight differences.== They often have functions that **allow specified sections of the file, screen or object to be ignored or masked out.** This means that a date or time stamp on a screen or field can be masked out because it is expected to be different when a comparison is performed. :::success 有些比較的項目,可能只是日期不同,這種時候有ignored or masked out的功能就可以排除類似這種狀況 ::: **Comparators are particularly useful for regression testing since the contents of output or interface files should usually be the same.** Comparators are **usually included in test execution tools.** ### Test execution tools Test execution tools allow test scripts to be run automatically (or at least semi-automatically). A test script (written in a programming language or scripting language) is used to navigate through the system under test and to **compare predefined expected outcomes with actual outcomes** The results of the **test run are written to a test log.** Test scripts can then be **amended and reused to run other or additional scenarios through the same system.** These utilities may include: - configuring the script to identify particular GUI objects; - customising the script to allow it to take specified actions when encountering particular GUI objects or messages; - parameterising the script to read data from various sources. #### Record (or capture playback) tools Record (or capture playback) tools **can be used to record a test script and then play it back exactly as it was executed** unexpected results or unrecognised objects. - When the script was recorded, the customer record did not exist. When the script is played back the system correctly recognises that this customer record already exists and produces a different response, thus causing the test script to fail. - When a test script is played back and actual and expected results are compared a date or time may be displayed. The comparison facility will spot this difference and report a failure. - Other problems include the inability of test execution tools to recognise some types of GUI control or object. This might be able to be resolved by coding or reconfiguring the object characteristics (but this can be quite complicated and should be left to experts in the tool).