# OBSOLETE - [WebGPU CTS](https://github.com/gpuweb/cts) Test Plan This document formerly held all of the unimplemented test plans. These have been moved into the gpuweb/cts repository. **Further changes should be made there. The guidelines below are outdated.** --- --- --- --- --- ## Contents of this document Everything in this document (once it is up-to-date with these guidelines) should be **unimplemented**. Once tests are in the process of being implemented, the plan **moves** to the `description` of a `.spec.ts` file (or to a `README.txt` file) in the CTS. This prevents multiple sources of information from getting out of sync. The contents of the `description`s and `README.txt` files appear as text in the CTS's standalone runner. The test implementation To-Do list is: - Anything in this document. - Any testing labeled TODO in the CTS. ## Test planning and implementation workflow ### Lifetime of new test plans For anything in the spec/API/ToC that is not currently covered by the test plan. Note that (the completed portions of) the initial version of this document, is based on parts of the planned [Table of Contents](https://github.com/gpuweb/gpuweb/wiki/Table-of-Contents) as of ~2020-08. 1. Test plan is added to this document. 1. Test plan is reviewed (reviewer should be the test implementer if different from the test planner). 1. Tests are implemented\*. ### Lifetime of test plan changes to match spec changes For changes that come through the specification process. TODO: Rename the columns to "Needs Test Plan" and "Specification Done". 1. Spec changes go through the [WebGPU spec project tracker](https://github.com/orgs/gpuweb/projects/1). 1. Once they reach the "Needs Test Plan" column, they can be added to this document (or directly into the CTS) and then moved to the "Specification Done" column. - Some features may have tests written before their specification is complete. If they are still in the "Needs Specification" column just make sure there is a note on the issue that tests are being written, and make sure any spec changes get reflected in the tests. - If adding to an unimplemented test plan in this document, just edit the test plan directly. If adding to a test plan that has already been moved into the CTS, either: - Leave a TODO in this document describing the changes needed, or - Open a PR leaving a TODO in the test description. 1. Test plan is reviewed (if needed). 1. Tests are implemented\*. ### Lifetime of additions to existing test plans For any new cases or testing found by any test plan author, WebGPU spec author, WebGPU implementer, WebGPU user, etc. For example, inspiration could come from reading an existing test suite (like dEQP or WebGL). These may have been filed as issues against the CTS repository on GitHub. 1. Plan is added to this document (or directly into the CTS). The issue, if any, can be closed so it's tracked in just one place. 1. Test plan is reviewed (if needed). 1. Tests are implemented\*. ### Requesting a review **To request a review on test plan edits, notify @kainino0x in a HackMD comment, or someone else via GitHub or email.** If comfortable, feel free to begin on test implementation before review; most review comments will just suggest additional corner cases or generalizations. ### \* Implementing tests 1. When a test implementer picks up any item in this document, they **move** it into the test description as part of their PR, leaving a note that it is in progress. - If not implemented in the same PR, the test plan item must be marked as "TODO" in the test description. 1. Once the PR is open (may be draft), update this document to **link to the PR.** ## Conventions ### Terms/Syntax - `Iff`: If and only if - `x=` or <code>&times;=</code>: cartesian product (for combinatorial test coverage) - Sometimes this will result in too many test cases;simplify as needed during planning or implementation. - `{x,y,z}`: list of cases to test - Control case: a case included to make sure that the rest of the cases aren't missing their target by testing some other error case. ### Validation tests Validation tests check the validation rules that are (or will be) set by the WebGPU spec. Validation tests try to carefully trigger the individual validation rules in the spec, without simultaneously triggering other rules. Validation errors *generally* generate WebGPU errors, not exceptions. But check the spec on a case-by-case basis. Test parameterization can help write many validation tests more succinctly, while making it easier for both authors and reviewers to be confident that an aspect of the API is tested fully. Examples: - [`webgpu:api,validation,render_pass,resolve:resolve_attachment:*`](https://github.com/gpuweb/cts/blob/ded3b7c8a4680a1a01621a8ac859facefadf32d0/src/webgpu/api/validation/render_pass/resolve.spec.ts#L35) - [`webgpu:api,validation,createBindGroupLayout:bindingTypeSpecific_optional_members:*`](https://github.com/gpuweb/cts/blob/ded3b7c8a4680a1a01621a8ac859facefadf32d0/src/webgpu/api/validation/createBindGroupLayout.spec.ts#L68) Use your own discretion when deciding the balance between heavily parameterizing a test and writing multiple separate tests. ### Operation tests Operation tests test the actual results of using the API. They execute (sometimes significant) code and check that the result is within the expected set of behaviors (which can be quite complex to compute). Note that operation tests need to test a lot of interactions between different features, and so can become quite complex. Try to reduce the complexity by utilizing combinatorics and helpers, and splitting/merging test files as needed. These should be `GPUTest`s, so any validation errors are test failures. These tests should not explicitly try to trigger validation errors, but could choose to do so if it simplifies the combinatorics. When it's easier to write an operation test with invalid cases, use `ParamsBuilder.filter`/`.unless` to avoid invalid cases, or detect and `expect` validation errors in some cases. **Default value tests** (for arguments and dictionary members) should usually be operation tests - all you have to do is make sure the behavior with `undefined` has the same expected result that you have when the default value is specified explicitly.