owned this note
owned this note
Published
Linked with GitHub
# uc-0009: Use Case Testing form
**Reviewer**: Saranya Canchi
**OS (including version)**: macOS Big Sur v11.2.1
**Browser (including version)**: Safari v14.0.3
**Use case**: [uc-0009](https://use-cases.nih-cfde.org/uc-0009/)
**Review type**: <manual or Selenium> manual
**Review Date**: 02/17/2021
**Role groups**: NIH CFDE 4DN Reviewers
# Use Case Test
<details><summary>Instructions</summary>
<p>
I would like each use case checked by at least two people. Preferably with a mix of browsers and OS, so that we have a better chance of spotting potential bugs.
1. Choose a use case that you will validate
2. Copy the text from the next comment into a new document
3. Follow the use case, filling out the document as you go
4. If you encounter one of the Quick Tests, check that it is right and check it off. If you don't encounter it as part of your use case, leave it blank. If it doesn't work, add some text explaining the problem
5. When you are done with your use case, post your filled form as a comment in this thread
I recommend starting by looking at the Quick Tests section and seeing which ones will be part of your use case so you can check them as you go instead of backtracking at the end
</p>
</details>
## Use Case Description
**1. Evaluate the description.**
- Does this description make sense?
- Yes.
- Does it sound like a useful thing to do?
- Yes
- Are there any corrections that should be made (spelling, grammar, etc)?
**2. Try to complete the steps as they are described for the persona in the use case.**
<details><summary>Instructions</summary>
<p>
For each step record:
- the specific action you took, for e.g. I clicked on 'leg' in the 'anatomy' filter at [this web address]()
- Whether that action was possible/worked
- Whether the *results* of that action are as described
- and if they are not as described, how they differ
- Any other comments you have, or things you were surprised about. Be specific!
Copy the lines below as many times as needed for your use case
</p>
</details>
---
Action: **[t-0017](https://use-cases.nih-cfde.org/t-0017/)** - accessed staging portal website: https://app-staging.nih-cfde.org and sign in using ORCID.
- [x] Worked
- [ ] Results as expected
- If not, why not:
- Other Comments
---
Action: **[t-0024](https://use-cases.nih-cfde.org/t-0024/)** - clicked on "Data Review" tab which lists the submitted datapackage entries
- [x] Worked
- [ ] Results as expected
- If not, why not:
- Other Comments
---
Action: **[t-0025](https://use-cases.nih-cfde.org/t-0025/)**
**[t-0026](https://use-cases.nih-cfde.org/t-0026/)**
**[t-0027](https://use-cases.nih-cfde.org/t-0027/)**
- [ ] Worked
- [ ] Results as expected
- If not, why not:
- Don't know what this means and cannot find anything with the words Review Catalog
- Other Comments
---
Action: **[t-0029](https://use-cases.nih-cfde.org/t-0029/)** - clicked on Summary Charts
- [x] Worked
- [ ] Results as expected
- If not, why not:
- Other Comments
---
Action: **[t-0030](https://use-cases.nih-cfde.org/t-0030/)**
**[t-0031](https://use-cases.nih-cfde.org/t-0031/)**
- [ ] Worked
- [ ] Results as expected
- If not, why not:
I didn't find any documentation or tutorial for data submission and assumed this is a future feature that is a currently absent at the time of this testing.
- Other Comments
---
Action: **[t-0028](https://use-cases.nih-cfde.org/t-0028/)**
- [ ] Worked
- [ ] Results as expected
- If not, why not:
I clicked around enrolled DCCs and groups but couldn't find any option to enable notification of collegaues on the same team as mentioned in the use case.
- Other Comments
---
## Tasks for this use case:
1. Based on the description you walked through, does this list of tasks make sense? If not, why not? Are there missing tasks? Unused tasks? Task descriptions that don't quite match the workflow? Be specific both about which tasks and their specific problems.
2. **OPTIONAL (if not already addressed above):**
Check whether each general task works, regardless of whether the specific instance described in the description works.
<details><summary>Instructions</summary>
<p>
For each task record:
- the specific action you took, for e.g. I clicked on 'leg' in the 'anatomy' filter at [this web address]()
- note that tasks are generally broader than the description, so you likely will need to do more than one action to test it
- Whether that action was possible/worked, i.e. was it technically possible to do?
- Whether the *results* of that action are what you expect, i.e. did it 'work' in the way a user would want
- and if they are not as described, how they differ
Copy the lines below as many times as needed for your use case
</p>
</details>
Action:
- [ ] Worked
- [ ] Results as expected
- If not, why not:
- Other Comments
Action:
- [ ] Worked
- [ ] Results as expected
- If not, why not:
- Other Comments
## Requirements for this use case:
1. Based on the description you walked through and it's tasks, does this list of requirements make sense? If not, why not? Are there things you needed but are not listed as requirements? Unused requirements? Requirement descriptions that don't quite match the workflow? Be specific both about which requirements and their specific problems.
2. **OPTIONAL (if not already addressed above):**
Check whether each requirement works, if possible, regardless of whether the specific instance described in the description works.
<details><summary>Instructions</summary>
<p>
For each requirement record:
- the specific action you took, for e.g. I clicked on 'leg' in the 'anatomy' filter at [this web address]()
- note that requirements are very broad, so you may need to do more than one action to test it
- if you can't find a way to test the requirement, record that and why
- Whether that action was possible/worked, i.e. was it technically possible to do?
- Whether the *results* of that action are what you expect, i.e. did it 'work' in the way a user would want
- and if they are not as described, how they differ
</p>
</details>
Action:
- [ ] Was not testable
- [ ] Worked
- [ ] Results as expected
- If not, why not:
- Other Comments
Action:
- [ ] Was not testable
- [ ] Worked
- [ ] Results as expected
- If not, why not:
- Other Comments
# Overall
What difficulties did you encounter while completing your use case?
- Unsure of what to click and what some of the terms mean for example what is a Review Catalog.
Did you see any spelling, grammar or similar mistakes on any resource you visited in completing your use case?
What other comments or questions do you have about your use case?
- While still in development, a lot of the interface links and pages seem excessive and clicking through them makes the user feel like going in a loop. Some of the summary pages are not useful as they show no new information but the user is directed to a new page.
- In the summary charts page all the links in the "Data Review" box lead to the same faceted view of Subject irrespective of which option if choosen under "Data Breakdown". Not sure if this is a bug or default behavior. If latter, I would suggest making the links open the page in the respective faceted views like Files, Biosamples etc. instead of Subject alone.
- In the Edit Submitted Datapackage page clicking on the DCC Approval Status options page, it would be useful to have the the boxes under Select column be a simple box instead of the icon consistent with the portal options. I clicked the blue links under ID column at first which redirects the user to a new page.
What other comments or questions do you have about any of the resources you visited?
What feedback do you have about this form/testing process?
# Quick Tests
Complete test if it is encountered as part of your use case.
- If test works/work is complete check the box.
- If you don't encounter the test during your use case, leave it blank
- If test does not work/work is not complete, note the issue. Be specific!
[Link to QA screens for reference](https://drive.google.com/file/d/11-SVyGzTsKy5Ke8o6s_lFE6LCF_BrJ2a/view)
Home page
- [x] Download button style now matches wireframe
- [x] chart in upper right corner reflects data
- [x] Color Palette is updated
Dashboard
- [ ] "Select Data view" box present to show which dashboard graphs are available
- [ ] Download button style now matches wireframe
- [ ] Timestamp for data missing
DCC Review
- [x] Numbers have links
- [x] Scroll bar in Data Review table
Registry
- [ ] [Spelling correct](https://github.com/nih-cfde/cfde-deriva/issues/131)
Navbar
- [ ] Bolded option in navbar when page is selected
- [x] Log out button styles
- [x] Locks next to Dashboard and Data Review links missin
- [x] Color Palette
If you are a Reviewer or Submitter, please try the following and document steps with screenshots (including computer clock):
- [x] On the "Submitted Datapackage" page, I can click the pencil icon
![](https://i.imgur.com/6URIgKJ.png)
- [ ] I can edit 1 or more field(s) for the Submitted Datapackage - please specify which field(s)
- The Description field is available to add but it doesn't let me save the value and results in 403 error
![](https://i.imgur.com/3BLIotV.jpg)
- [ ] I can change the "DCC Approval Status"
- I am able to change the value but unable to save the selection and get 403 error as above
![](https://i.imgur.com/9No0AZh.jpg)