owned this note
owned this note
Published
Linked with GitHub
# 2021-01-07 DataONE Community Call
[![hackmd-github-sync-badge](https://hackmd.io/uLOuwR1bTWCZ_rqDsft6aQ/badge)](https://hackmd.io/uLOuwR1bTWCZ_rqDsft6aQ)
7 Jan 2021, 09:00 Pacific Time
## Zoom Link
https://ucsb.zoom.us/j/94309556242
## Participants
*(please add your name, affiliation, and email)*
- Amber Budden, NCEAS/DataONE, aebudden@nceas.ucsb.edu
- Karl Benedict, UNM/DataONE Community Board, kbene@unm.edu
- Lauren Walker, NCEAS
- Matt Jones, NCEAS/DataONE, jones@nceas.ucsb.edu
- Rachel Volentine, University of Tennessee
- Suzie Allard, University of Tennessee
- Amy Forrester, University of Tennessee
- Rob Crystal-Ornelas, Berkeley Lab
- Bill Corey, University of Virginia, wtc2h@virginia.edu
- Brian Westra, University of Iowa, brian-westra@uiowa.edu
- Carrie Iwema,
- Corinna Gries, iDigBio, cgries@wisc.edu
- Dave Vieglais, DataONE, dave.vieglais@gmail.com
- David A Watts,
- Emily Clark,
- Emmi Felker-Quinn, NPS
- Erin McLean, Arctic Data Center, NCEAS
- Hoo Jung Yoo
- Holly Chandler, USGS National Climate Adaptation Science Center
- Jeffrey Krebs,
- Jeremy Ash,
- Julian Gautier,
- Kay Bjornded,
- Kelly Stathis, Canadian Association of Research Libraries Portage Network
- Kevin Drees,
- Lauren Phengley,
- Michal McCullough,
- Nancy Voorhis, UVM-FEMC
- Nino,
- Paul Lemieux III,
- Randal Greene,
- Rob Crystal-Ornelas, ESS-DIVE
- Sara Katherine Lafia,
- Sarah Poon,
- Sophie Hou,
- Stephanie Ann Schmidt
- Stevan Earl, CAP LTER, Arizona State University
- Susan Ivey,
- Tania Schlatter, Harvard Dataverse, tschlatter@g.harvard.edu
- William Michener
- Nancy Sheehan, Journey North, UW-Madison Aboretum
## [User Experience to Enhance DataONE community](https://github.com/DataONEorg/community-calls/issues/2)
### Agenda
- Welcome and Logistics (Karl)
- Introduction to Community Calls and Future Topics (Karl) - [GitHub Issues Process](https://github.com/DataONEorg/community-calls/issues)
- Conversation about the current state of UX in the DataONE community and how we can increase data usage, improve user satisfaction and trust in repositories, and help data organizations make development decisions that save time and money. (Leaders: Rachel V, Amy F and Suzie A)
### Questions/Discussion Leads
* Poll - What do users think of your product?
- They're downloading so it must be good (2) 6%
- They tell us they like it (11) 35%
- We're their only source of data so it doesn't really matter (3) 10%
- I don't know (15) 48%
* Poll - What are all the ways in which UX testing can better support you rproducts and mission?
- Increase data / tool usage (22) 92%
- Better data discovery (23) 96%
- Increase data confidence and trust (16) 67%
- Imporve repository branding (18) 75%
- Reduce costs (15) 63%
- Support funding and sustainability (17) 71%
* Poll - What is the most important issues that UX testing could help with in your repository or organizaiton?
- Increase data / tool usage (17) 61%
- Better data discovery (8) 29%
- Increase data confidence and trust (2) 7%
- Imporve repository branding (1) 1%
- Reduce costs (0) 0%
- Support funding and sustainability (0) 0%
* In your development of Usability Testing approaches did you find that there are different levels of "intensity" - i.e. degree of user engagement, complexity of the interaction - for testing that could meet different needs?
Jamboard link: https://jamboard.google.com/d/1Q4nNlFP5XBKolg5aJaw9sqSLLv6-CaBci7BOrn69YNw/edit?usp=sharing
### Notes
PLIII: Feedback from your own organization is a cheaper option to explore to get that valuable feedback.
EFQ: Users told us the tool isn't being used because it's hard to understand. Learning about UX now, not UX professionals but scientists so learning as we go.
SA: Hearing from the users is good but also useful to hear how other people are reaching out. Every community of users has different needs and different ways to connect. Someone needs to feel like they're part of the tool they're building - if they understand what's going on, they're more comfortable using it.
SH: Building relationships with your users is important, and follow different patterns of engagement with different folks. Not always a homogenous group. Navigating this as a consultant is slightly more difficult because you need familiarity with the users. Strategy will evolve over the long term. More interested in providing support for UX activities over time - engagement can grow over time. A lot of projects don't know how to get started but the UX program can grow over time. Worthwhile to invest time on UX no matter what stage you're at.
KB: Important to know who your diverse users are and what their backgrounds are. Worked on developing web-based map applications, built interfaces based on their expectations not on the USER expectations. Significant learning curve to the information is difficult to overcome as a user, especially when the users are coming from vastly different roles. Questions, challenges, comments they make are very different.
RV: Keep the users at the center of the product. Think of yourself as the user, but you're not - that's why usability testing is so important. Also helps to get loyalty and buy-in on the product. Also helps with stakeholder engagement, folks can see themselves as the user. Identify strengths and weaknesses of your design and helps lower development time and cost, as well as operational time and cost. DataONE offers usability services as part of the membership. Three services:
-Hueristic analysis, quick qay to asses current UX
-UX Testing of the site
-Eye tracking studies to provide deeper understnading of user behavior and interaction with your product.
RV: Incoporate long term / iterative testing to keep up with your product development and user needs. If you have a project to turn into a marketablle product, you need to do UX testing - demonstrates planning for project sustainability. For every $1 in UX, you save $10 in dev and $100 in operational cost.
RV: Also offer other UX services - three previously listed as well as focus groups, interviews, and card sorting.
Poll: How can UX testing better support your products and mission? All seems userful to folks.
Poll: What's the most important issue UX can help solve? Increase data/tool usage was number 1
SA: What are people's thoughts on trust?
NS: good metadata documentation
EFQ: Data confidence/trust is baked in if people are using the tool.
KB: Trust and better data discovery were both rolled into usage - if you don't have those, then we would not expect to have usage
SH: Turns out that some products are one of the only services available to the users - they're forced to use the service. They may not trust the data but it's better than nothing. May not have other choices.
BW: Not necessarily trust in the data but trust in the repo that's more important. If they get frustrated when they're using it, the trust erodes. Different trust than trust in the data.
SA: Difference between good data and accessible data in this era of misinformation.
BW: More of a trust that the system is FAIR compliant, going to be a better option than others...design features intot hat rather than trust in the data.
SA: Improving repo brand came up as 3rd most important...that encompasses a lot of different things.
SH: Repository brand is the vague one - do people mean recognition of the name? Or the value of the repository data?
EFQ: What are the indicators of data confidence?
SH: Need to start talking to the users about their usage. What are some of the use cases. Are they able to form conclusions from the data they got from their services? If I have to put a disclaimer on teh data, that's an indicator of low confidence in the data.
RV: What features of DataONE were tested and how? A lot of stuff with the search as it was developed and as different features were added. Ie what the views look like, how to login, how the metadata pages looked, etc. Things with the website, as well as some other tools like the metadata editor. How was a lot of different types. In person testing with the user group community at conferences/national labs. Remote studies over Zoom and other tools. Can give more info via email.
KB: Pros and cons of the different UX tests?
RV: Want to test multiple ways for the best results. Figuring out what's going to work best for those users or where you are in the dev process or how long the turnaround time will be. The more variety the better.
SA: Reduced cost was the least picked...why were folks not interested in UX testing for reduced cost?
SH: Data confidence and reduced cost were both up there in terms of importance - easier for people supporting the project to rally behind. Costs is more a logistic or administrative problem, perhaps is a siloed concern for a different team. Before cost reduction, some of the other elements were more important.
BW: Would be interesting for DataONE to see what specific things that were changed - maybe that would translate to other systems. Something may apply to other websites or repos, would be useful to have that comparison.
MJ: DataONE went through a lot of iterative versions, and it continues to happen now with different tooling. Would be great to write up a retrospective about how things evolved over a decade - every 9 months or so making changes based on UX testing. Some changes were small or backed out earlier, tradeoffs and timing, not a linear path. Done some presentations that compare versions over time but nothing comprehensive.
RV: Journal article is actually waiting for reviews now!
SA: Sometimes funding is tough for UX since sometimes you're maxed out just running your tool. You can write it into the proposal when you're seeking funding in the first place. Agencies like to see that in proposals - it's a better ROI.
**Discussion of tools from chat:**
- hotjar is a great asynchronous tool that is pretty affordable
- we’ve liked loop11 as well
- it would be interesting if there were a tool / features comparison
- agree! I'd be interested in seeing a comparison of tools
- working for a federal bureau, we have limitations on types of tools we can use / surveys we can conduct etc.