# OSCAL Engagement Metrics Brainstorming
## Key themes and goals
Most important for any metric we consider is why a metric is important to us, and what question does it answer for us.
- For OSCAL today and the NIST team, engagement means XYZ.
- For the NIST OSCAL Team, the intended audience for our engagement is XYZ.
- Developers?
- Maintaining and growing bought-in developers?
- Bringing in new developers
## Wendell's observations and recommendations
Three categories of data, are they measurable?
- OSCAL data flows (esp in comparison to pre-OSCAL status quo)
- FedRAMP data flows
- Other agency or organization data flows?
Testimonial info?
- OSCAL Schema promulgation
- How well can we tell this from Github?
- Page analytics?
- OSCAL forks
- Interest in OSCAL
- Web stats
- People showing up
- People asking questions
- Promotion of products and initiatives by third parties
## Chris thoughts
- Attendance at different public meetings
- Community opened issues
- Community opened pull requests
- Google/search engine results about non-NIST information about OSCAL (our project, not the calcium supplement)
- Count of non-public calls
- Counts of content downloads (schemas, official catalogs)
- Metrics around the tools, for example `oscal-cli` from the OSS Maven repository
- Survey audience (in forms for conf registration) their experience/knowledge of OSCAL
## AJ thoughts
- Google Analytics (via DAP) for all OSCAL Team products on pages.nist.gov
- Success stories, are they even a metric?
- Social media posts (LinkedIn/Twitter/etc) by Michaela and the NIST Team? In the official LinkedIn OSCAL group
## Miscellaneous pre-reqs conditions
- If our means and mechanisms cannot generate metrics, we have a problem (GitHub downloads are not trackable)
- Whatever we do, we should commit to consistency for at least a year, and it is simple to do that for one year
- Revisit metrics the following year for the next iteration of the strategic plan the following year
# Decided are these things to start collection, preparation and reporting:
## Measures
- Google Analytics top five pages by rank as an aggregate monthly count, including month, count, page/URL.
- GitHub Issues opened as an aggregate monthly count, including project, count, month, excluding NIST staff.
- GitHub Pull Requests opened as an aggregate monthly count, including project, count, month, excluding NIST staff.
- Attendance at public meetings as an aggregate MAX count, date and title.
## Methods
- A person will need to collect a count at each public call. Absent a location, it can be emailed on oscal-team@nist.gov.
- Collected for the previous month, during the first week of the following month.
- To be determined?
## Long-term risks
- More specific, advanced GitHub data points require custom tool development with APIs, do we have bandwidth and commitment?
- Using GitHub for key artifact inhibits analytics (Markdown files, code, schema files) or analysis entirely, how do we mitigate this risk long-term?