# WG Meeting: 2023-10-31 ## Agenda - Joseph Heenan to lead discussion on certification ## Attendees - Atul Tulshibagwale (SGNL) - Steve Venema (ForgeRock) - Phil Hunt (IndependentID) - Joseph Heenan (Authlete) - Shayne Miel (Cisco) - Tim Cappalli (Microsoft) - Mike Kiser (SailPoint) - Sean O'dell (Disney) - Eric Karlinsky (Okta) - Apoorva Deshpande (Okta) ## Notes ### Certification / Interop - Joseph also works for OIDF - leads the certification program - OIDC has been running the certification program for over 8 years - FAPI also has a certification program. Various ecosystems, e.g. UK, Brazil, Australia, etc. have mandated certification. Expect US and Canada to follow - All tests are a part of the OpenID Test Suite in a Java framework (was Python earlier) - The framework provides isolation, - The expectation is that all tests must pass for certification - Is the expectation in SSWG that we should require production / development versions - FAPI requires production level certification - Q to WG: What do people need to certify? Is there a minimum set of events to demonstrate, etc. - (Sean) Are we testing the SETs or the SSF spec? We have unit / implementation tests at Disney. We test how the system responds to receiving an event - (Steve) Is there traceability in the tests? How do you tie certification to the actual spec? - (Joseph) We link back specific tests to the specific parts of the spec - (Joseph) OIDC and FAPI have taken different approaches to certification. OIDC has defined different profiles, each of which point back to the spec - (Steve) We should consider certification when developing the spec in future. E.g. these three fields must be present. - (Steve) It'll be good to review the spec from a testability perspective, so that the spec becomes more concise. - (Shayne) We've left out a couple of things from the spec - e.g. how do Tx and Rx trust each other. How would we deal with that in a test suite? - (Joseph) We could specify it in the certification, or we treat it as being out of scope and provide it offline - (Tim) Like in OIDC, we could keep some things out of scope. People could adopt the pattern used in the cert program, but they don't have to in order to be certified - (Atul) Should we consider a basic interoperability excercise before getting into certification? - (Joseph) That could help identify issues in the spec and common pitfalls of implementations so that we can build the certification requirements along the way - (Atul) Can we limit the interop to specific use cases? - (Joseph) OIDC may have gone through such an exercise in the past. We should check with Mike Jones - (Joseph) UK open banking had a big interop exercise when the ecosystem went live and everyone found out issues - (Phil) There are 5 specs in play. SSF is different from OIDC. The specs that we need to have interop is around RFC 8935/36 (Push, Poll). From a certification perspective these are more important. Then we need to figure out the administration with SSF and then CAEP / RISC specs. The action that the individual endpoint takes when they receive an event should be out of scope - (Phil) What are the levels of certification from a security perspective? E.g. 8935/36 have acknowledgement semantics that indicate specific actions - (Joseph) Any definition of a certification rests with the WG and not OIDF - (Joseph) Ask questions on the Slack channel. OIDF is going through a budget exercise right now - (Phil) When we certify OIDF specs they are building on other specs. SSF is unique in that it defines the administration of specs that belong in the IETF. The most important things to certify are not the OIDF specs but the RFCs (push and poll). First do push and poll work as expected, then can we administer the streams, and then the individual events. Can I manually setup a connection with e.g. Disney and send and receive events - (Tim) To me we need to do this in the context of OIDF. So we should not worry about certifying the IETF specs. We should certifying its use in the SSF / CAEP / RISC context - (Steve) Agree with Phil (mostly). It's too early to think about certification because we're not stable on the specs. Like the idea of starting with some use cases. That will help us get concrete about what the points of friction are. Going through the specs from a certification point of view is appropriate at this time - (Atul) We need to identify what is required by participants to support the minimum required to solve a use-case - (Tim) This sounds like a part of a certificaiton program - (Phil) I question whether there is any value to certifying what is in the token without certifying the push / poll part. - (Tim) If you are certified for "SSF-Core" (made up name), and you support this deployment profile, then you are interoperable - (Tim) certification and interop is not divergent. A hackathon can find kinks - (Steve) Both interop testing and certification are ways to discover limitations of the spec. I agree that interop testing may be the first thing to do - (Sean) Agree with Steve. This is very important. I don't want an Azure API / Graph QL token. You need vendor interoperability testing first, which could be considered an informal certification. The hardest sell internally at Disney was whether this is vaporware or real. Vendor interop is huge to make this real - (Shayne) Circling back to the earlier question: Say a Transmitter wants to use OAuth and another Receiver wants to use Basic authenticaiton, or some weird authn. How do we support the different things - (Sean) We had to use different implementations of authentication for various vendors - (Atul) Can we base the descision based on how confusing / clear it is to end-customers? - (Sean) We must specify basics like you cannot do anonymous - (Phil) How do you bootstrap security is going to become an interop pain-points. We may need a secondary profile to define this - (Tim) One high-level decision point: Does this need to be 100% discoverability? The answer is probably no. We had this issue in FastFed. We need to agree upon some basic parameters. We should codify it. - (Eric) What is the impetus to make it a 100% interoperable? I don't know what it impedes if we don't specify everything - (Tim) Is it so bad that if you do something proprietary? Probably not, but you need to support some minimum set of common authorization, e.g. OAuth for Authorization - (Sean) That's not as prevalent as you think in businesses - (Tim) Do we expect internal components to get certified? Probably not. - (Sean) I see companies being Transmitters more than vendors, so many more participants may be interested in certification - (Tim) It's back to the use-cases. - (Steve) Use case - you have two companies with deployments. Both companies want to exchange events. The certification lets companies know what to buy. - (Eric) How does this relate to whether we need to define how authorization works? - (Phil) - (Sean) I don't see the difference between this and a SAML IdP / SP configuration - (Eric) OAuth is a high bar in itself to implement. Exchanging a shared secret may be faster (e.g. an API key) - (Sean) Having multiple authentication mechanisms made things complex - (Eric) There could be a higher bar for certification - (Shayne) We said that certification should be on production implementations ## Action Items