Try   HackMD

Digital Twin Community Pulse Check: Trustworthy & Ethical Development

â„šī¸ About this Document

This document is a space to collect ideas /plan the scoping survey we are planning to undertake with the Connected Places DT Hub community as part of our ongoing collaboration

This survey aims to answer the overarching question: Can DT practitioners benefit from the assurance method? And if so, how can TEA best support practitioners?

To answer this question wie probe both subjective value and the objective needs related to assurance to get a clear picture of the DT community's readiness and capacity to adopt and benefit from argument-based assurance approaches.

The survey is divided into 4 main sections, that answer the following sub-questions

  1. What is the make-up of the community / sub-set of the community we surveyed?
  2. Are practitioners ready/able to apply argument-based assurance?
  3. Are practitioners willing to apply argument-based assurance?
  4. Where is there a specific need for argument-based assurance?

Section 1: Community Composition

📊 Rationale for Section

This section would serve as a foundation for understanding the unique perspectives and experiences within the community. Or, to put it another way, to help us define our "community". It will also allow us to identify patterns or trends across different sub-groups within the community (e.g. attitudes of researchers versus developers) subject to a sufficiently large sample size—especially if we extend the invite to DTNet+ and TRIC-DT affiliated researchers.

The collected data would allow inference on the following questions:

  • What is the make-up of the community / sub-set of the community we surveyed?
# Question Data Type Analysis Type Rationale
1.1 What sector best represents your field of work? Text / multiple-choice Frequency analysis Used to support identfication of trends or patterns across sub-groups
1.2 Where is your organisation located? multiple-choice Frequency analysis Used to support identfication of trends or patterns across sub-groups; Used to filter for workshop invites
1.3 What is your role within your organisation? multiple-choice Frequency analysis Used to support identfication of trends or patterns across sub-groups
1.4 What are your primary responsibilities? Checkbox (Select all that apply) Frequency analysis Used to support identfication of trends or patterns across sub-groups

Answer options

  1. What sector best represents your field of work?
    • DT Hub to complement with existing data on represented sectors
    • Aerospace
    • Architecture
    • Artifical Intelligence
    • Automotive
    • Aviation
    • Construction
    • Consumer Goods
    • Defence
    • Education
    • Electronics
    • Engineering
    • Environment and Conservation
    • Finance
    • Food and Agriculture
    • Freight
    • Healthcare
    • International Government
    • Local Government
    • Manufacturing
    • Maritime
    • Media
    • Mining
    • National Government
    • Nuclear Energy
    • Oil and Gas
    • Place Leadership
    • Rail
    • Renewable Energy
    • Smart Cities
    • Supply Chain and Logistics
    • Technology
    • Telecommunications
    • Transport
    • Utilities
    • Waste and Recycling
    • Water
  2. Where is your organisation located?
    • Find reusable list of countries
  3. What is your role within your organisation?
    • Developer/Engineer
    • Project/Program Manager
    • Executive/Decision-Maker
    • Researcher/Academic
    • Compliance Officer/Regulatory Affairs Manager
    • Technical Manager/Lead Developer
    • Industry Consultant/Advisor
    • Ontology Engineer/Framework Architect
    • Data Scientist/Analyst
    • Middleware/API Developer
    • Governance Specialist
    • Semantic Web Technologist
    • Platform Developer
    • Other (Please specify)
  4. What are your primary responsibilities?
    • Designing and Implementing: I am involved in the technical design, development, and implementation of digital twin systems.
    • Strategizing and Directing: I guide the strategic direction, decision-making, and overall management of digital twin projects.
    • Ensuring Compliance: I focus on ensuring that digital twin projects comply with regulatory, legal, and ethical standards.
    • Advising and Consulting: I provide expert advice, insights, and best practices to enhance the effectiveness and assurance of digital twin projects.
    • Developing Tools and Frameworks: I create or contribute to the development of foundational tools, ontologies, and frameworks that support digital twin functionality.
    • Other (Please Specify)

Section 2: Assurance - Understanding & Current Practices

📊 Rationale for Section

This section would aim to provide a comprehensive overview of the existing skill sets and research and innovation infrastructure (e.g. access to tools, adoption of assurance methods) available within the community. This information is crucial for helping to identify existing strengths and areas of importance, with the assumption that existing capabilities serve as a proxy for areas of importance. However, the answers would also support our own research (e.g. identifying common tools, mechanisms, or processes) and can help identify gaps (e.g. project lifecycle stages with lack of assurance consideration).

The data will speak to the following questions:

  • Are practitioners ready/able to apply argument-based assurance?
  • What is the current status of the assurance capabilities within this community and how consistent is it across?

Assurance Understanding

# Question Data Type Analysis Type Rationale
2.1 What do you understand assurance to mean in the context of developing & applying digital twinning technology? Free Text Thematic Analysis Identify diverse interpretations and conceptual frameworks of the term "assurance" across various jurisdictions and sectors

background & definition of assurance (to be shown after 2.1)
Assurance means providing justified or warranted confidence in a specific property of a digital twin. For example, sharing an impact assessment around operational performance of a digital twin.

Current Assurance Practices

# Question Data Type Analysis Type Rationale
2.3 Which of the following assurance methods do you currently implement (if any)? Please note that the options provided below for assurance methods are based on recommendations outlined in the DSIT Practitioner Guide to AI Assurance. Select all that apply Frequency analysis To identify existing strengths and areas of importance
2.4 [if 2.3 not empty] For each method selected, indicate all properties of your digital twin that you currently assure with this method
2.5 At which stages in the project lifecycle are you implementing assurance techniques? Check all that apply Frequency analysis To gauge timing and spread of assurance practices across project phases, highlighting overarching patterns and gaps in assurance integration.
2.6 How satisfied are you with the current level of integration between your assurance processes and the actual development lifecycle of your digital twins? Likert scale: 1 (Very unsatisfied) to 5 (Very satisfied) Descriptive Statistics, Cross-tabulation This assesses the perceived effectiveness of existing assurance practices within the overall development process.

Ethics Capabilities

# Question Data Type Analysis Type Rationale
2.8 Does your organisation have an established definition or framework for "trustworthy" and "ethical" digital twins? Multiple choice (Yes/No/Don't know) Frequency analysis For readiness assessment of the organization / probe response quality (don't know option)
[2.8b] [Follow-up] If yes, please describe your organisation's definition or framework for "trustworthy" digital twins. Free text Thematic Analysis, Topic Modelling -
[2.8c] How was this definition or framework developed (e.g., in-house, through consultancy, collaborative industry efforts)? Free text Thematic Analysis valuable context on the adoption and customisation of trustworthy and ethical principles.

Ethics - Attitudes

📊 Rationale for Section

This would be an important section to gauge the community's sentiments regarding assurance methodologies and current offerings, even if it uncovers negative perceptions or attitudes. It should contrast with the previous two sections by allowing a more subjective assessment ot be offered. Understanding the prevailing attitudes within the community informs the development of our work to help it resonate with the community, and help establish a shared commitment to trustworthy and ethical practices surrounding digital twins.

Focus Question
How do practitioners perceive the importance and effectiveness of assurance methods, normative principles, and regulatory/governance frameworks in fostering trustworthy and ethical digital twins?

# Question Data Type Analysis Type Rationale
2.9 How valuable do you find ethical principles in general, such as the Gemini principles? Likert scale Descriptive Statistics, Cross-tabulation To understand the perceived value of ethical principles in guiding digital twin development.
2.10 Considering your own digital twin product, please rate the importance of the Gemini principles in assuring its trustworthiness and ethical use. Likert scale 1 (Not Important) to 5 (Extremely Important) Descriptive Statistics, Cross-tabulation to capture attitudes towards normative principles and how they compare to each other
2.11 Please rate how difficult you find it to operationalize the following principle within your digital twin product. Likert scale 1 (Very Easy) to 5 (Very Difficult) Descriptive Statistics, Cross-tabulation Needs analysis

Answer options

2.2. Unsure about this question as it may lead to us collecting a lot of sector-specific standards?. What legal requirements and/or standards does your organization adhere to/reference for assurance in digital twin projects?

  • IEEE P7003 - Algorithmic Bias Considerations
  • ISO/IEC TR 24027 - AI — Bias in AI systems and AI aided decision making
  • ISO/IEC 42001 - AI — Management System
  • ISO/IEC 23894 - AI — Risk management
  • ISO/IEC TS 12791 - Treatment of unwanted bias in classification and regression machine learning tasks
  • DNV-RP-A204 Assurance of digital twins
  • DNV-RP-0513 Assurance of simulation models
  • DNV-RP-0317 Assurance of data collection and transmission in sensor systems
  • DNV-RP-0497 Assurance of data quality management
  • DNV-RP-0665 Assurance of machine learning applications
  • DNV-RP-0510 Framework for assurance of data-driven algorithms and models

2.3. Which of the standard assurance methods do you currently implement?
* Risk Assessment (Bias Risk Analysis, Data Privacy Impact Assessment, Security Vulnerability Assessment, Reputational Risk Evaluation, Risk Assessments)
* (Algorithmic) Impact Assessment (Environmental Impact Study, Equality and Human Rights Impact Assessment, Data Protection Impact Assessment, Impact Assessments)
* Bias Audit (Input Data Bias Check, Algorithmic Decision-Making Audit, Model Fairness Evaluation)
* Compliance Audit (Policy Adherence Review, Regulatory Compliance Check, Legal Framework Alignment, Regulatory Compliance Documentation, Use of Specific Standards)
* Conformity Assessment (Product Certification, System Performance Testing, Market Readiness Evaluation, Quality Control Measures)
* Formal Verification (Mathematical Model Checking, Requirement Satisfaction Analysis, Logic-Based Verification)
* Model Cards
* None

2.3. At which stages in the project lifecycle are assurance methods used?
* project planning
* problem formulation
* data extraction & procurement
* data analysis
* preprocessing & feature engineering
* model selection & training
* model testing & validation
* model reporting
* system implementation
* user training
* system use & monitoring
* model updating & deprovisioning

2.7 Do you currently use argument-based methods for communicating how your assurance artifacts meet the high-level properties you want to assure in your digital twin?
* No, I don't know what argument-based assurance methods are
* Yes, I do

Current Assurance Practices

What is argument-based assurance?
Argument-based assurance is a systematic approach that employs structured argumentation to substantiate claims about a system's properties, based on available evidence. This method is particularly useful in demonstrating compliance with regulatory goals or standards, through the creation of an assurance case. An assurance case can be presented in various formats, including formal documentation, textual descriptions, or visual diagrams, effectively linking assurance artifacts to the overarching claims they support. This process is instrumental in operationalizing principles, such as those outlined in the Gemini principles, by providing a clear and structured framework for connecting specific assurance activities to the broader goal of ensuring trustworthiness and ethical integrity in digital twin projects.

# Question Data Type Analysis Type Rationale
2.7 Do you currently use argument-based methods for communicating how your assurance artifacts meet the high-level properties you want to assure in your digital twin? Multiple-Choice Thematic analysis Identify current practices
2.8 Are you aware of any existing resources to support your use of Assurance techniques and/or technical standards? Free Text Trend Identification highlight common tools and frameworks in use, providing a baseline for what is considered industry standard or best practice.

Section 4: Argument-Based Assurance - Perceived Value & Needs

📊 Rationale for Section

Here, we delve into the challenges and requirements faced by the community. This section will help us ensure that our work addresses specific needs, overcomes real barriers, and fosters a supportive environment for the trustworthy ethical design, development, deployment, and use of digital twins. In short, it allows us to improve the usability of our own tools, and where feedback is given that is beyond the scope of our work, it can serve as open information to the community that could allow others to develop solutions.

Types of questions could include:

  • Urgent priorities
  • Types of gaps in skills and capabilities
# Question Data Type Analysis Type Rationale
4.1 Do you believe argument-based assurance could offer advantages over current practices? Yes/No Frequency analysis .
4.2 If yes, what need could argument-based assurance solve in your assurance process? Free Text Thematic Analysis To understand the perceived value and impact of argument-based assurance.
4.3 If no, is this due to satisfaction with current practices or the presence of another more valuable option (If the latter, please specify)? Multiple Choice Frequency analysis To identify barriers to the adoption of argument-based assurance.
4.4 How prepared do you feel to develop and implement an assurance case for your digital twin project? Likert Scale Descriptive Statistics, Cross-tabulation To assess individual readiness and identify specific skill gaps in creating assurance cases.
4.4b What type of support might help you in creating sound assurance arguments around ethical principles for your digital twin project? Multiple Choice Frequency analysis To identify the types of support that could enhance individual capability in developing assurance cases.
4.5 How conducive is your organization's environment for adopting argument-based assurance methods? Likert Scale Descriptive Statistics, Cross-tabulation To evaluate the organizational environment and its influence on the adoption of argument-based assurance methods.
4.6 What factors would most significantly impact the successful adoption of argument-based assurance methods in your organization? Multiple Choice Frequency analysis To identify key factors that could facilitate or hinder the adoption of argument-based assurance methods.
# Question Data Type Analysis Type Rationale
2.6 Are your assurance methods carried out internally or by an external partner? Select (Internal/Third-party) Frequency analysis to identify current assurance infrastructure and localisation of skills

Likert Scale for 4.5 & 4.6:

  • Very prepared / Highly conducive
  • Somewhat prepared / Moderately conducive
  • Neutral
  • Somewhat unprepared / Slightly conducive
  • Not prepared at all / Not conducive

Answer Options for 4.5b (Support Types):

  • Skills/training in assurance methods
  • Awareness programs on assurance techniques
  • Guidance on applying assurance techniques to specific use-cases
  • Best practices in AI assurance
  • Regulatory clarity on assurance compliance
  • Other - please specify
  • Examples of effective digital twin assurance practices
  • Toolkit of digital twin assurance methods and tools
  • Guidance on selecting the right assurance practices for digital twins
  • Guidance on applying technical standards to digital twin projects
  • Learning and development resources on digital twin assurance
  • Development of new assurance practices for digital twins
  • Creation of more technical standards specific to digital twins
  • Clearer understanding of regulatory expectations for digital twins
  • Illustrations of the value added by digital twin assurance
  • More certification programs focused on digital twin assurance

Answer Options for 4.7 (Factors Impacting Adoption):

  • Available resources
  • Skills/training within the team
  • Awareness and understanding of argument-based assurance
  • Guidance on selecting appropriate assurance techniques
  • Established argument patterns for digital twin assurance
  • Regulatory support and clarity on required structure of assurance argumentation
  • External recognition (eg certification) of assurance argumentation
  • Compatibility with existing systems or standards
  • Other - please specify

These adjustments aim to provide a more nuanced understanding of the readiness and factors influencing the adoption of argument-based assurance methods, both at an individual and organizational level.

Section 5: Future Directions

📊 Rationale for Section

This section provides a more open-ended and forward-looking opportunity for the community to articulate their own aspirations and vision for the evolution of assurance methods. It can help minimise the biases imposed by our own questions, and provide an option for the participants to raise issues that were not unearthed by our own questions.

Types of questions could include:

  • Novel use cases
  • Areas of opportunity
  • Open-ended questions to promote sharing and community engagement
# Question Data Type Analysis Type Rationale
1 Is there anything else pertinent to the assurance of digital twins, which has not been covered, that you think is significant? Free Text Thematic Analysis, Topic Modelling to promote sharing and community engagement

Section 6: Post-Survey

  • Are you interested in follow-ups?
  • Would you like to join a more in-depth workshop to learn about assurance methodology & build your own assurance case under guidance of experts?

List of Questions

Here's a list of the questions extracted from the table, organized by sections:

Section 1: General Information

  1. What sector best represents your field of work?
  2. Where is your organisation located?
  3. What is your role within your organisation?
  4. What are your primary responsibilities?

Section 2: Assurance - Understanding & Current Practices

Assurance Understanding

  1. What do you understand assurance to mean in the context of developing & applying digital twinning technology?
  2. What governance or legal requirements does your organization adhere to/reference for assurance in digital twin projects (e.g., ISO standards, GDPR)?
  3. Which of the following assurance methods do you currently implement (if any)?
  4. At which stages in the project lifecycle are assurance methods used?
  5. What property of your digital twin is method X at project stage Y assuring?
  6. Are your assurance methods carried out internally or by an external partner?
  7. Do you currently use argument-based methods for communicating how your assurance artifacts meet the high-level properties you want to assure in your digital twin?
  8. Are you aware of any existing resources to support your use of Assurance techniques and/or technical standards?

Ethics Capabilities

  1. Does your organisation have an established definition or framework for "trustworthy" and "ethical" digital twins?
  2. If yes, please describe your organisation's definition or framework for "trustworthy" digital twins.
  3. How was this definition or framework developed (e.g., in-house, through consultancy, collaborative industry efforts)?

Section 3: Attitudes and Perceptions

  1. How valuable do you find ethical principles in general, such as the Gemini principles?
  2. Considering your own digital twin product, please rate the importance of the Gemini principles in assuring its trustworthiness and ethical use.
  3. Please rate how difficult you find it to operationalize the following principle within your digital twin product.
  4. How satisfied are you with the current level of integration between your assurance processes and the actual development lifecycle of your digital twins?

Section 4: Argument-Based Assurance - Perceived Value & Needs

  1. Do you believe argument-based assurance could offer advantages over current practices?
  2. If yes, what need could argument-based assurance solve in your assurance process?
  3. If no, is this due to satisfaction with current practices or the presence of another more valuable option (If the latter, please specify)?
  4. How prepared do you feel to develop and implement an assurance case for your digital twin project?
  5. What type of support might help you in creating sound assurance arguments around ethical principles for your digital twin project?
  6. How conducive is your organization's environment for adopting argument-based assurance methods?
  7. What factors would most significantly impact the successful adoption of argument-based assurance methods in your organization?

Section 5: Future Directions

  1. Is there anything else pertinent to the assurance of digital twins, which has not been covered, that you think is significant?

Scratchpad 📝

â„šī¸ About this Section

Any of the material in this section should be treated as notes, planning, or suggestions (e.g. to be integrated into the draft above).

Timeline survey creation

Version Date Description of Changes Comments
0.1 2024-03-08 Initial draft Initial creation of survey questions circulated within TEA-DT and DT Hub team
0.2 2024-03-18 First Revision First revision based on internal reviewer feedback
0.3 2024-03-25 Final Version (Deployed) Final version to be shared with external reviewers & TPS for pilot
1 2024-03-29 Release Version Survey is implemented and sufficiently tested / ready to be released

Tasks

  • Collect questions V0.1
  • Team to provide input V0.1
  • Incorporate internal feedback V0.2
  • Project partners give explicit approval V0.3
  • Incorporate feedback V0.3
  • Implement in Streamlit V0.3
  • Conduct technical tests V0.3
  • Verify survey protects anonymity V1
  • Provide comms around survey doc V1
  • Launch survey V1

Framing and Comms

Here are some alternative suggestions for framing the survey (please add to the list, and add your vote for your favourite):

  • Community Pulse Check ✅
  • State of the Digital Twins Assurance Ecosystem
  • Digital Twins Assurance Forum

Idea for Structuring Questions

To ensure each item in the questionnaire is salient, we could use the following table to structure our design and review process (illustrative example provided):

# Question Data Type Analysis Type Rationale
1 Which methods of assurance are you currently using? Unstructured text Frequency analysis, Topic Modelling Identify and understand existing capabilities within the community.

Archived Questions

Other Suggestions

  • What is the key function of your digital twinning technology?

    • Rationale: Provides another axis on which to group responses that is orthogonal to sector
    • Monitoring individual assets (e.g., pumps, machines, buildings)
    • Optimizing the performance of individual components
    • Controlling specific assets for efficiency or safety
    • Managing groups of assets within a system (e.g., manufacturing line, chemical process)
    • Optimizing production planning or resource scheduling
    • Enhancing operational efficiency of interconnected assets
    • Replicating entire systems or environments for simulation purposes
    • Optimizing complex systems, supply chains, or networks
    • Controlling large-scale operations, factories, or sites for strategic outcomes
    • Analyzing business outcomes or customer base behaviors
    • Integrating data from external sources for enhanced situational awareness
    • Collaborating with other organizations to optimize operations or processes
    • Leveraging shared data for improved customer management or service delivery
  • How are you involved in the assurance processes for digital twins?

    • Building solutions that are aligned with assurance requirements
    • Designing/Developing assurance cases
    • Overseeing/Managing assurance processes
    • Reviewing/Evaluating assurance documentation
    • Not (directly) involved
    • My organisation has no formal assurance process
  • Which of the following do you consider important for assuring trustworthy and ethical principles in your digital twin projects (select all that apply):

    • Safety (e.g. DT does not create unecessary or avoidable risks to an individual's health and well-being, the economy, the environment, or society more broadly)
    • Transparency (e.g. openness in how the digital twin works and the data it uses)
    • Accountability (e.g. identifying and taking responsibility for the actions and impacts of the digital twin)
    • Fairness (e.g. ensuring the digital twin does not unfairly discriminate or bias its decisions or outputs)
    • Privacy (e.g. protecting the privacy of individuals and organisations whose data is used in the digital twin)
    • Security (e.g. protecting the digital twin from unauthorised access, misuse, and manipulation)
    • Add other principles, including from the Gemini, SAFE-D, OAI, and NDTP
    • Rationale: identifies whether the spirit of the gemini principles are considered important without explicitly asking about them.
    • Data type: Multiple choice
    • Analysis method: Frequency count
  • How well do you believe current governance frameworks and standards address the ethical considerations of digital twins in your sector?

  • In your experience, what ethical issues or risks do you believe are unique or particularly amplified in the context of digital twin technology, compared to other digital solutions?

  • How would you rate current capabilities within your organisation for assuring adherence to Gemini Principle X? (1 = very poor, 5 = Excellent)

  • How confident are you that your project is adhering to ethical principles? (1 = not confident at all, 5 = extremely confident)