Abstract Knowledge graphs have recently been introduced to the verification strategy field successfully representing the complexity of verification in real‐life applications. This format provides a scale‐free analysis of verification strategies compared to the more traditional verification artifacts such as requirement traceability matrices and verification matrices. Complexities can be observed visually and numerically both in terms of the problem scope and the entity interdependencies. In this paper, we retrieve verification strategy information patterns representing different aspects of verification. This is achieved by tapping into the network properties of knowledge graphs. They are dissected to detect knowledge patterns emerging from different parts of the verification artifacts. Similarities and differences between the two verification strategies are explained numerically and semantically. Seemingly unrelated requirements and verification activities are connected through indirect connections, and orthogonalities between independent requirements are analyzed. These findings validate the scalability of verification planning and assessment based on knowledge graphs.
more »
« less
This content will become publicly available on July 1, 2026
Hidden Beliefs in Verification Decisions: An Experimental Study with Practitioners
Abstract System verification is used to check that the system has been built in accordance with its requirements. In executing a verification strategy, each verification activity produces certain information artifacts that are then used as evidence in the assessment of the compliance of the system against its requirements. The process of reasoning through verification artifacts is cognitive and subjective, as the engineer combines their knowledge and expertise along with the information available in the evidence collected through verification activities. Through an experimental study with practitioners, this paper shows that engineers use some of this knowledge implicitly in their verification assessment and do not explicitly express it when formally articulating the justification to declare the compliance of a system.
more »
« less
- Award ID(s):
- 2205468
- PAR ID:
- 10653087
- Publisher / Repository:
- Wiley
- Date Published:
- Journal Name:
- INCOSE International Symposium
- Volume:
- 35
- Issue:
- 1
- ISSN:
- 2334-5837
- Page Range / eLocation ID:
- 554 to 572
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Regulatory documents are complex and lengthy, making full compliance a challenging task for businesses. Similarly, privacy policies provided by vendors frequently fall short of the necessary legal standards due to insufficient detail. To address these issues, we propose a solution that leverages a Large Language Model (LLM) in combination with Semantic Web technology. This approach aims to clarify regulatory requirements and ensure that organizations’ privacy policies align with the relevant legal frameworks, ultimately simplifying the compliance process, reducing privacy risks, and improving efficiency. In this paper, we introduce a novel tool, the Privacy Policy Compliance Verification Knowledge Graph, referred to as PrivComp-KG. PrivComp-KG is designed to efficiently store and retrieve comprehensive information related to privacy policies, regulatory frameworks, and domain-specific legal knowledge. By utilizing LLM and Retrieval Augmented Generation (RAG), we can accurately identify relevant sections in privacy policies and map them to the corresponding regulatory rules. Our LLM-based retrieval system has demonstrated a high level of accuracy, achieving a correctness score of 0.9, outperforming other models in privacy policy analysis. The extracted information from individual privacy policies is then integrated into the PrivComp-KG. By combining this data with contextual domain knowledge and regulatory rules, PrivComp-KG can be queried to assess each vendor’s compliance with applicable regulations. We demonstrate the practical utility of PrivComp-KG by verifying the compliance of privacy policies across various organizations. This approach not only helps policy writers better understand legal requirements but also enables them to identify gaps in existing policies and update them in response to evolving regulations.more » « less
-
In many regulated domains, traceability is established across diverse artifacts such as requirements, design, code, test cases, and hazards -- either manually or with the help of supporting tools, and the resulting trace links are used to support activities such as impact analysis, compliance verification, and safety inspections. Automated tracing techniques need to leverage the semantics of underlying artifacts in order to establish more accurate trace links and to provide explanations of links that have been created in either a manual or automated fashion. To support this, we propose an automated technique which leverages source code, project artifacts and an external domain corpus to generate a domain-specific concept model. We then use the generated concept model to improve traceability results and to provide explanations of the results. Our approach overcomes existing problems with deep-learning traceability algorithms, as it does not require a training set of existing trace links. Finally, as an initial proof-of-concept, we apply our semantically-guided approach to the Dronology project, and show that it improves over other tracing techniques that do not use a concept model.more » « less
-
Since the introduction of CompCert, researchers have been refining its language semantics and correctness theorem, and used them as components in software verification efforts. Meanwhile, artifacts ranging from CPU designs to network protocols have been successfully verified, and there is interest in making them interoperable to tackle end-to-end verification at an even larger scale. Recent work shows that a synthesis of game semantics, refinement-based methods, and abstraction layers has the potential to serve as a common theory of certified components. Integrating certified compilers to such a theory is a critical goal. However, none of the existing variants of CompCert meets the requirements we have identified for this task. CompCertO extends the correctness theorem of CompCert to characterize compiled program components directly in terms of their interaction with each other. Through a careful and compositional treatment of calling conventions, this is achieved with minimal effort.more » « less
-
An essential requirement of any information management system is to protect data and resources against breach or improper modifications, while at the same time ensuring data access to legitimate users. Systems handling personal data are mandated to track its flow to comply with data protection regulations. We have built a novel framework that integrates semantically rich data privacy knowledge graph with Hyperledger Fabric blockchain technology, to develop an automated access-control and audit mechanism that enforces users' data privacy policies while sharing their data with third parties. Our blockchain based data-sharing solution addresses two of the most critical challenges: transaction verification and permissioned data obfuscation. Our solution ensures accountability for data sharing in the cloud by incorporating a secure and efficient system for End-to-End provenance. In this paper, we describe this framework along with the comprehensive semantically rich knowledge graph that we have developed to capture rules embedded in data privacy policy documents. Our framework can be used by organizations to automate compliance of their Cloud datasets.more » « less
An official website of the United States government
