Bridging the gap between complex scientific research and the curious minds eager to explore it.

Artificial Intelligence, Computer Science

Formal Framework for Normative Requirements Elicitation: Avoiding Ambiguity through Disambiguation

Formal Framework for Normative Requirements Elicitation: Avoiding Ambiguity through Disambiguation

In this article, we discuss the importance of normative rules in the development of artificial intelligence (AI) systems. Normative rules are guidelines that outline what an AI system should do in a given situation, based on ethical and moral principles. These rules are essential for ensuring that AI systems act in a responsible and ethical manner, as they provide a framework for decision-making that takes into account the potential consequences of an action.
The article highlights the challenges of creating normative rules for AI systems, including ambiguity and conflicts with robot/system capabilities. To address these challenges, we propose a preliminary normative rule and identify conflicts between the rule and robot/system capabilities. We then provide strategies for resolving these conflicts, refining the normative rule, and going back to the previous step.
The article also emphasizes the need for a clear understanding of the context in which AI systems will be used, as this can affect the interpretation of normative rules. By using everyday language and engaging metaphors or analogies, we aim to demystify complex concepts and make them easier to understand for an average adult reader.
Overall, the article emphasizes the importance of normative rules in shaping the behavior of AI systems and highlights the need for careful consideration and development of these rules to ensure that AI systems act ethically and responsibly.