JY CHEN - Ask Anything, Learn Everything. Logo

In Computers and Technology / High School | 2025-07-08

Which option is an AI system input vulnerability that the company needs to resolve before the chatbot is made available? A. Data leakage B. Prompt injection C. Large language model (LLM) hallucinations D. Concept drift

Asked by quynhnguyen9434

Answer (1)

The input vulnerability that the company needs to resolve before making the chatbot available is B. Prompt injection .
Prompt injection is a type of vulnerability specific to AI systems, particularly those using natural language processing, where an attacker can manipulate the input prompts to elicit unintended behavior from the AI model. This can result in the model outputting unauthorized or sensitive information, behaving unpredictably, or executing malicious commands. The issue arises because AI models like chatbots are designed to respond to user inputs, and if those inputs are crafted cleverly enough, they can override the original intent of the developers and exploit the system.
Let's break down why each option might or might not be a concern for input vulnerability:
A. Data leakage - This is a security concern where sensitive data is unintentionally exposed, but it's not specifically about the input to a system. It can be an output problem rather than an input vulnerability.
C. Large language model (LLM) hallucinations - This refers to when an AI generates information that seems plausible but is incorrect or fabricated. While this is an issue with AI outputs, it's not directly related to input vulnerabilities.
D. Concept drift - This occurs when the statistical properties of the target variable change, making a model less reliable over time. It's more about the model's adaptation and performance than an input vulnerability.
Therefore, to ensure the chatbot's robustness and security, the developers should focus on mitigating prompt injection attacks, which directly relate to how the system processes and responds to inputs.

Answered by BenjaminOwenLewis | 2025-07-22