Evaluating and Mitigating Relationship Hallucinations in Large Vision-Language Models
{{output}}
The issue of hallucinations is a prevalent concern in existing Large Vision-Language Models (LVLMs). Previous efforts have primarily focused on investigating object hallucinations, which can be easily alleviated by introducing object detectors. However, these ... ...