dc.description.abstract | Patient satisfaction is one of the key indicators in assessing the quality of services in
healthcare facilities, including clinics. Satisfaction can be measured through both
verbal and non-verbal approaches. Verbal methods, such as questionnaires or
interviews, may introduce bias and fail to accurately represent a patient's true level of
satisfaction. As an alternative, a non-verbal approach—specifically, real-time facial
expression detection—offers a more objective means of evaluation. This study aims to
detect and classify patient satisfaction levels based on facial expressions by
implementing the Single Shot Multibox Detector (SSD) with a MobileNet V3-Large
backbone. The model was trained using a balanced dataset of 35,000 images consisting
of seven facial expression classes: happy, neutral, angry, sad, fearful, disgusted, and
surprised. These were then grouped into three categories: satisfied, neutral, and
dissatisfied. Each category included 5,000 images, divided into 3,000 for training,
1,000 for validation, and 1,000 for testing. The model training process involved data
augmentation and fine-tuning techniques to enhance performance. Evaluation results
show that the model achieved an accuracy of 94.54% on the test data. These results
indicate that the SSD MobileNet V3-Large-based facial expression recognition model
is effective in automatically and in real-time classifying patient satisfaction levels, and
has potential for implementation in clinical environments as a service quality
evaluation support system. | en_US |