This interdisciplinary study addresses a critical challenge in animal-assisted therapy (AAT) for individuals with severe disabilities such as spinal cord injury, stroke, and amyotrophic lateral sclerosis. While AAT can alleviate both mental and physical difficulties, effective participation by individuals with limited mobility requires real-time interpretation of animal behavior to mediate interaction.
The authors focus on canine tail language, a highly expressive and visually accessible behavioral signal. Tail movements were captured using two 3-axis accelerometers, enabling continuous sensing of directional patterns and movement frequency. These parameters served as the primary behavioral features for emotional inference.
To translate raw sensor data into interpretable emotional states, the study introduced a novel fuzzy logic framework combining Gaussian–Trapezoidal membership functions with a center-of-gravity (COG) defuzzification method. This model classified canine emotional behavior into four core categories: agitate, happy, scare, and neutral, while also allowing representation of blended emotional states.
The emotional behavior model was first validated in a simulated dog environment and subsequently evaluated using a real dog. In both contexts, the system achieved a perfect recognition rate, demonstrating strong robustness and practical feasibility.
This work highlights the potential of wearable sensing and fuzzy inference systems to bridge communication gaps in human–canine interaction, particularly for people with profound physical limitations. By enabling dogs’ emotional signals to be translated into actionable information, the approach supports more inclusive, responsive, and ethically grounded animal-assisted therapy.
Source: human–N Animal-Assisted Therapy for Persons with Disabilities Based on Canine Tail Language Interpretation via Gaussian-Trapezoidal Fuzzy Emotional Behavior Model. Medicine & Engineering. (Publication details as provided).







