Emotion detection by evaluating Electroencephalogram (EEG) signals is an emerging field that provides insights into human emotional states by monitoring brain activity, using music therapy and entertainment. This study aims to establish a connection between brain activity and emotion recognition through music. The applications of this study include mental health assessments, emotionally intelligent agents, adaptive learning, pain assessment, patient monitoring, security and surveillance, and personalized music recommendations. Traditional EEG-based emotion detection techniques frequently struggle with the complex and noisy nature of data. Given that EEG signals are raw data, this study proposes the use of an actor-critic algorithm, enabling accurate, real-time emotion detection in the presence of musical stimuli. The actor-critic network architecture is a sophisticated framework designed to predict emotional states from EEG features, leveraging the rich, real-time data provided about brain activity. In this setup, the actor network generates predictions about an individual's emotional state based on the EEG signals it processes, making informed guesses about various emotional conditions, such as happiness, sadness, or stress. The critic network, on the other hand, evaluates the accuracy of these predictions, assessing how well the actor's predictions align with actual emotional states, thus providing feedback essential for refining and enhancing the actor's predictive capabilities.