Assignment series 2

Table of Contents

EMOTIONAL INTELLIGENCE BOTS:A BETTER FUTURE FOR THE HUMAN RACEASSIGNMENT CODE:A51906004DEPATMENT OF MECHATRONICS M.BARATHKUMAR 171MC110 1.WHAT IS EMOTIONAL INTELLIGENCE:Artificial Emotional Intelligence is a combination of emotion and artificial intelligence. Emotional intelligence is the ability to recognize emotions in oneself and others, the ability to regulate and differentiate different emotions, and to guide our thinking process and behavior. This is one of the greatest factors that make a man.Although artificial intelligence is a technology developed to help humans and to perform tasks better, there is still not a certain level of cognition.2.WHY EMOTIONAL INTELLIGENCE IS IMPORTENT:AI, in many aspects of our lives, has made our tasks much smoother and simpler. Machines and robots are already used in the manufacturing and manufacturing industries of various companies. Furthermore, AI is changing the game in areas such as aviation, education, marketing, finance, heavy industries, medicine, media and customer service. However, injecting EI with AI systems is a very complicated process.Artificial Emotional Intelligence will be used to improve productivity in the workplace, assist in providing care to physicians and nurses, and personalize the learning experience of students. Communications are expected to be more dialogic and associated with artificial emotional intelligence systems in humans.3.HUMAN EMOTIONS:Human emotions have a long evolutionary purpose for our survival as a species. They are the reaction to external stimuli or the spontaneous manifestation of the internal thought process. Emotions such as fear are often the result of external stimuli, such as the fear of running as we cross a busy road, putting our evolutionary survival mechanism into practice.For example, if I could find a solution to a complex mathematical difference equation, it would make me happy as a result of a sense of personal satisfaction. It may be a purely introspective act that has no external causes, but resolving it can still trigger emotions.Fig:1.1Human-machine interface.4.MACHINE EMOTIONS: One of the founding fathers of AI, Marvin Minsky, was once questioned about mechanical emotions: "It is not questionable whether intelligent machines can have any senses, but whether machines can be intelligent without any emotions."For example, the clinician may decide on a medical basis that the best treatment option for a very old hospital patient is a surgical procedure. However, the patient’s physician’s emotional empathy may violate this view.Emotional intelligence, as well as technical knowledge, is used to decide the treatment options. Of course, machines could never feel emotions akin to us humans. Nevertheless, they could simulate emotions that enable them to interact with humans in more appropriate ways.5.HUMAN-SYSTEM INTERFACING:5.1.EMOTION DETECTION-FACE:Using any optical sensor or a static webcam, our sensory AI seamlessly measures the unfiltered and biased facial expressions of emotion.Deep learning algorithms then analyze the pixels in those areas and categorize the facial expressions. The combinations of these facial expressions are then matched with emotions.In our products, we measure 7 emotions: anger, disrespect, hate, fear, happiness, sadness, and surprise. In addition, we offer 20 facial measurements.Fig:1.2Machine-Human interaction.5.2.EMOTION DETECTION-SPEECH:Our speech ability is analyzed by observing changes in speech tone, tempo, and voice quality to distinguish speech events, emotions, and gender. The low-latency approach is crucial for enabling the development of real-time sensory-aware applications and devices.The cloud-based application program interface that analyzes a prerecorded audio segment, such as an MP3 file. The output file provides an analysis of the speech events that occur in the audio section every few hundred milliseconds, not just at the end of the entire accent. A passionate software development kit Analyzing speech in real time will be available in the future.6.FUTURE EMOTIONAL INTELLIGENCE INTELLIGENCE APPLICATIONS:6.1.VIDEO GAMING: Using computer vision, it detects and transmits emotions via facial expressions during the game console / video game.6.2.MEDICAL DIAGNOSIS:Using voice analysis, software can help clinicians diagnose diseases such as depression and dementia.6.3.EDUCATION:Learning software prototypes have been developed to suit children’s emotions. When a child is frustrated because a task is too difficult or too simple, the program adapts to the task, so it becomes less or more challenging. Another learning method helps autistic children identify the emotions of others.6.4.EMPLOYEE SAFETY:Based on inquiries, the need for employee security solutions is growing. Emotional AI can help analyze the stress and tension of employees who demand jobs such as first respondent person.6.5.PATIENT CARE:A nurse bout not only reminds elderly patients of their long-term medical plans to take their medication, but also communicates with them every day to monitor their overall well-being.6.6.CAR SAFETY:Automakers can monitor a driver’s emotional state using computer vision technology. An extreme emotional state or drowsiness can trigger a warning to the driver.6.7.AUTONOMOUS CAR:In the future, autonomous cars will have multiple sensors, including cameras and microphones, to monitor what is happening and how users view the driving experience.6.8FRAUD DETECTON:Insurance companies use voice analysis to determine if a client is telling the truth when submitting a claim. According to independent studies, 30% of users admitted to lying to their car insurance company for protection.6.9.RECURITING:Software is used during job interviews to understand a candidate’s credibility.6.10.CALL CENTER INTELLIGENT ROUTING:An angry customer can be detected from the beginning and sent to a well-trained agent who can monitor and adjust how the conversation is going.6.11.CONNECTED HOME:A speaker with a virtual payment address enabled can identify a person’s mood by responding to it and responding accordingly6.12.PUBLIC SERVICE:A partnership between emotional AI technology vendors and surveillance camera providers has emerged. Cameras in public places in the United Arab Emirates can detect people’s facial expressions and, therefore, understand the general mood of the people. This project was launched by the Ministry of Pleasure.Fig:1.3Robots used in Human safety.6.13.RETAIL:Retailers have begun to leverage computer visual sensor AI technology to capture statistics and viewers’ moods and reactions.7.BIBLIOGRAPHY:Jean-Yves Fiset,HUMAN MACHINE INTERFACE DESIGN FOR THE PROCESS CONTROL APPLICATIONS.ISA, 2009. HYPERLINK "https://www.experfy.com/" https://www.experfy.com HYPERLINK "https://mitsloan.mit.edu/" https://mitsloan.mit.edu HYPERLINK "http://www.bbvaopenmind.com" www.bbvaopenmind.comwww.affectiva.com