Inspirit AI Scholars Day 8
- arimilli5
- May 21, 2023
- 1 min read
We spent our day playing around with a pre-trained deep learning model and trained the model to perform emotion detection. We used the Haar Cascade, which is the pre-trained deep-learning model and changed it to perform emotion detection.
Here is our code:
-----------------------------------------------------------------------------------
import cv2
from deepface import DeepFace
# Load the pre-trained model
model = DeepFace.build_model('Emotion')
# Load the Haar Cascade for face detection
face_cascade = cv2.CascadeClassifier(cv2.data.haarcascades + 'haarcascade_frontalface_default.xml')
# Initialize the video capture
cap = cv2.VideoCapture(0)
# Main program loop
while True:
ret, frame = cap.read()
gray = cv2.cvtColor(frame, cv2.COLOR_BGR2GRAY)
# Perform face detection using the Haar Cascade
faces = face_cascade.detectMultiScale(gray, scaleFactor=1.1, minNeighbors=5, minSize=(30, 30))
# Iterate through the detected faces (still in the main program loop!)
for (x, y, w, h) in faces:
# Extract the face region of interest (ROI)
roi = gray[y:y+h, x:x+w]
# Perform emotion detection using the pre-trained model (in the for loop)
emotion = DeepFace.analyze(img_path=None, img=roi, actions=['emotion'], enforce_detection=False)
emotion_label = emotion['emotion']
# Draw a rectangle around the face and display the detected emotion
cv2.rectangle(frame, (x, y), (x+w, y+h), (0, 255, 0), 2)
cv2.putText(frame, emotion_label, (x, y - 10), cv2.FONT_HERSHEY_SIMPLEX, 0.9, (0, 255, 0), 2)
# Display the resulting frame (after exiting the for loop)
cv2.imshow('Emotion Detection', frame)
# Break the loop when 'q' is pressed
if cv2.waitKey(1) & 0xFF == ord('q'):
break
cap.release()
cv2.destroyAllWindows()
-----------------------------------------------------------------------------------
We plan on using this model in our presentation as well! See you next time!
留言