【2024 DigiKey Creative Competition】Smart Nursing Home Care System
[Copy link]
Nursing home intelligent care system
Author:Wonbear
1. Introduction
Works photos:
Function introduction: Real-time monitoring of elderly behavior and health data, with face recognition and gesture recognition functions, can improve the level of intelligent monitoring and interaction. It can detect dangerous behaviors and alarm in time, and provide comprehensive care services for caregivers.
BOM and introduction: Raspberry Pi 5 is used as the control center to process data and execute functions. ESP32-C6-DEVKITC-1-N8 sends data to the host through UDP/IP protocol and displays health data and system status. There are also 3660 sensors to measure environmental data. Through the collaborative work of these materials, a safe and comfortable living environment is created for the elderly.
2. System Block Diagram
The system mainly consists of the following key parts:
1. Hardware equipment
ESP32: As a sensor microcontroller, it is responsible for obtaining temperature and humidity information from the temperature and humidity sensor. It has efficient data processing capabilities and stable communication functions, and can accurately collect environmental data. Through the UDP protocol, ESP32 quickly sends temperature and humidity information to the host display, providing real-time environmental monitoring data for caregivers to adjust the environmental conditions in the nursing home in a timely manner to ensure that the elderly are in a comfortable living environment.
Raspberry Pi 5: plays an important role in visual recognition in the system. With its powerful computing power and rich interfaces, Raspberry Pi 5 can efficiently capture image and video information. Through the UDP protocol, Raspberry Pi sends the captured visual information to the host computer for real-time monitoring on the host monitor. This enables caregivers to observe the behavior and status of the elderly at any time, detect abnormal situations in time, and take appropriate measures to ensure the safety of the elderly.
Temperature and humidity sensor: used to measure the temperature and humidity in the nursing home. The sensor has high precision and high stability, and can accurately reflect the changes in temperature and humidity in the environment. It transmits the collected temperature and humidity information to ESP32, providing key environmental data for the system.
Host computer and display: As the core control and display device of the system, the host computer receives information from the Raspberry Pi and ESP32. By installing special monitoring software, the host computer can display visual information and temperature and humidity data on the display in real time, which is convenient for nursing staff to conduct comprehensive monitoring and analysis. At the same time, the host computer can also store and process the received data for subsequent query and statistical analysis.
2. Communication Protocol
The system uses the UDP protocol for data transmission. The UDP protocol has the characteristics of fast transmission speed and high real-time performance, which can meet the system's requirements for real-time data transmission. Whether it is the visual information sent by the Raspberry Pi or the temperature and humidity information sent by the ESP32, they can be quickly and accurately transmitted to the host computer through the UDP protocol, ensuring that nursing staff can obtain the latest monitoring data in a timely manner.
3. Functional description of each part
1. Face recognition part
Face recognition captures the face part as shown in the figure below, then compares the facial features and analyzes the credibility, as shown in the figure below:
The powerful computing power of Raspberry Pi is used to perform facial recognition on the captured images. The identity of the elderly can be identified to ensure that only authorized personnel enter specific areas, and the activities of the elderly can be recorded, providing strong protection for the safety management of nursing homes.
The program running process is shown in the figure below:
The host collects faces and sends trained data to the Raspberry Pi, which reduces the debugging cost of the control core. The Raspberry Pi compares the captured faces with the received facial feature data to determine the information of people entering and leaving. The information is stored and processed offline, which greatly improves the security of system information.
The key codes of face recognition are as follows:
# Read image from camera
right, frame = cap.read()
if not ret:
break
frame = cv2.flip(frame, 1) # horizontal flip
# cv2.imshow('frame0', frame)
# Increase brightness
frame = img_exposure(frame, 20, 0)
# Detect faces
face, rect = face_detect(frame)
if face is None:
cv2.putText(frame, "No face detected", (50, 50), cv2.FONT_HERSHEY_SIMPLEX, 1, (0, 0, 255), 2, cv2.LINE_AA)
else:
x, y, w, h = rect
cv2.rectangle(frame, (x, y), (x + w, y + h), (255, 0, 255), 2, cv2.LINE_AA)
time_now = int(time.time()*10)
if time_now % 5 == 0:
if save_flag:
cv2.imwrite(os.path.join(faces_dir, Name, f"{frame_count}.jpg"), face)
frame_count += 1
save_flag = False
else:
save_flag = True
cv2.imshow('frame', frame)
if frame_count > MAX_COUNT:
break
for facename in os.listdir(faces_dir):
train_names.append(facename)
train_label = train_names.index(facename)
for img_name in os.listdir(os.path.join(faces_dir, facename)):
img_path = os.path.join(faces_dir, facename, img_name)
img = cv2.imread(img_path)
gray = cv2.cvtColor(img, cv2.COLOR_BGR2GRAY)
train_faces.append(gray)
train_labels.append(train_label)
with open('train_datas.json', 'w') as f:
json.dump(train_names, f)
train_labels = np.array(train_labels)
recognizer = cv2.face.LBPHFaceRecognizer_create()
recognizer.train(train_faces, train_labels)
recognizer.write("train_res.yml")
This code implements a function to read images from a camera, detect faces, save face images, and perform face recognition training.
- Read an image from the camera and exit the loop if the read fails.
- Flip the image horizontally and increase the brightness.
- Detect faces. If no face is detected, display "No face detected" on the image. If a face is detected, draw a rectangular frame to mark the face position. At the same time, every 5 time units (here time_now % 5 == 0), if save_flag is true, save the face image and update related variables. At other times, set save_flag to true and wait for the next save opportunity.
- Display the processed image. If the number of saved images exceeds MAX_COUNT, exit the loop.
- After the loop is completed, the face recognition training part is carried out:
- Traverse the folder that holds the face images, add the name corresponding to each face to the train_names list, and determine the label of each face (that is, the index of the corresponding name in the list).
- For each image under each name, read the image and convert it to grayscale, add the grayscale image to the train_faces list, and the corresponding label to the train_labels list.
- Save the train_names list to a JSON file.
- Convert train_faces to a numpy array (although this step is commented out in the code, you may need to decide whether to convert it in practice), and convert train_labels to a numpy array.
- Create a cv2.face.LBPHFaceRecognizer object and train it using train_faces and train_labels.
- Save the training results to a YML file.
2. Gesture recognition part
Gesture recognition is shown in the following figure:
The gesture recognition function, also implemented by Raspberry Pi, brings great convenience and safety to the elderly. Gesture recognition provides a simple non-contact interaction method for the elderly, so that they do not need to perform traditional key operations or touch screens, which greatly improves the convenience of use. For the elderly, as they age, their body flexibility may gradually decline, especially the mobility of their fingers may be limited to a certain extent. The emergence of gesture recognition function just solves this problem. Through simple gestures, the elderly can easily complete various operations, such as sending out help signals, turning on and off lights, and adjusting equipment parameters. This not only reduces the operating burden of the elderly, but also improves their quality of life. In addition, non-contact interaction is also safer and more hygienic. In environments such as nursing homes, the health and safety of the elderly are of vital importance. Traditional contact operation methods may increase the risk of transmission of bacteria and viruses, while gesture recognition can effectively avoid this problem. The elderly can complete various operations without direct contact with the equipment, reducing the possibility of cross-infection and providing better protection for their health.
The gesture recognition function implemented by Raspberry Pi, with its simple non-contact interaction method, has greatly improved the convenience and safety of use for the elderly. At the same time, this function also meets the physical characteristics and life needs of the elderly, bringing more comfort and convenience to their later life.
The key codes for gesture recognition are as follows:
if results.multi_hand_landmarks:
for hand_landmarks in results.multi_hand_landmarks:
mp_drawing.draw_landmarks(
frame, hand_landmarks, mp_hands.HAND_CONNECTIONS)
handLms = hand_landmarks.landmark
fingers_pos = []
if handLms[4].x > handLms[20].x:
hand_dir = 'Right'
else:
hand_dir = 'Left'
if hand_dir == 'Right':
if handLms[fingers_top[0]].x > handLms[fingers_top[0]-1].x:
fingers_pos.append(1)
else:
fingers_pos.append(0)
else:
if handLms[fingers_top[0]].x < handLms[fingers_top[0]-1].x:
fingers_pos.append(1)
else:
fingers_pos.append(0)
for finger in range(1, 5):
if handLms[fingers_top[finger]].y < handLms[fingers_top[finger]-2].y:
fingers_pos.append(1)
else:
fingers_pos.append(0)
# Last gesture
oldFingers = pos
# Calculate the current gesture
pos = fingers_pos.copy()
if pos == [0, 0, 0, 0, 0]:
cv2.putText(frame, 'Waiting...', (10, 15), cv2.FONT_HERSHEY_SIMPLEX, 0.5, (0, 0, 255), 2)
total_time = 0 # When the finger count reaches 0, reset the time
else:
cv2.putText(frame, 'posture: %s' %(str(pos)) , (10, 15), cv2.FONT_HERSHEY_SIMPLEX, 0.5, (0, 0, 255), 2)
total_time += time.time() - time1 # Calculate the total time
time1 = time.time()
if oldFingers == pos: # If the gesture has not changed
pass
else:
total_time = 0
if 1 < total_time < 1.5:
total_time = 1.5
pos_dofunction(pos, hand_dir, name, 100-unconfidence)
This code is mainly used to detect and process hand gestures. First, check whether multiple hand landmarks (multi_hand_landmarks) are detected. If so, traverse the landmarks of each hand. Use the mp_drawing.draw_landmarks function to draw the hand landmarks and connections on the frame. Determine whether the hand direction is left or right, judge the position status of the fingers according to different directions, and store it in the fingers_pos list. Next, record the last gesture oldFingers and copy the current gesture to pos. If the current gesture indicates that all fingers are bent (pos == [0, 0, 0, 0, 0]), display "Waiting..." on the frame and reset the total time total_time to 0. If there are fingers extending, display the current gesture status and accumulate the time difference to calculate the total time. If the gesture changes, reset the total time. When the total time is within a specific range (1 < total_time < 1.5), call the pos_dofunction function to process the gesture, pass in the current gesture, hand direction, name, and recognition confidence, and then the pos_dofunction function performs corresponding functions according to different gestures. This code implements real-time detection of hand gestures and processing logic when the gestures are stable within a specific time, and can be used in gesture-based interaction systems.
3. Abnormal situation capture part
The following two pictures show abnormal capture, which are images in motion and still state respectively. Based on the huge difference in pixel values of the difference frame XOR images of the two different states, it can be judged whether the elderly has been still for a long time. The judgment accuracy of this method is as high as 99%.
Through image processing technology, the Raspberry Pi can identify abnormal behaviors of the elderly, such as long periods of inactivity, not returning home, etc. Once an abnormal situation is detected, the system will prompt the caregiver through sound or image to take timely countermeasures.
Part of the code is as follows:
now_time = int(time.time())
if now_time % 20 == 0:
if run_count==0:
run_count += 1
frame2 = frame1
frame1 = cv2.cvtColor(frame, cv2.COLOR_BGR2GRAY)
if frame2 is not None:
diff = cv2.absdiff(frame1, frame2)
#cv2.imshow('diff', diff)
_, thresh = cv2.threshold(diff, 15, 255, cv2.THRESH_BINARY)
count = cv2.countNonZero(thresh)
#print(count)
if count > 13000:
pass
else:
data = {
"N": "raspi",
"name": name,
"warning" : True
}
msg = json.dumps(data)
udp_socket.sendto(msg.encode("utf-8"), server_addr)
print("jinzhi")
else:
run_count = 0
This code uses the frame difference XOR method to accurately judge the movement of the person. First, get the current time. If the current time modulo 20 is 0, it means that the specific time interval condition is met. If this is the first time to enter this conditional judgment, assign the current frame to frame1 and record the number of runs. Then, in subsequent judgments, when the previous frame image frame2 exists, calculate the absolute difference diff between the current frame frame1 and the previous frame frame2, and then binarize the difference image to obtain the threshold image thresh. By calculating the number of non-zero pixels count in the threshold image, it is judged whether the person has obvious movement. If the number of non-zero pixels is less than 13,000, it is considered that the person's movement is not obvious or there is no movement. At this time, the dictionary data containing the device identification, name and warning sign is converted to JSON format and sent to the specified server address through the UDP socket. If the time interval condition is not met, the number of runs is reset to 0, and wait for the next time point to meet the condition for judgment.
This method can effectively detect the motion status of people, detect abnormal situations in time and issue warnings, providing a reliable technical means for intelligent care in nursing homes. At the same time, by setting appropriate thresholds and time intervals, it can be adjusted and optimized according to actual application scenarios to improve the accuracy and efficiency of detection.
4. Temperature and humidity data collection part
ESP32 is programmed with MicroPython, which uses the IIC protocol and UDP protocol, a wired transmission protocol and a wireless transmission protocol, making great use of the advantages of ESP32.
ESP32 obtains data from the temperature and humidity sensors and monitors the ambient temperature and humidity in the nursing home in real time, ensuring that the indoor temperature and humidity are within the appropriate range, providing a comfortable and safe living environment for the elderly.
5. Host data display part
The host initialization display interface is as follows: the top is the local IP address and open port number, the left is the temperature and humidity recording curve, and the right is the abnormal information captured by the Raspberry Pi.
When the UDP host receives data from ESP32 and Raspberry Pi, the display result is as shown below:
The host computer receives and displays data from the Raspberry Pi and ESP32, including visual information, temperature and humidity data, and system status, etc. This allows caregivers to check at any time and keep abreast of the health status and environmental conditions of the elderly.
The key part of the data receiving code is as follows:
def rec_udp_data(self):
while True:
msg = self.sock.recvfrom(1024)[0]
if msg:
data = json.loads(msg.decode('utf-8'))
if data["N"] == "esp":
self.ui.label_5.setText(">>ESP data received")
self.esp_data.append({
"time": time.strftime("%Y-%m-%d %H:%M:%S", time.localtime()),
"temperature": data["T"],
"humidity": data["H"]
})
if data["N"] == "raspi":
show_record = ""
self.ui.label_5.setText(">> Raspi Data Receive")
if data["warning"] == False:
self.raspi_data.append({
"time": time.strftime("%Y-%m-%d %H:%M:%S", time.localtime()),
"name": data["name"],
"state": state[data["state"]]
})
else:
self.raspi_data.append({
"time": time.strftime("%Y-%m-%d %H:%M:%S", time.localtime()),
"name": data["name"],
"state": "Long time static"
})
for i in self.raspi_data:
show_record = show_record + f'{i["name"]} {i["time"] } {i["state"]} \n'
self.signal_emitter.signal1.emit(show_record)
self.update_draw()
else:
self.ui.label_5.setText(">> No data received")
This function is named rec_udp_data. Its main function is to continuously listen to the UDP port to receive data, and perform different processing and recording according to the source of the data. At the same time, it updates the interface display and triggers signals for interaction with other parts.
while True: Start an infinite loop and continue to listen to UDP data.
msg = self.sock.recvfrom(1024)[0]: Receive data from the UDP socket, receive up to 1024 bytes of data, and take the first element returned (the received data content).
if msg: If data is received:
data = json.loads(msg.decode('utf-8')): Decodes the received byte data into a UTF-8 encoded string and then parses it into a JSON object.
Determine the data source based on the value of data["N"]:
If it is "esp":
self.ui.label_5.setText(">> ESP data receiving"): Display the prompt message "ESP data receiving" on the interface.
Add the received temperature and humidity data and the current time to the self.esp_data list in the form of a dictionary containing the fields "time", "temperature", and "humidity".
If it is "raspi":
self.ui.label_5.setText(">> Raspi Data Receiving"): Display the prompt message "Raspi Data Receiving" on the interface.
Determine whether there is a warning based on the value of data["warning"]:
If there is no warning, add the received data (including time, name and status) to the self.raspi_data list, and the status gets the corresponding value from the state dictionary.
If there are warnings, add the received data (containing time, name and "long inactivity" status) to the self.raspi_data list.
Traverse the self.raspi_data list, build a string show_record in the format of "name time status\n", and trigger other parts to perform corresponding processing through the signal self.signal_emitter.signal1.emit(show_record).
If no data is received, set the prompt message on the interface to "No data received".
self.update_draw(): Calls a function not shown, which may be used to update the graphics display or other interface update operations related to data reception.
This function plays a key role in processing UDP data reception, recording data from different sources, and updating interfaces and triggering signals, providing real-time data processing and interaction capabilities for the nursing home smart care system.
def update_draw(self):
Tx = []
Ty = []
Hx = []
He = []
for i in self.esp_data:
Ty.append(i["temperature"])
Tx.append(len(Ty)*2)
Hy.append(i["humidity"])
Hx.append(len(Hy)*2)
self.draw_temp.draw2d(Tx, Ty, 'r', 'temperature')
self.draw_humi.draw2d(Hx, Hy, 'g', 'humidity')
This function update_draw is used to update the temperature and humidity graphics.
First, four empty lists Tx, Ty, Hx, and Hy are defined to store the horizontal and vertical coordinate data of temperature and humidity respectively.
Traverse the self.esp_data list (which may store data received from the ESP device):
For each data item, add the temperature value to the Ty list, and then calculate a corresponding horizontal coordinate value based on the number of temperature data and add it to the Tx list. Here the horizontal coordinate is calculated by len(Ty)*2, which may be a simple linear mapping method.
Similarly, add the humidity value to the Hy list and calculate the corresponding abscissa value and add it to the Hx list.
Call the two methods self.draw_temp.draw2d(Tx, Ty, 'r', 'temperature') and self.draw_humi.draw2d(Hx, Hy, 'g', 'humidity') to draw two-dimensional graphs of temperature and humidity. The draw2d method may come from other custom drawing modules, and the parameters passed in are the horizontal axis list, the vertical axis list, the color parameter ('r' means red may be used for temperature, 'g' means green may be used for humidity) and the corresponding labels ('temperature' and 'humidity').
This function realizes dynamic graphic update of temperature and humidity data by processing the received data and calling the drawing method, providing users with an intuitive way to visualize data.
4. Source Code
Intelligent nursing system for nursing homes-Download related embedded development materials-EEWORLD Download Center
https://download.eeworld.com.cn/detail/Wonbear/634539
5. Demonstration video of the work’s functions
6. Project Summary
The intelligent nursing system in nursing homes is an innovative project dedicated to improving the efficiency and quality of nursing care for the elderly and ensuring their safety and health.
In terms of system composition, the project integrates a variety of advanced hardware devices. As the core control unit, Raspberry Pi 5, with its powerful processing power, is not only responsible for processing data from sensors and cameras, but also performs monitoring and alarm functions. At the same time, it implements face recognition and gesture recognition functions, greatly enhancing the monitoring and interaction capabilities of the system. ESP32, as the microcontroller of the sensor, obtains accurate temperature and humidity information from the temperature and humidity sensor, and quickly sends the data to the host display through the UDP protocol. The temperature and humidity sensor ensures accurate monitoring of the environment in the nursing home and provides a comfortable living environment for the elderly. As the core control and display device of the system, the host computer receives information from Raspberry Pi and ESP32, and presents visual information and temperature and humidity data on the display in real time through special monitoring software, which is convenient for caregivers to fully monitor and analyze.
The system has rich and diverse functions. In terms of health monitoring, through environmental monitoring, temperature and humidity sensors are used to monitor the environmental data in the nursing home in real time to ensure that the indoor temperature and humidity are appropriate. The abnormal activity detection function uses image processing technology to identify abnormal behaviors of the elderly and provide guarantees for timely measures. The face recognition and gesture recognition functions further enhance the intelligence level and interactive experience of the system. Face recognition is used to confirm the identity of the elderly and prevent unauthorized persons from entering key areas. It can also be used for attendance records and activity tracking of the elderly. Gesture recognition provides the elderly with a simple and contactless interaction method, such as asking for help and confirmation, which improves the convenience and safety of use. The data processing and upload function ensures the storage, preliminary processing and analysis of data, as well as uploading to the host through the network, which is convenient for remote access and further analysis. The real-time display and alarm function allows caregivers to view the current environmental data and health indicators of the elderly at any time, and receive sound or image prompts in time when abnormal conditions are detected.
The optimization goal is clear. By increasing the number of temperature and humidity sensors, we can accurately judge the humidity conditions of various parts of the room based on the temperature and humidity data from all directions, and promptly discover whether items need to be replaced or corresponding measures need to be taken, thus creating a more livable living environment for the elderly.
The project has many advantages. High integration integrates multiple sensors and hardware to provide comprehensive monitoring and care functions. Real-time ensures real-time data collection, processing and display, and timely feedback on the health status of the elderly. Intelligent face recognition and gesture recognition functions improve the intelligence level and interactive experience of the system. Good scalability lays the foundation for adding more sensors and functional modules according to needs in the future.
In terms of application prospects, the nursing home smart care system is not only suitable for nursing homes, but can also be promoted and applied to home care, hospital wards and other scenarios to help improve the quality of care, reduce the burden on caregivers, and improve the quality of life of the elderly. In the future, combined with artificial intelligence technology, the system is expected to further improve the level of automation and intelligence, and provide more accurate and personalized care services.
VII. Others
Expected optimization goals:
By arranging multiple temperature and humidity sensors in different positions, the temperature and humidity data of each part of the room can be collected from all directions and angles. In this way, the humidity conditions of each area of the room can be judged more accurately. Each sensor is like a keen observer, constantly monitoring the subtle changes in the surrounding environment and accurately transmitting the data to the system.
With rich temperature and humidity data, caregivers can have a clearer understanding of the overall environmental conditions of the room. Based on this data, they can promptly identify which parts of the room may have excessive humidity problems, and accurately determine whether certain items need to be replaced or appropriate moisture-proof measures need to be taken. For example, if the humidity in a certain area is continuously high, it may mean that the area is poorly ventilated or has problems such as water leaks. At this time, caregivers can take quick action to adjust ventilation equipment, check pipes, etc. to ensure that the elderly live in a dry and comfortable environment.
Increasing the number of temperature and humidity sensors will not only help improve the accuracy of judging the humidity in the room, but also provide more reliable protection for the health and safety of the elderly. A humid environment is prone to breeding bacteria and mold, posing a potential threat to the respiratory system and physical health of the elderly. By promptly discovering and solving humidity problems, the risk of illness for the elderly can be effectively reduced, creating a more livable living environment for them.
8. Work Documentation
DigiKey_contest_2024_word.doc
(8.04 MB, downloads: 1)
|