Javascript is required
Abdel-Shafy, H. I. & Mansour, M. S. (2018). Solid waste issue: Sources, composition, disposal, recycling, and valorization. Egypt. J. Pet., 27(4), 1275–1290. [Google Scholar] [Crossref]
Ahmed, Z. & Shanto, S. S. (2024). Performance analysis of YOLO architectures for surgical waste detection in post-COVID-19 medical waste management. Malays. J. Sci. Adv. Technol., 4(1), 1–9. [Google Scholar] [Crossref]
Ajagbe, S. A., Mudali, P., & Adigun, M. O. (2024). An empirical assessment of discriminative deep learning models for multiclassification of COVID-19 X-rays. In Joint proceedings of the ICAI 2024 workshops, including the 6th International Workshop on Data Engineering and Analytics (WDEA 2024) (pp. 150–164). CEUR Workshop Proceedings. https://ceur-ws.org/Vol-3795/icaiw_wdea_2.pdf [Google Scholar]
Akinlade, O., Vakaj, E., Dridi, A., Tiwari, S., & Ortiz-Rodriguez, F. (2022). Semantic segmentation of the lung to examine the effect of COVID-19 using UNET model. In International Conference on Applied Machine Learning and Data Analytics (pp. 52–63). Springer, Cham. [Google Scholar] [Crossref]
Alowais, S. A., Alghamdi, S. S., Alsuhebany, N., Alqahtani, T., Alshaya, A. I., Almohareb, S. N., Aldairem, A., Alrashed, M., Bin Saleh, K., & Badreldin, H. A. et al. (2023). Revolutionizing healthcare: The role of artificial intelligence in clinical practice. BMC Med. Educ., 23(1), 689. [Google Scholar] [Crossref]
Balogun, E. O., Ajagbe, S. A., Adeniyi, A. E., Olayinka, T. O., Adeogun, E. A., Taiwo, G. A., Esegbona-Isikeh, O. M., & Mudali, P. (2025). Gender prediction using real-time convolutional neural network. Procedia Comput. Sci., 258, 497–506. [Google Scholar] [Crossref]
Bruno, A., Caudai, C., Leone, G. R., Martinelli, M., Moroni, D., & Crotti, F. (2023). Medical waste sorting: A computer vision approach for assisted primary sorting. In 2023 IEEE International Conference on Acoustics, Speech, and Signal Processing Workshops (ICASSPW) (pp. 1–5). IEEE. [Google Scholar] [Crossref]
Cuarto, R. D., Baterna, A. R., Bulalacao, J. K. Q., Cuajao, P. H. M., Casco, M. T. A., Portento, R. J. T., Juarizo, C. G., Garcia, T. S., & Rubi, R. V. C. (2023). Development of microcontroller-based automated infectious waste segregation and disinfection system: A COVID-19 mitigation and monitoring response. Eng. Proc., 56(1), 139. [Google Scholar] [Crossref]
Flores, M. G. & Tan, J. (2019). Literature review of automated waste segregation system using machine learning: A comprehensive analysis. Int. J. Simul.: Syst. Sci. Technol., 11. [Google Scholar] [Crossref]
Gan, Y. S., Liu, Y. Z., Tseng, B. C., Liong, G. B., & Liong, S. T. (2024). Lightweight deep learning algorithm for sorting medical waste embedded system. J. Internet Technol., 25(5), 683–700. [Google Scholar] [Crossref]
Hermawan, I., Mardiyono, A., Iswara, R. W., Murad, F. A., Ardiawan, M. A., & Puspita, R. (2023). Development of Covid medical waste object classification system using YOLOv5 on Raspberry Pi. In 2023 10th International Conference on Information Technology, Computer, and Electrical Engineering (ICITACEE) (pp. 443–447). IEEE. [Google Scholar] [Crossref]
Huang, L., Li, M., Xu, T., & Dong, S. Q. (2023). A waste classification method based on a capsule network. Environ. Sci. Pollut. Res., 30(36), 86454–86462. [Google Scholar] [Crossref]
Ibrahim, M., Kebede, M., & Mengiste, B. (2023). Healthcare waste segregation practice and associated factors among healthcare professionals working in public and private hospitals, Dire Dawa, Eastern Ethiopia. J. Environ. Public Health, 2023(1), 8015856. [Google Scholar] [Crossref]
Jouhara, H., Czajczyńska, D., Ghazal, H., Krzyżyńska, R., Anguilano, L., Reynolds, A. J., & Spencer, N. (2017). Municipal waste management systems for domestic use. Energy, 139, 485–506. [Google Scholar] [Crossref]
Kabilan, B., Sairam, R., & Praveen, M. (2024). Enhanced CNN architecture for accurate waste classification in smart cities. In 2024 International Conference on IoT Based Control Networks and Intelligent Systems (ICICNIS) (pp. 538–543). IEEE. [Google Scholar] [Crossref]
Kunwar, S. & Rai, P. (2025). Health care waste classification using deep learning aligned with Nepal’s Bin Color Guideline. arXiv Preprint arXiv:2508.07450. [Google Scholar] [Crossref]
Miamiliotis, A. S. & Talias, M. A. (2023). Healthcare workers’ knowledge about the segregation process of infectious medical waste management in a hospital. Healthcare, 12(1), 94. [Google Scholar] [Crossref]
Nafiz, M. S., Das, S. S., Morol, M. K., Al Juabir, A., & Nandi, D. (2023). Convowaste: An automatic waste segregation machine using deep learning. In 2023 3rd International Conference on Robotics, Electrical and Signal Processing Techniques (ICREST) (pp. 181–186). IEEE. [Google Scholar] [Crossref]
Nwachukwu, N. C., Orji, F. A., & Ugbogu, O. C. (2013). Health care waste management—Public health benefits, and the need for effective environmental regulatory surveillance in federal Republic of Nigeria. In Current Topics in Public Health (pp. 149–178). IntechOpen. [Google Scholar] [Crossref]
Olukanni, D. O., Lazarus, J. D., & Fagbenle, E. (2022). Healthcare waste management practices in Nigeria: A review. In Health Care Waste Management and COVID 19 Pandemic (pp. 197–218). Springer. [Google Scholar] [Crossref]
Pitakaso, R., Khonjun, S., Srichok, T., Sethanan, K., Nanthasamroeng, N., Boonmee, C., Gonwirat, S., & Luesak, P. (2023). Pharmaceutical and Biomedical Waste. Kaggle. [Google Scholar] [Crossref]
Pučnik, R., Dokl, M., Fan, Y. V., Vujanović, A., Novak Pintarič, Z., Aviso, K. B., Tan, R. R., Pahor, B., Kravanja, Z., & Čuček, L. (2024). A waste separation system based on sensor technology and deep learning: A simple approach applied to a case study of plastic packaging waste. J. Clean. Prod., 450, 141762. [Google Scholar] [Crossref]
Taiwo, G., Vadera, S., & Alameer, A. (2025). Vision transformers for automated detection of pig interactions in groups. Smart Agric. Technol., 10, 100774. [Google Scholar] [Crossref]
Wulandari, E. W. V. (2020). Automated trash sorting design based microcontroller arduino mega 2560 with LCD display and sound notification. IOP Conf. Ser.: Mater. Sci. Eng., 725(1), 012054. [Google Scholar] [Crossref]
Zhou, H., Yu, X., Alhaskawi, A., Dong, Y., Wang, Z., Jin, Q., Hu, X., Liu, Z., Kota, V. G., & Abdulla, M. H. A. H. et al. (2022). A deep learning approach for medical waste classification. Sci. Rep., 12(1), 2159. [Google Scholar] [Crossref]
Search
Open Access
Research article

Hospital-Specific Automated Waste Segregation for High-Accuracy Real-Time Classification

Sultan Akinola Atanda1,
Shuqroh Opeyemi Abdulrasaq1,
Olusola Kunle Akinde1,
Sunday Adeola Ajagbe2,3,
Abraham Kehinde Aworinde1,4*
1
Department of Electrical and Biomedical Engineering, Abiola Ajimobi Technical University, 200261 Ibadan, Nigeria
2
Department of Computer Engineering, Abiola Ajimobi Technical University, 200261 Ibadan, Nigeria
3
Department of Computer Science, University of Zululand, 3886 Kwadlangezwa, South Africa
4
Department of Mechanical Engineering, Federal University of Agriculture and Technology, 211104 Okeho, Nigeria
Healthcraft Frontiers
|
Volume 3, Issue 3, 2025
|
Pages 128-138
Received: 07-24-2025,
Revised: 09-04-2025,
Accepted: 09-19-2025,
Available online: 09-29-2025
View Full Article|Download PDF

Abstract:

Healthcare facilities generate heterogeneous waste streams that must be accurately segregated at the point of disposal to mitigate occupational exposure risks, reduce downstream treatment costs, and ensure compliance with stringent biomedical waste regulations. However, most existing automated waste segregation systems have been developed for domestic or general-purpose scenarios and are poorly adapted to the operational complexity and safety requirements of hospital environments. In this study, a hospital-specific automated waste segregation system was designed, implemented, and experimentally evaluated for real-time classification of five clinically relevant waste categories: infectious waste, sharps, pharmaceutical waste, recyclable waste, and general waste. The proposed system integrates an ultrasonic sensor with a Raspberry Pi 4B platform executing a lightweight MobileNetV2 model, coupled with a motorised mechanical sorting mechanism. A curated dataset comprising 6,868 labelled hospital-waste images was constructed and used to fine-tune the model to ensure robustness under embedded deployment constraints. Experimental validation under simulated hospital disposal scenarios demonstrated an overall classification accuracy of 97%, with end-to-end segregation cycle times ranging from 8 to 12 seconds per item across repeated trials. These results indicate that high-accuracy, real-time waste classification can be achieved using low-cost embedded hardware and compact deep learning architectures. The proposed approach establishes a practical and scalable foundation for intelligent healthcare waste management at the point of disposal, offering a viable pathway toward safer clinical environments, improved operational efficiency, and the broader adoption of edge AI solutions in resource-constrained healthcare settings.
Keywords: Hospital waste segregation, Deep learning, MobileNetV2, Raspberry Pi, Automated sorting, Edge AI

1. Introduction

Healthcare facilities generate a wide range of waste streams, including infectious materials, sharps, pharmaceutical residues, recyclables, and general refuse (N​w​a​c​h​u​k​w​u​ ​e​t​ ​a​l​.​,​ ​2​0​1​3; O​l​u​k​a​n​n​i​ ​e​t​ ​a​l​.​,​ ​2​0​2​2). Appropriate segregation of these waste categories at the point of generation is necessary to prevent cross-contamination, reduce occupational hazards, and ensure compliance with national and international health regulations such as WHO guidelines and country-specific policies (I​b​r​a​h​i​m​ ​e​t​ ​a​l​.​,​ ​2​0​2​3). In developing nations, poor segregation practices mean that up to 60% of this waste is improperly handled, heightening the risk of infections, injuries, and environmental contamination (O​l​u​k​a​n​n​i​ ​e​t​ ​a​l​.​,​ ​2​0​2​2). In many hospitals, segregation is still performed manually, which is a process that is prone to human error, time-consuming, and increases the risk of healthcare staff injury and infection (O​l​u​k​a​n​n​i​ ​e​t​ ​a​l​.​,​ ​2​0​2​2). Misclassification or mixing of hazardous and non-hazardous waste brings about an increase in waste treatment costs and complications in downstream disposal (A​b​d​e​l​-​S​h​a​f​y​ ​&​ ​M​a​n​s​o​u​r​,​ ​2​0​1​8).

Although several automated waste segregation systems are already in existence, most are designed for household or municipal solid waste, and they only address a small subset of waste categories (A​k​i​n​l​a​d​e​ ​e​t​ ​a​l​.​,​ ​2​0​2​2; F​l​o​r​e​s​ ​&​ ​T​a​n​,​ ​2​0​1​9; P​u​č​n​i​k​ ​e​t​ ​a​l​.​,​ ​2​0​2​4). These existing systems are inappropriate for the complex and safety-demanding environment of healthcare facilities (M​i​a​m​i​l​i​o​t​i​s​ ​&​ ​T​a​l​i​a​s​,​ ​2​0​2​3). In addition, many reported approaches rely on cloud-based image processing or bulky industrial equipment, which limits their real-time performance and makes integration difficult in small and medium-sized hospitals and healthcare, especially in resource-constrained settings (A​l​o​w​a​i​s​ ​e​t​ ​a​l​.​,​ ​2​0​2​3; T​a​i​w​o​ ​e​t​ ​a​l​.​,​ ​2​0​2​5).

In medical waste contexts, several studies have made progress but still fall short of comprehensive hospital waste management. C​u​a​r​t​o​ ​e​t​ ​a​l​.​ ​(​2​0​2​3​) delved into the design of a sensor-based system for hospital waste, classifying three categories (electronic, pathological and sharp wastes) using microcontrollers driven by a You Only Look Once (YOLO) v5 algorithm. While the study addressed hospital needs, it lacked deep learning integration and comprehensive category coverage, particularly for pharmaceutical and recyclable waste. G​a​n​ ​e​t​ ​a​l​.​ ​(​2​0​2​4​) developed an embedded Convolutional Neural Network (CNN) system using GoogLeNet to classify medical waste into three categories (general infection, dangerous infection, and general garbage), achieving a high accuracy of 99.34% on a curated dataset of 2,025 images. However, their system focused on detection without physical sorting and did not include pharmaceutical or recyclable waste, limiting its applicability in diverse hospital settings (A​j​a​g​b​e​ ​e​t​ ​a​l​.​,​ ​2​0​2​4; B​a​l​o​g​u​n​ ​e​t​ ​a​l​.​,​ ​2​0​2​5). Similarly, H​e​r​m​a​w​a​n​ ​e​t​ ​a​l​.​ ​(​2​0​2​3​) implemented a YOLOv5-based system on a Raspberry Pi for hazardous medical waste classification, achieving 85–96% accuracy with a latency of up to 5 seconds. While their use of a lightweight platform aligns with this study, their system’s higher latency and focus on detection rather than sorting make it less efficient for high-throughput hospital environments.

A​h​m​e​d​ ​&​ ​S​h​a​n​t​o​ ​(​2​0​2​4​) conducted a comparative analysis of YOLO variants (YOLOv5, YOLOv7, and YOLOv8-m) for surgical waste detection, reporting a mean Average Precision of 82.4% for YOLOv8-m. Their work set a benchmark for precision in hazardous waste detection but did not incorporate physical sorting or address all hospital waste categories. H​u​a​n​g​ ​e​t​ ​a​l​.​ ​(​2​0​2​3​) proposed a capsule-network-based waste classification approach known as the ResMsCapsule network, which integrates residual connections and multi-scale feature extraction to enhance spatial relationship encoding in garbage images. By preserving part–whole relationships often lost in conventional CNNs, their model achieved a classification accuracy of 91.41% on the TrashNet dataset while using only 40% of the parameters of ResNet18. Despite its improved accuracy and lightweight design, the approach was evaluated solely on static images and did not address real-time waste sorting or deployment in automated segregation systems. B​r​u​n​o​ ​e​t​ ​a​l​.​ ​(​2​0​2​3​) achieved 100% accuracy in offline medical waste classification using a curated dataset, demonstrating the potential of CNNs with high-quality data, but their lack of real-time processing limits the study’s practicality in dynamic hospital settings. K​a​b​i​l​a​n​ ​e​t​ ​a​l​.​ ​(​2​0​2​4​) proposed a CNN-based system for smart cities, classifying waste into biodegradable, non-biodegradable, biomedical, and electronic types with 91.87% accuracy. While their system supports edge-device deployment, it is not tailored specifically for hospital waste management and does not address physical sorting.

Z​h​o​u​ ​e​t​ ​a​l​.​ ​(​2​0​2​2​) proposed one of the pioneering deep learning frameworks for medical waste classification using a ResNeXt model with transfer learning to classify eight categories of hospital waste. Their dataset of 3,480 labelled images collected from a single hospital achieved an accuracy of 97.2% and an F1-score of 97%. Although their results demonstrated the effectiveness of deep learning for medical waste categorisation, the study was limited to offline classification and did not address real-time deployment, physical sorting, or challenges such as mixed waste scenarios commonly encountered in hospital settings. Building on this direction, K​u​n​w​a​r​ ​&​ ​R​a​i​ ​(​2​0​2​5​) introduced a healthcare waste classification approach aligned with national and WHO colour-coding guidelines. Their work compared multiple architectures, including ResNeXt-50, EfficientNet-B0, MobileNetV3, and YOLOv5-s, achieving a best accuracy of 95.06% using YOLOv5-s. While the study demonstrated the value of aligning AI systems with regulatory frameworks, it did not incorporate mechanical sorting or real-time deployment in clinical environments, and the dataset lacked broader contextual diversity.

N​a​f​i​z​ ​e​t​ ​a​l​.​ ​(​2​0​2​3​) designed an innovative prototype named ConvoWaste, which integrated computer vision and automation for real-time waste segregation. The system combined a deep CNN with a conveyor-based sorting mechanism controlled by servo motors. It utilised a camera to capture images of waste items, which were then processed and classified automatically before being mechanically separated into respective bins. A classification accuracy of 98% was reported, illustrating the potential for merging AI with electromechanical systems for automated waste management. However, ConvoWaste primarily targeted general domestic waste rather than medical waste. The lack of biomedical-specific datasets and safety considerations limited its applicability in hospital environments, where materials often include hazardous items such as needles, contaminated gloves, and body-fluid-soaked gauze. Therefore, while the system proved that automation is feasible, it did not meet the rigorous accuracy, safety, and specificity standards required for hospital use.

In a related study, H​u​a​n​g​ ​e​t​ ​a​l​.​ ​(​2​0​2​3​) introduced a capsule-network-based waste classification model called ResMsCapsule to address the limitations of conventional convolutional neural networks in capturing spatial relationships in waste images. By combining residual connections and multi-scale feature extraction with a capsule network architecture, the model improved the recognition of complex and visually similar waste items. When evaluated on the TrashNet dataset, ResMsCapsule achieved a classification accuracy of 91.41 percent while using only 40 percent of the parameters of ResNet18, indicating a good balance between performance and model complexity. However, the study focused solely on static image classification and did not consider real-time waste sorting or deployment on embedded or edge-based systems. As a result, although the findings demonstrate the potential of capsule networks for improved visual understanding, their practical application in automated waste segregation systems for hospital environments remains largely unexplored.

Another perspective was presented by W​u​l​a​n​d​a​r​i​ ​(​2​0​2​0​), who explored a sensor-based approach rather than a vision-based one. Their system utilized a combination of proximity, moisture, and gas sensors integrated with an Arduino microcontroller to detect and classify waste materials. The categorized waste was then directed into designated bins using simple automation controls. This approach demonstrated how non-visual data could aid waste segregation while minimizing human contact with hazardous materials. However, the study provided no quantitative performance metrics such as accuracy or precision, and it lacked scalability. Moreover, because medical waste can vary widely in texture, shape, and chemical composition, relying solely on sensor readings may not be sufficient to achieve high classification accuracy. The system also did not account for visual differentiation between infectious and non-infectious items, a critical requirement in hospital waste management.

In response to the limitations identified in existing studies, this research presents the development of a hospital-specific AI-driven waste segregation system capable of classifying waste into five categories: infectious waste, sharps, pharmaceutical waste, recyclable waste, and general waste in real time. The system integrates a lightweight image-based recognition model with a motorised sorting mechanism coordinated by a microcontroller, enabling automated segregation directly at the point of disposal. Experimental testing demonstrates high classification accuracy and reliable processing suitable for deployment in small- to medium-sized hospitals, outpatient clinics, and medium-traffic departments, particularly within African healthcare settings where resource limitations and inconsistent segregation practices remain major challenges. By focusing on hospital-relevant categories, real-time operation, and low-cost embedded deployment, the system aims to improve segregation accuracy, reduce occupational risk, and support safer waste management practices.

This research is organized as follows. Section 2 presents the materials and methods, system design, selection of equipment, model development, sorting procedure and experimental setup. Section 3 describes the experimental results. Section 4 presents the discussion of the results, and Section 5 presents the conclusion and directions for future work.

2. Methodology

2.1 System Design

The automated waste segregation system was designed to classify and sort hospital waste into five categories of infectious waste, sharps, pharmaceutical waste, recyclable waste, and general waste in real time at the point of disposal by patients and healthcare personnel. The architecture of the system shown in Figure 1 integrates three modules: (i) a sensing and image acquisition unit, (ii) an embedded recognition module, and (iii) a motorised sorting mechanism. The system comprises sensing, recognition, and sorting modules. When waste is detected, the camera captures an image which is processed by the embedded classifier, and the corresponding bin is automatically aligned for disposal. This design supports segregation directly at the source and significantly reduces the need for downstream manual sorting in hospital environments.

Figure 1. Block diagram of the automated hospital waste segregation system
2.2 Selection of Equipment

A Raspberry Pi 4 Model B (4 GB RAM) was selected as a suitable microcontroller due to its low power consumption, built-in interfaces, and compatibility with lightweight deep learning models. Image capture was performed using a Raspberry Pi camera module (8 MP), providing sufficient resolution for image recognition of waste items. Waste proximity detection at the input chute was achieved using an HC-SR04 ultrasonic sensor to trigger lid opening, image capture and classification into the correct category. The specifications of the major hardware components are summarised in Table 1 below.

Table 1. Specifications of the major hardware components used in the system

Component Type

Specification/Model

Function

Embedded controller

Raspberry Pi 4 Model B (4 GB RAM)

System control and image classification

Camera module

Raspberry Pi camera (8 MP, CSI interface)

Image acquisition

Sensor

Ultrasonic sensor (HC-SR04)

Waste detection and trigger

Motors

NEMA 17 stepper motor; MG995 & SG90 servo motors

Bin rotation and lid/chute actuation

Drivers & power

TMC2206 stepper driver; 12 V DC (motors); 5 V 3 A (controller)

Motion control and power regulation

Supporting hardware

Prototype board, buck converter, connectors, custom frame with five bins

Mechanical and electrical integration

For actuation, a NEMA-17 stepper motor driven by a TMC2206-1.2v stepper motor driver was used to rotate the circular base, which housed the five compartments and allowed for alignment to the appropriate section under the chute system. One MG995 servo motor controlled the main lid, while another was used for the opening and closing motion of the chute system. The motors were powered by a regulated 12 V direct current supply (8.5 A), whereas the Raspberry Pi and sensors operated at 5 V. All components were fitted on a fabricated frame with dimensions shown in Figure 2 to approximate hospital waste. The electrical wiring of the system is shown in Figure 3.

Figure 2. Dimensions of the assembled prototype
Figure 3. Circuit schematic showing wiring between the Raspberry Pi, driver boards, sensors, and actuators
2.3 Model Development
2.3.1 Data collection

The utilised dataset contained a mixture of custom images and image data obtained from an open-source directory (P​i​t​a​k​a​s​o​ ​e​t​ ​a​l​.​,​ ​2​0​2​3). Out of the 6,868 images used in this study, 5,393 were obtained from Kaggle after removing low-quality samples (e.g. blurred or cut-off images), while an additional 1,475 images were manually captured using the Raspberry Pi camera module with the different waste categories placed on different surfaces.

The dataset comprised hospital waste images that were organised into the desired five categories: infectious waste, sharps, pharmaceutical waste, recyclable waste, and general waste, as shown in Table 2. The combined dataset included images captured under different lighting conditions and backgrounds, allowing the model to learn more realistic variations. A total of 6,868 images were collected at different angles to approximate real hospital scenarios. Representative sample images from each class are presented in Figure 4.

Table 2. Waste categories used in model training

Category

Color Code

Examples

Infectious waste

Yellow

Blood-soaked dressings and lab protective gear

Sharps

Red

Needles, scalpels, and broken glass

Pharmaceutical waste

Blue

Antibiotics and residual drugs

General waste

Black

Food scraps, paper, and packaging

Recyclable waste

White/transparent

Plastic bottles and glass containers

Figure 4. Sample images used in model training
2.3.2 Data pre-processing

To prepare the images for training, each one was resized to 224 × 224 pixels and adjusted so that the brightness values fell between 0 and 1. This resolution was selected after preliminary experiments with alternative sizes (160 × 160 and 448 × 448), which resulted in lower accuracy and increased computational cost, respectively. The 224 × 224 input size, therefore, provided the best balance between performance and efficiency.

To help the model learn better and perform well on new data, techniques like flipping the images sideways and upside down, rotating them randomly, and tweaking their brightness were used. Because the prototype includes an internal light source, brightness was relatively stable during real-time use, but augmentation was still applied to improve robustness. Grayscale experiments were also tested but resulted in poorer performance, indicating the importance of colour features for classification. After that, the dataset was divided into three parts: one for training the model, one for validating its performance, and one for final testing, as shown in Table 3.

Table 3. Training, validation, and test dataset split across waste categories

Color Code

Waste Category

Train Images

Validation Images

Test Images

Black

General waste

1,050

268

331

Blue

Pharmaceutical waste

156

39

87

Red

Sharps

1,423

356

446

White

Recyclable waste

357

89

112

Yellow

Infectious waste

1,373

343

438

Total

-

4,359

1,095

1,414

2.3.3 Model architecture and deployment

A MobileNetV2 CNN was used due to its lightweight depthwise separable convolutions and suitability for embedded deployment. The architecture comprised an input layer (224 × 224 × 3), a sequence of inverted residual blocks, and a global average pooling layer connected to a fully connected softmax classifier for five output categories. Transfer learning was applied using a MobileNetV2 model pretrained on the ImageNet dataset, with the final classification layer replaced to predict the five target waste classes. Several training configurations were explored during development, including adjustments to learning rate, batch size, and training duration, in order to achieve stable convergence and optimal performance on the dataset. To improve generalisation and reduce overfitting, a Dropout layer with a rate of 0.3 was incorporated within the classifier head. The model was trained using the Adam optimizer (learning rate = 0.0001) over 50 epochs with a batch size of 32, employing categorical cross-entropy as the loss function. Model performance was continuously monitored using validation accuracy and loss, and the learning curves consistent convergence without signs of severe overfitting, indicating that the selected configuration was appropriate for the task. After training, the model was converted to TensorFlow Lite format for deployment on a Raspberry Pi 4, where real-time inference tests were conducted to evaluate classification accuracy and latency under operational conditions.

2.4 Sorting procedure

The sorting mechanism automatically directs each classified waste item into its designated compartment, eliminating manual handling. When the ultrasonic sensor detects waste at the input chute, the camera captures an image that is processed by the embedded MobileNetV2 model for real-time classification. Based on the predicted category, the control algorithm drives a NEMA-17 stepper motor (via a TMC2206 controller) to rotate the circular bin to the appropriate position. Servo motors then open the lid and release the item into the selected compartment before resetting to their default states for the next cycle, as depicted in Figure 5 below, ensuring fast and reliable operation across all waste types.

Figure 5. Flowchart of the sorting procedure showing detection, classification, rotation, and deposition steps
2.5 Experimental Setup

The performance of the hospital-specific automated waste segregation system was evaluated under controlled laboratory conditions simulating a hospital disposal environment. Five waste categories, namely, infectious waste, sharps, pharmaceutical waste, recyclable waste, and general waste, were used for testing. The camera was positioned approximately 11 cm adjacent to the chute platform, with a consistent bright light provided by the internal illumination of the system. The background surface was wood-brown in colour. Waste items were presented at different orientations, although object size was limited by the chute dimensions, with the largest items approximately the size of an IV fluid bottle.

Each item was presented individually at the input chute. Upon detection, the system captured an image, classified the waste using the embedded MobileNetV2 model and directed it into the appropriate compartment using the motorised sorting mechanism. For each trial, classification output, inference time and total cycle time from detection to deposition were recorded. The experiment was repeated 20 times per category to obtain average values. Performance metrics included overall accuracy and per-class accuracy. Timing metrics measured inference time per image (model latency) and total sorting time per item (mechanical response + inference).

3. Experimental Results

3.1 Dataset Distribution and Training Performance

The dataset consisted of 6,868 labelled images spanning five distinct hospital waste categories. The allocation of images across training, validation, and test sets is detailed in Table 3. The MobileNetV2 model was trained over 50 epochs with a batch size of 32. Both training and validation accuracy exhibited a consistent upward trend, ultimately converging at 98.12%, with negligible signs of overfitting. The validation loss remained low and stable throughout the training process, suggesting strong generalisation capability. The study kept an eye on the model's performance all the time by looking at its validation accuracy and loss. The learning curves (Figure 6) revealed that the model was always getting better without any evidence of severe overfitting. On the test set, the model achieved an overall classification accuracy of 98.12%. The confusion matrix of predicted versus actual categories is presented in Figure 7, illustrating the model’s performance for each class.

Figure 6. Training and validation accuracy and loss curves of the MobileNetV2 model
Figure 7. Confusion matrix of classification results for the five waste categories
3.2 Real-Time Sorting Performance and Functional Testing

The embedded system implemented on the Raspberry Pi 4B achieved an average total cycle time (detection to deposition) of 8 to 12 seconds per item, with the time varying slightly across waste categories and showing no consistent performance pattern between classes. The system maintained consistent performance across repeated cycles, with no missed steps or actuator failures observed over 20 consecutive operations. A functional test was carried out to confirm the correct operation of the sensing, classification and sorting subsystems under repeated use, with real-time testing performance shown in Table 4. Table 5 shows a comparative analysis of automated medical waste classification systems.

Table 4. Real-time test results

Waste Category

Samples Tested

Correctly Classified

Accuracy (%)

General waste

20

20

100.0

Pharmaceutical waste

20

17

85.0

Sharps

20

20

100.0

Recyclable waste

20

20

100.0

Infectious waste

20

20

100.0

Overall

100

97

97.0

Table 5. Comparative analysis of automated medical waste classification systems

Study

Waste Categories

Approach

Platform

Accuracy

Real-Time Sorting

Deployment Practicality

Key Limitations

This study

5 categories (WHO-aligned: infectious waste, sharps, pharmaceutical waste, recyclable waste, general waste)

MobileNetV2 + mechanical sorter

Raspberry Pi 4B + camera + motors

97.0% (real-time test)

Yes (8–12 s full cycle)

High (low-cost, embedded, point-of-disposal sorting)

Cannot process multiple items simultaneously; struggles with transparent objects

Z​h​o​u​ ​e​t​ ​a​l​.​ ​(​2​0​2​2​)

8 categories (e.g., syringes, gloves, gauze, and bottles)

ResNeXt + transfer learning

PC-based (offline)

97.2%

No

Medium (classification only, no deployment)

No real-time use, no physical sorting

K​u​n​w​a​r​ ​&​ ​R​a​i​ ​(​2​0​2​5​)

Multiple (aligned with WHO color codes)

YOLOv5-s, EfficientNet-B0, etc.

Smartphones/edge devices

95.06% (best model)

Partial (software-level only)

Medium (no mechanical system)

No automated sorting mechanism

G​a​n​ ​e​t​ ​a​l​.​ ​(​2​0​2​4​)

3 categories

GoogLeNet CNN

Embedded system

99.34%

No

Low–medium

Limited categories; no sorting

H​e​r​m​a​w​a​n​ ​e​t​ ​a​l​.​ ​(​2​0​2​3​)

Hazardous waste only

YOLOv5

Raspberry Pi

85–96%

Partial (detection only)

Medium

Focused only on detection

N​a​f​i​z​ ​e​t​ ​a​l​.​ ​(​2​0​2​3​)

Domestic waste

CNN + conveyor system

Microcontroller + motors

98%

Yes

Medium

Not designed for medical waste

4. Discussion

The results of this study demonstrate that a hospital-specific automated waste segregation system, powered by a lightweight CNN, can achieve high classification accuracy and reliable real-time sorting performance on an embedded platform. The MobileNetV2 model achieved an overall classification accuracy of 97%, with a full sorting cycle time ranging between 8 and 12 seconds per item. These results compare favorably with previously reported systems focusing on municipal or household waste (J​o​u​h​a​r​a​ ​e​t​ ​a​l​.​,​ ​2​0​1​7). By expanding the classification scope to five hospital waste categories and enabling near-real-time operation on a Raspberry Pi, the system addresses key limitations of earlier approaches, which often supported fewer classes or relied on cloud-based processing. For instance, Z​h​o​u​ ​e​t​ ​a​l​.​ ​(​2​0​2​2​) reported 97.2% accuracy using a ResNeXt model on eight waste categories, but their system was limited to offline classification without mechanical sorting or real-time deployment. Similarly, K​u​n​w​a​r​ ​&​ ​R​a​i​ ​(​2​0​2​5​) achieved 95.06% accuracy using YOLOv5-s while aligning classification with WHO colour-coding standards, yet their system lacked physical sorting and broader contextual diversity. Other approaches proposed by G​a​n​ ​e​t​ ​a​l​.​ ​(​2​0​2​4​) and H​e​r​m​a​w​a​n​ ​e​t​ ​a​l​.​ ​(​2​0​2​3​) demonstrated high accuracy on small datasets or on lightweight platforms, but often omitted pharmaceutical or recyclable waste categories, and their systems focused on detection rather than real-time sorting. In comparison, the system proposed in this study combines high-accuracy, near-real-time operation on a Raspberry Pi with direct mechanical sorting, addressing the practical limitations of previous methods.

Functional testing showed consistent performance across repeated trials, confirming the system’s suitability for deployment directly at the point of disposal in hospital settings. The absence of actuator failures or missed steps across 20 consecutive operations also indicates that the mechanical design is sufficiently robust for continuous use. When compared to manual segregation, which is often inconsistent, slow, and prone to human error, the system clearly shows efficiency benefits. With a sorting cycle of 8 to 12 seconds per item, it can process around 300 to 450 items per hour, which would normally require several human operators. Automated sorting also lowers the risk of staff being exposed to infectious or hazardous waste, which can reduce safety-related incidents and associated costs. While the initial cost of the hardware is higher than using simple manual bins, the savings from reduced labour, fewer misclassifications, and easy maintenance with features like the enclosed camera and nylon-lined bins make the system more cost-effective over time. Altogether, these points show that automated segregation is both practical and economically beneficial for hospitals.

Practical usability was also considered: the camera module is enclosed to reduce dust exposure, lens cleaning is minimal, and each waste bin is lined with nylon to facilitate easy waste removal and prevent waste from sticking to the container walls. These design choices enhance operational convenience and ensure that the system remains effective in busy hospital environments.

A notable outcome is the model’s ability to maintain high classification accuracy across all categories, despite variations in lighting conditions and object orientation introduced during testing. This highlights the effectiveness of the data augmentation strategies used during training and shows that the model generalises reasonably well. However, occasional misclassifications were observed in the pharmaceutical waste (blue) category. In particular, single-colour tablets were sometimes mistaken for general waste, while some drug containers were confused with recyclable items due to their similar visual appearance. This suggests that visual similarity, rather than model instability, was the main cause of these errors. These observations provide useful direction for further improvement, especially through the collection of more diverse pharmaceutical samples. Future work will focus on expanding the dataset to include a broader diversity of waste types and increasing the representation of underperforming classes, such as pharmaceutical (blue) waste. The integration of additional sensors, such as near-infrared or weight-based sensors, may further improve discrimination between visually similar materials. Design improvements will also target multi-object handling and the incorporation of battery backup to enhance robustness during power interruptions. At the system level, a lightweight monitoring dashboard could be developed to support data logging and deployment management across multiple hospital units. These enhancements would further improve scalability and practical adoption across healthcare facilities.

Despite its effectiveness, the proposed system has several limitations. Transparent materials such as glass and clear plastic packaging can reduce classification reliability due to visual ambiguity. The system is designed to process one waste item at a time; therefore, classification accuracy decreases when multiple objects are deposited simultaneously, which remains a key limitation of the current chute-based design. In addition, the system does not include backup power support and becomes non-operational during power outages.

5. Conclusion

This study successfully developed and evaluated a hospital-specific automated waste segregation system integrating a Raspberry Pi 4B, camera module, stepper motor, servo motors and an ultrasonic sensor with a MobileNetV2-based image classification model. The prototype was designed to detect incoming waste at the point of disposal, classify it into five standardised hospital waste categories: infectious waste, sharps, pharmaceutical waste, general non-hazardous waste, and recyclable waste, and direct each item into the appropriate compartment using a motorised sorting mechanism. The primary objectives were to design and build a multi-compartment waste bin tailored to hospital requirements, to integrate a lightweight deep learning model capable of high-accuracy classification on embedded hardware, and to test the system’s performance under realistic conditions. These objectives were achieved through the combined development of hardware and software modules, resulting in a functional system capable of real-time classification and sorting directly at the source of waste generation.

By focusing specifically on hospital waste and by employing a Raspberry Pi instead of industrial Programmable Logic Controller-based solutions, the system addresses gaps in existing research. It demonstrates that cost-effective, embedded AI solutions can handle multiple waste categories with high precision. With a classification accuracy of 97% and an average sorting time between 8 and 12 seconds per item, the system offers a practical alternative to conventional manual segregation, which is often inconsistent, error-prone and hazardous to waste handlers. Importantly, the system is designed to be extensible to additional waste categories. Scaling can be achieved through targeted dataset expansion to include new waste types, coupled with transfer learning or fine-tuning of the MobileNetV2 model on these augmented datasets. Additional data augmentation strategies and incremental learning approaches could further ensure high classification accuracy as new categories are added without retraining the model from scratch. Overall, the findings confirm the feasibility of deploying lightweight deep learning models for real-time hospital waste classification. These implementation strategies provide a clear technical path for scaling the system to larger facilities, more complex waste streams, or integration with broader hospital waste management infrastructures.

Author Contributions

Conceptualisation, methodology writing–original draft, and data curation, S.A.Atanda, S.O.A., S.A.Ajagbe and A.K.A.; project administration, resources, methodology, validation, writing–original draft, visualisation, editing and software, A.K.A. and S.A.Ajagbe; editing, review and supervision, A.K.A. and O.K.A. All authors have read and agreed to the published version of the manuscript.

Data Availability

The data used to support the research findings are available from the corresponding author upon request.

Acknowledgments

The authors acknowledge the Department of Electrical and Biomedical Engineering, Abiola Ajimobi Technical University, Ibadan, Nigeria and Department of Computer Science, Department of Computer Science, University of Zululand, Kwadlangezwa 3886 KZN, South Africa.

Conflicts of Interest

The authors declare no conflict of interest.

References
Abdel-Shafy, H. I. & Mansour, M. S. (2018). Solid waste issue: Sources, composition, disposal, recycling, and valorization. Egypt. J. Pet., 27(4), 1275–1290. [Google Scholar] [Crossref]
Ahmed, Z. & Shanto, S. S. (2024). Performance analysis of YOLO architectures for surgical waste detection in post-COVID-19 medical waste management. Malays. J. Sci. Adv. Technol., 4(1), 1–9. [Google Scholar] [Crossref]
Ajagbe, S. A., Mudali, P., & Adigun, M. O. (2024). An empirical assessment of discriminative deep learning models for multiclassification of COVID-19 X-rays. In Joint proceedings of the ICAI 2024 workshops, including the 6th International Workshop on Data Engineering and Analytics (WDEA 2024) (pp. 150–164). CEUR Workshop Proceedings. https://ceur-ws.org/Vol-3795/icaiw_wdea_2.pdf [Google Scholar]
Akinlade, O., Vakaj, E., Dridi, A., Tiwari, S., & Ortiz-Rodriguez, F. (2022). Semantic segmentation of the lung to examine the effect of COVID-19 using UNET model. In International Conference on Applied Machine Learning and Data Analytics (pp. 52–63). Springer, Cham. [Google Scholar] [Crossref]
Alowais, S. A., Alghamdi, S. S., Alsuhebany, N., Alqahtani, T., Alshaya, A. I., Almohareb, S. N., Aldairem, A., Alrashed, M., Bin Saleh, K., & Badreldin, H. A. et al. (2023). Revolutionizing healthcare: The role of artificial intelligence in clinical practice. BMC Med. Educ., 23(1), 689. [Google Scholar] [Crossref]
Balogun, E. O., Ajagbe, S. A., Adeniyi, A. E., Olayinka, T. O., Adeogun, E. A., Taiwo, G. A., Esegbona-Isikeh, O. M., & Mudali, P. (2025). Gender prediction using real-time convolutional neural network. Procedia Comput. Sci., 258, 497–506. [Google Scholar] [Crossref]
Bruno, A., Caudai, C., Leone, G. R., Martinelli, M., Moroni, D., & Crotti, F. (2023). Medical waste sorting: A computer vision approach for assisted primary sorting. In 2023 IEEE International Conference on Acoustics, Speech, and Signal Processing Workshops (ICASSPW) (pp. 1–5). IEEE. [Google Scholar] [Crossref]
Cuarto, R. D., Baterna, A. R., Bulalacao, J. K. Q., Cuajao, P. H. M., Casco, M. T. A., Portento, R. J. T., Juarizo, C. G., Garcia, T. S., & Rubi, R. V. C. (2023). Development of microcontroller-based automated infectious waste segregation and disinfection system: A COVID-19 mitigation and monitoring response. Eng. Proc., 56(1), 139. [Google Scholar] [Crossref]
Flores, M. G. & Tan, J. (2019). Literature review of automated waste segregation system using machine learning: A comprehensive analysis. Int. J. Simul.: Syst. Sci. Technol., 11. [Google Scholar] [Crossref]
Gan, Y. S., Liu, Y. Z., Tseng, B. C., Liong, G. B., & Liong, S. T. (2024). Lightweight deep learning algorithm for sorting medical waste embedded system. J. Internet Technol., 25(5), 683–700. [Google Scholar] [Crossref]
Hermawan, I., Mardiyono, A., Iswara, R. W., Murad, F. A., Ardiawan, M. A., & Puspita, R. (2023). Development of Covid medical waste object classification system using YOLOv5 on Raspberry Pi. In 2023 10th International Conference on Information Technology, Computer, and Electrical Engineering (ICITACEE) (pp. 443–447). IEEE. [Google Scholar] [Crossref]
Huang, L., Li, M., Xu, T., & Dong, S. Q. (2023). A waste classification method based on a capsule network. Environ. Sci. Pollut. Res., 30(36), 86454–86462. [Google Scholar] [Crossref]
Ibrahim, M., Kebede, M., & Mengiste, B. (2023). Healthcare waste segregation practice and associated factors among healthcare professionals working in public and private hospitals, Dire Dawa, Eastern Ethiopia. J. Environ. Public Health, 2023(1), 8015856. [Google Scholar] [Crossref]
Jouhara, H., Czajczyńska, D., Ghazal, H., Krzyżyńska, R., Anguilano, L., Reynolds, A. J., & Spencer, N. (2017). Municipal waste management systems for domestic use. Energy, 139, 485–506. [Google Scholar] [Crossref]
Kabilan, B., Sairam, R., & Praveen, M. (2024). Enhanced CNN architecture for accurate waste classification in smart cities. In 2024 International Conference on IoT Based Control Networks and Intelligent Systems (ICICNIS) (pp. 538–543). IEEE. [Google Scholar] [Crossref]
Kunwar, S. & Rai, P. (2025). Health care waste classification using deep learning aligned with Nepal’s Bin Color Guideline. arXiv Preprint arXiv:2508.07450. [Google Scholar] [Crossref]
Miamiliotis, A. S. & Talias, M. A. (2023). Healthcare workers’ knowledge about the segregation process of infectious medical waste management in a hospital. Healthcare, 12(1), 94. [Google Scholar] [Crossref]
Nafiz, M. S., Das, S. S., Morol, M. K., Al Juabir, A., & Nandi, D. (2023). Convowaste: An automatic waste segregation machine using deep learning. In 2023 3rd International Conference on Robotics, Electrical and Signal Processing Techniques (ICREST) (pp. 181–186). IEEE. [Google Scholar] [Crossref]
Nwachukwu, N. C., Orji, F. A., & Ugbogu, O. C. (2013). Health care waste management—Public health benefits, and the need for effective environmental regulatory surveillance in federal Republic of Nigeria. In Current Topics in Public Health (pp. 149–178). IntechOpen. [Google Scholar] [Crossref]
Olukanni, D. O., Lazarus, J. D., & Fagbenle, E. (2022). Healthcare waste management practices in Nigeria: A review. In Health Care Waste Management and COVID 19 Pandemic (pp. 197–218). Springer. [Google Scholar] [Crossref]
Pitakaso, R., Khonjun, S., Srichok, T., Sethanan, K., Nanthasamroeng, N., Boonmee, C., Gonwirat, S., & Luesak, P. (2023). Pharmaceutical and Biomedical Waste. Kaggle. [Google Scholar] [Crossref]
Pučnik, R., Dokl, M., Fan, Y. V., Vujanović, A., Novak Pintarič, Z., Aviso, K. B., Tan, R. R., Pahor, B., Kravanja, Z., & Čuček, L. (2024). A waste separation system based on sensor technology and deep learning: A simple approach applied to a case study of plastic packaging waste. J. Clean. Prod., 450, 141762. [Google Scholar] [Crossref]
Taiwo, G., Vadera, S., & Alameer, A. (2025). Vision transformers for automated detection of pig interactions in groups. Smart Agric. Technol., 10, 100774. [Google Scholar] [Crossref]
Wulandari, E. W. V. (2020). Automated trash sorting design based microcontroller arduino mega 2560 with LCD display and sound notification. IOP Conf. Ser.: Mater. Sci. Eng., 725(1), 012054. [Google Scholar] [Crossref]
Zhou, H., Yu, X., Alhaskawi, A., Dong, Y., Wang, Z., Jin, Q., Hu, X., Liu, Z., Kota, V. G., & Abdulla, M. H. A. H. et al. (2022). A deep learning approach for medical waste classification. Sci. Rep., 12(1), 2159. [Google Scholar] [Crossref]

Cite this:
APA Style
IEEE Style
BibTex Style
MLA Style
Chicago Style
GB-T-7714-2015
Atanda, S. A., Abdulrasaq, S. O., Akinde, O. K., Ajagbe, S. A., & Aworinde, A. K. (2025). Hospital-Specific Automated Waste Segregation for High-Accuracy Real-Time Classification. Healthcraft. Front., 3(3), 128-138. https://doi.org/10.56578/hf030302
S. A. Atanda, S. O. Abdulrasaq, O. K. Akinde, S. A. Ajagbe, and A. K. Aworinde, "Hospital-Specific Automated Waste Segregation for High-Accuracy Real-Time Classification," Healthcraft. Front., vol. 3, no. 3, pp. 128-138, 2025. https://doi.org/10.56578/hf030302
@research-article{Atanda2025Hospital-SpecificAW,
title={Hospital-Specific Automated Waste Segregation for High-Accuracy Real-Time Classification},
author={Sultan Akinola Atanda and Shuqroh Opeyemi Abdulrasaq and Olusola Kunle Akinde and Sunday Adeola Ajagbe and Abraham Kehinde Aworinde},
journal={Healthcraft Frontiers},
year={2025},
page={128-138},
doi={https://doi.org/10.56578/hf030302}
}
Sultan Akinola Atanda, et al. "Hospital-Specific Automated Waste Segregation for High-Accuracy Real-Time Classification." Healthcraft Frontiers, v 3, pp 128-138. doi: https://doi.org/10.56578/hf030302
Sultan Akinola Atanda, Shuqroh Opeyemi Abdulrasaq, Olusola Kunle Akinde, Sunday Adeola Ajagbe and Abraham Kehinde Aworinde. "Hospital-Specific Automated Waste Segregation for High-Accuracy Real-Time Classification." Healthcraft Frontiers, 3, (2025): 128-138. doi: https://doi.org/10.56578/hf030302
ATANDA S A, ABDULRASAQ S O, AKINDE O K, et al. Hospital-Specific Automated Waste Segregation for High-Accuracy Real-Time Classification[J]. Healthcraft Frontiers, 2025, 3(3): 128-138. https://doi.org/10.56578/hf030302
cc
©2025 by the author(s). Published by Acadlore Publishing Services Limited, Hong Kong. This article is available for free download and can be reused and cited, provided that the original published version is credited, under the CC BY 4.0 license.