{"id":1270,"date":"2025-06-30T10:45:16","date_gmt":"2025-06-30T03:45:16","guid":{"rendered":"https:\/\/filkom.ub.ac.id\/project\/?p=1270"},"modified":"2025-06-30T10:45:16","modified_gmt":"2025-06-30T03:45:16","slug":"sistem-otomasi-quality-control-pada-kemasan-produk-dengan-embedded-ai","status":"publish","type":"post","link":"https:\/\/filkom.ub.ac.id\/project\/2025\/06\/sistem-otomasi-quality-control-pada-kemasan-produk-dengan-embedded-ai\/","title":{"rendered":"Sistem Otomasi Quality Control pada Kemasan Produk dengan Embedded AI"},"content":{"rendered":"<h3><b>Sistem Otomasi Quality Control pada Kemasan Produk dengan Embedded AI<\/b><\/h3>\n<p><img fetchpriority=\"high\" decoding=\"async\" class=\"alignnone size-full wp-image-4069\" src=\"https:\/\/filkom.ub.ac.id\/project\/wp-content\/uploads\/sites\/3\/2025\/06\/IMG20250529123436-2-min.jpg\" alt=\"\" width=\"8160\" height=\"6144\" srcset=\"https:\/\/filkom.ub.ac.id\/project\/wp-content\/uploads\/sites\/3\/2025\/06\/IMG20250529123436-2-min.jpg 8160w, https:\/\/filkom.ub.ac.id\/project\/wp-content\/uploads\/sites\/3\/2025\/06\/IMG20250529123436-2-min-300x226.jpg 300w, https:\/\/filkom.ub.ac.id\/project\/wp-content\/uploads\/sites\/3\/2025\/06\/IMG20250529123436-2-min-1024x771.jpg 1024w, https:\/\/filkom.ub.ac.id\/project\/wp-content\/uploads\/sites\/3\/2025\/06\/IMG20250529123436-2-min-768x578.jpg 768w\" sizes=\"(max-width: 8160px) 100vw, 8160px\" \/><\/p>\n<hr \/>\n<h3>Project Domain<\/h3>\n<p>This project falls under <strong data-start=\"25\" data-end=\"72\">ESP32-based Embedded AI for Quality Control<\/strong>, focusing on reliably identifying and classifying product packaging conditions (e.g., damaged vs. intact) by integrating computer vision, AI inference, and camera modules using an <strong data-start=\"253\" data-end=\"278\">ESP32 microcontroller<\/strong>.<\/p>\n<hr \/>\n<h3>Meet Our Team<\/h3>\n<ul>\n<li>MUHAMMAD DZAKA AUFA F.\u00a0 (225150200111053)<\/li>\n<li>MOHAMMAD AMRIZAL K. (225150300111003)<\/li>\n<li>MOUDY LESTARI TULUS WIDODO (225150300111006)<\/li>\n<li>AQNA MUMTAZ `ILMI (225150300111022)<\/li>\n<li>CHANDRA FERDINAN SIHITE (225150300111024)<\/li>\n<\/ul>\n<hr \/>\n<h3>Problem Statements<\/h3>\n<ul>\n<li data-start=\"0\" data-end=\"108\">packaging damage still occurs <strong>frequently<\/strong> and is difficult to detect<strong> quickly<\/strong> and <strong>consistently<\/strong>.<\/li>\n<li data-start=\"0\" data-end=\"108\">Manual quality control processes are often <strong>inefficient<\/strong>, prone to <strong>human error<\/strong>, and <strong>unreliable<\/strong> at large scale.<\/li>\n<\/ul>\n<hr \/>\n<h3>Goals<\/h3>\n<ul>\n<li data-start=\"582\" data-end=\"700\">\n<p data-start=\"585\" data-end=\"700\"><strong data-start=\"587\" data-end=\"620\">Develop an on-device AI model<\/strong> to classify product packaging as either damaged or intact with high accuracy.<\/p>\n<\/li>\n<li data-start=\"701\" data-end=\"808\">\n<p data-start=\"704\" data-end=\"808\"><strong data-start=\"706\" data-end=\"757\">Deploy the AI model on an ESP32 microcontroller<\/strong>, enabling real-time inferencing using ESP32-CAM.<\/p>\n<\/li>\n<li data-start=\"809\" data-end=\"927\">\n<p data-start=\"812\" data-end=\"927\"><strong data-start=\"814\" data-end=\"852\">Improve efficiency and reliability<\/strong> of the quality control process by reducing human intervention and error.<\/p>\n<\/li>\n<li data-start=\"928\" data-end=\"1028\">\n<p data-start=\"931\" data-end=\"1028\"><strong data-start=\"933\" data-end=\"967\">Create a fault-tolerant system<\/strong> using dual ESP32s for redundancy and improved reliability.<\/p>\n<\/li>\n<li data-start=\"1029\" data-end=\"1144\">\n<p data-start=\"1032\" data-end=\"1144\"><strong data-start=\"1034\" data-end=\"1073\">Enable remote access and monitoring<\/strong>, providing real-time feedback and control over the inspection process.<\/p>\n<\/li>\n<\/ul>\n<hr \/>\n<h3>Solution Statements<\/h3>\n<ul>\n<li data-start=\"173\" data-end=\"267\">\ud83d\udcf7 <strong data-start=\"176\" data-end=\"193\">Use ESP32-CAM<\/strong> to capture real-time images of product packaging for visual inspection.<\/li>\n<li data-start=\"271\" data-end=\"433\">\ud83e\uddea <strong data-start=\"274\" data-end=\"304\">Run on-device AI inference<\/strong> using a pre-trained MobileNetV2 model to classify packaging as <strong data-start=\"360\" data-end=\"371\">damaged<\/strong> or <strong data-start=\"375\" data-end=\"385\">intact<\/strong> directly on the ESP32 without cloud dependency.<\/li>\n<li data-start=\"435\" data-end=\"553\">\ud83d\udce1 <strong data-start=\"438\" data-end=\"468\">Integrate ESP32 with Wi-Fi<\/strong> to enable remote monitoring, feedback, and system control via Platform IoT Blynk.<\/li>\n<li data-start=\"557\" data-end=\"701\">\ud83d\udd01 <strong data-start=\"560\" data-end=\"597\">Use a dual ESP32 redundant system<\/strong>, where a backup microcontroller takes over automatically in case of failure, ensuring high reliability.<\/li>\n<li data-start=\"703\" data-end=\"827\">\ud83d\udce6 <strong data-start=\"706\" data-end=\"753\">Trigger actuators (servo)<\/strong> to sort out damaged products from intact ones based on inference results.<\/li>\n<li data-start=\"829\" data-end=\"929\">\ud83d\udcfa <strong data-start=\"832\" data-end=\"860\">Display real-time status<\/strong> and classification results on Platform IoT Blynk.<\/li>\n<\/ul>\n<hr \/>\n<h3 data-start=\"1798\" data-end=\"1848\">\ud83e\uddf0\u00a0<strong data-start=\"1805\" data-end=\"1846\">Prerequisites \u2013 Component Preparation<\/strong><\/h3>\n<h4>Hardware<\/h4>\n<ul>\n<li><strong>Main ESP32<\/strong>: Set up system logic, send status data to Blynk, and control servos.<\/li>\n<li><strong>Backup ESP32<\/strong>: Taking a picture of the packaging, and performing AI inference.<\/li>\n<li><strong>Conveyor Belt<\/strong>: Moving products packaging past inspection points.<\/li>\n<li><strong>Motor Servo SG90<\/strong>: Moving a board that will separate the boxes that are detected as defective.<\/li>\n<\/ul>\n<h4>Software<\/h4>\n<ul>\n<li><strong>Edge Impulse<\/strong>: Building, training, and testing AI models.<\/li>\n<li><strong>Blynk<\/strong>: QC status monitoring and real-time notification.<\/li>\n<li><strong>Arduino IDE<\/strong>: Writing program code and setting up QC logic.<\/li>\n<\/ul>\n<hr \/>\n<h3>Dataset<\/h3>\n<p><img decoding=\"async\" src=\"https:\/\/filkom.ub.ac.id\/project\/wp-content\/uploads\/sites\/3\/2025\/06\/Screenshot-2025-06-18-211911.png\" alt=\"\" width=\"1552\" height=\"755\" \/><\/p>\n<p data-start=\"710\" data-end=\"802\">To train the AI model, a custom image dataset was prepared consisting of two main classes:<\/p>\n<ul data-start=\"803\" data-end=\"887\">\n<li data-start=\"803\" data-end=\"827\">\n<p data-start=\"805\" data-end=\"827\"><strong data-start=\"805\" data-end=\"815\">Intact<\/strong> packaging<\/p>\n<\/li>\n<li data-start=\"828\" data-end=\"887\">\n<p data-start=\"830\" data-end=\"887\"><strong data-start=\"830\" data-end=\"841\">Damaged<\/strong> packaging (e.g., torn, dented, or misaligned)<\/p>\n<\/li>\n<\/ul>\n<p data-start=\"889\" data-end=\"1171\">The dataset included <strong data-start=\"910\" data-end=\"939\">over 100 labeled images<\/strong> per class, captured in varying lighting and background conditions to enhance model generalization. Data augmentation techniques such as rotation, flipping, zoom, and brightness adjustment were applied to increase dataset diversity.<\/p>\n<p data-start=\"1173\" data-end=\"1412\">The model was trained using <strong data-start=\"1201\" data-end=\"1216\">MobileNetV2<\/strong> with transfer learning and quantized to <strong data-start=\"1257\" data-end=\"1281\">int8 TensorFlow Lite<\/strong> format for deployment on the ESP32. Performance was evaluated using accuracy, precision, and confusion matrix on a validation set.<\/p>\n<article dir=\"auto\" data-testid=\"conversation-turn-2\" data-scroll-anchor=\"true\">\n<hr \/>\n<h3>System Workflow<\/h3>\n<article dir=\"auto\" data-testid=\"conversation-turn-2\" data-scroll-anchor=\"true\">\n<ol>\n<li data-start=\"0\" data-end=\"173\"><strong data-start=\"0\" data-end=\"25\">System Initialization<\/strong><br data-start=\"25\" data-end=\"28\" \/>The ESP32-WROOM-32 and ESP32-CAM are powered on. The system displays the status &#8220;Ready&#8221; on the OLED and stands by to start the inference process.<\/li>\n<li data-start=\"175\" data-end=\"287\"><strong data-start=\"175\" data-end=\"192\">Image Capture<\/strong><br data-start=\"192\" data-end=\"195\" \/>When a product passes in front of the camera, the ESP32-CAM automatically captures an image.<\/li>\n<li data-start=\"289\" data-end=\"471\"><strong data-start=\"289\" data-end=\"305\">AI Inference<\/strong><br data-start=\"305\" data-end=\"308\" \/>The captured image is sent to the ESP32-WROOM-32 or processed directly on the ESP32-CAM (if the model is lightweight enough). The AI model classifies the image as:\n<ul>\n<li data-start=\"289\" data-end=\"471\"><strong data-start=\"475\" data-end=\"485\">Intact<\/strong> \u2192 the product is passed through<\/li>\n<li data-start=\"289\" data-end=\"471\"><strong data-start=\"522\" data-end=\"533\">Damaged<\/strong> \u2192 the product is diverted to the reject path<\/li>\n<\/ul>\n<\/li>\n<li data-start=\"289\" data-end=\"471\"><strong data-start=\"580\" data-end=\"610\">Decision Making and Action<\/strong><br data-start=\"610\" data-end=\"613\" \/>Based on the classification result:\n<ul>\n<li data-start=\"289\" data-end=\"471\">A servo or actuator directs the product to the appropriate path (accepted or rejected)<\/li>\n<li data-start=\"289\" data-end=\"471\">The OLED display shows the classification result and the count of passed vs. rejected products<\/li>\n<\/ul>\n<\/li>\n<li data-start=\"289\" data-end=\"471\"><strong data-start=\"901\" data-end=\"930\">Monitoring and Redundancy<\/strong><br data-start=\"930\" data-end=\"933\" \/>The system can be remotely monitored via a dashboard (web\/mobile app).<br data-start=\"1003\" data-end=\"1006\" \/>If the main ESP32 fails, the backup ESP32 is activated through a watchdog system and takes over the process.<\/li>\n<\/ol>\n<hr \/>\n<\/article>\n<h3 data-start=\"1173\" data-end=\"1412\">Schematic<\/h3>\n<p><img decoding=\"async\" src=\"https:\/\/filkom.ub.ac.id\/project\/wp-content\/uploads\/sites\/3\/2025\/06\/proses.drawio-1.png\" alt=\"\" width=\"571\" height=\"282\" \/><\/p>\n<p data-start=\"181\" data-end=\"389\">This diagram illustrates the flow of data and control in the proposed quality control system using ESP32 microcontrollers. The system is divided into three main stages: <strong data-start=\"350\" data-end=\"359\">Input<\/strong>, <strong data-start=\"361\" data-end=\"372\">Process<\/strong>, and <strong data-start=\"378\" data-end=\"388\">Output<\/strong>.<\/p>\n<ol>\n<li data-start=\"393\" data-end=\"625\"><strong data-start=\"393\" data-end=\"408\">Input Stage<\/strong>:<br data-start=\"409\" data-end=\"412\" \/>The ESP32-CAM module functions as both a <strong data-start=\"455\" data-end=\"465\">sensor<\/strong> and <strong data-start=\"470\" data-end=\"493\">AI inference engine<\/strong>, capturing images of product packaging and classifying them as either <em data-start=\"564\" data-end=\"572\">intact<\/em> or <em data-start=\"576\" data-end=\"585\">damaged<\/em> using an embedded neural network model.<\/li>\n<li data-start=\"629\" data-end=\"757\"><strong data-start=\"629\" data-end=\"649\">Processing Stage<\/strong>:<br data-start=\"650\" data-end=\"653\" \/>A secondary ESP32 microcontroller receives the classification results for decision-making. It handles:\n<ul>\n<li data-start=\"629\" data-end=\"757\">Communication with the cloud or mobile interface via <strong data-start=\"815\" data-end=\"824\">Wi-Fi<\/strong>.<\/li>\n<li data-start=\"629\" data-end=\"757\">Logic control to determine actuator behavior.<\/li>\n<li data-start=\"629\" data-end=\"757\">Optional redundancy management in case of system faults.<\/li>\n<\/ul>\n<\/li>\n<li data-start=\"940\" data-end=\"978\"><strong data-start=\"940\" data-end=\"956\">Output Stage<\/strong>:<br data-start=\"957\" data-end=\"960\" \/>The results are:\n<ul>\n<li data-start=\"940\" data-end=\"978\">Sent to the <strong data-start=\"995\" data-end=\"1013\">User Interface<\/strong> (e.g., Blynk) for real-time notifications and system monitoring.<\/li>\n<li data-start=\"940\" data-end=\"978\">Used to control <strong data-start=\"1099\" data-end=\"1112\">actuators<\/strong> (e.g., servo motors) to separate defective products automatically.<\/li>\n<\/ul>\n<\/li>\n<\/ol>\n<p data-start=\"1181\" data-end=\"1357\">This architecture ensures efficient, real-time decision-making and system feedback in a compact, low-power embedded setup, suitable for industrial quality control applications.<\/p>\n<hr \/>\n<h3 data-start=\"2554\" data-end=\"2586\">\ud83c\udfac\u00a0<strong data-start=\"2561\" data-end=\"2584\">Demo and Evaluation<\/strong><\/h3>\n<ul data-start=\"2587\" data-end=\"2872\">\n<li data-start=\"2587\" data-end=\"2663\">\n<p data-start=\"2589\" data-end=\"2663\">\ud83d\udee0\ufe0f\u00a0<strong data-start=\"2593\" data-end=\"2602\">Setup<\/strong>: Assemble all hardware components, including the ESP32-CAM, dual ESP32 boards (MAIN &amp; BACKUP), Arduino Nano, L298N motor driver, DC motor for the conveyor belt, and MG996R servo motor. Upload the developed firmware to each microcontroller using the Arduino IDE. Ensure required libraries such as Blynk, Servo, and WiFi are installed.<\/p>\n<p data-start=\"453\" data-end=\"785\">This project operates in a Wi-Fi-based IoT environment, where the ESP32 connects to a network to send real-time notifications to the <strong data-start=\"586\" data-end=\"595\">Blynk<\/strong> application. Additionally, the AI model for defect detection is deployed on the ESP32-CAM via the <strong data-start=\"694\" data-end=\"710\">Edge Impulse<\/strong> platform, enabling local inference (Embedded AI) without cloud processing.<\/p>\n<\/li>\n<\/ul>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"alignnone wp-image-4087\" src=\"https:\/\/filkom.ub.ac.id\/project\/wp-content\/uploads\/sites\/3\/2025\/06\/IMG20250529123628-min.jpg\" alt=\"\" width=\"450\" height=\"339\" srcset=\"https:\/\/filkom.ub.ac.id\/project\/wp-content\/uploads\/sites\/3\/2025\/06\/IMG20250529123628-min.jpg 8160w, https:\/\/filkom.ub.ac.id\/project\/wp-content\/uploads\/sites\/3\/2025\/06\/IMG20250529123628-min-300x226.jpg 300w, https:\/\/filkom.ub.ac.id\/project\/wp-content\/uploads\/sites\/3\/2025\/06\/IMG20250529123628-min-1024x771.jpg 1024w, https:\/\/filkom.ub.ac.id\/project\/wp-content\/uploads\/sites\/3\/2025\/06\/IMG20250529123628-min-768x578.jpg 768w\" sizes=\"(max-width: 450px) 100vw, 450px\" \/> <img loading=\"lazy\" decoding=\"async\" class=\"alignnone wp-image-4086\" src=\"https:\/\/filkom.ub.ac.id\/project\/wp-content\/uploads\/sites\/3\/2025\/06\/IMG20250529123617-min.jpg\" alt=\"\" width=\"445\" height=\"335\" srcset=\"https:\/\/filkom.ub.ac.id\/project\/wp-content\/uploads\/sites\/3\/2025\/06\/IMG20250529123617-min.jpg 8160w, https:\/\/filkom.ub.ac.id\/project\/wp-content\/uploads\/sites\/3\/2025\/06\/IMG20250529123617-min-300x226.jpg 300w, https:\/\/filkom.ub.ac.id\/project\/wp-content\/uploads\/sites\/3\/2025\/06\/IMG20250529123617-min-1024x771.jpg 1024w, https:\/\/filkom.ub.ac.id\/project\/wp-content\/uploads\/sites\/3\/2025\/06\/IMG20250529123617-min-768x578.jpg 768w\" sizes=\"(max-width: 445px) 100vw, 445px\" \/> <img loading=\"lazy\" decoding=\"async\" class=\"alignnone wp-image-4088\" src=\"https:\/\/filkom.ub.ac.id\/project\/wp-content\/uploads\/sites\/3\/2025\/06\/WhatsApp-Image-2025-05-18-at-21.13.53_77030941-min.jpg\" alt=\"\" width=\"298\" height=\"635\" srcset=\"https:\/\/filkom.ub.ac.id\/project\/wp-content\/uploads\/sites\/3\/2025\/06\/WhatsApp-Image-2025-05-18-at-21.13.53_77030941-min.jpg 540w, https:\/\/filkom.ub.ac.id\/project\/wp-content\/uploads\/sites\/3\/2025\/06\/WhatsApp-Image-2025-05-18-at-21.13.53_77030941-min-141x300.jpg 141w, https:\/\/filkom.ub.ac.id\/project\/wp-content\/uploads\/sites\/3\/2025\/06\/WhatsApp-Image-2025-05-18-at-21.13.53_77030941-min-481x1024.jpg 481w\" sizes=\"(max-width: 298px) 100vw, 298px\" \/> <img loading=\"lazy\" decoding=\"async\" class=\"alignnone wp-image-4085\" src=\"https:\/\/filkom.ub.ac.id\/project\/wp-content\/uploads\/sites\/3\/2025\/06\/IMG20250529123608-min.jpg\" alt=\"\" width=\"479\" height=\"636\" srcset=\"https:\/\/filkom.ub.ac.id\/project\/wp-content\/uploads\/sites\/3\/2025\/06\/IMG20250529123608-min.jpg 6144w, https:\/\/filkom.ub.ac.id\/project\/wp-content\/uploads\/sites\/3\/2025\/06\/IMG20250529123608-min-226x300.jpg 226w, https:\/\/filkom.ub.ac.id\/project\/wp-content\/uploads\/sites\/3\/2025\/06\/IMG20250529123608-min-771x1024.jpg 771w, https:\/\/filkom.ub.ac.id\/project\/wp-content\/uploads\/sites\/3\/2025\/06\/IMG20250529123608-min-768x1020.jpg 768w\" sizes=\"(max-width: 479px) 100vw, 479px\" \/><\/p>\n<ul data-start=\"2587\" data-end=\"2872\">\n<li data-start=\"2664\" data-end=\"2755\">\n<p data-start=\"2666\" data-end=\"2755\">\ud83d\udc15\u00a0<strong data-start=\"2669\" data-end=\"2677\">Demo<\/strong>: This section demonstrates the full operation of the automated inspection system. The ESP32-CAM captures real-time images of products on a conveyor belt and classifies them using an embedded AI model. Based on the classification result, the main ESP32 controls a servo motor to sort the product into the appropriate lane \u2014 either \u201cpass\u201d or \u201creject.\u201d All decisions and system status updates are also sent to the Blynk app for real-time remote monitoring.<\/p>\n<\/li>\n<li data-start=\"2664\" data-end=\"2755\">\n<p data-start=\"2666\" data-end=\"2755\">\ud83c\udfa5 <strong data-start=\"621\" data-end=\"655\">Watch the full demo on YouTube<\/strong> to see the system in action: <a href=\"https:\/\/youtu.be\/mm5qb9Eb6Bw\">https:\/\/youtu.be\/mm5qb9Eb6Bw<\/a><\/p>\n<\/li>\n<\/ul>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"alignnone wp-image-4106\" src=\"https:\/\/filkom.ub.ac.id\/project\/wp-content\/uploads\/sites\/3\/2025\/06\/WhatsApp-Image-2025-06-02-at-11.54.02_c845a1b6-scaled.jpg\" alt=\"\" width=\"669\" height=\"892\" srcset=\"https:\/\/filkom.ub.ac.id\/project\/wp-content\/uploads\/sites\/3\/2025\/06\/WhatsApp-Image-2025-06-02-at-11.54.02_c845a1b6-scaled.jpg 1920w, https:\/\/filkom.ub.ac.id\/project\/wp-content\/uploads\/sites\/3\/2025\/06\/WhatsApp-Image-2025-06-02-at-11.54.02_c845a1b6-225x300.jpg 225w, https:\/\/filkom.ub.ac.id\/project\/wp-content\/uploads\/sites\/3\/2025\/06\/WhatsApp-Image-2025-06-02-at-11.54.02_c845a1b6-768x1024.jpg 768w, https:\/\/filkom.ub.ac.id\/project\/wp-content\/uploads\/sites\/3\/2025\/06\/WhatsApp-Image-2025-06-02-at-11.54.02_c845a1b6-1152x1536.jpg 1152w, https:\/\/filkom.ub.ac.id\/project\/wp-content\/uploads\/sites\/3\/2025\/06\/WhatsApp-Image-2025-06-02-at-11.54.02_c845a1b6-1536x2048.jpg 1536w\" sizes=\"(max-width: 669px) 100vw, 669px\" \/><\/p>\n<ul data-start=\"2587\" data-end=\"2872\">\n<li data-start=\"2756\" data-end=\"2872\">\n<p data-start=\"2758\" data-end=\"2872\">\ud83d\udd2c\u00a0<strong data-start=\"2761\" data-end=\"2775\">Evaluation<\/strong>: The system was evaluated based on three key aspects:<\/p>\n<ul data-start=\"223\" data-end=\"987\">\n<li data-start=\"223\" data-end=\"504\">\n<p data-start=\"225\" data-end=\"504\"><strong data-start=\"225\" data-end=\"247\">Detection Accuracy<\/strong>: The embedded AI model running on the ESP32-CAM was tested using multiple product samples under different lighting and movement conditions. The model achieved reliable classification performance for distinguishing between defective and non-defective items.<\/p>\n<\/li>\n<li data-start=\"506\" data-end=\"751\">\n<p data-start=\"508\" data-end=\"751\"><strong data-start=\"508\" data-end=\"533\">System Responsiveness<\/strong>: Communication between ESP32-CAM, main ESP32, and backup ESP32 was tested for latency and failover handling. The heartbeat mechanism ensured rapid takeover by the backup ESP32 within 3 seconds if the main unit failed.<\/p>\n<\/li>\n<li data-start=\"753\" data-end=\"987\">\n<p data-start=\"755\" data-end=\"987\"><strong data-start=\"755\" data-end=\"774\">App Integration<\/strong>: The integration with the Blynk IoT platform was tested for real-time updates. Notifications about product status and ESP32 availability were successfully pushed, allowing for remote monitoring from a smartphone.<\/p>\n<\/li>\n<\/ul>\n<p data-start=\"989\" data-end=\"1132\">Overall, the system performed reliably under varying conditions, demonstrating robustness in both defect detection and fault-tolerant response.<\/p>\n<\/li>\n<\/ul>\n<hr \/>\n<h3 data-start=\"2879\" data-end=\"2901\">\u2705\u00a0<strong data-start=\"2885\" data-end=\"2899\">Conclusion<\/strong><\/h3>\n<p data-start=\"2902\" data-end=\"3306\">This project delivers a reliable and intelligent solution for automating quality control of product packaging using <strong data-start=\"347\" data-end=\"374\">ESP32-based embedded AI<\/strong>. By combining <strong data-start=\"389\" data-end=\"430\">real-time image capture via ESP32-CAM<\/strong>, <strong data-start=\"432\" data-end=\"490\">on-device classification using a lightweight CNN model<\/strong>, and <strong data-start=\"496\" data-end=\"528\">remote monitoring over Wi-Fi<\/strong>, the system reduces reliance on manual inspection and minimizes human error, while also increasing scalability and consistency in industrial environments.<\/p>\n<hr \/>\n<h3 data-start=\"2902\" data-end=\"3306\">Contact Us<\/h3>\n<p><em>Our Git:\u00a0<a href=\"https:\/\/github.com\/ErkaW\/QualityCheckBox.git\">https:\/\/github.com\/ErkaW\/QualityCheckBox.git<\/a><\/em><\/p>\n<\/article>\n","protected":false},"excerpt":{"rendered":"<p>Sistem Otomasi Quality Control pada Kemasan Produk dengan Embedded AI Project Domain This project falls under ESP32-based Embedded AI for Quality Control, focusing on reliably identifying and classifying product packaging conditions (e.g., damaged vs. intact) by integrating computer vision, AI inference, and camera modules using an ESP32 microcontroller. Meet Our Team MUHAMMAD DZAKA AUFA F.\u00a0&#8230;<\/p>\n","protected":false},"author":349,"featured_media":4106,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_kad_post_transparent":"default","_kad_post_title":"default","_kad_post_layout":"default","_kad_post_sidebar_id":"","_kad_post_content_style":"default","_kad_post_vertical_padding":"default","_kad_post_feature":"","_kad_post_feature_position":"","_kad_post_header":false,"_kad_post_footer":false,"_kad_post_classname":"","footnotes":""},"categories":[9,1],"tags":[],"class_list":["post-1270","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-artificial-intelligence-of-thing-aiot","category-capstone"],"_links":{"self":[{"href":"https:\/\/filkom.ub.ac.id\/project\/wp-json\/wp\/v2\/posts\/1270","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/filkom.ub.ac.id\/project\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/filkom.ub.ac.id\/project\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/filkom.ub.ac.id\/project\/wp-json\/wp\/v2\/users\/349"}],"replies":[{"embeddable":true,"href":"https:\/\/filkom.ub.ac.id\/project\/wp-json\/wp\/v2\/comments?post=1270"}],"version-history":[{"count":8,"href":"https:\/\/filkom.ub.ac.id\/project\/wp-json\/wp\/v2\/posts\/1270\/revisions"}],"predecessor-version":[{"id":4188,"href":"https:\/\/filkom.ub.ac.id\/project\/wp-json\/wp\/v2\/posts\/1270\/revisions\/4188"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/filkom.ub.ac.id\/project\/wp-json\/wp\/v2\/media\/4106"}],"wp:attachment":[{"href":"https:\/\/filkom.ub.ac.id\/project\/wp-json\/wp\/v2\/media?parent=1270"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/filkom.ub.ac.id\/project\/wp-json\/wp\/v2\/categories?post=1270"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/filkom.ub.ac.id\/project\/wp-json\/wp\/v2\/tags?post=1270"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}