Bạn đang xem bản rút gọn của tài liệu. Xem và tải ngay bản đầy đủ của tài liệu tại đây (6.39 MB, 102 trang )
<span class="text_page_counter">Trang 1</span><div class="page_container" data-page="1">
MINISTRY OF EDUCATION AND TRAINING
<b>HO CHI MINH CITY UNIVERSITY OF TECHNOLOGY AND EDUCATION FACULTY FOR HIGH QUALITY TRAINING </b>
<b> </b>
<b>Ho Chi Minh City, January 2024GRADUATION PROJECT </b>
<b>COMPUTER ENGINEERING TECHNOLOGY</b>
<b>LECTURER: PHAM VAN KHOA</b>
<b>STUDENT: DO MINH QUAN NGUYEN TRAN DUY KHANH</b>
<small>S K L 0 1 2 5 3 9</small>
</div><span class="text_page_counter">Trang 2</span><div class="page_container" data-page="2"><b>HO CHI MINH CITY UNIVERSITY OF TECHNOLOGY AND EDUCATIONFACULTY OF INTERNATIONAL EDUCATION</b>
Ho Chi Minh City, January 2024
<small>THE SOCIALIST REPUBLIC OF VIETNAM</small>
<b><small>Independence - Freedom - Happiness</small></b>
<i><small>---Ho Chi Minh City, January 6th, 2024</small></i>
Student name: NGUYEN TRAN DUY KHANH Student ID: 19119063Major: COMPUTER ENGINEERING TECHNOLOGY Class: 19119CLA1
Date of assignment: September 3<small>rd</small>, 2023 Date of submission: January 6<small>th</small>, 20241. Project title: TWO-LAYER SECURITY SYSTEM USING FACE RECOGNITION AND
TIME ATTENDANCE INTEGRATION
2. Initial materials provided by the advisor: Documents such as papers about facial featuresextraction model - ArcFace.
3. Content of the project:
Analyze the challenge of the project, confirm the problem statement.
Learn about technical specifications, guiding through and theoretical basis of the componentsof hardware.
Choosing model and summarizing the overall system. Design block diagram, flowchart, tableof routing.
Pre-processing data (clean, resize data, design schema for database) System configuration and design hardware
Test run, check, debug, evaluate and adjust code. Conduct report writing.
4. Final product: A check-in check-out model that read RFID, extracts face’s data vector, a webapplication, a final report, a demo video.
<b>CHAIR OF THE PROGRAM</b>
<i>(Sign with full name)</i>
<i>(Sign with full name)</i>
</div><span class="text_page_counter">Trang 4</span><div class="page_container" data-page="4"><small>THE SOCIALIST REPUBLIC OF VIETNAM</small>
<b><small>Independence - Freedom - Happiness</small></b>
<i>Ho Chi Minh City, January , 2024</i>
<i>(Sign with full name)</i>
Student name: NGUYEN TRAN DUY KHANH Student ID: 19119063Major: COMPUTER ENGINEERING TECHNOLOGY
Project title: TWO-LAYER SECURITY SYSTEM USING FACE RECOGNITION AND TIMEATTENDANCE INTEGRATION
Supervisor: PHAM VAN KHOA, Ph.D
</div><span class="text_page_counter">Trang 5</span><div class="page_container" data-page="5"><small>THE SOCIALIST REPUBLIC OF VIETNAM</small>
<b><small>Independence - Freedom - Happiness</small></b>
<i>Ho Chi Minh City, January , 2024</i>
<i>(Sign with full name)</i>
Student name: NGUYEN TRAN DUY KHANH Student ID: 19119063Major: COMPUTER ENGINEERING TECHNOLOGY
Project title: TWO-LAYER SECURITY SYSTEM USING FACE RECOGNITION AND TIMEATTENDANCE INTEGRATION
</div><span class="text_page_counter">Trang 6</span><div class="page_container" data-page="6"><small>THE SOCIALIST REPUBLIC OF VIETNAM</small>
<b><small>Independence - Freedom - Happiness</small></b>
<i>Ho Chi Minh City, January , 2024</i>
<b>COMMITTEE MEMBER</b>
<i>(Sign with full name)</i>
Student name: NGUYEN TRAN DUY KHANH Student ID: 19119063Major: COMPUTER ENGINEERING TECHNOLOGY
Project title: TWO-LAYER SECURITY SYSTEM USING FACE RECOGNITION AND TIMEATTENDANCE INTEGRATION
Name of Defense Committee Member:
</div><span class="text_page_counter">Trang 8</span><div class="page_container" data-page="8">We hereby confirm that this project is the result of our independent research
<b>conducted under the guidance of Dr. Pham Van Khoa. The statements and discoveries</b>
presented herein are the outcome of our meticulous and independent exploration, involvingcomprehensive examination and analysis of academic materials. Researchers accountablefor this project, we commit to refrain from reproducing or duplicating the content andconclusions of other works. All references used have been appropriately acknowledgedand referenced in their entirety.
All information, data, and research findings presented in this project are for referencepurposes only and do not represent a direct affiliation or association with any specificorganization, institution, or individual. This research has been carried out solely foracademic purposes and not for commercial or personal gain. The author and research teamdo not assume responsibility for any consequences or applications arising from theutilization of information in this project without validation or support from experts in therelevant field. Any implications or use of information from this project are entirely at thediscretion and responsibility of the reader and are independent of the author or supervisor.
<i>(Sign with full name)</i>
</div><span class="text_page_counter">Trang 9</span><div class="page_container" data-page="9">We extend our heartfelt appreciation to numerous individuals and institutions whohave been instrumental in our academic journey. Foremost, our deepest gratitude goes to
<b>Dr. Pham Van Khoa, our advisor, for his unwavering support, patience, insightful</b>
guidance, and invaluable contributions to this thesis. His enthusiastic mentorship, wealthof knowledge, and expert advice in artificial intelligence were instrumental in thesuccessful completion of this research. Without his encouragement and direction, thisendeavor would not have come to fruition.
<b>We also wish to express our sincere gratitude to Ho Chi Minh City University of</b>
<b>Technology and Education for fostering a dynamic and enriching learning environment,</b>
providing opportunities through contests and seminars. Additionally, our heartfelt thanksgo to the dedicated teachers who have imparted invaluable knowledge and wisdom duringour four-year journey. Their teachings have not only enriched our academic pursuits buthave also honed our practical skills and experiences.
We are indebted to our seniors and friends for their steadfast support and invaluableassistance throughout this academic endeavor. Their encouragement and collaborativeefforts have been immensely beneficial.
</div><span class="text_page_counter">Trang 10</span><div class="page_container" data-page="10">PROJECT ASSIGNMENT ... i
EVALUATION SHEET OF SUPERVISOR ... ii
PRE-DEFENSE EVALUATION SHEET ... iii
EVALUATION SHEET OF DEFENSE COMMITTEE MEMBER... iv
</div><span class="text_page_counter">Trang 11</span><div class="page_container" data-page="11">3.4.4. MISSIONS OF WEB APPLICATION ... 52
3.5. COMPRESSED FACIAL FEATURES EXTRACTION MODEL ... 53
3.5.1. OVERVIEW OF MODEL ... 53
3.5.2. TESTING MODEL ABILITY ... 57
4. RESULTS ... 59
4.1. WEB APPLICATION INTERFACE RESULTS ... 59
4.2. FACIAL FEATURES EXTRACTION MODEL RESULTS ... 63
4.2.1. SIMILARITY OF COMPRESSED AND ORIGINAL MODELS ... 63
4.2.2. EXTRACTION OF COMPRESSED MODEL ... 67
4.3. HARDWARE RESULTS ... 69
4.3.1. CONTROL PROGRAMS ... 69
4.3.2. AUTHENTICATION METHODS ... 73
4.3.3. COMPLETE HARDWARE WITH ENCLOSURE ... 76
5. CONCLUSIONS AND RECOMMENDATIONS ... 78
5.1. CONCLUSIONS ... 78
5.2. RECOMMENDATIONS ... 80
REFERENCES ... 82
APPENDICES ... 83
</div><span class="text_page_counter">Trang 12</span><div class="page_container" data-page="12">Figure 2.1. The MERN Stack work methodology ... 4
Figure 2.2. Training flow supervised by ArcFace loss ... 8
Figure 2.3. The difference between residual block (a) and inverted residual block (b) ... 8
Figure 2.4. Map of Raspberry Pi 4 Model B... 10
Figure 2.5. GPIO of Raspberry Pi 4 Model B ... 11
Figure 2.6. USB camera ... 13
Figure 2.7. SPI communication protocol ... 14
Figure 2.8. I2C communication protocol ... 15
Figure 3.1. Block diagram... 19
Figure 3.2. Schematic circuit of NOTIFICATION block ... 24
Figure 3.3. Schematic circuit of LOCK CONTROL block ... 24
Figure 3.4. Schematic circuit of hardware ... 27
Figure 3.5. Printed circuit board of hardware ... 27
Figure 3.6. Implemented circuit board of hardware ... 28
Figure 3.7. Flowchart of GET DATA program ... 29
Figure 3.8. Flowchart of SEND DATA program ... 30
Figure 3.9. Flowchart of ATTENDACE program ... 32
Figure 3.10. Flowchart of FACE ADD sub-program ... 34
Figure 3.11. Flowchart of FACE ADD ADD sub-program ... 35
Figure 3.12. Flowchart of PIN INPUT sub-program ... 36
Figure 3.13. Flowchart of RFID INPUT sub-program ... 37
Figure 3.14. Flowchart of FACE SCAN sub-program ... 39
Figure 3.15. Flowchart of FACE CHECK sub-program ... 40
Figure 3.16. Flowchart of ACCESS HANDLING sub-program ... 41
Figure 3.17. Web application’s architecture ... 42
Figure 3.18. User and admin use case interface ... 43
Figure 3.19. Admin routes control ... 45
Figure 3.20. Hardware request routes control ... 47
Figure 3.21. Clock information flowchart ... 48
Figure 3.22. User routes control ... 49
</div><span class="text_page_counter">Trang 13</span><div class="page_container" data-page="13">Figure 3.23. System database design in MongoDB ... 52
Figure 3.24. Workflow of feature extraction model ... 53
Figure 3.25. Output facial feature extraction files ... 57
Figure 3.26. Output precision of model ... 58
Figure 4.1. Sign in interface ... 59
Figure 4.2. Main dasboard interface ... 60
Figure 4.3. Clock Information Page ... 60
Figure 4.4. Manage Users Page ... 61
Figure 4.5. Detail Users Information Page ... 61
Figure 4.6. Update Information for User ... 62
Figure 4.7. Create New User page... 63
Figure 4.8. Distribution of vectors - orginal model on Colab... 64
Figure 4.9. Distribution of vectors - compressed model on Colab ... 65
Figure 4.10. Statistics of vectors - orginal model on Colab ... 65
Figure 4.11. Statistics of vector - compressed model on Colab... 65
Figure 4.12. Distribution of vectors - compressed model on Raspberry ... 67
Figure 4.13. Statistics of vectors - compressed model on Raspberry ... 67
Figure 4.14. Result of feature extraction verification ... 68
Figure 4.15. MAIN window ... 70
Figure 4.16. RFID - FACE ADD window ... 70
Figure 4.17. FACE ADD - facial extraction window ... 70
Figure 4.18. RFID - LEVEL 1 window ... 71
Figure 4.19. PIN - LEVEL 1 window ... 71
Figure 4.20. PIN INPUT - LEVEL 1 window ... 71
Figure 4.21. FACE SCAN - LEVEL 2 window... 72
Figure 4.22. FACE SCAN - facial extraction window ... 72
Figure 4.23. Complete hardware with enclosure (1) ... 76
Figure 4.24. Complete hardware with enclosure (2) ... 77
</div><span class="text_page_counter">Trang 14</span><div class="page_container" data-page="14">Table 3.1. Connect pins between Raspberry Pi 4 and DS1307 ... 22
Table 3.2. Connect pins between Raspberry Pi 4 and RC522 ... 23
Table 3.3. Typical current and typycal voltage of hardware ... 25
Table 3.4. Hardware components list ... 26
Table 3.5. Admin endpoint table ... 46
Table 3.6. Hardware request endpoint table ... 49
Table 3.7. User endpoint table ... 50
Table 3.8. Dataset for tranining pre-trained model ... 56
Table 4.1. Comparison person1 scan vector with person1 database ... 74
Table 4.2. Comparison person1 scan vector with person2 database ... 74
Table 4.3. Comparison person1 scan vector with person3 database ... 75
Table 4.4. Comparison person1 scan vector with person4 database ... 75
</div><span class="text_page_counter">Trang 15</span><div class="page_container" data-page="15"><b>3</b> ASCII American Standard Code for Information Interchange
<b>10</b> HDMI High-Definition Multimedia Interface
<b>32</b> RDBMS Relational Database Management System
</div><span class="text_page_counter">Trang 16</span><div class="page_container" data-page="16"><b>ABBREVIATIONSMEANING</b>
</div><span class="text_page_counter">Trang 17</span><div class="page_container" data-page="17">In today’s rapidly evolving business landscape, the escalating demand for robust datastorage services mirrors the expansive growth of companies across diverse domains.Within this context, the effective management of employee working hours has emerged asa critical aspect, exerting a profound influence on company finances. Despite its pivotalrole, many enterprises, ranging from small businesses to some larger counterparts, lackstringent enforcement of attendance tracking, relying on single-layer security measures thatinadvertently expose them to unauthorized access and potential financial losses. To addressthis challenge, this paper introduces a sophisticated two-layer security system thatseamlessly integrates embedded computing and artificial intelligence. This innovativesolution aims to rectify the vulnerabilities associated with conventional attendance trackingsystems, particularly those susceptible to unauthorized access and fraudulent practices suchas proxy attendance. The proposed system includes a web application, providing users witha comprehensive platform not only to monitor their attendance data but also to access thefirst layer of security through a credential pin code. Users have the flexibility to choosebetween an access card and a pin code for the first layer, ensuring that access informationis uniquely tied to individual employees. Additionally, the system incorporates anadvanced artificial intelligence model to enhance the second layer of security, integratingfacial recognition technology. This multi-layered approach significantly diminishes thelikelihood of attendance manipulation and unauthorized access, thereby safeguarding theintegrity of employee working hours and, consequently, the financial interests of thecompany. By embracing cutting-edge technologies, this research and implementationeffort contributes to the ongoing discourse on secure attendance tracking systems.Experimental results demonstrate that the system fulfills all specified requirements andobjectives. For recognition tasks, the system achieved 0.99 precision, 0.25 recall, 0.4 F1-score and <i>50𝑀𝐵 tflite model size. For website application task, the system is successfully</i>
generate, store and retrieve user’s vital information.
</div><span class="text_page_counter">Trang 18</span><div class="page_container" data-page="18"><b>1.1. PROBLEM STATEMENT</b>
In the contemporary business landscape, the effective tracking of employeeattendance is a pivotal concern, with many enterprises, including both small and somelarger organizations, facing challenges related to fraudulent practices such as proxyattendance. Instances where employees hand over their access cards to colleagues,allowing them to clock out on their behalf and receive full compensation for unworkedhours, contribute to financial losses and undermine the integrity of attendance records.Current single-layer security measures, often relying on traditional methods like accesscards, are proving insufficient to address these deceptive practices, necessitating thedevelopment of a more sophisticated and secure system.
<b>1.2. OBJECTIVES</b>
<b>Prevent Proxy Attendance: Develop a two-layer security system incorporating</b>
embedded computing and artificial intelligence to effectively prevent deceptive practiceslike proxy attendance, where employees misuse access cards to manipulate clock-in andclock-out records.
<b>Ensure Individual Accountability: Implement features that discourage the sharing</b>
of access cards by providing a personalized identification method for each employee,reducing the potential for fraudulent activities and enhancing individual accountability.
<b>Improve Overall Attendance Integrity: Establish a secure and adaptable attendance</b>
tracking system that not only mitigates proxy attendance but also contributes tomaintaining accurate and trustworthy attendance records for financial and organizationaltransparency.
<b>Integrate Web Application: Develop a user-friendly web application that not only</b>
facilitates attendance monitoring but also serves as an additional layer of security, allowingusers to access the system through a secure credential pin code.
</div><span class="text_page_counter">Trang 19</span><div class="page_container" data-page="19"><b>1.3. MISSIONS</b>
<b>Implement Anti-Proxy Measures: Utilize embedded computing and artificial</b>
intelligence technologies to introduce features that specifically target and prevent instancesof proxy attendance, ensuring that only the authorized employee can clock in and out.
<b>Individualized Access Methods: Develop a system that allows employees to choose</b>
between access cards and pin codes for the first layer of security, ensuring a personalizedand secure identification process that discourages the sharing of access credentials.
<b>Enhance Data Accuracy: Implement advanced artificial intelligence models to</b>
accurately verify employee identities through facial recognition, contributing to the overallintegrity of attendance data and reducing the risk of fraudulent activities.
<b>Web Application Integration: Incorporate a user-friendly web application</b>
accessible through secure credential pin codes, providing employees with a convenientplatform for attendance monitoring and reinforcing the multi-layered security approach.
<b>1.4. LIMITATIONS</b>
<b>Technology Constraints: The effectiveness of the system may be influenced by the</b>
inherent limitations of embedded computing and artificial intelligence technologies,potentially impacting the overall performance and accuracy.
<b>Cost Implications: The adoption of sophisticated technologies, particularly</b>
embedded computing and artificial intelligence, may entail significant upfront costs.Smaller enterprises with limited budgets may find it challenging to invest in and maintainsuch systems.
<b>Maintenance and Upkeep: Ensuring the consistent functionality of the system</b>
requires regular maintenance and updates. The need for timely software updates, securitypatches, and hardware maintenance might pose challenges in environments where ITresources are limited.
<b>Scalability Challenges: Adapting the system to accommodate the growing needs of</b>
expanding organizations may present scalability challenges. As the number of usersincreases, the system’s ability to maintain performance levels and handle a larger volumeof data must be carefully considered.
</div><span class="text_page_counter">Trang 20</span><div class="page_container" data-page="20"><b>Environmental Factors: The system’s reliance on hardware components may be</b>
susceptible to environmental factors such as temperature fluctuations, humidity, andphysical wear and tear. Also lighting could effect the accuracy of feature extraction.Ensuring the system’s resilience under varying conditions is essential for sustainedreliability.
<b>1.5. OUTLINE</b>
The outline of the thesis is divided into five chapters as follows:
<b> Chapter 1. Introduction: This chapter presents the background research, the purpose,</b>
the research tasks, the limitations, and the outline of the report.
<b> Chapter 2. Literature review: This chapter introduces the theoretical background of</b>
the methods used in this study such as MERN Stack, ArcFace, MobileNetv2, and someothers foundational theories related to the hardware aspect of the system.
<b> Chapter 3. System design: This chapter illustrates the steps to finding the solution to</b>
the problem and setting requirements for the proposed system. The method is describedfrom an overview to details of the algorithms inside.
<b> Chapter 4. Results: This chapter shows the results of system implementation.</b>
<b> Chapter 5. Conclusions and Recommendations: This chapter provides conclusions</b>
and future work.
</div><span class="text_page_counter">Trang 21</span><div class="page_container" data-page="21"><b>2.1. MERN STACK</b>
MERN Stack is a JavaScript Stack that is used for easier and faster deployment offull-stack web applications. MERN Stack comprises of four technologies namely:MongoDB, Express, React and Node.js. It is designed to make the development processsmoother and easier.<small>[1]</small>
MongoDB, the leading NoSQL database, distinguishes itself as an open-source,document-oriented platform that diverges from traditional relational databases bydispensing with fixed schemas and structured tables. Instead, MongoDB employs BSONas its storage format, encoding data in a binary structure that facilitates swift parsing. Thisdatabase excels in scalability and flexibility, offering a document structure that is schemaless and adaptable. Unlike relational databases, MongoDB operates without the constraintsof table formations or relationships, making it particularly efficient for developers. Itsefficiency is further underscored by streamlined storage and indexing techniques,contributing to faster processing compared to traditional RDBMS. While MongoDB lackscomplex join operations and does not support intricate transactions, it compensates byallowing scalability through the addition of servers.<small>[2]</small>
Express, a server-side JavaScript framework that operates within Node.js, stands outas one of the premier frameworks for backend development. Offering developers apowerful platform, Express facilitates the creation and maintenance of resilient servers. Itssignificance is particularly notable in the realm of web and mobile application
<b>Figure 2.1. The MERN Stack work methodology</b>
</div><span class="text_page_counter">Trang 22</span><div class="page_container" data-page="22">development, where it streamlines the process for building and designing applications withspeed and ease. Express serves a crucial role in providing server-side logic for a myriad ofapplications, demonstrating its versatility across various domains. Developers leverageExpress to effortlessly establish robust APIs and web servers, simplifying the overalldevelopment process. The framework excels in organizing application functionalitythrough efficient routing and middleware, enhancing the creation of robust web servers.As a fundamental component of both the MERN and MEAN stacks, Express isinstrumental in constructing fast, maintainable, and robust production web applications,cementing its role as a cornerstone in modern web development.<sup>[3]</sup>
React, a highly popular open-source front-end JavaScript library, is widely employedfor building web applications, particularly single-page applications. Before diving intoReact, prerequisites include downloading Node packages and possessing a foundationalunderstanding of HTML, CSS, and JavaScript. Developed by Facebook, React is not aframework but a library aimed at addressing challenges in web and mobile applicationdevelopment. It excels in constructing user interfaces and offers reusability through UIcomponents. Originating from the work of software engineer Jordan Walke at Facebook,React made its debut in the Facebook news feed. Key attributes of React include its abilityto enable the creation of large, dynamic web applications that update data without requiringpage reloads.<small>[4]</small>
Node.js, an open-source server environment, serves as a cross-platform runtime forexecuting JavaScript code beyond the confines of a web browser. It distinguishes itself asneither a programming language nor a framework but rather as a versatile runtimeenvironment. Notably, Node.js finds extensive application in the development of backendservices for web and mobile applications, with major corporations like Uber, PayPal, andNetflix leveraging its capabilities in production.<small>[5]</small>
<b>2.2. ARCFACE AND MOBILENETV2 INTEGRATION</b>
Facial recognition technology has witnessed significant advancements in recentyears, driven by breakthroughs in deep learning and neural network architectures. Twoprominent contributors to this progress are ArcFace, an innovative face recognitionalgorithm, and MobileNetV2, a lightweight neural network designed for mobile and edge
</div><span class="text_page_counter">Trang 23</span><div class="page_container" data-page="23">precision compare to the two other models, but recognizable for it speed and portable foredge device like raspberry pi.<small>[6]</small>
<i>ArcFace, introduced by Deng et al. in the paper “ArcFace: Additive Angular Margin</i>
<i>Loss for Deep Face Recognition” (2019), addresses challenges in face recognition,</i>
particularly regarding discriminative feature learning. The authors proposed a novel lossfunction, the additive angular margin loss, which enhances the discriminative power ofdeep face recognition models. ArcFace introduces a margin-based approach that enforcesintra-class compactness and inter-class separability, leading to improved face recognitionaccuracy.<small>[7]</small>
The central idea of ArcFace is to map the features of each face to a hypersphere,ensuring that the angular distance between features corresponds to their similarity. Thealgorithm has demonstrated state-of-the-art performance on benchmark face recognitiondatasets, making it a compelling choice for applications where accuracy and reliability areparamount.
More detail into the ArcFace loss function, the ArcFace loss combines the standardsoftmax loss with an angular margin term.
<i>Softmax Loss:</i>
The softmax loss is a standard classification loss that measures the dissimilaritybetween the predicted class scores and the ground truth labels. For a given training sample𝑥<sub>𝑖</sub> with ground truth label𝑦<sub>𝑖</sub><i>, the softmax loss is calculated as Eq. 1, where</i>𝑊<sub>𝑗</sub> is the weightvector associated with class𝑗, and 𝑁 is the total number of classes.
𝐿<sub>1</sub> = − log <sup>𝑒</sup>
∑<sub>𝑁</sub> 𝑒<sup>𝑊</sup><small>𝑗</small><sup>𝑇</sup><small>𝑥</small><sub>𝑖</sub><small>+𝑏</small><sub>𝑗</sub><small>𝑗=1</small>
(𝐸𝑞. 1)
<i>Angular Margin Term:</i>
To introduce the angular margin, the cosine of the angle between the input embeddingvector 𝑥<sub>𝑖</sub> and the weight vector𝑊<sub>𝑦</sub><sub>𝑖</sub> (associated with the true class) is considered.<small>[6]</small> Theangular margin <i>𝑚 is added to this cosine term as Eq. 2, where 𝜃</i><sub>𝑦</sub><sub>𝑖</sub>is the angle between𝑥<sub>𝑖</sub>and 𝑊<sub>𝑦</sub><sub>𝑖</sub><i>, and m is the angular margin.</i>
cos 𝜃<sub>𝑦</sub><sub>𝑖</sub> + 𝑚 (𝐸𝑞. 2)
</div><span class="text_page_counter">Trang 24</span><div class="page_container" data-page="24"><i>Combined ArcFace Loss:</i>
The ArcFace loss combines the softmax loss and the angular margin term, and it is
𝑁: The total number of classes or identities in the training dataset.
𝜃<sub>𝑦</sub><sub>𝑖</sub>: The angle between the embedding vector 𝑥<sub>𝑖</sub> and the weight vector 𝑊<sub>𝑦</sub><sub>𝑖</sub>corresponding to the true class of the𝑖 − 𝑡ℎ 𝑠𝑎𝑚𝑝𝑙𝑒.
𝑚: The angular margin, a hyperparameter that defines the minimum angular separationbetween the embeddings of different classes. It encourages the model to learn morediscriminative features.
𝑠: The scaling parameter, another hyperparameter that controls the scale of theembeddings. Adjusting s can affect the magnitude of the cosine similarities and is oftenexperimentally tuned for optimal performance.
The main steps of training of the pre-trained weight follows: Normalization: Normalize the weights and feature vectors.
Cosine Similarity: Calculate cos 𝜃 for each class 𝑗 using the normalized feature vectorsand weights of class𝑗.
Calculate Angle 𝜃: Compute 𝜃<sub>𝑗</sub> – cos<sup>−1</sup>( 𝜃<sub>𝑗</sub>), the angle between the actual weight 𝑊<sub>𝑦</sub><sub>𝑖</sub>and the feature vector𝑥<sub>𝑖</sub>.
Calculate Loss Function: Use the formula 𝑠 cos(𝜃<sub>𝑦</sub><sub>𝑖</sub> + 𝑚) within the range [0, 𝜋], where𝑚 is the margin.
Softmax and Probability: Apply the softmax function to obtain the probabilitydistribution of the labels.
</div><span class="text_page_counter">Trang 25</span><div class="page_container" data-page="25"> Cross Entropy Loss: Compare the ground truth vector (one-hot labels) with the predictedprobability and compute the cross entropy loss.
In the realm of image processing and facial recognition, employing cosine similarityamong facial feature representation vectors is a robust method for measuring similarity.Thus, throughout the execution of this project, cosine similarity is utilized to compute thesimilarity between facial feature vectors. The formula for calculating cosine similarity is
<i>illustrated in Ep. 4.</i>
𝐶𝑜𝑠𝑖𝑛𝑒 𝑆𝑖𝑚𝑖𝑙𝑎𝑟𝑖𝑡𝑦 = cos 𝜃<sub>𝑗</sub> = <sup>𝑨 ⋅ 𝑩</sup>‖𝑨‖‖𝑩‖<sup>=</sup>
∑<sup>𝑛</sup><sub>𝑖=1</sub>𝐴<sub>𝑖</sub>𝐵<sub>𝑖</sub>∑ 𝐴<small>𝑛</small> <sub>𝑖</sub><small>2</small>
(𝐸𝑞. 4)
<i>MobileNetV2, introduced by Sandler et al. in “MobileNetV2: Inverted Residuals and</i>
<i>Linear Bottlenecks” (2018), focuses on addressing the computational challenges associated</i>
with deploying deep neural networks on mobile and edge devices. MobileNetV2 buildsupon the success of its predecessor, MobileNetV1, by introducing inverted residuals andlinear bottlenecks. These architectural enhancements contribute to increased efficiency,allowing for faster inference while maintaining competitive accuracy.
<b>Figure 2.2. Training flow supervised by ArcFace loss</b>
<b>Figure 2.3. The difference between residual block (a) and inverted residual block (b)</b>
</div><span class="text_page_counter">Trang 26</span><div class="page_container" data-page="26"><i>Inverted Residual Blocks:</i>
MobileNetV2 employs inverted residuals, a novel design using lightweight depthwiseseparable convolutions. This design choice significantly reduces the computational cost byseparating spatial filtering (depthwise convolution) from pointwise mixing (1 × 1convolution), making the model well-suited for resource-constrained devices.
<i>Linear Bottlenecks:</i>
Linear bottlenecks are integrated into each inverted residual block to enhanceinformation flow within the network. The1 × 1 convolution in the linear bottleneck servesto increase the non-linearity of the network, promoting feature reuse and mitigating therisk of information loss. This contributes to the overall efficiency of the model.
<i>Efficient Feature Extraction:</i>
The combination of inverted residuals and linear bottlenecks allows MobileNetV2 tostrike a balance between computational efficiency and model accuracy. This is particularlyadvantageous for applications on devices with constrained resources, where efficientfeature extraction is crucial.
<i>Mathematical Formulation (Simplified):</i>
Let𝑥<sub>𝑖</sub> represent the input feature vector to an inverted residual block, and𝐹 denotesthe combination of operations within the block. The output of the block, denoted as𝑦<sub>𝑖</sub>, can
<i>be expressed as Eq. 4, where</i> 𝐹(𝑥<sub>𝑖</sub>) is the operations involving lightweight depthwiseseparable convolutions and linear bottlenecks.
𝑦<sub>𝑖</sub> = 𝑆𝑘𝑖𝑝_𝐶𝑜𝑛𝑛𝑒𝑐𝑡𝑖𝑜𝑛 𝐹(𝑥<sub>𝑖</sub>) + 𝑥<sub>𝑖</sub> (𝐸𝑞. 5)MobileNetV2 achieves a harmonious balance between computational efficiency andmodel accuracy, making it an ideal choice for applications with limited resources.
This architecture, with its emphasis on efficiency and feature reuse, has proven to beeffective for various computer vision tasks on devices with restricted computationalcapabilities.<small>[8]</small>
<b>Integration in Facial Recognition Systems:</b>
The combination of ArcFace and MobileNetV2 in facial recognition systems presentsa compelling synergy. ArcFace provides a robust solution for discriminative featurelearning, enhancing the accuracy of facial recognition. When coupled with MobileNetV2,
</div><span class="text_page_counter">Trang 27</span><div class="page_container" data-page="27">the overall system gains computational efficiency, making it suitable for deployment onresource-constrained devices such as mobile phones and edge devices.
Recent research and applications have explored this integration, showcasing itspotential in real-world scenarios. By leveraging the strengths of both ArcFace andMobileNetV2, facial recognition systems can achieve a balance between accuracy,efficiency, and suitability for diverse deployment environments.
In conclusion, the literature surrounding ArcFace and MobileNetV2 highlights theirindividual contributions to face recognition and mobile deployment, respectively. Asresearchers and practitioners continue to explore novel architectures and optimizations, theintegration of these technologies remains a promising avenue for advancing the capabilitiesof facial recognition systems in various domains.
<b>2.3. RASPBERRY PI 4 MODEL BOverview of Raspberry Pi:</b>
Raspberry Pi is a line of small, affordable computers developed by the Raspberry PiFoundation in the UK. Introduced in 2012, Raspberry Pi was initially aimed at promotingprogramming and IT education. However, it has evolved into a versatile and powerful toolacross various fields, from IoT to robotics and embedded systems.
The Raspberry Pi 4 Model B offers a significant performance boost, powered by theBroadcom BCM2711 quad-core ARM Cortex-A72 64-bit processor running at1.5𝐺𝐻𝑧. Itsexpandable RAM options of 2𝐺𝐵, 4𝐺𝐵, and 8𝐺𝐵 LPDDR4-3200 SDRAM cater to variedproject needs.
<b>Figure 2.4. Map of Raspberry Pi 4 Model B</b>
</div><span class="text_page_counter">Trang 28</span><div class="page_container" data-page="28">Equipped with Gigabit Ethernet, USB 3.0, USB 2.0, and Micro HDMI for 4K videooutput, it ensures fast data transfer and diverse connectivity. Wi-Fi 802.11ac and Bluetooth5.0 enable seamless network connections and remote control.
This model’s advanced cooling system ensures stable operation even under heavyloads. In summary, the Raspberry Pi 4 Model B’s versatility and robust performance makeit an ideal choice for a wide array of projects, from IoT applications to custom PCs,fostering innovation and experimentation.
The GPIO pins on the Raspberry Pi 4 Model B play a vital role in interacting withperipherals and controlling various functions. With a total of 40 GPIO pins, they offer highflexibility for users to create and manage electronic projects.
Each GPIO pin can be configured to perform several functions:
<b> Digital Input Function: These pins can receive data from peripherals like motion</b>
sensors, buttons, or other signal sources. They are capable of reading digital signals,receiving values of 0 (LOW) or 1 (HIGH) corresponding to the connected device’s state.
<b> Digital Output Function: These pins can also control devices such as LEDs, relays, or</b>
motors by providing electrical signals. By adjusting the signal value (0 or 1), users canmanage the operational status of these devices.
<b> Analog Input Function: Some GPIO pins support reading analog values from sensors</b>
with analog outputs. This capability allows the Raspberry Pi to collect more detailed
<b>Figure 2.5. GPIO of Raspberry Pi 4 Model B</b>
</div><span class="text_page_counter">Trang 29</span><div class="page_container" data-page="29">data about signal levels, suitable for applications demanding high measurementaccuracy.
<b> Communication Protocol: GPIO pins also support communication protocols like I2C,</b>
SPI, UART. This opens up possibilities for connecting to other devices such as sensors,displays, or expansion boards to extend the functionality of the Raspberry Pi.
<b>Wireless Connectivity and Gigabit Ethernet:</b>
The Raspberry Pi 4 Model B brings Wi-Fi 802.11ac and Bluetooth 5.0 for wirelessconnectivity, enabling remote access, data transfer, and device communication withoutphysical cables. This enhances flexibility for mobility and remote control in diverseapplications, connecting easily to existing wireless networks and a wide array ofperipherals via Bluetooth 5.0.
Additionally, it features a Gigabit Ethernet port, offering high-speed and stable wiredconnections, advantageous for data-intensive tasks, media streaming, or applicationsrequiring low latency. This combination empowers users to select between wired orwireless networking, catering to specific project needs and enhancing adaptability acrossvaried applications.
<b>HDMI and DSI:</b>
The Raspberry Pi 4 Model B offers versatile display options through its HDMI outputand DSI. Featuring dual Micro HDMI ports supporting 4K video output, it connectsseamlessly to monitors or TVs, delivering high-quality visuals and enabling touchscreeninteractions. Additionally, its DSI supports various display types, from non-touch LCDs tohigh-resolution screens, expanding possibilities for graphics, data visualization, andinformation display in diverse projects.
This integration of HDMI output and DSI opens avenues for creativity, allowing usersto connect touch-enabled LCD displays for interactive applications and leverage diversedisplay options for projects across IoT, robotics, and embedded systems.
<b>USB Ports:</b>
The USB ports on the Raspberry Pi 4 Model B offer not only versatile peripheralconnectivity but also enable various practical applications, including the use of USB-connected cameras.
</div><span class="text_page_counter">Trang 30</span><div class="page_container" data-page="30">The Raspberry Pi 4 Model B features a total of 4 USB 2.0 ports and 2 USB 3.0 ports.These ports facilitate connections to a wide range of peripheral devices such as keyboards,mice, hard drives, or USB storage devices for convenient data access and sharing. The USB3.0 ports provide faster data transfer speeds compared to USB 2.0, suitable for handlinglarge datasets and transmitting high-resolution videos.
Raspberry Pi supports multiple types of USB-connected cameras. By connecting aUSB camera, users can harness the power of the Pi for imaging and video projects. Thisopens up applications ranging from security surveillance to capturing imagery for virtualreality or artificial intelligence projects. For those interested in experimenting with image-based projects, connecting a USB camera to the Raspberry Pi 4 Model B via the USB portsoffers a flexible and convenient option.
<b>2.4. RADIO FREQUENCY IDENTIFICATIONOverview of RFID:</b>
RFID technology harnesses radio waves to retrieve data from electronic tags, findingwide applications across industries like inventory management, logistics, and accesscontrol. Its swift and accurate identification abilities streamline operations, from enhancingretail inventory to optimizing logistics and improving safety in healthcare. RFID’sadaptability with various tag types underscores its scalability and efficiency in diverseoperational settings. This technology’s non-contact data retrieval has revolutionizedefficiency and automation across industries.
An RFID system comprises essential components:
<b> Tags: Carriers of unique identification data, ranging from passive tags relying on reader</b>
energy to active tags with their own power source for longer-range communication.
<b>Figure 2.6. USB camera</b>
</div><span class="text_page_counter">Trang 31</span><div class="page_container" data-page="31"><b> Readers: Transmit and receive signals to communicate with tags, capturing data for</b>
processing in computer systems.
<b> Antennas: Integral for communication between readers and tags, emitting and receiving</b>
radio waves to determine communication range and effectiveness.
<b> Middleware: Software that filters and manages data collected by readers before</b>
transmitting it to central databases or software applications.
<b> Software and Database: Applications managing collected RFID data, integrating it</b>
with existing business processes.
<b> Network Infrastructure: Essential for seamless communication and data transfer</b>
between RFID system components, ensuring efficient functionality across industries.
<b>Communication Protocols of RFID Modules:</b>
RFID modules rely on distinct connection protocols, categorized as passive or active.Passive RFID commonly incorporates simple protocols like UART, SPI, or occasionallyI2C. In contrast, active RFID primarily employs wireless protocols such as Bluetooth orWi-Fi for transmitting data over extended distances.
SPI holds prominence in passive RFID systems due to its efficiency and simplicity infacilitating data exchange between the reader and tag. Operating in a master-slave setup,SPI utilizes key communication lines: MOSI, MISO, SCK, and SS, providing control anddata transfer between devices.
The directness and rapid data transmission capabilities of SPI position it as a preferredprotocol for RFID systems, particularly in scenarios necessitating swift and straightforwardcommunication between RFID tags and readers.
<b>Figure 2.7. SPI communication protocol</b>
</div><span class="text_page_counter">Trang 32</span><div class="page_container" data-page="32"><b>2.5. RTC INTEGRATED CIRCUIT</b>
<b>Overview of RTC Integrated Circuit:</b>
An RTC module is an essential electronic component within systems, maintainingand tracking real-time clock and calendar functions. It consists of several key components:
<b> RTC Chip: The core component, responsible for tracking time in various units and</b>
managing leap years and shorter months.
<b> Backup Power Pin: This offers uninterrupted timekeeping during power interruptions</b>
through a secondary power source like a lithium battery.
<b> Interface Connections: Incorporates interfaces (like I2C, SPI, or UART) to enable data</b>
transmission with other system components.
<b> Crystal Oscillator Resistor: Ensures high accuracy in time measurement by generating</b>
<b>Communication Protocols of RTC Modules:</b>
RTC modules utilize different protocols to connect with microcontrollers or circuitsfor managing real-time data. Commonly, the I2C protocol enables seamless data exchange
<b>Figure 2.8. I2C communication protocol</b>
</div><span class="text_page_counter">Trang 33</span><div class="page_container" data-page="33">through dedicated SDA and SCL lines. Some RTC modules favor the SPI protocol,utilizing specific pins like MOSI, MISO, SCK, and SS for transmission. Additionally,certain modules employ the UART protocol, often using TX and RX pins forcommunication.
The selection of protocol for RTC modules relies on design and applicationrequirements, prioritizing compatibility and flexibility with connected devices. I2C,popular among these modules, offers a flexible communication method with minimal pins.Its two-pin system enables multiple devices on the same communication bus, withSDA handling data transmission and SCL managing data synchronization.
Operating on a master/slave mechanism, the I2C protocol allows the microcontrollerto direct communication as the master, while the RTC device functions as the slave,responding to read or write requests for real-time data. Its flexibility in connecting multipledevices on a single bus makes I2C widely preferred for interfacing microcontrollers withRTC modules in digital clocks, real-time measurement devices, and diverse embeddedapplications.
</div><span class="text_page_counter">Trang 34</span><div class="page_container" data-page="34">The core of this chapter revolves around the intricate design of a comprehensivesystem catering to employee identification, attendance tracking, and time managementneeds. Precision, reliability, and meticulous execution form the bedrock, ensuring thefulfillment of all requisites while averting any discrepancies. The section underscores thesignificance of detailed planning and precise execution to deliver a solution that not onlyexcels in functionality but also adeptly addresses a spectrum of diverse demands andscenarios.
<b>3.1. REQUIREMENTS AND BLOCK DIAGRAM OF SYSTEM3.1.1. REQUIREMENTS OF SYSTEM</b>
To meet the design requirements of a dual-layer attendance system using RFID, PINcodes, and facial recognition, it needs to satisfy specifications for both hardware(Raspberry Pi 4 Model B with peripherals) and software (website, server).
<b>Hardware Design Requirements:</b>
<b> Role Limitation: The system is dedicated solely to managing user attendance and does</b>
not encompass administrative functions meant for administrators. This segregationensures that the system’s functionalities remain focused on attendance tracking and user-related operations.
<b> User-Friendly GUI: The GUI of the system should prioritize user-friendliness. It must</b>
be intuitive, simple to navigate, and visually clear. It should present accurate informationregarding attendance, and notifications seamlessly. The interface should facilitate easyinteraction and understanding for users engaging with the attendance system.
<b> Precise Input Processing: The system must possess the capability to accurately capture</b>
and process various input methods, including RFID, PIN codes, and facial recognition.Each method of input needs to be reliably interpreted and authenticated to ensure preciseattendance records for users.
<b> API Data Handling: The system should have the functionality to retrieve data from an</b>
API and transmit data back to the API. This communication should occur over Wi-Fi orthe Internet, following a predetermined schedule. This feature allows seamless
</div><span class="text_page_counter">Trang 35</span><div class="page_container" data-page="35">integration with external systems or databases, enabling data exchange forcomprehensive attendance management and record-keeping.
<b>Software Design Requirements:</b>
<b> Responsive Design: This requirement involves structuring the web application’s layout</b>
and functionality to adapt seamlessly to various screen sizes and devices. It ensures thatwhether accessed from a desktop, tablet, or mobile phone, users experience consistentand optimal usability without sacrificing functionality or clarity.
<b> User-Friendly Interface: The focus is on crafting an interface that is intuitive, easy to</b>
comprehend, and navigable. It involves employing UI components that facilitate asmooth user experience, considering factors like button placement, layout simplicity,and clear information presentation to enhance user engagement and understanding.
<b> Integration of Authentication and User Management: This includes incorporating</b>
multiple authentication methods like RFID, PIN codes, and facial recognition into theweb interface for user access. Additionally, the system should provide administratorswith tools to manage user accounts effectively, allowing functionalities such as adding,editing, and deleting user profiles. Moreover, it involves designing interfaces to viewattendance records and providing straightforward login mechanisms for users.
<b> Communication with API: The system will establish connections with an external API</b>
to fetch attendance data and send it securely to the server. This involves setting up securedata transmission protocols to ensure that data integrity and confidentiality aremaintained during interactions with the API.
<b> User Data Security: Security measures will be implemented to protect sensitive user</b>
information. Techniques such as data encryption will safeguard user data againstunauthorized access or breaches. Access rights will be strictly regulated, and userauthentication mechanisms will be put in place to ensure that only authorized individualscan access, modify, or view specific data, thereby upholding data integrity and ensuringuser privacy and security.
</div><span class="text_page_counter">Trang 36</span><div class="page_container" data-page="36"><b>3.1.2. BLOCK DIAGRAM OF SYSTEM</b>
<i>As depicted in Figure 3.1, the project’s system is designed into 9 main blocks:</i>
<b>CENTRAL PROCESSING Block:</b>
The central processing unit of the system uses Raspberry Pi 4 Model B, playing apivotal role in receiving and processing data from various sources such as RFID, PIN input,and information from facial recognition sent by RFID, TOUCH SCREEN, and CAMERAblocks. Its crucial task involves processing the data according to pre-programmed controlsequences, subsequently sending control signals to blocks including TOUCH SCREENblock, NOTIFICATION block, and CONTROL LOCK block. Simultaneously, the centralprocessing unit is responsible for communicating with an API, transmitting and receivingdata from it, as well as storing the information sent down from the API. Its most crucialrole lies in establishing and maintaining coherent interaction between the variouscomponents within the system, ensuring smooth and efficient operations of the entiresystem. Furthermore, this block also serves as the power source provider for the REAL-TIME, RFID, CAMERA, and TOUCH SCREEN blocks.
<b>REAL TIME Block:</b>
Utilizing the DS1307 RTC module, it serves as a clock providing real-timeinformation to the CENTRAL PROCESSING block in cases where there’s no internet
<b>Figure 3.1. Block diagram</b>
</div><span class="text_page_counter">Trang 37</span><div class="page_container" data-page="37">connection available upon booting up. This block also incorporates a dedicated powersupply to mitigate power loss scenarios that could otherwise hinder the CENTRALPROCESSING block from providing power.
<b>RFID Block:</b>
Utilizing the RC522 RFID module to receive analog signals from RFID cards/tags,subsequently converting them into digital signals and transmitting these signals to theCENTRAL PROCESSING block.
<b>CAMERA Block:</b>
Utilizing the Xiaomi Xiaovv XVV-6320S-USB 1080P webcam to receive analogsignals from user faces, subsequently converting them into digital signals and sending themto the CENTRAL PROCESSING block, becoming the input data for the facial extractionand recognition process.
<b>TOUCH SCREEN Block:</b>
Utilizing a 7-inch1024 × 600𝑝𝑥 touchscreen LCD display to showcase the GUI andessential information from the CENTRAL PROCESSING block, simultaneously receivingand transmitting touch signals from user interactions to the CENTRAL PROCESSINGblock for processing.
<b>NOTIFICATION Block:</b>
It functions by receiving control signals from the CENTRAL PROCESSING blockand displaying visual notifications for users through the use of light from LEDs and soundfrom buzzers.
<b>LOCK CONTROL Block:</b>
Its function is receiving control signals from the CENTRAL PROCESSING block tolock or unlock according to pre-programmed sequences. This block is essentially a relaymodule that receives signals to command locking/unlocking actions from the CENTRALPROCESSING block.
<b>POWER Block:</b>
Comprising two sections, a12𝑉𝐷𝐶/2𝐴 adapter power supply for the load - normallyclosed electromagnetic latch lock, and a5𝑉𝐷𝐶/3𝐴 adapter power supply for the Raspberry
</div><span class="text_page_counter">Trang 38</span><div class="page_container" data-page="38">Pi 4 Model B. Not only does it provide power for the operation of the the CENTRALPROCESSING block and electromagnetic latch lock in the CONTROL LOCK block, butthe POWER block also ensures the stability of the hardware system, including peripheraldevices, safeguarding against issues like overload, overheating, overvoltage, short circuits,and low voltage.
<b>SERVER/WEBSITE Block:</b>
The web application’s architecture, parts, and interactions are all included in thesystem design, which guarantees a reliable and effective solution for the Two-LayerSecurity System with Face Recognition and Time Attendance Integration. The program isdivided into two primary layers, the Front End (Client Layer) and the Back End (API andDatabase Layer), based on the MVC paradigm.
<b>3.2. HARDWARE</b>
<b>3.2.1. HARDWARE DESIGN</b>
<b>3.2.1.1. CENTRAL PROCESSING BLOCK</b>
The system requires a processing and control unit powerful enough to handle signalprocessing from peripheral devices, provide control signals based on programmedsequences, store databases, and have internet connectivity for real-time data retrieval froman API and sending data to the API. Additionally, the system needs image processingcapabilities, including reading input image data, processing it for facial feature extractionmodels, and running a suitable facial feature extraction model to meet real-timerequirements. Therefore, the system requires an embedded computer with sufficientmemory and high processing speed to fulfill the functions and requirements specified forthe project.
In addition to memory and processing speed requirements, the project calls for theembedded computer to have specific connectivity ports. These include an Ethernet port forcontrolling and programming the embedded computer from a PC or laptop, an HDMI portfor a touchscreen LCD display, USB ports to power the touchscreen LCD and a USBcamera, and to establish a connection between the USB camera and the embeddedcomputer. Additionally, the embedded computer must feature GPIO ports that support theI2C and SPI communication protocols.
</div><span class="text_page_counter">Trang 39</span><div class="page_container" data-page="39">The Raspberry Pi 4 Model B 4𝐺𝐵 not only meets the previously mentionedrequirements but also demonstrates advantages in energy efficiency during prolongedoperation, compact size, ease of accessibility for handling, and cost-effectiveness.Considering these advantages alongside the project requirements, the authors have decidedto choose the Raspberry Pi 4 Model B4𝐺𝐵 as the CENTRAL PROCESSING block.
The CENTRAL PROCESSING block, driven by the Raspberry Pi 4 Model B4𝐺𝐵,serves as the nerve center of the system. This core unit facilitates seamless communicationand data processing, harnessing the Raspberry Pi’s versatile connectivity options. WithEthernet for remote access and programming, HDMI for interfacing with a touchscreendisplay, USB ports catering to the power needs of peripherals like the touchscreen LCDand USB camera, and GPIO pins accommodating protocols such as I2C and SPI, this blockintricately weaves together various connections essential for cohesive system operation. Itembodies a powerful yet accessible hub, ensuring efficient processing, control, andinteraction among the system’s components.
<b>3.2.1.2. PERIPHERAL BLOCKSREAL TIME Block:</b>
The RTC DS1307 module fulfills critical requirements: accurate timekeeping,seamless interfacing with the Raspberry Pi, low power consumption, and affordability. Itsswift connectivity, coupled with precise timekeeping capabilities even under minimalpower usage, makes it an ideal choice. Additionally, its backup power support ensuresuninterrupted functionality when disconnected from the Raspberry Pi. All these featuresare packaged within a compact and cost-effective design, highlighting the DS1307 as anoptimal solution for the task at hand.
<b>Table 3.1. Connect pins between Raspberry Pi 4 and DS1307</b>
</div><span class="text_page_counter">Trang 40</span><div class="page_container" data-page="40">power consumption, as well as a small form factor and reasonable cost. However, adistinctive requirement for the RFID block is the need for fast signal reception, processing,and transmission to the Raspberry Pi. While various RFID modules exist, the RC522module closely aligns with these specifications, making it the choice for the RFID block.
<b>Table 3.2. Connect pins between Raspberry Pi 4 and RC522</b>
<b>CAMERA block:</b>
Recognizing and extracting facial features are crucial aspects of the system, relativelyunaffected by recording devices. Hence, the choice of a camera with high resolution andthe ability to capture stable, high-quality color images is vital. Ensuring consistentbrightness is essential for the CAMERA block to acquire and process images accuratelyand reliably, safeguarding image quality from external factors.
The Xiaomi Xiaovv XVV-6320S-USB 1080P webcam, boasting Full HD 1080Presolution, enables crisp, clear, and color-accurate image capture. Equipped with a USB-2.0 connection, it allows for easy connectivity and programming with the Raspberry Pi.Given its advantages and alignment with the outlined requirements, the Xiaomi XiaovvXVV-6320S-USB 1080P webcam stands as a fitting choice for the CAMERA block of thesystem.
<b>TOUCH SCREEN block:</b>
The TOUCH SCREEN block requires the ability to display crucial systeminformation clearly and intuitively, with a highly responsive touch interface for real-timeresponsiveness. Additionally, it needs to seamlessly communicate and program with theRaspberry Pi, consume low power, and be reasonably sized and priced.
Based on these requirements, a 7-inch capacitive touch screen has been chosen. The7-inch HDMI capacitive touch LCD screen with IPS panel ensures wide and clear viewingangles, providing real-time touch responsiveness. Moreover, this screen exhibits excellentcompatibility with the Raspberry Pi, operating reliably while conserving energy.
<b>RC522</b> SS SCK MOSI MISO GND RST VCC
<b>RASPBERRY</b> 24 23 19 21 25 15 17
</div>