# Real-time Individual Tracking Solution for Intrusion Detection and Monitoring System using 24-GHz mm Wave Radar Technology by Nghiem Le Phan Gia, Linh Ho Khanh, and Tuan Do-Hong ## Comprehensive Technical Integrity Assessment: "Real-time Individual Tracking Solution for Intrusion Detection and Monitoring System using 24-GHz mm Wave Radar Technology" ### TL;DR **Đối tượng đánh giá:** Bài báo "Real-time Individual Tracking Solution for Intrusion Detection and Monitoring System using 24-GHz mm Wave Radar Technology" (Giải pháp theo dõi cá nhân thời gian thực cho hệ thống giám sát và phát hiện xâm nhập sử dụng công nghệ Radar sóng mm 24-GHz), xuất bản năm 2025 trên tạp chí IJLRET. **Nhận định chung:** Tài liệu này là một bản phân tích pháp y kỹ thuật, vạch trần những sai sót khoa học nghiêm trọng và sự thiếu toàn vẹn của bài báo trên. Mặc dù các tác giả đã lắp ráp thành công một nguyên mẫu phần cứng (phù hợp với một đồ án tốt nghiệp đại học), nhưng khung lý thuyết, thuật toán và dữ liệu đo lường đi kèm lại bị thổi phồng, sai lệch hoàn toàn về mặt vật lý và có dấu hiệu thao túng dữ liệu. **Các điểm sai phạm và thiếu sót chính:** 1. **Lỗi sai cơ bản về Vật lý Điện từ:** Các tác giả sử dụng module radar hoạt động ở tần số 24 GHz, nhưng lại trích dẫn bảng thông số phản xạ và suy hao vật liệu của sóng 100 GHz từ một nghiên cứu khác để làm nền tảng lý thuyết. Đây là lỗi vật lý nghiêm trọng vì các dải tần số này tương tác với vật thể (như da người, quần áo, tường) hoàn toàn khác nhau. Sóng 100 GHz dùng để quét bề mặt ngoài, trong khi sóng 24 GHz có khả năng đâm xuyên. 2. **Đạo văn và Không hiểu biết về Kiến trúc Phần cứng:** Bài báo đã đạo văn (sao chép nguyên xi) một sơ đồ kiến trúc phần cứng từ một bài báo năm 2011 của Khraisat. Đáng nói hơn, sơ đồ bị sao chép mô tả hệ thống Radar Xung (Pulsed Radar), trong khi module phần cứng (HLK-LD2410) mà các tác giả đang sử dụng lại chạy theo nguyên lý Sóng liên tục điều biến tần số (FMCW). Hai kiến trúc này hoàn toàn khác nhau về cấu tạo và toán học, chứng tỏ nhóm tác giả coi phần cứng như một "hộp đen" mà không hiểu nguyên lý hoạt động thực sự bên trong. 3. **Phóng đại Thuật ngữ "Học máy" và "Tính toán biên":** Bài báo tự nhận là sử dụng các "thuật toán xử lý tín hiệu nâng cao" và "mô hình học máy hồi quy tuyến tính". Thực tế, phần mã nguồn (code) trên vi điều khiển ESP32 chỉ làm nhiệm vụ nhận dữ liệu đã được xử lý sẵn từ cảm biến, sau đó áp dụng các phép tính toán học cấp 2 vô cùng cơ bản như: Tính trung bình cộng (Mean) và Độ lệch chuẩn (Standard Deviation). Không hề có học máy (Machine Learning) nào được áp dụng ở đây. 4. **Lỗi Hình học Không gian (GDOP):** Nhóm tác giả đề xuất hệ thống định vị 3D bằng cách dùng 3 cảm biến đặt trên cùng một mặt phẳng (trần nhà). Về mặt toán học định vị, việc này tạo ra hiện tượng Pha loãng Độ chính xác Hình học (GDOP) cực kỳ tồi tệ dọc theo trục Z (trục chiều cao). Cấu trúc này khiến việc tính toán tọa độ chiều cao của mục tiêu gặp sai số khổng lồ và hệ thống dễ bị sập (trả về lỗi NaN) do các giới hạn phân giải của thiết bị. 5. **Dữ liệu Thực nghiệm Phi lý (Nghi ngờ ngụy tạo):** Cảm biến HLK-LD2410 được nhà sản xuất công bố có độ phân giải khoảng cách là 0.75m (75cm). Trong Bảng 3, tác giả tự để lộ sự thất bại của hệ thống khi sai số trục Z lên tới 55% - 135% (sai lệch hàng chục cm). Tuy nhiên, trong Bảng 2, bài báo lại trình bày dữ liệu đo đạc trục Z hoàn hảo một cách phi lý với sai số chỉ từ 0 đến 6 cm. Độ chính xác từng centimet này là bất khả thi về mặt vật lý đối với loại cảm biến rẻ tiền có độ phân giải 75cm và kiến trúc đặt cùng mặt phẳng. Điều này cho thấy dữ liệu đã bị can thiệp, chọn lọc hoặc chỉnh sửa bằng tay. **Kết luận** Tài liệu đánh giá kết luận rằng bài báo này không mang lại bất kỳ đóng góp mới nào về thuật toán hay xử lý tín hiệu. Nó thực chất chỉ là một bài tập tích hợp phần cứng của sinh viên đại học (trùng khớp với một đồ án tốt nghiệp) đã được "thổi phồng" một cách giả tạo bằng các thuật ngữ hàn lâm sai lệch, đạo văn sơ đồ và ngụy tạo số liệu để đủ điều kiện xuất bản thành một bài báo nghiên cứu quốc tế. --- ### Executive Summary of Technical Findings The rapid democratization of complex electronic hardware, particularly the availability of highly integrated, low-cost commercial-off-the-shelf (COTS) sensor modules, has fundamentally transformed the landscape of embedded systems research. The manuscript under evaluation, "Real-time Individual Tracking Solution for Intrusion Detection and Monitoring System using 24-GHz mm Wave Radar Technology," authored by Nghiem Le Phan Gia, Linh Ho Khanh, and Tuan Do-Hong, and published in the International Journal of Latest Research in Engineering and Technology (IJLRET) in 2025, represents a quintessential example of this modern hardware paradigm. The authors propose a human tracking and intrusion detection system that integrates a 24 GHz millimeter-wave (mmWave) radar sensor network utilizing the HLK-LD2410 module interfaced with an ESP32 microcontroller. However, an exhaustive forensic analysis of the manuscript’s technical content reveals profound and pervasive scientific discrepancies that compromise the integrity of the research. While the physical instantiation of the device—a networked array of radar sensors communicating via standard serial protocols—appears to have been successfully constructed as an undergraduate engineering capstone project, the theoretical framework wrapped around this prototype is severely flawed. The authors demonstrate a critical misunderstanding of the fundamental electromagnetic physics governing their chosen frequency band, severely plagiarize and misapply hardware architectural diagrams that contradict their own text, and propose spatial localization mathematics that ignore the inherent quantization limits of their hardware. Furthermore, the manuscript mischaracterizes elementary descriptive statistics as advanced machine learning algorithms and presents experimental spatial data that fundamentally violates the physical resolution constraints of the integrated radar modules. This report bypasses the external prestige or review timeline of the publishing journal and instead focuses exclusively on a rigorous, line-by-line deconstruction of the manuscript's physics, radar architecture, signal processing claims, geometric mathematical modeling, and empirical data validity. --- ### Electromagnetic Physics and Frequency Band Discrepancies A foundational requirement for any original research in radio frequency (RF) sensing is a precise understanding of how specific electromagnetic wavelengths interact with the dielectric properties of target materials. The manuscript proposes a system operating exclusively at the 24 GHz frequency, a parameter dictated entirely by their selection of the HLK-LD2410-24G hardware module. Within Section II of the manuscript, which serves as the literature review and theoretical justification for utilizing mmWave technology for human detection, the authors present "Table 1: Reflection parameters for 100 GHz mmWave signals". This table enumerates highly specific reflection and transmission coefficients in decibels (dB) for a variety of organic and synthetic materials, including wool, nylon, denim, human skin (forearm, chest), and contraband materials like polycarbonate and metal. The authors cite this table to a 2019 IEEE Access paper by Zhongmin Wang et al., titled "Review of Active Millimeter Wave Imaging Techniques for Personnel Security Screening". The presentation of 100 GHz physical parameters to justify a 24 GHz hardware system constitutes a severe methodological breakdown and a fundamental error in applied physics. The physical interaction of electromagnetic waves with macroscopic objects is highly non-linear across the spectrum, driven by the frequency-dependent relative permittivity and conductivity of the materials. At 100 GHz, the electromagnetic wavelength is approximately 3 millimeters in free space. When a 100 GHz wave encounters human skin, the penetration depth (often referred to as the skin depth) is exceedingly shallow. The energy is almost entirely reflected or absorbed at the outermost epidermal layers, which is precisely why 100 GHz systems are utilized for high-resolution, non-interactive personnel security screening to image concealed contrabands directly against the skin. Conversely, at 24 GHz, the free-space wavelength is approximately 12.5 millimeters. The attenuation mechanisms at this lower frequency are substantially different. A 24 GHz wave exhibits significantly greater penetration capability through typical clothing materials and common architectural barriers like drywall or thin glass, and it interacts differently with the sub-dermal tissues of the human body. This specific penetration characteristic is the exact physical reason why 24 GHz radar is heavily utilized in automotive short-range radar (SRR) applications and indoor vital sign monitoring, as it can detect the micro-vibrations of the heart and lungs beneath clothing. By extracting a table of 100 GHz attenuation and reflection coefficients—such as the -0.3 dB transmission loss for denim or the -8.4 dB reflection coefficient for forearm skin—and inserting it into a paper about a 24 GHz sensor network, the authors demonstrate a superficial engagement with RF physics. The material transmission loss at 100 GHz cannot be mathematically scaled or physically applied to the operation of a 24 GHz Frequency Modulated Continuous Wave (FMCW) radar. This discrepancy strongly indicates that the literature review was constructed by aggregating high-impact visual elements from prior research to mimic academic rigor, rather than to establish a genuine, scientifically sound theoretical foundation for the experiment at hand. --- ### Radar Architecture Contradictions: The Pulsed vs. FMCW Paradigm The most egregious breach of academic integrity and technical competence within the manuscript is located in Section IV (System Design and Algorithms). The authors attempt to explain the internal signal generation and processing mechanics of their chosen hardware, the HLK-LD2410-24G radar module. To illustrate this, they present a complex schematic titled "Fig 3: FMCW radar operating principle". An investigation into the cited source—a 2011 European Journal of Scientific Research paper by Yahya S. H. Khraisat titled "Simulation of the 24GHz short range, wide band automotive radar"—reveals that the authors extracted this figure wholesale from Khraisat’s work. In the original 2011 publication, the exact same diagram is explicitly labeled "Figure 1: Block diagram of the automotive radar sensor". While the reuse of diagrams without clear modification or explicit copyright permission constitutes plagiarism, the primary failure here is one of profound scientific illiteracy. The diagram that the authors copied represents a Pulsed Radar System, which is physically, mathematically, and architecturally incompatible with the FMCW (Frequency Modulated Continuous Wave) radar system they claim to be utilizing and describing. The textual description in the manuscript explicitly states: "The HLK-LD2410-24G radar module operates on the principle of Frequency Modulated Continuous Wave (FMCW). FMCW radar emits a continuous signal whose frequency is modulated over time, typically in a linear fashion, to create a waveform known as a chirp". This textual description is factually correct regarding the actual hardware they purchased. However, the plagiarized Figure 3 completely contradicts this text. To comprehend the magnitude of this error, one must rigorously analyze the contrasting architectural requirements of the two distinct radar typologies. | Architectural Component | Pulsed Radar Architecture (Depicted in Plagiarized Figure 3) | FMCW Architecture (Actual Operation of the HLK-LD2410) | | :--- | :--- | :--- | | Transmission Mode | Intermittent. The system emits exceptionally short, high-power bursts of RF energy, followed by a comparatively long silent period dedicated to listening for echoes. | Continuous. The transmitter is constantly emitting a relatively low-power, frequency-sweeping signal while the receiver simultaneously and continuously listens. | | Isolation Switching | Required. Because the transmitter emits massive bursts of power, the highly sensitive receiver must be physically isolated to prevent it from being overloaded or destroyed. The plagiarized Figure 3 clearly illustrates a "TX Switch" and an "RX Switch" specifically for this protective isolation. | Absent. Because the transmission is continuous and operates at low power, the transmit and receive antennas operate simultaneously without the need for active isolation switches. | | System Timing Control | Pulse Repetition Frequency (PRF) Controller. This central timing block dictates exactly how often the high-power pulses are dispatched. Figure 3 prominently features a central PRF block controlling the TX and RX switches. | Chirp Generator / Voltage Sweep Control. FMCW does not utilize discrete pulses. It relies on a continuous voltage ramp applied to a Voltage Controlled Oscillator (VCO) to generate the frequency sweep across a specific bandwidth. | | Distance Derivation Mathematics | Time of Flight ($\Delta t$). The distance $R$ is calculated based on the absolute time it takes for a pulse to travel to the target and back: $R = \frac{c \Delta t}{2}$. | Beat Frequency ($f_b$). Distance is derived by mathematically mixing the instantaneous transmitted continuous signal with the delayed received signal to isolate the frequency difference: $R = \frac{c f_b}{2 S}$. | Khraisat (2011) explicitly acknowledges his chosen architecture in the original text, stating: "The 24 GHz SRR sensors are based on a pulsed radar concept". By extracting Khraisat's highly specific pulsed radar schematic, renaming it to describe an FMCW operating principle, and inserting it into their manuscript, the authors of the 2025 paper prove unequivocally that they do not understand the internal analog and digital hardware architecture of the sensor module they are integrating. They appear to have searched for a block diagram associated with "24 GHz radar" and erroneously assumed that all 24 GHz radars share an identical internal topology. This precise error is the defining hallmark of "black-box" hardware engineering. It reveals a scenario wherein a researcher utilizes a pre-compiled commercial module via a simplified digital interface without comprehending the underlying analog physics, RF front-end design, or baseband signal processing occurring on the silicon die itself. For a manuscript claiming to be a formal research article in a peer-reviewed engineering journal, the inability to distinguish between the two most fundamental radar architectures entirely invalidates the theoretical credibility of the systems design section. --- ### Hardware Abstraction and the Reality of Edge Processing Moving beyond the theoretical architectural flaws, the practical physical implementation of the proposed system relies heavily on the interaction between an ESP32 microcontroller and the HLK-LD2410-24G radar module. The manuscript asserts that the sensor nodes are tasked with performing "on-board pre-processing to reject noise and enhance the probability of human identification". Furthermore, it claims the central node acts as an "edge computing hub" that employs "advanced processing algorithms, statistical signal analysis, and machine learning models". It is critical to dissect the communication protocol between these devices to determine whether the authors actually developed novel signal processing algorithms, or if they simply parsed the pre-computed outputs of the commercial hardware. According to the manufacturer's technical specifications and the serial communication protocols for the HLK-LD2410 series, the module operates as a completely self-contained, highly abstracted human presence sensor. The module's internal proprietary silicon and closed-source firmware handle all of the complex analog-to-digital conversion, the Fast Fourier Transforms (FFTs) required to extract the beat frequencies, and the phase-shift analysis necessary to detect micro-motions. The module interfaces with external microcontrollers via a simple Universal Asynchronous Receiver-Transmitter (UART) connection operating at a default, non-negotiable baud rate of 256,000 bps, outputting highly processed, human-readable data frames. The authors state in their methodology that the module determines parameters which they characterize as "moving distance" (Md), "stationary distance" (Sd), "moving energy" (Me), and "stationary energy" (Se). This exact nomenclature is not a novel conceptual framework developed by the authors; it maps flawlessly and identically to the standard UART data payload variables documented in the official HLK-LD2410 serial protocol manual. The radar module itself, not the ESP32, executes the intensive digital signal processing required to generate these variables. --- ### Deconstruction of the Proposed Detection Algorithms To justify their claims of edge processing, the manuscript presents two primary algorithms executed on the ESP32 microcontroller, framing them as sophisticated logic for target detection. **Algorithm 1: Determine the Distance to Detected Target** The structure of this algorithm is profoundly simplistic. It consists of a solitary logical conditional statement. The algorithm checks if the UART-provided moving distance (Md) is approximately equal to the UART-provided stationary distance (Sd), and simultaneously checks if the reported energy values (Me, Se) exceed manually hardcoded environmental thresholds (Me_threshold, Se_threshold). If these basic conditions are met, the algorithm executes a rudimentary arithmetic mean to establish the final scalar distance: $d = \frac{Md+Sd}{2}$. **Algorithm 2: Detect the Target's Activity State** This subsequent algorithm functions merely as an extension of the trivial logic established in Algorithm 1. It evaluates the boolean output of the previous algorithm. If the threshold conditions are met, it assigns a string variable state of "ActingHuman." If the primary conditions fail, it engages a secondary check comparing the current stationary distance against a previously recorded stationary distance in memory. If they match, the state string is updated to "Non-ActingHuman." Any other condition defaults the state to "Noise". From the perspective of advanced computer science and digital signal processing, presenting these elementary procedural sequences as formal "Algorithms" in a scholarly research paper is a severe overstatement of complexity. They represent textbook examples of basic conditional data parsing applied to a serial stream. The ESP32 is not executing "advanced edge-based signal processing" as claimed in the abstract; it is merely acting as a UART serial string parser and a basic logical gate array. The true signal processing—the generation of the continuous wave chirp, the heterodyne mixing of the RF signals, the windowing and FFT execution to isolate specific range bins, and the Doppler analysis—is completely encapsulated within the proprietary HLK-LD2410 hardware, remaining entirely opaque to the authors. While writing a serial parser to bridge a sensor to an IoT network via MQTT is a perfectly valid and necessary task for an undergraduate hardware integration project, elevating this basic script to the status of a novel algorithmic contribution in an international journal constitutes a fundamental misrepresentation of the research's computational sophistication. --- ### The Geometry and Mathematics of Coplanar Trilateration The most mathematically ambitious claim within the manuscript is the conceptual leap from analyzing scalar, one-dimensional distance measurements to rendering a fully trackable, three-dimensional spatial coordinate system. The authors correctly acknowledge the physical limitation that a single HLK-LD2410 module only provides a scalar radial distance ($d$) to the detected target, possessing absolutely no angular resolution capabilities (unlike more complex MIMO radar arrays that can calculate Angle of Arrival). To circumvent this limitation, the authors propose a distributed architecture wherein a central ESP32 node manages a minimum of three individual sensor nodes. Crucially, they stipulate that these nodes must be installed in a coplanar arrangement—specifically citing mounting them "on the same ceiling"—and must be non-collinear. They assert: "With the respective distance value, d, reported by each sensor, the central node can then compute the 3D position (x-y-z coordinates) of the monitored object by applying the Trilateration algorithm". While this approach is theoretically sound in the realm of pure, frictionless geometry, the practical mathematical application of coplanar trilateration using ultra-low-cost, coarsely quantized IoT sensors is highly problematic and mandates rigorous mathematical deconstruction. #### Derivation of the Coplanar Spherical Intersection To understand the inevitable failure points of this system, one must define the mathematical operational space. The three sensors are mounted on a two-dimensional ceiling plane, which we will define as an elevation of $z = H$. Their fixed spatial coordinates are $S_1(x_1, y_1, H)$, $S_2(x_2, y_2, H)$, and $S_3(x_3, y_3, H)$. The human target is located at an unknown three-dimensional coordinate point $P(x, y, z)$. Each sensor $i$ reports a radial distance $d_i$ to the target. This creates a system of three intersecting spherical equations: $(x - x_1)^2 + (y - y_1)^2 + (z - H)^2 = d_1^2$ $(x - x_2)^2 + (y - y_2)^2 + (z - H)^2 = d_2^2$ $(x - x_3)^2 + (y - y_3)^2 + (z - H)^2 = d_3^2$ Because all three sensors share the exact same $Z$ coordinate ($H$), subtracting any equation from another entirely eliminates the non-linear $(z - H)^2$ term. This mathematical reduction simplifies the complex 3D problem into a system of two linear equations with two unknowns ($x$ and $y$). This reduction allows for the mathematically robust and computationally efficient calculation of the horizontal position of the target directly beneath the sensor array. Once the horizontal coordinates $x$ and $y$ are resolved, the vertical coordinate $z$ must be calculated by substituting $x$ and $y$ back into one of the original spherical equations. Let the squared horizontal distance from sensor 1 to the target be defined as $r_1^2 = (x - x_1)^2 + (y - y_1)^2$. The equation for the elevation $z$ then becomes: $z = H \pm \sqrt{d_1^2 - r_1^2}$ Because the physical target (a human) must exist in the space below the ceiling, the system logic can comfortably discard the positive root and select the negative root, theoretically resulting in a single, definitive $(x, y, z)$ spatial coordinate. #### Quantization Error and Geometric Dilution of Precision (GDOP) The fatal mathematical flaw in the authors' physical implementation lies in the interaction between the error propagation inherent to the $z$ calculation and the severe physical limitations of their chosen hardware. The equation for $z$ relies fundamentally on the square root of the difference between the squared measured distance ($d_1^2$) and the squared calculated horizontal distance ($r_1^2$). According to the official manufacturer specifications, the HLK-LD2410 does not provide continuous, millimeter-level analog precision. It utilizes a 250 MHz frequency sweep bandwidth. Based on the fundamental FMCW radar range resolution formula ($\Delta R = \frac{c}{2B}$), a 250 MHz bandwidth dictates a maximum theoretical physical resolution of 0.6 meters. The manufacturer specifies the actual operational distance resolution of the module as 0.75 meters (75 centimeters). The module's internal DSP groups human micro-motions into these discrete 0.75m range gates; it outputs stepped, quantized distance values. When the raw input distance $d_i$ fed into the trilateration algorithm carries a massive inherent quantization error of up to $\pm 0.375$ meters, the calculation for $x$ and $y$ will suffer from notable horizontal jitter. However, the calculation for the $z$ axis will suffer catastrophic systemic failure. Because of the heavy quantization of $d_1$, it is highly probable that the calculated squared horizontal distance ($r_1^2$) will occasionally evaluate to a number slightly larger than the coarsely quantized measured distance squared ($d_1^2$). When this occurs, the term inside the square root ($d_1^2 - r_1^2$) becomes a negative value. Attempting to calculate the square root of a negative number yields an imaginary number, causing the microcontroller's math library to throw a NaN (Not a Number) exception, leading to a complete collapse of the tracking algorithm. Furthermore, even when the term remains marginally positive, the mathematical derivative of the square root function approaches infinity as the operand approaches zero. This means that a minuscule fluctuation in the coarsely reported distance $d_i$ from the LD2410 will result in massive, entirely unpredictable swings in the calculated $z$-axis height. This geometric vulnerability is known in aerospace and radar engineering as Geometric Dilution of Precision (GDOP). The authors' architectural decision—placing all three sensors on a single, flat, coplanar ceiling—creates the absolute worst-case scenario for GDOP along the axis perpendicular to that plane (the Z-axis). It is physically and mathematically impossible to derive stable Z-axis tracking using coplanar sensors with 75-centimeter quantization errors. --- ### Misappropriation of Statistical and Machine Learning Terminology Recognizing that their system would inherently produce wildly erratic spatial data due to the GDOP issues outlined above, the authors introduce a post-processing step titled "Algorithm 3: Outliers Filtering and Probabilistic Presence Region Detecting". In the accompanying text, the authors assert that during the observation cycle, "a linear regression machine learning algorithm (Algorithm 3) is employed to eliminate outliers". This specific terminology represents a profound mischaracterization of fundamental mathematics. Linear regression is a well-established predictive modeling technique utilized to fit a linear equation (e.g., $y = \beta_0 + \beta_1 x + \epsilon$) to a dataset in order to model the relationship between a dependent variable and one or more independent variables, typically minimizing the sum of squared residuals. It is fundamentally not a spatial algorithm designed for outlier removal in a three-dimensional point cloud. Standard computational geometry algorithms utilized for spatial outlier removal include Statistical Outlier Removal (SOR), Random Sample Consensus (RANSAC), or Density-Based Spatial Clustering of Applications with Noise (DBSCAN). Analyzing the actual pseudocode provided for Algorithm 3 reveals the stark reality. The sequence does not perform linear regression, nor does it involve any form of machine learning, training data, or predictive modeling. The pseudocode merely iterates over an observation cycle, collects all the volatile $(x, y, z)$ position estimations into a basic list, and subsequently executes two elementary mathematical functions: * It calculates the statistical Mean of the list to serve as the region's central coordinate (Rcenter). * It calculates the Standard Deviation of the points from that center to serve as the region's estimated radius (Rradius). Calculating an arithmetic average and a standard deviation of a dataset constitutes basic, first-year descriptive statistics. Labeling a simple mean calculation as a "linear regression machine learning algorithm" is a classic example of academic buzzword inflation. It artificially elevates the perceived technological sophistication of the manuscript, likely to satisfy the stylistic expectations of modern technology journals, while obfuscating the true, elementary nature of the underlying code. --- ### Forensic Analysis of Experimental Data Tables The manuscript concludes its technical presentation by offering three tables of empirical experimental results. These tables detail the performance of the integrated tracking system in locating a physical target in 3D space under various environmental conditions. A rigorous forensic review of this data, viewed through the lens of the physical resolution limits and GDOP mathematics established previously, raises severe questions regarding the authenticity of the reported metrics. #### Evidence of Catastrophic Z-Axis Failure As proven mathematically, a system relying on three coplanar LD2410 sensors featuring a 0.75m distance resolution will inevitably produce massive, uncontrollable instability in the Z-axis. The data presented in the manuscript inadvertently confirms this mathematical inevitability, although the text fails to address the severity of the implications. Consider Table 3: Experimental results in a cluttered environment with low obstacles and loud music. The raw data points provided by the authors are reconstructed below, augmented with a calculated column for Absolute Z-Error to highlight the severity of the variance: | Spatial Metric | Test Point 1 | Test Point 2 | Test Point 3 | Test Point 4 | Test Point 5 | | :--- | :--- | :--- | :--- | :--- | :--- | | X Coordinate (Actual) | 50 cm | 80 cm | 90 cm | 100 cm | 120 cm | | X Coordinate (Measured) | 52 cm | 75 cm | 89 cm | 92 cm | 115 cm | | Y Coordinate (Actual) | 50 cm | 80 cm | 90 cm | 100 cm | 120 cm | | Y Coordinate (Measured) | 57 cm | 86 cm | 85 cm | 106 cm | 116 cm | | Z Coordinate (Actual) | 20 cm | 20 cm | 20 cm | 20 cm | 20 cm | | Z Coordinate (Measured) | 41 cm | 47 cm | 42 cm | 31 cm | 38 cm | | Z-Error (Absolute cm) | 21 cm | 27 cm | 22 cm | 11 cm | 18 cm | | Z-Error (Percentage) | 105% | 135% | 110% | 55% | 90% | The Z-axis measurements in this realistic scenario exhibit catastrophic tracking failure, with absolute errors frequently exceeding 100% of the actual physical value. This magnitude of error renders the Z-axis data entirely useless for any practical "3D tracking" application. The authors attempt to dismiss this massive systemic failure with a single, scientifically invalid sentence: "Errors along the z-axis, however, can be mitigated by adjusting the installation height of this plane relative to the ground". This mitigation strategy is mathematically nonsensical. Adjusting the installation height of the sensor array merely changes the offset variable ($H$) in the intersection equation. It does absolutely nothing to correct the fundamental Geometric Dilution of Precision caused by the coplanar trilateration architecture, nor does it address or smooth the massive quantization noise inherent to the 0.75m resolution of the hardware. A geometric instability embedded in a square root calculation cannot be resolved simply by moving the sensor plane vertically. #### Physical Impossibility and Data Smoothness Anomalies While the data in Table 3 confirms the system's flaws, the data presented in Table 2: Experimental results under quiet conditions is highly suspicious due to its unnatural and physically impossible precision. | Spatial Metric | Test Point 1 | Test Point 2 | Test Point 3 | Test Point 4 | Test Point 5 | | :--- | :--- | :--- | :--- | :--- | :--- | | Z Coordinate (Actual) | 50 cm | 50 cm | 50 cm | 50 cm | 50 cm | | Z Coordinate (Measured) | 54 cm | 50 cm | 56 cm | 49 cm | 52 cm | In this table, the Z-axis measurements are remarkably and inexplicably accurate, with absolute errors ranging from a mere 0 to 6 centimeters. This level of precision fundamentally contradicts the stated capabilities of the sensor hardware. The HLK-LD2410 simply does not possess millimeter or fine-centimeter level accuracy; its range resolution, dictated by its 250 MHz bandwidth, is 75 centimeters. Even allowing for the temporal averaging technique detailed in Algorithm 3, calculating a complex 3D spatial intersection based on three discrete, coarsely quantized distance gates will inherently produce a highly erratic, jitter-heavy point cloud. It will never produce a smoothly varying coordinate trace that is perfectly accurate to within 1 centimeter (as vividly demonstrated in Point 2 of Table 2, where the z-measured equals the z-actual exactly). The stark contradiction between the hyper-accurate, physically impossible Z-axis data in Table 2 and the catastrophically failing, mathematically expected Z-axis data in Table 3 strongly suggests empirical data manipulation. It is highly probable that the authors manually tuned static environmental offsets within the microcontroller code, or selectively sampled non-representative "best-case" frames from the UART serial output to artificially generate the results for Table 2. They appear to have failed to realize that the subsequent data provided in Table 3 would definitively expose the inherent physical limitations of their geometric architecture. --- ### Literature Review Deficiencies and Project Context A broader review of the textual narrative reveals additional academic infractions that contextualize the genesis of the manuscript. The structural flow, the specific phrasing of the introduction, and the scope of the literature review bear the distinct hallmarks of an undergraduate capstone report rather than a novel research article. Within the literature review, the authors cite the work of Raju A. Nadaf et al. (2020), "Home Security against Human Intrusion using Raspberry Pi". Nadaf's research details a highly rudimentary security system utilizing a basic passive infrared (PIR) sensor coupled with a Raspberry Pi camera module, programmed to email photographs when motion triggers the PIR sensor. The authors of the 2025 manuscript reference this specific paper to broadly dismiss traditional PIR sensors, arguing that they "lack the sophistication to differentiate between humans and other objects". While this constitutes a valid technical critique of basic PIR technology, the narrative leap from a simple Raspberry Pi camera trigger (Nadaf) to a multi-node, networked mmWave 3D tracking matrix represents a severely disjointed and inadequate literature review. The authors entirely fail to engage with the actual, contemporary state-of-the-art in mmWave spatial tracking. The rich and complex body of academic literature regarding MIMO (Multiple Input Multiple Output) radar arrays, advanced point-cloud clustering techniques utilizing high-resolution Texas Instruments mmWave sensors (such as the IWR6843), or the implementation of advanced Kalman and Particle filtering for predictive trajectory tracking is completely absent from the manuscript. Instead, the authors artificially pad the literature review with irrelevant 100 GHz personnel screening papers and elementary Raspberry Pi hobbyist projects. This indicates a profound lack of familiarity with the contemporary academic discourse surrounding their chosen technological domain. This lack of deep academic context is explained by the origin of the work. The primary author's curriculum vitae indicates that the culmination of his Bachelor of Engineering degree in 2025 was a thesis titled "Individual Smart Tracking System using mm Wave Radar Technology," for which he achieved a high score and was named valedictorian. The 2025 publication date and the subject matter of the IJLRET paper perfectly align with this undergraduate thesis. The manuscript is almost certainly a direct adaptation of this capstone project. While connecting commercial sensors to an ESP32 is a commendable integration exercise for an engineering student, translating this exact exercise into an international journal article resulted in the artificial inflation of the theoretical framework and the academic missteps documented throughout this report. --- ### Synthesis of Integrity Failures The anomalies uncovered within this manuscript are emblematic of a systemic vulnerability at the intersection of modern rapid-prototyping hardware and academic publishing. The commoditization of complex RF technologies—specifically the packaging of advanced FMCW radar signal processing into a low-cost, plug-and-play module like the HLK-LD2410—has drastically lowered the technical barrier to entry for hardware integration. Researchers can now acquire these modules, connect them to a microcontroller using merely three wires (Transmit, Receive, Ground), and instantly harvest sophisticated human presence telemetry without possessing any fundamental understanding of the Maxwell's equations, analog front-end design, or Fourier transforms that make the telemetry possible. While this represents a triumph for open-source IoT development, it creates severe integrity hazards when subjected to the pressures of academic publishing. Because the underlying hardware functions so seamlessly, researchers are tempted to wrap their elementary integration scripts in the veneer of formal scientific research. This leads to the exact sequence of failures documented in this report. The authors appropriate highly specific architectural diagrams they do not understand (such as Khraisat's pulsed radar block diagram) to explain the internal workings of modules they cannot see inside. They apply basic descriptive arithmetic (mean and standard deviation) and mislabel it as "machine learning" to satisfy the stylistic expectations of modern technology journals. Most damagingly, they calculate complex geometric coordinates using intersection formulas that completely ignore the fundamental quantization limits of their physical hardware, resulting in the publication of impossible data precision. Based on an exhaustive technical deconstruction of the manuscript, it is evident that the research contains no novel algorithmic contributions to the field of signal processing. The theoretical foundation is compromised by the misapplication of 100 GHz electromagnetic parameters to a 24 GHz system, demonstrating a failure to account for frequency-dependent RF attenuation. The architectural claims are falsified by the plagiarism of a pulsed radar diagram to describe FMCW hardware. Finally, the mathematical framework for coplanar trilateration is fundamentally incompatible with the 0.75m range resolution of the chosen sensors, a reality that renders the highly precise data presented in Table 2 scientifically impossible. The manuscript represents a functional undergraduate hardware integration exercise that has been artificially and improperly inflated into a research paper through the misappropriation of physics, mathematics, and terminology. --- ### Works cited * Nghiem Le Phan Gia, Linh Ho Khanh, Tuan Do-Hong. "Real-time Individual Tracking Solution for Intrusion Detection and Monitoring System using 24-GHz mm Wave Radar Technology" , International Journal of Latest Research in Engineering and Technology (IJLRET), 53-58(11) 2025. https://doi.org/10.56581/IJLRET.11.07.53-58. * Hi-Link HLK-LD2410 Radar Module User Manual - device.report, accessed April 12, 2026, https://device.report/manual/8633827 * Review of Active Millimeter Wave Imaging Techniques for Personnel Security Screening | Request PDF - ResearchGate, accessed April 12, 2026, https://www.researchgate.net/publication/336453233_Review_of_Active_Millimeter_Wave_Imaging_Techniques_for_Personnel_Security_Screening * Simulation of the 24GHz Short Range, Automotive Radar - ResearchGate, accessed April 12, 2026, https://www.researchgate.net/profile/Yahya_Khraisat2/publication/261211431_Simulation_of_the_24GHz_short_range_wide_band_automotive_radar/links/555b0f0d08ae980ca611bbab/Simulation-of-the-24GHz-short-range-wide-band-automotive-radar.pdf * HLK-LD2410 - sudo.is, accessed April 12, 2026, https://www.sudo.is/docs/esphome/components/ld2410/HLK-LD2410_Serial_Communication_Protocol_v1.02.pdf * HLK-LD2410C 24Ghz Human Presence Induction Distance Detection Radar Sensor Module support BLE/UART adjustment paraments - Shenzhen Hi-Link Electronic Co., Ltd., accessed April 12, 2026, https://www.hlktech.net/index.php?id=1095 * LD2410C Human Presence Detector - DroneBot Workshop, accessed April 12, 2026, https://dronebotworkshop.com/ld2410c-human-sensor/ * HLK-LD2410 Human Sensing Module Guide | PDF | Power Supply | Transmission Control Protocol - Scribd, accessed April 12, 2026, https://www.scribd.com/document/874179420/HLK-LD2410-Serial-Communication-Protocol-V1-02 * Real-time Individual Tracking Solution for Intrusion Detection and Monitoring System using 24-GHz mm Wave Radar Technology - ResearchGate, accessed April 12, 2026, https://www.researchgate.net/publication/394373092_Real-time_Individual_Tracking_Solution_for_Intrusion_Detection_and_Monitoring_System_using_24-GHz_mm_Wave_Radar_Technology * (PDF) Home Security against Human Intrusion using Raspberry Pi - ResearchGate, accessed April 12, 2026, https://www.researchgate.net/publication/340699465_Home_Security_against_Human_Intrusion_using_Raspberry_Pi * (PDF) IoT Based Smart Mirror Using Raspberry Pi 4 and YOLO Algorithm: A Novel Framework for Interactive Display - ResearchGate, accessed April 12, 2026, https://www.researchgate.net/publication/364357056_IoT_Based_Smart_Mirror_Using_Raspberry_Pi_4_and_YOLO_Algorithm_A_Novel_Framework_for_Interactive_Display