Zabala‐Blanco, David ORCID: https://orcid.org/0000-0002-5692-5673, Azurdia‐Meza, Cesar A., Lobos Soto, Benjamín, Soto, Ismael, Játiva, Pablo Palacios, Ahumada‐García, Roberto and Ijaz, Muhammad
ORCID: https://orcid.org/0000-0002-0050-9435
(2025)
Extreme Learning Machine Models for Classifying the LED Source in a 2D Visible Light Positioning Database.
IET Optoelectronics.
ISSN 1751-8776
|
Published Version
Available under License Creative Commons Attribution. Download (2MB) | Preview |
Abstract
In recent years, there has been a surge in interest in indoor positioning systems that use visible light communication (VLC) technology combined with light‐emitting diodes (LEDs). These systems have gained attention because of their ability to offer high bandwidth, precise localisation, and potential for wireless communication to extend into the visible light spectrum in the future, making VLC a notable candidate. Furthermore, the visible light spectrum proves advantageous in the industrial internet of things setting, as it does not offer electromagnetic interference as in radio frequency (RF) spectrum. This paper analyses a database made up of approximately 356 image samples obtained from a CMOS sensor. The database encompasses eight distinct classes, each demonstrating frequency (bit rate) variations ranging from 1 to 4.5 kHz in 500 Hz increments. The aim is to implement this database for classification applications as a first stage with several neural networks based on extreme learning machines (ELM) in various forms: (1) standard ELM, (2) regularised ELM, (3) weighted ELM in two configurations, and (4) multilayer ELM with 2 and 3 hidden layers. The findings of this study reveal that standard ELM is particularly promising, achieving more than 99% in accuracy and G‐mean, while maintaining low computational complexity (measured in tenths of seconds) when compared to convolutional neural networks and multilayer perceptrons, which offer superior performance, however at the cost of significant computational demands.
Impact and Reach
Statistics
Additional statistics for this dataset are available via IRStats2.