Abstract
We propose and present the first proof-of-concept demonstration of an angle-of-arrival (AOA)-based visible-light-positioning (VLP) system using long-short-term-memory neural-network (LSTMNN) model. Only a single LED lamp and silicon based solar cell Rx are needed. The LSTM cell can manipulate and analyze time-serial sequential data, reducing the influence of time-dependent fluctuation and noises during the AOA data acquisition; hence, enhancing the positioning accuracy. We compare different ML models including the linear regression (LR) and artificial neural network (ANN), as well as different AOA data sequence lengths and additional LSTM layers. The experimental result reveals that the LSTMNN model outperforms the other models under evaluation. When the sequence length of 20 and one LSTM layer are employed, the mean positioning error is 1.78 cm and the 90% of the experimental data has positioning error within ∼2.9 cm. Compared to the LR model and ANN model, the mean positioning errors are reduced by 49% and 40% respectively; and the 90% cumulative distribution function (CDF) values are reduced by 3 cm and 1.3 cm respectively by utilizing the proposed LSTMNN model.
Original language | English |
---|---|
Article number | 128761 |
Journal | Optics Communications |
Volume | 524 |
DOIs | |
State | Published - 1 Dec 2022 |
Keywords
- Light emitting diode (LED)
- Optical wireless communication (OWC)
- Visible light communication (VLC)
- Visible light positioning (VLP)