Machine Learning at the Edge Flagged for High-Efficiency 5G Optimisation
EDN's Martin Rowe has published a write-up detailing independent research by Professors Tim O'Shea and Andrea Goldsmith on how machine learning at the edge of a network can optimise signals for 5G New Radio and beyond.
Based on technologies originally developed for computer vision and molecular communications, the optimisation system developed by O'Shea and colleagues requires no knowledge of the radio-frequency chain itself, though can be accelerated if context is provided: data input at the encoder is compared to the data from the decoder, and a model of the channel is created and optimised. This optimisation is then fed into the encoders and decoders as coefficients, automatically improving the signal.
Goldsmith's approach, meanwhile, uses a sliding bidirectional recurrent neural network to optimise the signal - taking a common method of signal processing but applying machine learning to vastly improve its ability to react to changing conditions.
"Channel sounding of 5G signals started years ago with the intent of gaining knowledge on how 5G signals will react in a real-world environment. From those experiments, engineers have developed channel models," Rowe writes. "5G New Radio is, however, much more complex than LTE given beam steering and mmWave signals. Thus, developing mathematical models and adapting them to modems has become far more difficult with 5G."
Rowe's full piece can be read on EDN now. A brief video demonstration of O'Shea's machine learning system for 5G, meanwhile, is reproduced below.
[embed]https://youtu.be/UZBqk1OkXvo[/embed]