e-space
Manchester Metropolitan University's Research Repository

    Machine Learning-based Signal Correlation for Improved Satellite-5G Cellular Convergence Use Cases Network Quality Metrics Prediction

    Ekpo, Sunday ORCID logoORCID: https://orcid.org/0000-0001-9219-3759, Han, Liangxiu ORCID logoORCID: https://orcid.org/0000-0003-2491-7473, Ijaz, Muhammad ORCID logoORCID: https://orcid.org/0000-0002-0050-9435, Zafar, Muazzam and Gibson, Andy ORCID logoORCID: https://orcid.org/0000-0003-2874-5816 (2022) Machine Learning-based Signal Correlation for Improved Satellite-5G Cellular Convergence Use Cases Network Quality Metrics Prediction. In: CHIST-ERA Conference 2022, 24 May 2022 - 26 May 2022, Edinburgh, UK. (Unpublished)

    [img]
    Preview
    Presentation
    Available under License In Copyright.

    Download (133kB) | Preview

    Abstract

    Prior fifth generation (5G) mobile technologies only accommodated a single use case with less complex quality of service performance metrics constraints. The 5G radio access technology (RAT) addresses three primary use cases. Firstly, enhanced mobile broadband (eMBB) promises enhanced indoor and outdoor broadband connections and support for applications like augmented and virtual reality. Secondly, massive machine-type communications (mMTC) as another use case will enable the Internet of Things (IoTs) spanning smart cities, smart agriculture and the industrial IoTs. Ultrareliable, low-latency communications (URLLC) is the third use case that aims to support smart grids autonomous vehicles, industrial automation and telehealth. These use cases present very demanding and complex quality of service (QoS) performance metrics requirements for a real-time satellite-cellular convergence network ecosystem. Cognitive radio (CR) technology provides the dynamic spectrum access capability to share the wireless channel with licenced and unlicenced users in an opportunistic manner. However, the cognitive capability (involving interaction with the radio environment for temporal and spatial spectrum usage awareness) and reconfigurability (involving programmable transmit/receive frequencies via different access technologies) present enormous spectrum and interference management challenges due to the vagaries of the user equipment (UE) hardware design and stochastic channel conditions. The proposed novel space-terrestrial communication ecosystem requires a real-time dynamic radio network learning technique for end-to-end quality of service metrics assessment and decision making for each connected node. The current satellite–cellular network uses traditional signal processing, machine learning and deep learning methods that utilise mainly the signal properties with limited or no dynamic recourse to the hardware configuration of the primary UE. There is currently no application of an artificial intelligence (AI) method in a commercial ubiquitous satellite-5G communication network QoS performance metrics prediction. The proposed research will develop a deterministic adaptive machine learning algorithm (DAMLA) for ubiquitous 5G QoS prediction by bringing together experts in communication and systems engineering; and big data analytics and machine learning/AI. The proposed DAMLA will be tested and validated with a three-level 5G emulation testbed, viz: network; channel; and UE. The proposed research will enable 5G open radio access network (O-RAN) quality of service performance metrics, viz: peak data rate (Gbit/s); user experienced data rate (Mbit/s); area traffic capacity (Mbit/s/m2); network energy efficiency; connection density(devices/km2); latency (ms); mobility (km/h); and spectrum efficiency to be investigated, simulated and characterised for the three main 5G use cases.

    Impact and Reach

    Statistics

    Activity Overview
    6 month trend
    7Downloads
    6 month trend
    7Hits

    Additional statistics for this dataset are available via IRStats2.

    Repository staff only

    Edit record Edit record