Open Access
Article
Article ID: 3072
PDF
by Farhan Nisar, Baseer Ali Rehman
Comput. Telecommun. Eng. 2025, 3(2);   
Received: 16 November 2024; Accepted: 13 March 2025; Available online: 1 April 2025;
Issue release: 30 June 2025
Abstract

The rapid expansion of cloud computing within business environments, along with the increasing complexity of organizational deployments, has led to a surge in cloud-based attacks on computer networks. These attacks exploit security vulnerabilities and systemic breaches. This study explores robust defense mechanisms by leveraging policy-based configurations and rule enforcement on edge network devices. These mechanisms were tested using GNS3 simulations to strengthen internal and external infrastructures against critical threats such as ICMP, CDP, and port security attacks.

show more
Open Access
Article
Article ID: 8228
PDF
by Esraq Humayun, Farzana Talukder Sumona, Lim Kok Cheng, Md Shahin Hossain, Ali Selamat, Ondrej Krejar
Comput. Telecommun. Eng. 2025, 3(2);   
Received: 25 May 2025; Accepted: 1 June 2025; Available online: 18 June 2025;
Issue release: 30 June 2025
Abstract

Preliminary identification of cancer is still essential because it can greatly improve the chances of survival of a patient, as cancer is the leading reason of death internationally. In this study, we introduce a mixed machine learning (ML) model using support vector machines (SVM) and random forest (RF) algorithms. To improve accuracy in diagnosing cancer, classifiers such as linear regression (LinReg), support vector machines (SVM), and logistic regression (LogReg) are used. This way, the model is tested and validated using the UAE Cancer. It contains medical records of all patients, their demographic information, clinical information, and outcomes. Our results demonstrate that the hybrid model achieved 98.3% accuracy, 98.5% recall, and 0.99 AUC-ROC, outperforming individual classifiers. Policies are justiciable despite the difficulties of needing to validate findings from many datasets, make them easy to use clinically, and manage biases in the available information. Because of this study, people are considering how hybrid ML models can be beneficial in clinical care and are encouraging more research on cancer diagnostics.

show more
Open Access
Article
Article ID: 8431
PDF
by Zhongming Wang
Comput. Telecommun. Eng. 2025, 3(2);   
Received: 7 March 2025; Accepted: 31 May 2025; Available online: 20 June 2025;
Issue release: 30 June 2025
Abstract

The rapid densification of fifth generation radio access networks and the growing demand for low-latency services have significantly increased the energy consumption of mobile infrastructures, raising critical concerns regarding operational cost and environmental sustainability. Multi-access edge computing has been introduced as a key architectural paradigm to support stringent latency requirements by deploying computing resources closer to base stations. However, the deployment of edge computing does not inherently guarantee energy efficiency, as edge platforms may consume substantial baseline power under low utilization if orchestration and task placement are not energy-aware. This paper proposes an energy-efficient edge computing architecture for 5G networks that integrates real-time energy monitoring with load-aware task scheduling at the edge layer. The proposed architecture is aligned with standardized 5G edge deployment frameworks and is evaluated using real operational base station data, including traffic load, computing utilization, and power consumption measurements. By leveraging real data rather than synthetic workloads, the proposed approach enables a realistic assessment of energy efficiency under practical operating conditions. Experimental results demonstrate that the proposed architecture achieves approximately 30% improvement in energy efficiency compared with a conventional edge computing deployment without energy-aware scheduling, while maintaining comparable latency performance. The findings indicate that data-driven energy-aware orchestration at the network edge can deliver measurable energy savings in commercial 5G environments. This work provides practical insights for mobile network operators seeking to reduce the energy footprint of 5G infrastructures and contributes a deployable architectural framework for energy-efficient edge computing in next-generation mobile networks.

show more
Open Access
Article
Article ID: 8433
XML
by Ashraf Hassan
Comput. Telecommun. Eng. 2025, 3(2);   
Abstract

The global rollout of 5G technology promises unprecedented data rates, ultra-low latency, and massive device connectivity. However, the research community often lacks access to large-scale, real-world datasets needed to model the highly heterogeneous nature of network performance and user experience (QoE). A complex interplay of radio frequency conditions, network deployment strategies, and device capabilities shapes these characteristics. While traditional drive-testing can provide granular data, its utility is limited by spatial and temporal constraints, making it unsuitable for continuous large-scale analysis. To address this data gap, this paper introduces OPNet-Sim, a framework for generating realistic, large-scale, multi-dimensional synthetic datasets that emulate data collected from commercial 5G smartphones. The design of OPNet-Sim is informed by statistical characteristics and data schemas found in the literature and public reports on large-scale network measurement. The simulated dataset encompasses over 1.2 billion synthetic records, emulating data from more than 150,000 unique devices over 12 months. It includes detailed physical layer measurements (e.g., RSRP, RSRQ, SINR), key performance indicators (KPIs) such as throughput and latency, device context information, and network metadata. OPNet-Sim serves as both a benchmark and a synthetic data resource for researchers in telecommunications and data science. It enables the development, training, and validation of models for network performance prediction, QoE estimation for applications such as video streaming, and novel methodologies for network diagnostics all without the privacy and access constraints associated with real user data. This paper describes the dataset generation methodology, the structural schema, validation against established models, and illustrative examples of potential applications.

 

show more
Open Access
Article
Article ID: 8434
PDF
by Ashraf Hassan
Comput. Telecommun. Eng. 2025, 3(2);   
Received: 16 February 2025; Accepted: 19 May 2025; Available online: 18 June 2025;
Issue release: 30 June 2025
Abstract

The global rollout of 5G technology promises unprecedented data rates, ultra-low latency, and massive device connectivity. However, the research community often lacks access to large-scale, real-world datasets needed to model the highly heterogeneous nature of network performance and user experience (QoE). A complex interplay of radio frequency conditions, network deployment strategies, and device capabilities shapes these characteristics. While traditional drive-testing can provide granular data, its utility is limited by spatial and temporal constraints, making it unsuitable for continuous large-scale analysis. To address this data gap, this paper introduces OPNet-Sim, a framework for generating realistic, large-scale, multi-dimensional synthetic datasets that emulate data collected from commercial 5G smartphones. The design of OPNet-Sim is informed by statistical characteristics and data schemas found in the literature and public reports on large-scale network measurement. The simulated dataset encompasses over 1.2 billion synthetic records, emulating data from more than 150,000 unique devices over 12 months. It includes detailed physical layer measurements (e.g., RSRP, RSRQ, SINR), key performance indicators (KPIs) such as throughput and latency, device context information, and network metadata. OPNet-Sim serves as both a benchmark and a synthetic data resource for researchers in telecommunications and data science. It enables the development, training, and validation of models for network performance prediction, QoE estimation for applications such as video streaming, and novel methodologies for network diagnostics all without the privacy and access constraints associated with real user data. This paper describes the dataset generation methodology, the structural schema, validation against established models, and illustrative examples of potential applications.

 

show more
Open Access
Review
Article ID: 8430
PDF
by Donghee Njoh
Comput. Telecommun. Eng. 2025, 3(2);   
Received: 16 May 2025; Accepted: 27 May 2025; Available online: 25 June 2025;
Issue release: 30 June 2025
Abstract

The telephone has undergone a remarkable transformation from its origins as a fixed-line analogue device to today’s mobile and internet-based systems that form the backbone of global communication. This review traces the historical progression of telephony, beginning with the invention and expansion of circuit-switched fixed-line networks, through the generational evolution of mobile systems from 1G to the emerging vision of 6G, and culminating in the rise of internet-based platforms such as Voice over Internet Protocol (VoIP) and Unified Communications (UC). Each stage reflects not only technical innovation but also broader socioeconomic shifts, with implications for infrastructure, spectrum management, security, sustainability, and user behaviour. The analysis highlights how engineering responses to challenges such as noise, capacity, and scalability have shaped telephony’s evolution, while identifying future directions in satellite telephony, artificial intelligence, quantum communication, and immersive extended reality (XR). By synthesising historical, technical, and forward-looking perspectives, this review underscores telephony’s continued role as a driver of technological advancement and its enduring relevance to telecommunication engineering in the twenty-first century.

show more