Delay Driven Machine Learning Scheme for TCP Data Flow Performance Enhancement
Main Article Content
Abstract
Wireless networks with a substantial Bandwidth Delay Product (BDP) often use the Transmission Control Protocol (TCP) for conventional data delivery. However, TCP expands the congestion window dramatically, often resulting in ineffective bandwidth consumption. Furthermore, its congestion monitoring system adds to significant packet loss, a problem increased by improved network devices and the growing Internet. This paper presents an intelligent solution to congestion control that uses machine learning models to automatically change the number of virtual parallel streams. This approach improves congestion control by adjusting delays depending on fluctuations in the Cwnd. MATLAB-based simulations show that the suggested technique outperforms existing algorithms in terms of bandwidth efficiency.