On the Robustness of Deep Learning-predicted Contention Models for Network Calculus

11/24/2019
by   Fabien Geyer, et al.
0

The network calculus (NC) analysis takes a simple model consisting of a network of schedulers and data flows crossing them. A number of analysis "building blocks" can then be applied to capture the model without imposing pessimistic assumptions like self-contention on tandems of servers. Yet, adding pessimism cannot always be avoided. To compute the best bound on a single flow's end-to-end delay thus boils down to finding the least pessimistic contention models for all tandems of schedulers in the network - and an exhaustive search can easily become a very resource intensive task. The literature proposes a promising solution to this dilemma: a heuristic making use of machine learning (ML) predictions inside the NC analysis. While results of this work are promising in terms of delay bound quality and computational effort, there is little to no insight on why a prediction is made or if the trained machine can achieve similarly striking results in networks vastly differing from its training data. In this paper we address these pending questions. We evaluate the influence of the training data and its features on accuracy, impact and scalability. Additionally, we contribute an extension of the method by predicting the best n contention model alternatives in order to achieve increased robustness for its application outside the training data. Our numerical evaluation shows that good accuracy can still be achieved on large networks although we restrict the training to networks that are two orders of magnitude smaller.

READ FULL TEXT
research
02/07/2022

Network Calculus with Flow Prolongation – A Feedforward FIFO Analysis enabled by ML

The derivation of upper bounds on data flows' worst-case traversal times...
research
07/26/2023

Differentiable Programming Network Calculus: Configuration Synthesis under Delay Constraints

With the advent of standards for deterministic network behavior, synthes...
research
04/25/2023

VeML: An End-to-End Machine Learning Lifecycle for Large-scale and High-dimensional Data

An end-to-end machine learning (ML) lifecycle consists of many iterative...
research
02/09/2023

Hyperparameter Search Is All You Need For Training-Agnostic Backdoor Robustness

Commoditization and broad adoption of machine learning (ML) technologies...
research
04/29/2021

Fully Unleashing the Power of Paying Multiplexing Only Once in Stochastic Network Calculus

The stochastic network calculus (SNC) holds promise as a framework to ca...
research
06/05/2023

Information Flow Control in Machine Learning through Modular Model Architecture

In today's machine learning (ML) models, any part of the training data c...
research
04/28/2023

Temporal Subsampling Diminishes Small Spatial Scales in Recurrent Neural Network Emulators of Geophysical Turbulence

The immense computational cost of traditional numerical weather and clim...

Please sign up or login with your details

Forgot password? Click here to reset