DeepAI AI Chat
Log In Sign Up

Towards Logging Noisiness Theory: quality aspects to characterize unwanted log entries

by   Eduardo Mendes, et al.

Context: Logging tasks track the system's functioning by keeping records of evidence that have been analyzed by monitoring and observability activities. For these activities to be effective, it is necessary to consider the quality of the consumed information. Problem: However, the presence of noise - unwanted information - compromises the log files' quality. The noisiness of a log file can be affected among other things by: (i) the wrong severity log choices, (ii) the production of duplicate entries, (iii) the incompleteness of the information, (iv) the inappropriate format of the entries, (v) the amount of information generated. Objective: This work aims to broadly define the concept of noise in the context of logging, proposing the initial steps of Logging Noisiness, a theory on quality aspects to characterize unwanted log entries.


Log severity level classification: an approach for systems in production

Context: Logs are often the primary source of information for system dev...

Log severity levels matter: A multivocal mapping

The choice of log severity level can be challenging and cause problems i...

Active Meta-Learner for Log Analysis

The analysis of logs is a vital activity undertaken for cyber investigat...

Highly Scalable and Flexible Model for Effective Aggregation of Context-based Data in Generic IIoT Scenarios

Interconnectivity of production machines is a key feature of the Industr...

Forensic Analysis of the exFAT artefacts

Although keeping some basic concepts inherited from FAT32, the exFAT fil...

Reducing Honeypot Log Storage Capacity Consumption – Cron Job with Perl-Script Approach

Honeypot is a decoy computer system that is used to attract and monitor ...

Detecting Botnets Through Log Correlation

Botnets, which consist of thousands of compromised machines, can cause s...