WARC-DL: Scalable Web Archive Processing for Deep Learning

09/25/2022
by   Niklas Deckers, et al.
0

Web archives have grown to petabytes. In addition to providing invaluable background knowledge on many social and cultural developments over the last 30 years, they also provide vast amounts of training data for machine learning. To benefit from recent developments in Deep Learning, the use of web archives requires a scalable solution for their processing that supports inference with and training of neural networks. To date, there is no publicly available library for processing web archives in this way, and some existing applications use workarounds. This paper presents WARC-DL, a deep learning-enabled pipeline for web archive processing that scales to petabytes.

READ FULL TEXT
research
03/27/2019

Scalable Deep Learning on Distributed Infrastructures: Challenges, Techniques and Tools

Deep Learning (DL) has had an immense success in the recent past, leadin...
research
09/06/2023

Unveiling the frontiers of deep learning: innovations shaping diverse domains

Deep learning (DL) enables the development of computer models that are c...
research
02/07/2017

Development of JavaScript-based deep learning platform and application to distributed training

Deep learning is increasingly attracting attention for processing big da...
research
08/23/2022

Survey on Evolutionary Deep Learning: Principles, Algorithms, Applications and Open Issues

Over recent years, there has been a rapid development of deep learning (...
research
11/11/2021

Classification of URL bitstreams using Bag of Bytes

Protecting users from accessing malicious web sites is one of the import...
research
02/20/2021

Merly.jl: Web Framework in Julia

Merly.jl is a package for creating web applications in Julia. It present...
research
04/19/2021

TREC Deep Learning Track: Reusable Test Collections in the Large Data Regime

The TREC Deep Learning (DL) Track studies ad hoc search in the large dat...

Please sign up or login with your details

Forgot password? Click here to reset