-
Improved Adversarial Robustness by Reducing Open Space Risk via Tent Activations
Adversarial examples contain small perturbations that can remain imperce...
read it
-
Deep Learning Training in Facebook Data Centers: Design of Scale-up and Scale-out Systems
Large-scale training is important to ensure high performance and accurac...
read it
-
My First Deep Learning System of 1991 + Deep Learning Timeline 1962-2013
Deep Learning has attracted significant attention in recent years. Here ...
read it
-
Training Deep Neural Networks with Constrained Learning Parameters
Today's deep learning models are primarily trained on CPUs and GPUs. Alt...
read it
-
The TrojAI Software Framework: An OpenSource tool for Embedding Trojans into Deep Learning Models
In this paper, we introduce the TrojAI software framework, an open sourc...
read it
-
Unlearnable Examples: Making Personal Data Unexploitable
The volume of "free" data on the internet has been key to the current su...
read it
-
Analyzing the benefits of communication channels between deep learning models
As artificial intelligence systems spread to more diverse and larger tas...
read it
Scaling down Deep Learning
Though deep learning models have taken on commercial and political relevance, many aspects of their training and operation remain poorly understood. This has sparked interest in "science of deep learning" projects, many of which are run at scale and require enormous amounts of time, money, and electricity. But how much of this research really needs to occur at scale? In this paper, we introduce MNIST-1D: a minimalist, low-memory, and low-compute alternative to classic deep learning benchmarks. The training examples are 20 times smaller than MNIST examples yet they differentiate more clearly between linear, nonlinear, and convolutional models which attain 32, 68, and 94 respectively (these models obtain 94, 99+, and 99+ example use cases which include measuring the spatial inductive biases of lottery tickets, observing deep double descent, and metalearning an activation function.
READ FULL TEXT
Comments
There are no comments yet.