Event Classification with Multi-step Machine Learning

06/04/2021
by   Masahiko Saito, et al.
0

The usefulness and value of Multi-step Machine Learning (ML), where a task is organized into connected sub-tasks with known intermediate inference goals, as opposed to a single large model learned end-to-end without intermediate sub-tasks, is presented. Pre-optimized ML models are connected and better performance is obtained by re-optimizing the connected one. The selection of an ML model from several small ML model candidates for each sub-task has been performed by using the idea based on Neural Architecture Search (NAS). In this paper, Differentiable Architecture Search (DARTS) and Single Path One-Shot NAS (SPOS-NAS) are tested, where the construction of loss functions is improved to keep all ML models smoothly learning. Using DARTS and SPOS-NAS as an optimization and selection as well as the connections for multi-step machine learning systems, we find that (1) such a system can quickly and successfully select highly performant model combinations, and (2) the selected models are consistent with baseline algorithms, such as grid search, and their outputs are well controlled.

READ FULL TEXT

page 10

page 16

research
11/11/2020

Efficient Neural Architecture Search for End-to-end Speech Recognition via Straight-Through Gradients

Neural Architecture Search (NAS), the process of automating architecture...
research
05/12/2022

Warm-starting DARTS using meta-learning

Neural architecture search (NAS) has shown great promise in the field of...
research
06/23/2019

Densely Connected Search Space for More Flexible Neural Architecture Search

In recent years, neural architecture search (NAS) has dramatically advan...
research
10/12/2021

On the Security Risks of AutoML

Neural Architecture Search (NAS) represents an emerging machine learning...
research
02/27/2022

ONE-NAS: An Online NeuroEvolution based Neural Architecture Search for Time Series Forecasting

Time series forecasting (TSF) is one of the most important tasks in data...
research
05/08/2023

MO-DEHB: Evolutionary-based Hyperband for Multi-Objective Optimization

Hyperparameter optimization (HPO) is a powerful technique for automating...
research
05/19/2022

Incremental Learning with Differentiable Architecture and Forgetting Search

As progress is made on training machine learning models on incrementally...

Please sign up or login with your details

Forgot password? Click here to reset