Machine learning of the well known things

04/25/2022
by   V. Dolotin, et al.
0

Machine learning (ML) in its current form implies that an answer to any problem can be well approximated by a function of a very peculiar form: a specially adjusted iteration of Heavyside theta-functions. It is natural to ask if the answers to the questions, which we already know, can be naturally represented in this form. We provide elementary, still non-evident examples that this is indeed possible, and suggest to look for a systematic reformulation of existing knowledge in a ML-consistent way. Success or a failure of these attempts can shed light on a variety of problems, both scientific and epistemological.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/26/2021

Exploring the Role of Machine Learning in Scientific Workflows: Opportunities and Challenges

In this survey, we discuss the challenges of executing scientific workfl...
research
09/07/2017

An Analysis of ISO 26262: Using Machine Learning Safely in Automotive Software

Machine learning (ML) plays an ever-increasing role in advanced automoti...
research
04/21/2023

Who's the Best Detective? LLMs vs. MLs in Detecting Incoherent Fourth Grade Math Answers

Written answers to open-ended questions can have a higher long-term effe...
research
09/30/2022

Towards Implementing ML-Based Failure Detectors

Most existing failure detection algorithms rely on statistical methods, ...
research
12/17/2019

How Personal is Machine Learning Personalization?

Though used extensively, the concept and process of machine learning (ML...
research
03/08/2022

TTML: tensor trains for general supervised machine learning

This work proposes a novel general-purpose estimator for supervised mach...
research
11/20/2017

The Doctor Just Won't Accept That!

Calls to arms to build interpretable models express a well-founded disco...

Please sign up or login with your details

Forgot password? Click here to reset