Symbolic Tensor Neural Networks for Digital Media - from Tensor Processing via BNF Graph Rules to CREAMS Applications

09/18/2018
by   Wladyslaw Skarbek, et al.
0

This tutorial material on Convolutional Neural Networks (CNN) and its applications in digital media research is based on the concept of Symbolic Tensor Neural Networks. The set of STNN expressions is specified in Backus-Naur Form (BNF) which is annotated by constraints typical for labeled acyclic directed graphs (DAG). The BNF induction begins from a collection of neural unit symbols with extra (up to five) decoration fields (including tensor depth and sharing fields). The inductive rules provide not only the general graph structure but also the specific shortcuts for residual blocks of units. A syntactic mechanism for network fragments modularization is introduced via user defined units and their instances. Moreover, the dual BNF rules are specified in order to generate the Dual Symbolic Tensor Neural Network (DSTNN). The joined interpretation of STNN and DSTNN provides the correct flow of gradient tensors, back propagated at the training stage. The proposed symbolic representation of CNNs is illustrated for six generic digital media applications (CREAMS): Compression, Recognition, Embedding, Annotation, 3D Modeling for human-computer interfacing, and data Security based on digital media objects. In order to make the CNN description and its gradient flow complete, for all presented applications, the symbolic representations of mathematically defined loss/gain functions and gradient flow equations for all used core units, are given. The tutorial is to convince the reader that STNN is not only a convenient symbolic notation for public presentations of CNN based solutions for CREAMS problems but also that it is a design blueprint with a potential for automatic generation of application source code.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/17/2019

A Symbolic Neural Network Representation and its Application to Understanding, Verifying, and Patching Networks

Analysis and manipulation of trained neural networks is a challenging an...
research
08/17/2019

A Symbolic Neural Network Representation and its Application to Understanding, Verifying, and Patching Network

Analysis and manipulation of trained neural networks is a challenging an...
research
03/18/2016

Symbolic Tensor Calculus -- Functional and Dynamic Approach

In this paper, we briefly discuss the dynamic and functional approach to...
research
07/15/2023

NeurASP: Embracing Neural Networks into Answer Set Programming

We present NeurASP, a simple extension of answer set programs by embraci...
research
03/01/2022

Dual Embodied-Symbolic Concept Representations for Deep Learning

Motivated by recent findings from cognitive neural science, we advocate ...
research
05/24/2019

Bayesian Tensorized Neural Networks with Automatic Rank Selection

Tensor decomposition is an effective approach to compress over-parameter...
research
12/16/2021

A Static Analyzer for Detecting Tensor Shape Errors in Deep Neural Network Training Code

We present an automatic static analyzer PyTea that detects tensor-shape ...

Please sign up or login with your details

Forgot password? Click here to reset