Max-min Learning of Approximate Weight Matrices from Fuzzy Data

01/15/2023
by   Ismaïl Baaj, et al.
0

In this article, we study the approximate solutions set Λ_b of an inconsistent system of max-min fuzzy relational equations (S): A _min^maxx =b. Using the L_∞ norm, we compute by an explicit analytical formula the Chebyshev distance Δ = inf_c ∈𝒞‖ b -c ‖, where 𝒞 is the set of second members of the consistent systems defined with the same matrix A. We study the set 𝒞_b of Chebyshev approximations of the second member b i.e., vectors c ∈𝒞 such that ‖ b -c ‖ = Δ, which is associated to the approximate solutions set Λ_b in the following sense: an element of the set Λ_b is a solution vector x^∗ of a system A _min^maxx =c where c ∈𝒞_b. As main results, we describe both the structure of the set Λ_b and that of the set 𝒞_b. We then introduce a paradigm for max-min learning weight matrices that relates input and output data from training data. The learning error is expressed in terms of the L_∞ norm. We compute by an explicit formula the minimal value of the learning error according to the training data. We give a method to construct weight matrices whose learning error is minimal, that we call approximate weight matrices. Finally, as an application of our results, we show how to learn approximately the rule parameters of a possibilistic rule-based system according to multiple training data.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro