In the next section, we derive a relation which relates measure of distance and entropy for IvIFSs, which satisfies all the axioms of the definition of entropy.
Here, we develop a technique which obtains entropy measure for IvIFSs which satisfies the aforementioned properties.
The regression equations fitted to the values of differential enthalpy, differential entropy and Gibbs free energy of desorption and adsorption (Table 2) had a high degree of fit to experimental data ([R.sup.2] > 0.99) and can be used to estimate these thermodynamic parameters for the desorption and adsorption processes of 'Malagueta' pepper seeds with moisture contents from 7.0 to 24.7% (d.b.) and from 4.6 to 21.3% (d.b.), respectively, at temperatures of 30, 40 and 50 [degrees]C.
Differential enthalpy and differential entropy of adsorption and desorption increase with the reduction in the moisture content of pepper seeds, being higher for the desorption process.
Figure 7 shows, that median entropy ratio of a investigated healthy motor is around 1.785.
The method is based on a calculation of the ratio between information entropy before and after wavelet packet decomposition and reconstruction of a current signal.
This criterion states that the macrostate of a system is determined by the probability distribution that maximizes the entropy, given some constraints [2].
The entropy has turned the interested of many scientists and researchers over the century and not only in thermodynamics or statistical mechanics.
Scientists next compared their statistical measures of relatively higher or lower entropy with participants' scores on two standard IQ tests: the Shipley-Hartford test, which gauges verbal skills, and the Wechsler test, which assesses problem-solving abilities.
If brain entropy could offer useful insight into intelligence, Saxe proposed, then it should track closely with IQ scores.
In the following theorem, we give bounds for the weighted
entropy of chemical graphs by taking ABC edge weights.
The
entropy measure of IVIFSs is a function E : IVIF(X) [right arrow] [0,1] which satisfies the following four properties.
where [[??].sub.tot] represents the total
entropy; that is, [[??].sub.tot] = [[??].sub.in] + [[??].sub.h].
Our motivation is, among others, that this quantity generalizes some information measures already existing in the literature such as Arndt [19]
entropy, which is used in physics.
The higher the probability of our "solution," the better our "solution."
Entropy can also be used for measuring the quality of the state sequence of the kth-order HMM.