## Are you sure?

This action might not be possible to undo. Are you sure you want to continue?

Kolmogorov→Dempster→Zadeh Zadeh: “…[Various theories of uncertainty such as] fuzzy logic and probability theory are complementary rather than competitive”

Most Swedes are tall. Most tall Swedes are blond. What is the probability that Magnus (a Swede picked at random) is blond?

Involves linguistic quantifiers (most) and linguistic attributes (tall. Therefore categorized as a prototypical advanced CWW problem. blond) An implicit assignment of the linguistic value “Most” to: the portion of Swedes who are tall the portion of tall Swedes who are blond. .

µQ ( y ))) 1 z = xy 1 2 At least is the following operation: µ At least (Q ) ( x) = sup( µQ ( y )) y≤ x .Q1 A’s are B’s Q2(A and B)’s are C’s Q1 x Q2A’s are (B and C)’s At least (Q1 xQ2 ) A’s are C’s x is the multiplication of two fuzzy sets via: µQ ×Q 2 ( z ) = sup(min( µQ ( x).

At least 40% of the students of the EE Department at USC are on F1 visa .50% of the students of the EE Department at USC are graduate students. 80% of the graduate students of the EE Department at USC are on F1 visa. 50% ×80% of the graduate students of the EE Department at USC are on F1 visa.

B= tall. A= Swede. Most is modeled as a monotonic quantifier and therefore At least (Most2)=Most2 . Q2=Most. At least (MostxMost)=Most2 Swedes are both tall and blond.In Magnus problem: Q1= Most. C=blond Therefore.

and directly concludes that: LProb(Magnus is blond)=MostxMost=Most2 .Zadeh interprets a linguistic constraint on the portion of a population as a linguistic probability (LProb).

We construct a MF for Most: .

Almost Certain. Unlikely. Very likely. Absolutely Certain . Very unlikely. Moderately likely.We construct a vocabulary of type-1 fuzzy probabilities to translate the solution to a word: Absolutely improbable. Almost improbable. Likely.

MF of the words are shown here: .

The MF of Most2 is depicted in the following: .

We compute the Jaccard’s similarity between Most2 and the members of the vocabulary It is concluded that “It is Likely that Magnus is tall” .

Most Swedes are Tall A few Swedes are not Tall We generally have the following syllogism: Q A’s are B’s ¬Q A’s are not B’s µ¬Q (u ) = µQ (1 − u ) µnot B (u ) = 1 − µ B (u ) .

All of them or none of them can be blond . we do not know about the distribution of blonds among those few Swedes who are not tall.Similarly: Most tall Swedes are blond A few tall Swedes are not blond However.

The available information is summarized in the following tree: .

so: Most × Most + Few × one L Pr ob = Most + Few − In the optimistic case. all of Swedes who are not tall is blond. so: Most × Most + Few × All L Pr ob = Most + Few + . none of Swedes who are not tall is blond.In the pessimistic case.

LProb(blond|Swede) =LProb(tall|Swede) × LProb(blond|tall and Swede)+ LProb(¬tall|Swede) × LProb(blond|¬tall and Swede) Assuming LProb(blond|¬tall and Swede) is either None or All yields LProb.(Magnus is blond) or LProb+(Magnus is blond). .

and a vocabulary of linguistic probabilities .All and None are modeled as singletons: 1 u = 0 µ one (u ) = 0 otherwise 1 u = 1 µ All (u ) = 0 otherwise We also construct models for Most and Few.

MF’s of T2FS models of Most and Few: .

We construct a vocabulary of linguistic probabilities to decode the solution to a word: .

The pessimistic and optimistic linguistic probabilities are depicted here: .

The Jaccard’s similarities between the solutions and the members of the vocabulary are demonstrated in the following table: .

“The probability that Magnus is blond is between Likely and Very Likely” Using the average centroids of the solutions we can also say that: “The probability that Magnus is blond is between around 80% and around 89%.” .

Linguistic Approximation is similar to rounding numeric values The resolution of the vocabulary is important When vocabularies are small. the pessimistic and optimistic probabilities may map to the same word We studied the effect of the size of vocabularies on the decoded solution .

Vocabularies with different sizes: .

.

.

Tables show the similarities of the solutions with members of each of the vocabularies .

. which is Likely for the first vocabulary.Using all of these vocabularies. For small vocabularies. and is Very Likely for others. the total ignorance present in the problem does not affect the outcome. both the pessimistic and the optimistic solutions map to the same word.

Novel Weighted Averages are promising when dealing with linguistic probabilities Our solution builds a probability model for the problem which obeys a set of axioms Is the problem really reduced to calculating the belief and plausibility of a DempsterShafer Model? .

- Water Bill Argumentative Essay
- Law and Logic
- PhilScience[1].doc
- How to Read the Newspaper for CLAT
- Adz001Free Joinable Thoughts of Evolving Linguistic Logics
- Yeni Metin Belgesi
- Maaari Bang Bumalik Muli Tayo Sa Simula
- Victory by Silence Fallacy
- Debate Seminar
- Alain Badiou - Wikipedia, The Free Encyclopedia
- tanner mormon insights author1 2
- Holmes.pdf
- chemistry2
- Existentialism
- Glossario Filosofico Ingles-portugues
- System
- argumentative fallacies
- Logic
- 1. Pengantar Mi (Mhs)
- Tableau4ConditionalLogic Copy
- unilog2013b
- Reasoning and Proofs
- Cb Cl16talk
- feedback artifact 2
- Functions
- Schopenhauer En
- Formalizing Narrative Structures Glossematics, Generativity, And Transformational Rules
- Characteristics and Strategies-Learning Styles
- 0c6be38c_22
- @BelagaHalfwayUpBSL Reviews 2010-03-21

Sign up to vote on this title

UsefulNot usefulRead Free for 30 Days

Cancel anytime.

Close Dialog## Are you sure?

This action might not be possible to undo. Are you sure you want to continue?

Loading