User Preferences

Content preview.

Arcu felis bibendum ut tristique et egestas quis:

  • Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris
  • Duis aute irure dolor in reprehenderit in voluptate
  • Excepteur sint occaecat cupidatat non proident

Keyboard Shortcuts

7.1 - the rules of probability, example 7.1 section  .

Astragalus bone

The astragalus (ankle or heel bone) of animals were used in ancient times as a forerunner of modern dice. In fact, Egyptian tomb paintings show that sheep astragali were used in board games as early as 3500 B.C. (see Figure 7.2 ). When a sheep astragalus is thrown into the air it can land on one of four sides, which were associated with the numbers 1, 3, 4, and 6 (see Table 7.1 ). Two sides (the 3 and the 4) are wider and each come up about 40% of the time, while the narrower sides (the 1 and the 6) each come up about 10% of the time. Astragali were used for gambling, games, and divination purposes by the ancients.

  Toss an astragalus once. What's the chance you get a "1"?

Since probabilities can often be viewed as the proportion of times something happens we see our first rule of probability.

  Toss an astragalus once. What's the chance you get at least a 3?

Notice that there is another way to solve the previous problem. The opposite of "at least 3" is "getting a 1" (i.e. the only other possibility) so you can also figure the answer as 100% - 10% = 90% or 0.90. This rule of the opposites is our third rule of probability.

  Suppose you toss an astralgus twice. What's the chance that you get "4s" on both tosses?

  In ancient Rome, the lowest score in tossing four astragali (getting all four 1s) was called the dog throw. What is the probability of getting a dog throw?

Example 7.2 Independent or not? Section  

For the next two single births at Hershey Medical Center: whether the first baby is a boy and whether the second baby is a boy.

For the next two single births at Hershey Medical Center: whether the first baby is a girl and whether both babies are girls.

For the end of the month in February next year: whether there will be snow on the ground at the State College airport on February 27th and whether there will be snow on the ground at the airport on February 28th

Example 7.3 Section  

The highest paid employee has randomly selected from the list of Fortune 500 companies. Which of these probabilities is the largest?

  • The chance this person is a college graduate
  • The chance this person is a college graduate with a Business degree.
  • The chance this person is a college graduate with an Engineering degree.

Other Interpretations Section  

While we can often think of how the process leading to data might be repeated, some events arise in situations that are not easily seen as being repeatable. In such situations, the relative frequency interpretation of probability may seem inappropriate. For example, answering a question like "What is the chance that our next President will be a woman?" would seem to require a different interpretation of the meaning of probability. Luckily, it is perfectly reasonable to assign probabilities to events outside of the relative frequency interpretation as long as they satisfy the above rules of probability. Personal probabilities that satisfy these rules give a coherent interpretation even if they might differ from one person's assignment to another's.

3.3 Two Basic Rules of Probability

In calculating probability, there are two rules to consider when you are determining if two events are independent or dependent and if they are mutually exclusive or not.

The Multiplication Rule

If A and B are two events defined on a sample space , then P ( A AND B ) = P ( B ) P ( A | B ).

This equation can be rewritten as P ( A AND B ) = P ( B ) P ( A | B ), the multiplication rule.

If A and B are independent , then P ( A | B ) = P ( A ). In this special case, P ( A AND B ) = P ( A | B ) P ( B ) becomes P ( A AND B ) = P ( A ) P ( B ).

A bag contains four green marbles, three red marbles, and two yellow marbles. Mark draws two marbles from the bag without replacement. The probability that he draws a yellow marble and then a green marble is

Notice that P ( green | yellow ) = 4 8 P ( green | yellow ) = 4 8 . After the yellow marble is drawn, there are four green marbles in the bag and eight marbles in all.

The Addition Rule

If A and B are defined on a sample space, then P ( A OR B ) = P ( A ) + P ( B ) − P ( A AND B ).

Draw one card from a standard deck of playing cards. Let H = the card is a heart, and let J = the card is a jack. These events are not mutually exclusive because a card can be both a heart and a jack.

If A and B are mutually exclusive , then P ( A AND B ) = 0. Then P ( A OR B ) = P ( A ) + P ( B ) − P ( A AND B ) becomes P ( A OR B ) = P ( A ) + P ( B ).

Draw one card from a standard deck of playing cards. Let H = the card is a heart and S = the card is a spade. These events are mutually exclusive because a card cannot be a heart and a spade at the same time. The probability that the card is a heart or a spade is

Example 3.14

Klaus is trying to choose where to go on vacation. His two choices are: A = New Zealand and B = Alaska.

  • Klaus can only afford one vacation. The probability that he chooses A is P ( A ) = .6 and the probability that he chooses B is P ( B ) = .35.
  • P ( A AND B ) = 0 because Klaus can only afford to take one vacation.
  • Therefore, the probability that he chooses either New Zealand or Alaska is P ( A OR B ) = P ( A ) + P ( B ) = .6 + .35 = .95. Note that the probability that he does not choose to go anywhere on vacation must be .05.

Example 3.15

Carlos plays college soccer. He makes a goal 65 percent of the time he shoots. Carlos is going to attempt two goals in a row in the next game. A = the event Carlos is successful on his first attempt. P ( A ) = .65. B = the event Carlos is successful on his second attempt. P ( B ) = .65. Carlos tends to shoot in streaks. The probability that he makes the second goal given that he made the first goal is .90.

a. What is the probability that he makes both goals?

a. The problem is asking you to find P ( A AND B ) = P ( B AND A ). Since P ( B | A ) = .90: P ( B AND A ) = P ( B | A ) P ( A ) = (.90)(.65) = .585.

Carlos makes the first and second goals with probability .585.

b. What is the probability that Carlos makes either the first goal or the second goal?

b. The problem is asking you to find P ( A OR B ).

P ( A OR B ) = P ( A ) + P ( B ) − P ( A AND B ) = .65 + .65 − .585 = .715

Carlos makes either the first goal or the second goal with probability .715.

c. Are A and B independent?

c. No, they are not, because P ( B AND A ) = .585.

P ( B ) P ( A ) = (.65)(.65) = .423

.423 ≠ .585 = P ( B AND A )

So, P ( B AND A ) is not equal to P ( B ) P ( A ).

d. Are A and B mutually exclusive?

d. No, they are not because P ( A and B ) = .585.

To be mutually exclusive, P ( A AND B ) must equal zero.

Try It 3.15

Helen plays basketball. For free throws, she makes the shot 75 percent of the time. Helen must now attempt two free throws. C = the event that Helen makes the first shot. P ( C ) = .75. D = the event Helen makes the second shot. P ( D ) = .75. The probability that Helen makes the second free throw given that she made the first is .85. What is the probability that Helen makes both free throws?

Example 3.16

A community swim team has 150 members. Seventy-five of the members are advanced swimmers. Forty-seven of the members are intermediate swimmers. The remainder are novice swimmers. Forty of the advanced swimmers practice four times a week. Thirty of the intermediate swimmers practice four times a week. Ten of the novice swimmers practice four times a week. Suppose one member of the swim team is chosen randomly.

a. What is the probability that the member is a novice swimmer?

a. There are 150 members; 75 of these are advanced, and 47 of these are intermediate swimmers. So there are 150 − 75 − 47 = 28 novice swimmers. The probability that a randomly selected swimmer is a novice is 28 150 . 28 150 .

b. What is the probability that the member practices four times a week?

b. 40 + 30 + 10 150 = 80 150 40 + 30 + 10 150 = 80 150

c. What is the probability that the member is an advanced swimmer and practices four times a week?

c. There are 40 advanced swimmers who practice four times per week, so the probability is 40 150 . 40 150 .

d. What is the probability that a member is an advanced swimmer and an intermediate swimmer? Are being an advanced swimmer and being an intermediate swimmer mutually exclusive? Why or why not?

d. P (advanced AND intermediate) = 0, so these are mutually exclusive events. A swimmer cannot be an advanced swimmer and an intermediate swimmer at the same time.

e. Are being a novice swimmer and practicing four times a week independent events? Why or why not?

e. No, these are not independent events. P (novice AND practices four times per week) = .0667 P (novice) P (practices four times per week) = .0996 .0667 ≠ .0996

Try It 3.16

A school has 200 seniors of whom 140 will be going to college next year. Forty will be going directly to work. The remainder are taking a gap year. Fifty of the seniors going to college are on their school's sports teams. Thirty of the seniors going directly to work are on their school's sports teams. Five of the seniors taking a gap year are on their schools sports teams. What is the probability that a senior is taking a gap year?

Example 3.17

Felicity attends a school in Modesto, CA. The probability that Felicity enrolls in a math class is .2 and the probability that she enrolls in a speech class is .65. The probability that she enrolls in a math class GIVEN that she enrolls in speech class is .25.

Let M = math class, S = speech class, and M | S = math given speech.

  • What is the probability that Felicity enrolls in math and speech? Find P ( M AND S ) = P ( M | S ) P ( S ).
  • What is the probability that Felicity enrolls in math or speech classes? Find P ( M OR S ) = P ( M ) + P ( S ) − P ( M AND S ).
  • Are M and S independent? Is P ( M | S ) = P ( M )?
  • Are M and S mutually exclusive? Is P ( M AND S ) = 0?

a. P ( M AND S ) = P ( M | S ) P ( S ) = .25(.65) = .1625

b. P ( M OR S ) = P ( M ) + P ( S ) − P ( M AND S ) = .2 + .65 − .1625 = .6875

c. No, P ( M | S ) = .25 and P ( M ) = .2.

d. No, P ( M AND S ) = .1625.

Try It 3.17

A student goes to the library. Let events B = the student checks out a book and D = the student checks out a DVD. Suppose that P ( B ) = .40, P ( D ) = .30, and P ( D | B ) = .5.

  • Find P ( B AND D ).
  • Find P ( B OR D ).

Example 3.18

Researchers are studying one particular type of disease that affects women more often than men. Studies show that about one woman in seven (approximately 14.3 percent) who live to be 90 will develop the disease. Suppose that of those women who develop this disease, a test is negative 2 percent of the time. Also suppose that in the general population of women, the test for the disease is negative about 85 percent of the time. Let B = woman develops the disease and let N = tests negative. Suppose one woman is selected at random.

a. What is the probability that the woman develops the disease? What is the probability that woman tests negative?

a. P ( B ) = .143; P ( N ) = .85

b. Given that the woman develops the disease, what is the probability that she tests negative?

b. Among women who develop the disease, the test is negative 2 percent of the time, so P ( N | B ) = .02

c. What is the probability that the woman has the disease AND tests negative?

c. P ( B AND N ) = P ( B ) P ( N | B ) = (.143)(.02) = .0029

d. What is the probability that the woman has the disease OR tests negative?

d. P ( B OR N ) = P ( B ) + P ( N ) − P ( B AND N ) = .143 + .85 − .0029 = .9901

e. Are having the disease and testing negative independent events?

e. No. P ( N ) = .85; P ( N | B ) = .02. So, P ( N | B ) does not equal P ( N ).

f. Are having the disease and testing negative mutually exclusive?

f. No. P ( B AND N ) = .0029. For B and N to be mutually exclusive, P ( B AND N ) must be zero.

Try It 3.18

A school has 200 seniors of whom 140 will be going to college next year. Forty will be going directly to work. The remainder are taking a gap year. Fifty of the seniors going to college are on their school's sports teams. Thirty of the seniors going directly to work are on their school's sports teams. Five of the seniors taking a gap year are on their school's sports teams. What is the probability that a senior is going to college and plays sports?

Example 3.19

Refer to the information in Example 3.18 . P = tests positive.

  • Given that a woman develops the disease, what is the probability that she tests positive? Find P ( P | B ) = 1 − P ( N | B ).
  • What is the probability that a woman develops the disease and tests positive? Find P ( B AND P ) = P ( P | B ) P ( B ).
  • What is the probability that a woman does not develop the disease? Find P ( B′ ) = 1 − P ( B ).
  • What is the probability that a woman tests positive for the disease? Find P ( P ) = 1 − P ( N ).

a. P ( P | B ) = 1 − P ( N | B ) = 1 − .02 = .98

b. P ( B AND P ) = P ( P | B ) P ( B ) = .98(.143) = .1401

c. P ( B' ) = 1 − P ( B ) = 1 − .143 = .857

d. P ( P ) = 1 − P ( N ) = 1 − .85 = .15

Try It 3.19

  • Find P ( B′ ).
  • Find P ( D AND B ).
  • Find P ( B | D ).
  • Find P ( D AND B′ ).
  • Find P ( D | B′ ).

As an Amazon Associate we earn from qualifying purchases.

This book may not be used in the training of large language models or otherwise be ingested into large language models or generative AI offerings without OpenStax's permission.

Want to cite, share, or modify this book? This book uses the Creative Commons Attribution License and you must attribute Texas Education Agency (TEA). The original material is available at: https://www.texasgateway.org/book/tea-statistics . Changes were made to the original material, including updates to art, structure, and other content updates.

Access for free at https://openstax.org/books/statistics/pages/1-introduction
  • Authors: Barbara Illowsky, Susan Dean
  • Publisher/website: OpenStax
  • Book title: Statistics
  • Publication date: Mar 27, 2020
  • Location: Houston, Texas
  • Book URL: https://openstax.org/books/statistics/pages/1-introduction
  • Section URL: https://openstax.org/books/statistics/pages/3-3-two-basic-rules-of-probability

© Jan 23, 2024 Texas Education Agency (TEA). The OpenStax name, OpenStax logo, OpenStax book covers, OpenStax CNX name, and OpenStax CNX logo are not subject to the Creative Commons license and may not be reproduced without the prior and express written consent of Rice University.

Essential Probability

Part I - Basics

Random Experiments

Sample spaces, populations, rules for combining events, probabilities of events, rules of probability, equally likely outcomes, sampling without replacement, sampling with replacement.

Conditional Probability

Bayes Rule and Total Probability

A random experiment is one whose outcome is uncertain and which can be repeated indefinitely under essentially identical conditions.  Examples are rolling a die, tossing a coin, making a laboratory measurement with a degree of uncontrollable error.

The sample space of a random experiment is the set of all its possible outcomes.  We will denote the sample space by the Greek letter Ω (upper-case Omega).  Individual outcomes of an experiment will be denoted generically by ω (lower-case Omega).  For example, in the experiment of tossing a six-sided die and observing the number of spots, the sample space might be described as Ω={1,2,3,4,5,6}.

A population is a set of individuals or objects that forms the subject of a statistical investigation.  Usually, the population is so large that it is not feasible to examine every member of the population and a smaller subset (called a sample ) is chosen to represent the population.  In such cases, the sample space is not the population.  Rather, it is the collection of all samples (subsets of the population) of a given size.

An event is a subset of the sample space.  Events may be described in mathematical notation or in informal ordinary language.  For example, the event E="The number of spots is even." is also E={2,4,6}.  Events are denoted generically by upper-case Latin letters, such as "E" above.

If ω is the outcome of a random experiment, the event E occurs if we E. 

1.  The complement of E, denoted E c , occurs if and only if E does not occur.  In other words,

2.  The union of a finite or infinite sequence of events is an event.  The union occurs if and only if at least one of the individual events in the sequence occurs.

3.  The entire sample space Ω is an event (the certain event).  The certain event arises naturally when an ordinary language description of an event is satisfied by every outcome of the random experiment.

4.  The empty set φ is an event.  The empty event arises naturally when an ordinary language description cannot be satisfied by any outcome of the random experiment.  Note that

5.  The intersection of a finite or infinite sequence of events in an event.  The intersection occurs if and only if each of the individual events in the sequence occurs.

If E 1 ∩ E 2 = φ, the events E 1 and E 2 are said to be disjoint .  This means that they cannot both occur.

A probability assignment or probability measure is a way of assigning probabilities between 0 and 1 to events.  In other words, it is a function P whose domain is the set of all events associated with a random experiment and whose codomain is the unit interval [0, 1].  A probability assignment is an essential part of modeling a random experiment.  Ideally, it comes from a detailed background knowledge of the phenomena of the experiment.  Probability theory is the study of the mathematical consequences of the basic properties (axioms) of probability assignments given in the next paragraph.  Statistics is the study of methods of using data to assess the correctness of probability models of real life experiments.

Many random experiments have only a finite number of possible outcomes.  For such an experiment, let n denote the number of possible outcomes.  A singleton event is an event that consists of only one outcome ω.  Denote this event by {ω}.  The experiment is said to have equally likely outcomes if P ({ω}) = 1/ n   for all ω.  It then follows from the basic rules of probability that for any event E ,

where #( E ) is the number of outcomes in the event E .  The assumption of equally likely outcomes may or may not be appropriate for a given experiment.  We almost always assume equally likely outcomes for certain simple experiments involving tossing dice, drawing cards, etc.

If a finite population has M members, a sample without replacement of size m is simply a subset of size m of the population.  The number of such subsets is

where, e.g., M ! ( M factorial) is the product of the positive integers from 1 to M , inclusive.  Consider a random experiment whose outcome is a subset of size m from the population.  If all outcomes are equally likely, the probability of each singleton event { ω} is given by

This is what is usually meant by "choosing a random sample of size m ".  Sometimes the order of presentation of the population members selected is important.  In these cases, the outcome of the experiment is a non-repeating sequence (not just a subset) of length m from the population.  These are also called permutations of length m from the population.  The number of permutations of length m is

If all permutations of length m are equally likely, the result of the experiment is called an ordered sample without replacement from the population.

An ordered sample of size m , with replacement , from a population is a possibly repeating sequence of length m from the population.  The number of such sequences is M m .  It is not hard to see that if the sample size m is much less than the population size M , a very large fraction of the ordered samples with replacement will not repeat themselves anyway, so there is little practical difference between ordered sampling with and without replacement.  In such circumstances, samples without replacement are often treated as samples with replacement for mathematical convenience.

Conditional Probability and Independence

Probabilities discussed up to this point have been unconditional probabilities .  If D and E are events and P ( E ) > 0, the conditional probability of D , given that E occurs , is

If the experiment has equally likely outcomes, P ( D | E ) is just the fraction of all the outcomes in E where D also occurs.  Two events D and E are independent if P ( D ∩ E ) = P ( D ) P ( E ).  If P(E) > 0, this is equivalent to P ( D | E ) = P ( D ), i.e., the conditional probability of D given E is the same as the unconditional probability of D .  For example, in the experiment of drawing a single card from a standard deck with equally likely outcomes, the events "Draw a Heart" and "Draw a Queen" are independent.  The events "Draw a Heart" and "Draw a red card" are not independent.  They are dependent.

Bayes' Rule and Total Probability

Let E 1 , E 2 , ..., E k be pairwise disjoint events such that P ( E i ) > 0 for each i and

This means that one of the events E i must occur and that only one can occur.  Let D be another event.  The law of total probability says that

and Bayes' rule says that

For example, let D denote a set of symptoms exhibited by a patient and let E 1 , E 2 , ..., E k be a collection of mutually exclusive disease conditions that might account for the symptoms.   For each disease E i , there is a certain probability P ( D | E i ) that a sufferer of that disease will have the symptoms D and there is a certain probability P ( E i ) that a patient will have that specific disease.  Then Bayes' rule gives the probability that the patient has the specific disease, given that he or she has the symptoms D .

  • Toggle navigation

Mathematics Department Course Hub

  • Project Profile
  • Course Outline
  • WeBWorK through Rederly
  • Video resources
  • Faculty Announcements
  • Course Coordination
  • WeBWorK – Faculty Resources
  • Training and Support

Lesson 6: Assigning Probabilities to Events; Probability Rules

Hi everyone! Read through the material below, watch the videos, work on the Word lecture and follow up with your instructor if you have questions.

You have, more than likely, used probability. In fact, you probably have an intuitive sense of probability. Probability deals with the chance of an event occurring. Whenever you weigh the odds of whether or not to do your homework or to study for an exam, you are using probability. In this chapter, you will learn how to solve probability problems using a systematic approach.

WeBWorK. Set 3.1

Learning Outcomes

  • Understand and use the terminology of probability.
  • Determine whether two events are mutually exclusive and whether two events are independent.
  • Calculate probabilities using the Addition Rules and Multiplication Rules.
  • Construct and interpret Venn Diagrams.
  • Construct and interpret Tree Diagrams.

Common Resoursces

  • 3.1 Terminology
  • 3.2 Independent and Mutually Exclusive Events
  • 3.3 Two Basic Rules of Probability
  • 3.5 Tree and Venn Diagram
  • Lesson06-ProbabilityBasics
  • WordLecture-06
  • Open Intro Slides

Basic Probability, Definitions, and Venn Diagrams

Properties of Probability, the Addition Rule

Excel-based Course Resources

  • Introductory Statistics by Sheldon Ross, 3rd edition: Sections 4.1-4.3

R-based Course Resources

  • R-laboratory

Watch the video introduction to probability .

  • What is a random phenomenon?
  • Explain why weather is an example of a random phenomenon.
  • What does it mean when a weather reporter says that there is a 70% chance of rain tomorrow?
  • If we flip a fair coin repeatedly, what can be said about the proportion of heads in the short run? What can be said about the proportion of heads in the long run?
  • What can you say about an event whose probability is close to one compared to an event whose probability is close to zero?

Consider three sets A , B , and C . What does $n(A\cup B \cup C)$ look like?

MAT 1372 Statistics with Probability Course Hub

This site contains resources for the course MAT 1372 Statistics with Probability , including the syllabus, lessons, help and support materials, and more.  It is intended for both students and faculty.  Welcome!

NOTE: This site is a repository of information and is not intended for direct communication between students and faculty.  If you are a student in MAT 1372, your professor will let you know how and where to reach them online.

© 2024 MAT 1372 Course Hub

Theme by Anders Noren — Up ↑

The OpenLab at City Tech: A place to learn, work, and share

The OpenLab is an open-source, digital platform designed to support teaching and learning at City Tech (New York City College of Technology), and to promote student and faculty engagement in the intellectual and social life of the college community.

New York City College of Technology

New York City College of Technology | City University of New York

Accessibility

Our goal is to make the OpenLab accessible for all users.

Learn more about accessibility on the OpenLab

Creative Commons

  • - Attribution
  • - NonCommercial
  • - ShareAlike

Creative Commons

© New York City College of Technology | City University of New York

If you're seeing this message, it means we're having trouble loading external resources on our website.

If you're behind a web filter, please make sure that the domains *.kastatic.org and *.kasandbox.org are unblocked.

To log in and use all the features of Khan Academy, please enable JavaScript in your browser.

Unit 7: Probability

About this unit.

Probability tells us how often some event will happen after many repeated trials. You've experienced probability when you've flipped a coin, rolled some dice, or looked at a weather forecast. Go deeper with your understanding of probability as you learn about theoretical, experimental, and compound probability, and investigate permutations, combinations, and more!

Basic theoretical probability

  • Intro to theoretical probability (Opens a modal)
  • Probability: the basics (Opens a modal)
  • Simple probability: yellow marble (Opens a modal)
  • Simple probability: non-blue marble (Opens a modal)
  • Intuitive sense of probabilities (Opens a modal)
  • The Monty Hall problem (Opens a modal)
  • Simple probability Get 5 of 7 questions to level up!
  • Comparing probabilities Get 5 of 7 questions to level up!

Probability using sample spaces

  • Example: All the ways you can flip a coin (Opens a modal)
  • Die rolling probability (Opens a modal)
  • Subsets of sample spaces (Opens a modal)
  • Subsets of sample spaces Get 3 of 4 questions to level up!

Basic set operations

  • Intersection and union of sets (Opens a modal)
  • Relative complement or difference between sets (Opens a modal)
  • Universal set and absolute complement (Opens a modal)
  • Subset, strict subset, and superset (Opens a modal)
  • Bringing the set operations together (Opens a modal)
  • Basic set notation Get 5 of 7 questions to level up!

Experimental probability

  • Experimental probability (Opens a modal)
  • Theoretical and experimental probabilities (Opens a modal)
  • Making predictions with probability (Opens a modal)
  • Simulation and randomness: Random digit tables (Opens a modal)
  • Experimental probability Get 5 of 7 questions to level up!
  • Making predictions with probability Get 5 of 7 questions to level up!

Randomness, probability, and simulation

  • Experimental versus theoretical probability simulation (Opens a modal)
  • Theoretical and experimental probability: Coin flips and die rolls (Opens a modal)
  • Random number list to run experiment (Opens a modal)
  • Random numbers for experimental probability (Opens a modal)
  • Statistical significance of experiment (Opens a modal)
  • Interpret results of simulations Get 3 of 4 questions to level up!

Addition rule

  • Probability with Venn diagrams (Opens a modal)
  • Addition rule for probability (Opens a modal)
  • Addition rule for probability (basic) (Opens a modal)
  • Adding probabilities Get 3 of 4 questions to level up!
  • Two-way tables, Venn diagrams, and probability Get 3 of 4 questions to level up!

Multiplication rule for independent events

  • Sample spaces for compound events (Opens a modal)
  • Compound probability of independent events (Opens a modal)
  • Probability of a compound event (Opens a modal)
  • "At least one" probability with coin flipping (Opens a modal)
  • Free-throw probability (Opens a modal)
  • Three-pointer vs free-throw probability (Opens a modal)
  • Probability without equally likely events (Opens a modal)
  • Independent events example: test taking (Opens a modal)
  • Die rolling probability with independent events (Opens a modal)
  • Probabilities involving "at least one" success (Opens a modal)
  • Sample spaces for compound events Get 3 of 4 questions to level up!
  • Independent probability Get 3 of 4 questions to level up!
  • Probabilities of compound events Get 3 of 4 questions to level up!
  • Probability of "at least one" success Get 3 of 4 questions to level up!

Multiplication rule for dependent events

  • Dependent probability introduction (Opens a modal)
  • Dependent probability: coins (Opens a modal)
  • Dependent probability example (Opens a modal)
  • Independent & dependent probability (Opens a modal)
  • The general multiplication rule (Opens a modal)
  • Dependent probability (Opens a modal)
  • Dependent probability Get 3 of 4 questions to level up!

Conditional probability and independence

  • Calculating conditional probability (Opens a modal)
  • Conditional probability explained visually (Opens a modal)
  • Conditional probability using two-way tables (Opens a modal)
  • Conditional probability tree diagram example (Opens a modal)
  • Tree diagrams and conditional probability (Opens a modal)
  • Conditional probability and independence (Opens a modal)
  • Analyzing event probability for independence (Opens a modal)
  • Calculate conditional probability Get 3 of 4 questions to level up!
  • Dependent and independent events Get 3 of 4 questions to level up!

Help | Advanced Search

Physics > Data Analysis, Statistics and Probability

Title: the art of probability assignment.

Abstract: The problem of assigning probabilities when little is known is analized in the case where the quanities of interest are physical observables, i.e. can be measured and their values expressed by numbers. It is pointed out that the assignment of probabilities based on observation is a process of inference, involving the use of Bayes' theorem and the choice of a probability prior. When a lot of data is available, the resulting probability are remarkably insensitive to the form of the prior. In the oposite case of scarse data, it is suggested that the probabilities are assigned such that they are the least sensitive to specific variations of the probability prior. In the continuous case this results in a probability assignment rule wich calls for minimizing the Fisher information subject to constraints reflecting all available information. In the discrete case, the corresponding quantity to be minimized turns out to be a Renyi distance between the original and the shifted distribution.
Comments: Presented at MaxEnt 2012, the 32nd International Workshop on Bayesian Inference and Maximum Entropy Methods in Science and Engineering, July 15-20, 2012, Garching near Munich, Germany; Reference issue fixed
Subjects: Data Analysis, Statistics and Probability (physics.data-an)
Cite as: [physics.data-an]
  (or [physics.data-an] for this version)
  Focus to learn more arXiv-issued DOI via DataCite

Submission history

Access paper:.

  • Other Formats

References & Citations

  • Google Scholar
  • Semantic Scholar

BibTeX formatted citation

BibSonomy logo

Bibliographic and Citation Tools

Code, data and media associated with this article, recommenders and search tools.

  • Institution

arXivLabs: experimental projects with community collaborators

arXivLabs is a framework that allows collaborators to develop and share new arXiv features directly on our website.

Both individuals and organizations that work with arXivLabs have embraced and accepted our values of openness, community, excellence, and user data privacy. arXiv is committed to these values and only works with partners that adhere to them.

Have an idea for a project that will add value for arXiv's community? Learn more about arXivLabs .

Probability assignment is not an easy task to perform. For this, you need comprehensive knowledge.

Check all useful details here to solve your probability assignment queries effectively.

The probability of an event (for example, the event that a randomly chosen person has blood type O) can be estimated by the relative frequency with which the event occurs in a long series of trials.

Example: toss a fair coin three time, {hhh, thh, hth, hht, htt, tht, tth, ttt} now let’s define the following events:, event: “getting no h” → ttt, #1 rules of probability, the probability of an event can range anywhere from 0 (indicating that the event will never occur) to 1 (indicating that the event is certain)..

– For any event A, 0 ≤ P(A) ≤ 1.

#2 Rules of Probability

The probability of all possible outcomes together must be 1..

– The sum of the probabilities of all possible outcomes is 1.

#3 Rules of Probability

This deals with the relationship between the probability of an event and the probability of its complement event., p(not a) = 1 – p(a), get aware of more probability assignment rules, then avoid the mistake that may lead to lower grades., also, get the best probability assignment help from experts @ lowest price now.

Get Help Here

A new approach for generation of generalized basic probability assignment in the evidence theory

  • Theoretical advances
  • Published: 17 February 2021
  • Volume 24 , pages 1007–1023, ( 2021 )

Cite this article

probability assignment rule

  • Yongchuan Tang   ORCID: orcid.org/0000-0003-2568-9628 1 ,
  • Dongdong Wu 1 &
  • Zijing Liu 1  

1143 Accesses

Explore all metrics

The process of information fusion needs to deal with a large number of uncertain information with multi-source, heterogeneity, inaccuracy, unreliability, and incompleteness. In practical engineering applications, Dempster–Shafer evidence theory is widely used in multi-source information fusion owing to its effectiveness in data fusion. Information sources have an important impact on multi-source information fusion in an environment with the characteristics of complex, unstable, uncertain, and incomplete. To address multi-source information fusion problem, this paper considers the situation of uncertain information modeling from the closed-world to the open-world assumption and studies the generation of basic probability assignment with incomplete information. A new method is proposed to generate the generalized basic probability assignment (GBPA) based on the triangular fuzzy number model under the open-world assumption. First, the maximum, minimum, and mean values for the triangular membership function of each attribute in classification problem can be obtained to construct a triangular fuzzy number representation model. Then, by calculating the length of the intersection points between the sample and the triangular fuzzy number model, a GBPA set with an assignment for the empty set can be determined. The proposed method can not only be used in different complex environments simply and flexibly, but also have less information loss in information processing. Finally, a series of comprehensive experiments basing on the UCI data sets is used to verify the rationality and superiority of the proposed method.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save.

  • Get 10 units per month
  • Download Article/Chapter or Ebook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime

Price includes VAT (Russian Federation)

Instant access to the full article PDF.

Rent this article via DeepDyve

Institutional subscriptions

probability assignment rule

Similar content being viewed by others

probability assignment rule

A series of interval-valued Fermatean fuzzy Hamacher operators and their application

probability assignment rule

Introduction to Fuzzy Systems

probability assignment rule

Group decision-making approach based on the distance measure of linguistic intuitionistic fuzzy sets and VIKOR technique

Hu Z, Nie F, Wang R, Li X (2020) Multi-view spectral clustering via integrating nonnegative embedding and spectral embedding. Inf Fusion 55:251–259

Article   Google Scholar  

Wei W, Liang J (2019) Information fusion in rough set theory: an overview. Inf Fusion 48:107–118

Zhang L, Wu X, Zhu H, AbouRizk SM (2017) Perceiving safety risk of buildings adjacent to tunneling excavation: an information fusion approach. Autom Constr 73:88–101

Zhou D, Al-Durra A, Gao F, Ravey A, Matraji I, Simões MG (2017) Online energy management strategy of fuel cell hybrid electric vehicles based on data fusion approach. J Power Sources 366:278–291

He Z, Jiang W (2018) An evidential dynamical model to predict the interference effect of categorization on decision making results. Knowl Based Syst 150:139–149

Xu Z, Na Z (2016) Information fusion for intuitionistic fuzzy decision making: an overview. Inf Fusion 28:10–23

Banerjee TP, Das S (2012) Multi-sensor data fusion using support vector machine for motor fault detection. Inf Sci 217:96–107

Hui KH, Lim MH, Leong MS, Al-Obaidi SM (2017) Dempster-Shafer evidence theory for multi-bearing faults diagnosis. Eng Appl Artif Intell 57:160–170

Lin Y, Li Y, Yin X, Dou Z (2018) Multisensor fault diagnosis modeling based on the evidence theory. IEEE Trans Reliab 67(2):513–521

Song Y, Wang X, Zhu J, Lei L (2018) Sensor dynamic reliability evaluation based on evidence theory and intuitionistic fuzzy sets. Appl Intell 48(11):3950–3962

Chemweno P, Pintelon L, Muchiri PN, Van Horenbeek A (2018) Risk assessment methodologies in maintenance decision making: a review of dependability modelling approaches. Reliab Eng Syst Saf 173:64–77

Dutta P (2017) Modeling of variability and uncertainty in human health risk assessment. MethodsX 4:76–85

Seiti H, Hafezalkotob A (2018) Developing pessimistic-optimistic risk-based methods for multi-sensor fusion: an interval-valued evidence theory approach. Appl Soft Comput 72:609–623

Chen L, Deng Y (2018) A new failure mode and effects analysis model using Dempster-Shafer evidence theory and grey relational projection method. Eng Appl Artif Intell 76:13–20

Liu Z-G, Pan Q, Dezert J, Martin A (2017) Combination of classifiers with optimal weight based on evidential reasoning. IEEE Trans Fuzzy Syst 26(3):1217–1230

Habtie AB, Abraham A, Midekso D (2015) Comparing measurement and state vector data fusion algorithms for mobile phone tracking using a-gps and u-tdoa measurements. In: International conference on hybrid artificial intelligence systems. Springer, pp 592–604

Liu Z, Pan Q, Dezert J, Han J-W, He Y (2017) Classifier fusion with contextual reliability evaluation. IEEE Trans Cybern 48(5):1605–1618

Ebrahimnejad A, Verdegay JL (2018) Fuzzy set theory. In: Fuzzy sets-based methods and techniques for modern analytics. Springer, pp 1–27

Frikha A, Moalla H (2015) Analytic hierarchy process for multi-sensor data fusion based on belief function theory. Eur J Oper Res 241(1):133–147

Article   MathSciNet   MATH   Google Scholar  

Deng X, Xiao F, Deng Y (2017) An improved distance-based total uncertainty measure in belief function theory. Appl Intell 46(4):898–915

Gnedenko BV (2018) Theory of probability. Routledge, London

Book   Google Scholar  

Tuckwell HC (2018) Elementary applications of probability theory. Routledge, London

MATH   Google Scholar  

Kang B, Chhipi-Shrestha G, Deng Y, Hewage K, Sadiq R (2018) Stable strategies analysis based on the utility of z-number in the evolutionary games. Appl Math Comput 324:202–217

MathSciNet   MATH   Google Scholar  

Zadeh LA (2011) A note on z-numbers. Inf Sci 181(14):2923–2932

Article   MATH   Google Scholar  

Li X, Chen X (2018) D-intuitionistic hesitant fuzzy sets and their application in multiple attribute decision making. Cogn Comput 10(3):496–505

Xiao F (2018) A novel multi-criteria decision making method for assessing health-care waste treatment technologies based on d numbers. Eng Appl Artif Intell 71:216–225

Xu Z, Wang H (2017) On the syntax and semantics of virtual linguistic terms for information fusion in decision making. Inf Fusion 34:43–48

Dubois D, Liu W, Ma J, Prade H (2016) The basic principles of uncertain information fusion. an organised review of merging rules in different representation frameworks. Inf Fusion 32:12–39

Bernardo JM, Smith AFM (2009) Bayesian theory, vol 405. Wiley, New York

Google Scholar  

Dempster AP (1967) Upper and lower probabilities induced by a multi-valued mapping. Ann Math Stat 38(2):325–339

Shafer G (1976) A mathematical theory of evidence, vol 42. Princeton University Press, Princeton

Book   MATH   Google Scholar  

Jiao Z, Gong H, Wang Y (2016) A ds evidence theory-based relay protection system hidden failures detection method in smart grid. IEEE Trans Smart Grid 9(3):2118–2126

Liu Y-T, Pal NR, Marathe AR, Lin C-T (2017) Weighted fuzzy Dempster-Shafer framework for multimodal information integration. IEEE Trans Fuzzy Syst 26(1):338–352

Su Z, Thierry D (2018) Bpec: Belief-peaks evidential clustering. IEEE Trans Fuzzy Syst 27(1):111–123

Meng J, Fu D, Tang Y (2020) Belief-peaks clustering based on fuzzy label propagation. Appl Intell 50:1259–1271

Pan Y, Zhang L, Li ZW, Ding L (2019) Improved fuzzy Bayesian network-based risk analysis with interval-valued fuzzy sets and D-S evidence theory. IEEE Trans Fuzzy Syst 28(9):2063–2077

Luo J, Shi L, Ni Y (2017) Uncertain power flow analysis based on evidence theory and affine arithmetic. IEEE Trans Power Syst 33(1):1113–1115

Xie C, Bai J, Zhu W, Lu G, Wang H (2017) Lightning risk assessment of transmission lines based on D–S theory of evidence and entropy-weighted grey correlation analysis. In: 2017 IEEE conference on energy internet and energy system integration (EI2). IEEE, pp 1–6

Huang ZL, Jiang C, Zhang Z, Zhang W, Yang TG (2019) Evidence-theory-based reliability design optimization with parametric correlations. Struct Multidiscip Optim 60(2):565–580

Article   MathSciNet   Google Scholar  

Fu C, Xu D-L, Min X (2018) Determining attribute weights for multiple attribute decision analysis with discriminating power in belief distributions. Knowl Based Syst 143:127–141

Liu J, Li Q, Yu WC, Wang YX (2018) A fast fault diagnosis method of the PEMFC system based on extreme learning machine and Dempster–Shafer evidence theory. IEEE Trans Transport Electr 5(1):271–284

Fei L, Deng Y (2019) A new divergence measure for basic probability assignment and its applications in extremely uncertain environments. Int J Intell Syst 34(4):584–600

de Oliveira Silva LG, de Almeida-Filho AT (2016) A multicriteria approach for analysis of conflicts in evidence theory. Inf Sci 346:275–285

Yager RR (1987) On the Dempster–Shafer framework and new combination rules. Inf Sci 41(2):93–137

Smets P, Kennes R (1994) The transferable belief model. Artif Intell 66(2):191–234

Deng Y (2015) Generalized evidence theory. Appl Intell 43(3):530–543

Su X, Sankaran M, Xu P, Yong D (2015) Handling of dependence in Dempster–Shafer theory. Int J Intell Syst 30(4):441–467

Zhang W, Deng Y (2019) Combining conflicting evidence using the DEMATEL method. Soft Comput 23(17):8207–8216

Xiao F (2018) An improved method for combining conflicting evidences based on the similarity measure and belief function entropy. Int J Fuzzy Syst 20(4):1256–1266

Song Y, Wang X, Wu W, Wen Q, Huang W (2018) Evidence combination based on credibility and non-specificity. Pattern Anal Appl 21(1):167–180

Xiao F (2019) Multi-sensor data fusion based on the belief divergence measure of evidences and the belief entropy. Inf Fusion 46:23–32

Xu P, Yong D, Su X, Sankaran M (2013) A new method to determine basic probability assignment from training data. Knowl Based Syst 46:69–80

Zhang Z, Han D, Dezert J, Yang Y (2017) Determination of basic belief assignment using fuzzy numbers. In: 2017 20th international conference on information fusion (fusion). IEEE, pp 1–6

Yin L, Deng X, Deng Y (2018) The negation of a basic probability assignment. IEEE Trans Fuzzy Syst 27(1):135–143

Jingfei Zhang, Yong Deng (2017) A method to determine basic probability assignment in the open world and its application in data fusion and classification. Appl Intell 46(4):934–951

Wen J, Hu W (2018) An improved soft likelihood function for Dempster-Shafer belief structures. Int J Intell Syst 33(6):1264–1282

Deng Y, Han D (2011) Methods to determine generalized basic probability assignment in generalized evidence theory. J Xi’an JiaoTong Univ 45(2):34–38

Wen J, Jun Z (2017) A modified combination rule in generalized evidence theory. Appl Intell 46(3):630–640

Zadeh LA (1965) Fuzzy sets. Inf Control 8(3):338–353

Deng X, Jiang W (2020) On the negation of a Dempster-Shafer belief structure based on maximum uncertainty allocation. Inf Sci 516:346–352

Luo Z, Deng Y (2019) A matrix method of basic belief assignment’s negation in Dempster–Shafer theory. IEEE Trans Fuzzy Syst 28(9):2270–2276

Liu F, Deng Y (2020) Determine the number of unknown targets in open world based on elbow method. IEEE Trans Fuzzy Syst

Nesa N, Banerjee I (2017) Iot-based sensor data fusion for occupancy sensing using Dempster-Shafer evidence theory for smart buildings. IEEE Internet Things J 4(5):1563–1570

Wu D, Liu Z, Tang Y (2020) A new classification method based on the negation of a basic probability assignment in the evidence theory. Eng Appl Artif Intell 96:103985

Jing M, Tang Y (2021) A new base basic probability assignment approach for conflict data fusion in the evidence theory. Appl Intell 51(2):1056–1068

Zhang H, Liu G, Chow TWS, Liu W (2011) Textual and visual content-based anti-phishing: a Bayesian approach. IEEE Trans Neural Netw 22(10):1532–1546

Chen F-C, Jahanshahi MR, Wu R-T, Chris J (2017) A texture-based video processing methodology using Bayesian data fusion for autonomous crack detection on metallic surfaces. Comput Aided Civ Infrastruct Eng 32(4):271–287

Mil S, Piantanakulchai M (2018) Modified Bayesian data fusion model for travel time estimation considering spurious data and traffic conditions. Appl Soft Comput 72:65–78

Zhou T, Chen M, Yang C, Nie Z (2020) Data fusion using Bayesian theory and reinforcement learning method. Sci China Inf Sci 63:170209

Wu D, Tang Y (2020) An improved failure mode and effects analysis method based on uncertainty measure in the evidence theory. Qual Reliab Eng Int 36(5):1786–1807

Download references

Author information

Authors and affiliations.

School of Big Data and Software Engineering, Chongqing University, Chongqing, 401331, China

Yongchuan Tang, Dongdong Wu & Zijing Liu

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Yongchuan Tang .

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Tang, Y., Wu, D. & Liu, Z. A new approach for generation of generalized basic probability assignment in the evidence theory. Pattern Anal Applic 24 , 1007–1023 (2021). https://doi.org/10.1007/s10044-021-00966-0

Download citation

Received : 14 May 2020

Accepted : 25 January 2021

Published : 17 February 2021

Issue Date : August 2021

DOI : https://doi.org/10.1007/s10044-021-00966-0

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Generalized evidence theory
  • Modified generalized combination rule
  • Generalized basic probability assignment
  • Multi-source information fusion
  • Dempster–Shafer evidence theory
  • Triangular fuzzy number
  • Find a journal
  • Publish with us
  • Track your research

COMMENTS

  1. General Probability Rules

    Rule 1: The probability of an impossible event is zero; the probability of a certain event is one. Therefore, for any event A, the range of possible probabilities is: 0 ≤ P (A) ≤ 1. Rule 2: For S the sample space of all possibilities, P (S) = 1. That is the sum of all the probabilities for all possible events is equal to one.

  2. 2.4

    The relative frequency approach involves taking the follow three steps in order to determine P ( A ), the probability of an event A: Perform an experiment a large number of times, n, say. Count the number of times the event A of interest occurs, call the number N ( A ), say. Then, the probability of event A equals: P ( A) = N ( A) n.

  3. PDF 1. Assigning Probabilities

    Statistics 11/Economics 40 Lecture 8 Random Variables (4.3 & 4.4) 1. Assigning Probabilities. The key to assigning probabilities is knowing all of your possible outcomes and knowing two rules: • All possible outcomes must total 1 or 100% (Where have we talked about 100% being important) • A probability must take a value 0 ≤ P(A) ≤ 1 (or ...

  4. Math 1601 Study Guide: Lesson 4

    BASIC PROBABILITY RULES Let us start with the first two rules. These rules help us to end up with legitimate values for the probabilities: Rule 1: P(A) is always between 0 and 1. Rule 2: P(S)=1 Here are the steps for assigning probability to an event: Define the experiment List all possible outcomes Assign probabilities to each outcome

  5. 7.1

    The opposite of "at least 3" is "getting a 1" (i.e. the only other possibility) so you can also figure the answer as 100% - 10% = 90% or 0.90. This rule of the opposites is our third rule of probability. Rule 3: The chance of something is 1 minus the chance of the opposite thing. Suppose you toss an astralgus twice.

  6. PDF STAT 201A

    1.3 Rules of Probability Probabilities are assigned to propositions (also known as events). Every probability is con-ditional on some information (this could be available information or some information that

  7. Probability Assignment

    A basic probability assignment is a mapping m from the set of all subsets of a universal finite nonempty set X to the interval [0, 1], ... the probabilities of some more complicated events can be determined according to the rules set forth in Section 2.2.

  8. 3.3 Two Basic Rules of Probability

    Notice that P (green | yellow) = 4 8 P (green | yellow) = 4 8.After the yellow marble is drawn, there are four green marbles in the bag and eight marbles in all. The Addition Rule. If A and B are defined on a sample space, then P(A OR B) = P(A) + P(B) − P(A AND B).. Draw one card from a standard deck of playing cards.

  9. PDF 1. Probability rules

    for the adoption of a set of mathematical rules, or axioms. The first four rules are easy to remember if you think of probability as a proportion. One more rule will be added soon. Rules for probabilities (P1) : 0 •PE •1 for every event E. (P2) : For the empty subset ;(= the "impossible event"),P;D0,

  10. Essential Probability

    A probability assignment or probability measure is a way of assigning probabilities between 0 and 1 to events. In other words, it is a function P whose domain is the set of all events associated with a random experiment and whose codomain is the unit interval [0, 1]. A probability assignment is an essential part of modeling a random experiment.

  11. Lesson 6: Assigning Probabilities to Events; Probability Rules

    In fact, you probably have an intuitive sense of probability. Probability deals with the chance of an event occurring. Whenever you weigh the odds of whether or not to do your homework or to study for an exam, you are using probability. In this chapter, you will learn how to solve probability problems using a systematic approach. WeBWorK. Set 3.1.

  12. PDF Lecture Notes 1 Basic Probability

    • Total Probability and Bayes Rule • Independence • Counting EE 178/278A: Basic Probability Page 1-1 ... • Probability law (measure or function) is an assignment of probabilities to events (subsets of sample space Ω) such that the following three axioms are satisfied: 1. P(A) ≥ 0, for all A(nonnegativity) ...

  13. PPTX Chapter 14: From Randomness to Probability

    Probability Assignment Rule: The probability of the set of all possible outcomes of a trial must be 1. P (S) = 1 (S. represents the set of all possible outcomes.) Formal Probability Rules (cont.) Complement Rule: The set of outcomes that are . not. in the event . A. is called the . complement . of . A

  14. PDF 1342-Notes Navidi 5-1 basic-concepts-probability

    Rules for probability assignments: For each simple event, the probability of that event is a number between 0 and l, inclusive. In other words, 0 < P(e) . ... A probability assignment that reflects the actual or expected percentage of times a simple event occurs is called reasonable. In Other words, it is reasonable if it makes sense based on ...

  15. Probability: the basics (article)

    Classical Probability (Equally Likely Outcomes): To find the probability of an event happening, you divide the number of ways the event can happen by the total number of possible outcomes. Probability of an Event Not Occurring: If you want to find the probability of an event not happening, you subtract the probability of the event happening from 1.

  16. 5.3: Probability Rules- "And" and "Or"

    The probability that a student is taking art or English is 0.833 or 83.3%. When we calculate the probability for compound events connected by the word "or" we need to be careful not to count the same thing twice. If we want the probability of drawing a red card or a five we cannot count the red fives twice.

  17. PDF Section 5.1 Basic Concepts of Probability

    Section 5.1 Basic Concepts of Probability. Experiment: An act or process that generates well-defined outcomes. Example: Toss a coin. Roll a die. Selecting a random sample of size 2 from a group of five. Sample Space: The collection of all possible outcomes of an experiment. Simple Event: An individual outcome to an experiment.

  18. Probability

    Probability tells us how often some event will happen after many repeated trials. You've experienced probability when you've flipped a coin, rolled some dice, or looked at a weather forecast. Go deeper with your understanding of probability as you learn about theoretical, experimental, and compound probability, and investigate permutations, combinations, and more!

  19. [1208.5276] The Art of Probability Assignment

    It is pointed out that the assignment of probabilities based on observation is a process of inference, involving the use of Bayes' theorem and the choice of a probability prior. ... In the continuous case this results in a probability assignment rule wich calls for minimizing the Fisher information subject to constraints reflecting all ...

  20. probability rules assignment Flashcards

    The spinner is spun 2 times. A partially completed probability model is shown for the number of times the spinner lands on blue. Find each probability. Note that landing on green, then blue is considered different from landing on blue, then green. P (0 blue) = 4/9. P (1 blue) = 4/9. P (2 blue) = 1/9. For your art history test, you have to write ...

  21. Check Rules Of Probability Assignment

    P (not A) = 1 - P (A) Get aware of more probability assignment rules, then avoid the mistake that may lead to lower grades. Also, get the best probability assignment help from experts @ lowest price now! Rules of Probability Probability Rule One (For any event A, 0 ≤ P (A) ≤ 1) Rule Two (The sum of the probabilities of all possible ...

  22. A new approach for generation of generalized basic probability

    Nevertheless, Dempster-Shafer evidence theory has many limitations that need to be improved. For example, the key step of how to automatically generate basic probability assignment (BPA), which is the prerequisites for the use of Dempster's combination rule and the application of Dempster-Shafer evidence theory.

  23. Statistics: Chapter 13 Flashcards

    The probability assignment rule. The probability of the entire sample space must be 1 P(S)=1. Complement rule. The probability of an event occurring is 1 minus the probability that it doesn't occurs P(A)= 1-P(A^c) Disjoint (mutually exclusive) Two events are disjointed if they share no outcomes in common. If A and B are disjoint, then knowing ...