musicmarkup.info Tutorials PROBABILITY AND STOCHASTIC PROCESSES YATES 3RD PDF

PROBABILITY AND STOCHASTIC PROCESSES YATES 3RD PDF

Sunday, August 18, 2019 admin Comments(0)

probab ility and stochastic processes liave not ch anged since -vve wrote the first ed i- .. 5 1\1!.res o.3 Independent Trials 49 musicmarkup.info 1 f Total Probability 18 Mixed Random Variables 4.f a.t Probability musicmarkup.info musicmarkup.info PDF 5. Probability and Stochastic Processes Features of this Text Who will benefit from using this text? PDF of the xvi CONTENTS 10 Stochastic Processes Definitions MODELS, AND PROBABILITIES corresponds to Axiom 3 (Theorem Roy D. Yates David J. Goodman Rutgers, The State University of. for electrical & computer engineers / Roy D. Yates, David J. Goodman. p. cm. PREFACE. When we started teaching the course Probability and Stochastic Processes to Rutgers erties of a single discrete random variable and Chapter 3 covers multiple random .. PDF of the Sum of Two Random Variables


Author:JEANNETTE SCHUEMANN
Language:English, Spanish, Portuguese
Country:Malawi
Genre:Health & Fitness
Pages:545
Published (Last):27.12.2015
ISBN:719-2-40279-141-3
ePub File Size:30.47 MB
PDF File Size:9.77 MB
Distribution:Free* [*Sign up for free]
Downloads:28242
Uploaded by: MABEL

This text introduces engineering students to probability theory and stochastic processes. Processes: A Friendly Introduction for Electrical and Computer Engineers, 3rd Edition [Book] by David J. Goodman, Roy D. Yates Continuous Functions of Two Continuous Random Variables · PDF of the Sum of Two. PDF | R. D. Yates and others published Probability and Stochastic Problem Solutions: Yates and Goodman, 9. Probability and Stochastic Processes. A Friendly Roy D. Yates, David J. Goodman, David Famolari. August 27, 1 download: – A manual musicmarkup.info describing the.m functions in musicmarkup.info – The quiz 3. We can also divide the students by age. Let Ci denote the subset of students of age i in years. At most.

Updating results WorldCat is the world's largest library catalog, helping you find library materials online. Don't have an account? Your Web browser is not enabled for JavaScript. Some features of WorldCat will not be available. Create lists, bibliographies and reviews:

Chapter 4 covers pairs of random variables including joint probability functions, conditional probability functions, correlation, and covariance.

Chapter 5 extends these concepts to multiple random variables, with an emphasis on vector notation. In studying Chapters 1—5, students encounter many of the same ideas three times in the contexts of abstract events, discrete random variables, and continuous random variables. Armed with the fundamentals, students can move next to any of three subsequent chapters. Chapter 6 teaches students how to work with sums of random variables.

For the most part it deals with independent random variables and derives probability models using convolution integrals and moment generating functions. A presentation of the central limit theorem precedes examples of Gaussian approximations to sums of random variables. Chapter 8 introduces Bayesian hypothesis testing, the foundation of many signal 9.

Chapter 9 presents techniques for using observations of random variables to estimate other random variables. Some of these techniques appear again in Chapter 11 in the context of random signal processing. Many instructors may wish to move from Chapter 5 to Chapter 10, which introduces the basic concepts of stochastic processes with the emphasis on wide sense stationary processes. It provides tools for working on practical applications in the last two chapters.

Chapter 11 introduces several topics related to random signal processing including: Chapter 12 introduces Markov chains and their practical applications. The text includes several hundred homework problems, organized to assist both instructors and students. The problem numbers refer to sections within a chapter.

For example Problem 3. Skiers will recognize the following symbols: Every ski area emphasizes that these designations are relative to the trails at that area. Further Reading Libraries and bookstores contain an endless collection of textbooks at all levels covering the topics presented in this textbook. We know of two in comic book format [GS93, Pos01]. The reference list on page is a brief sampling of books that can add breadth or depth to the material in this text. Most books on probability, statistics, stochastic processes, and random signal processing contain expositions of the basic principles of probability and random variables, covered in Chapters 1—4.

In advanced texts, these expositions serve mainly to establish notation for more specialized topics. It presents probability as a branch of number theory. The summaries at the end of Chapters 5—12 refer to books that supplement the specialized material in those chapters. At Wiley, we are pleased to acknowledge the continuous encouragement and enthusiasm of our executive editor, Bill Zobrist and the highly skilled support of marketing manager, Jennifer Powers, Senior Production Editor, Ken Santor, and Cover Designer, Dawn Stanley.

Unique among our teaching assistants, Dave Famolari took the course as an undergraduate. Finally, we acknowledge with respect and gratitude the inspiration and guidance of our teachers and mentors who conveyed to us when we were students the importance and elegance of probability theory. Usually these students recognize that learning probability theory is a struggle, and most of them work hard enough to do well. Other people have the opposite problem. The work looks easy to them, and they understand everything they hear in class and read in the book.

There are good reasons for assuming There are very few basic concepts to absorb. The terminology like the word probability , in most cases, contains familiar words. With a few exceptions, the mathematical manipulations are not complex. You can go a long way solving problems with a four-function calculator.

However, most people who do well in probability need to practice with a lot of examples to get comfortable with the work and to really understand what the subject is about. Most of the work in this course is that way, and the only way to do well is to practice a lot. Most people can do it in a respectable time, provided they train for it.

So, our advice to students is, if this looks really weird to you, keep working at it. You will probably catch on. It may be harder than you think. The theoretical material covered in this book has helped both of us devise new communication techniques and improve the operation of practical systems.

If you master the basic ideas, you will have many opportunities to apply them in other courses and throughout your career. We have worked hard to produce a text that will be useful to a large population of students and instructors. We welcome comments, criticism, and suggestions.

Feel free to send us e-mail at ryates winlab. In addition, the Website, http: Roy D. Yates David J. Expected Value and Variance 7. Vectors and Matrices Stationary Probabilities Now you can begin. The title of this book is Probability and Stochastic Processes.

We say and hear and read the word probability and its relatives possible, probable, probably in many contexts. Within the realm of applied mathematics, the meaning of probability is a question that has occupied mathematicians, philosophers, scientists, and social scientists for hundreds of years.

Everyone accepts that the probability of an event is a number between 0 and 1. Some people interpret probability as a physical property like mass or volume or temperature that can be measured. This probability is closely related to the nature of the coin. Fiddling around with the coin can alter the probability of heads. Another interpretation of probability relates to the knowledge that we have about something. We might assign a low probability to the truth of the statement, It is raining now in Phoenix, Arizona, because we know that Phoenix is in the desert.

However, our knowledge changes if we learn that it was raining an hour ago in Phoenix. This knowledge would cause us to assign a higher probability to the truth of the statement, It is raining now in Phoenix. Both views are useful when we apply probability theory to practical problems. While the structure of the subject conforms to principles of pure logic, the terminology is not entirely abstract.

The point of view is different from the one we took when we started studying physics. There we said that if we do the same thing in the same way over and over again — send a space shuttle into orbit, for example — the result will always be the same.

To predict the result, we have to take account of all relevant facts. In this case, repetitions of the same procedure yield different results. The situ1 While each outcome may be unpredictable, there are consistent patterns to be observed when we repeat the procedure a large number of times. Understanding these patterns helps engineers establish test procedures to ensure that a factory meets quality objectives.

In this repeatable procedure making and testing a chip with unpredictable outcomes the quality of individual chips , the probability is a number between 0 and 1 that states the proportion of times we expect a certain thing to happen, such as the proportion of chips that pass a test. As an introduction to probability and stochastic processes, this book serves three purposes: To exhibit the logic of the subject, we show clearly in the text three categories of theoretical material: These three axioms are the foundation on which the entire subject rests.

Each theorem would be accompanied by a complete proof. While rigorous, this approach would completely fail to meet our second aim of conveying the intuition necessary to work on practical problems.

To address this goal, we augment the purely mathematical material with a large number of examples of practical phenomena that can be analyzed by means of probability theory. We also include brief quizzes that you should try to solve as you read the book. Each one will help you decide whether you have grasped the material presented just before the quiz.

The problems at the end of each chapter give you more practice applying the material introduced in the chapter.

Processes yates probability pdf stochastic and 3rd

Some of them take you more deeply into the subject than the examples and quizzes do. Most people who study probability have already encountered set theory and are familiar with such terms as set, element, For them, the following paragraphs will review material already learned and introduce the notation and terminology we use here. A set is a collection of things. We use capital letters to denote sets.

The things that together make up the set are elements. When we use mathematical notation to refer to set elements, we usually use small letters. Thus we can have a set A with elements x, y, and z. One way is simply to name the elements: In addition to set inclusion, we also have the notion of a subset, which describes a relationship between two sets.

This is the mathematical way of stating that A and B are identical if and only if every element of A is an element of B and every element of B is an element of A. This is the set of all things that we could possibly consider in a given context. In any study, all set operations relate to the universal set for that study. The members of the universal set include all of the elements of all of the sets in the study. We will use the letter S to denote the universal set.

The null set, which is also important, may seem like it is not a set at all. It is customary to refer to Venn diagrams to display relationships among sets. By convention, the region enclosed by the large rectangle is the universal set S. Closed surfaces within this rectangle denote sets.

There are three operations for doing this: Union and intersection combine two existing sets to produce a third set. The complement operation forms a new set from one existing set. Another notation for intersection is AB. The complement of a set A, denoted by A c , is the set of all elements in S that are not in A.

It is a combination of intersection and complement. In working with probability we will frequently refer to two important properties of collections of sets.

A collection of sets A 1 ,. A1 A collection of sets A 1 ,. As we see in the following theorem, this can be complicated to show. Theorem 1.

Proof There are two parts to the proof: Quiz 1. In addition, each slice may have mushrooms M or onions O as described by the Venn diagram at right. Probability is a number that describes a set. The higher the number, the more probability there is. In this sense probability is like a quantity that measures a physical phenomenon; for example, a weight or However, it is not necessary to think about probability in physical terms.

Fortunately for engineers, the language of probability including the word probability itself makes us think of things that we experience. The basic model is a repeatable experiment. An experiment consists of a procedure and observations.

There is uncertainty in what will be observed; otherwise, performing the experiment would be unnecessary. Some examples of experiments include 1. Flip a coin. Did it land with heads or tails facing up? Walk to a bus stop. How long do you wait for the arrival of a bus? Give a lecture. How many students are seated in the fourth row? Transmit one of a collection of waveforms over a channel.

What waveform arrives at the receiver? Which waveform does the receiver identify as the transmitted waveform? For the most part, we will analyze models of actual physical experiments. We create models because real experiments generally are too complicated to analyze.

Is it rush hour? Some drivers drive faster than others. Consequently, it is necessary to study a model of the experiment that captures the important part of the actual physical experiment. Since we will focus on the model of the experiment almost exclusively, we often will use the word experiment to refer to the model of an experiment.

3rd stochastic yates and probability pdf processes

Example 1. Flip a coin and let it land on a table. Observe which side head or tail faces you after the coin lands. Heads and tails are equally likely.

As we have said, an experiment consists of both a procedure and observations. It is important to understand that two experiments with the same procedure but with different observations are different experiments. For example, consider these two experiments: Observe the sequence of heads and tails.

Observe the number of heads. These two experiments have the same procedure: They are different experiments because they require different observations. We will describe models of experiments in terms of a set of possible experimental outcomes.

In the context of probability, we give precise meaning to the word outcome. In probability terms, we call this universal set the sample space. The requirement that outcomes be mutually exclusive says that if one outcome occurs, then no other outcome also occurs.

For the set of outcomes to be collectively exhaustive, every outcome of the experiment must be in the sample space. In common speech, an event is just something that occurs. In an experiment, we may say that an event occurs when a certain phenomenon is observed. That is, for each outcome, either the particular event occurs or it does not. Table 1. All of this may seem so simple that it is boring. A probability problem arises from some practical situation that can be modeled as an experiment.

Getting this right is a big step toward solving the problem. Each subset of S is an event. An outcome x is a nonnegative real number. A short-circuit tester has a red light to indicate that there is a short circuit and a green light to indicate that there is no short circuit. Consider an experiment consisting of a sequence of three tests. In each test the observation is the color of the light that is on at the end of a test.

An outcome of the experiment is a sequence of red r and green g lights. We denote the event that light n was red or green by R n or G n. We can also denote an outcome as an intersection of events R i and G j.

In Example 1. An event space and a sample space have a lot in common. The members of both are mutually exclusive and collectively exhaustive. The members of a sample space are outcomes. By contrast, the members of an event space are events. The event space is a set of events sets , while the sample space is a set of outcomes elements. Usually, a member of an event space contains many outcomes. Consider a simple example: Examine the coins in order penny, then nickel, then dime, then quarter and observe whether each coin shows a head h or a tail t.

What is the sample space? How many elements are in the sample space? The sample space consists of 16 four-letter words, with each letter either h or t. For example, the outcome tthh refers to the penny and the nickel showing tails and the dime and quarter showing heads. There are 16 members of the sample space. Continuing Example 1. Each B i is an event containing one or more outcomes. Its members are mutually exclusive and collectively exhaustive. The experiment in Example 1. Mathematically, however, it is equivalent to many real engineering problems.

For example, observe a pair of modems transmitting four bits from one computer to another. For each bit, observe whether the receiving modem detects the bit correctly c , or makes an error e. Or, test four integrated circuits. For each one, observe whether the circuit is acceptable a , or a reject r.

In all of these examples, the sample space contains 16 four-letter words formed with an alphabet containing two letters. The concept of an event space is useful because it allows us to express any event as a union of mutually exclusive events. We will observe in the next section that the entire theory of probability is based on unions of mutually exclusive events.

The following theorem shows how to use an event space to represent an event as a union of mutually exclusive events. Figure 1. Many practical problems use the mathematical technique contained in the theorem. Classify each one as a voice call v if someone is speaking, or a data call d if the call is carrying a modem or fax signal. Your observation is a sequence of three letters each letter is either For example, two voice calls followed by one data call corresponds to vvd.

Write the elements of the following sets: This leads to a set-theory representation with a sample space universal set S , outcomes s that are elements of S , and events A that are sets of elements. To complete the model, we assign a probability P[ A] to every event, A, in the sample space. With respect to our physical idea of the experiment, the probability of an event is the proportion of the time that event is observed in a large number of runs of the experiment.

This is the relative frequency notion of probability. Mathematically, this is expressed in the following axioms. Axiom 3 For any countable collection A 1 , A2 ,.

We will build our entire theory of probability on these three axioms. Axioms 1 and 2 simply establish a probability as a number between 0 and 1. Axiom 3 states that the probability of the union of mutually exclusive events is the sum of the individual probabilities. We will use this axiom over and over in developing the theory of probability and in solving problems. In fact, it is really all we have to work with. Everything else follows from Axiom 3.

To use Axiom 3 to solve a practical problem, we refer to Theorem 1. A useful extension of Axiom 3 applies to the union of two disjoint events. Although it may appear that Theorem 1. In fact, a simple proof of Theorem 1. If you are curious, Problem 1. It is a simple matter to extend Theorem 1. The correspondence refers to a sequential experiment consisting of n repetitions of the basic experiment. We refer to each repetition of the experiment as a trial. In these n trials, N A n is the number of times that event A occurs.

Theorem 7. Another consequence of the axioms can be expressed as the following theorem: Applying Theorem 1. The expression in the square brackets is an event. Within the context of one experiment, P[ A] can be viewed as a function that transforms event A to a number between 0 and 1.

In these experiments we say that the n outcomes are equally likely. What is the probability of each outcome? Find the probabilities of the events: A score of 90 to is an A, 80 to 89 is a B, 70 to 79 is a C, 60 to 69 is a D, and below 60 is a failing grade of F. While we do not supply the proofs, we suggest that students prove at least some of these theorems in order to gain experience working with the axioms.

The following useful theorem refers to an event space B 1 , B2 ,. In this table, the rows and columns each represent an event space. This method is shown in the following example. It also observes whether calls carry voice v , data d , or fax f.

This model implies an experiment in which the procedure is to monitor a call and the observation consists of the type of call, v, d, or f , and the length, l or b. In this problem, each call is classifed in two ways: J,n,table collectiori A 1 , A 2 , J,t11,ally ex;cl?

J,sive e'verits. Axiorr1 3 stat es that t: In fact , it is really all v. To use Axiorn 3 t o solve a practical problem , we "''ill learn in Section 1. A useful exter1sion of Axiorr1 3 a pplies t o the llnion of two rr1l1t 11a lly exclusive events. In fa ct, a sirr1ple proof of Theorerr1 1.

If you ar e curio11s, Problern 1. It is a sirnple rnatter to extend Theorern 1. We refer to each repetit ior1 of t11e experirnent as a trial. Theorem 1. Proof Each outcome Siis a n even t a set w it h t h e single elemen t Si. Since outco1nes by definition are mutually exch1sive, B can be expressed as t he union of m mutually exclusive sets:.

Comments on Notation vVe use the notation P [] t o indicate t11e probability of an event. The expression in the square brackets is an event.

W ithin t he context of one experirnent , P[A can be vievved as a fur1ction t11at t ransforrns event A to a nurnber between 0 and 1.

For convenience, vie will sorr1etimes vvrit e P [si] r ather t. We will also abbreviat e t he notation for the probability of t: What is the probability of each outcome? Find t he probabilities of t he events: Conditional probabilities correspond to a modified probability model that reflects partial inforrnation a. The rnodified rnodel has a srnaller sample space than the original model. As we Stlggest ed earlier, it is sorr1etirnes l1seful to interpret P [A] as our knowledge of the occurren ce of event A before a n experiment takes p lace.

Thus P [A] r eflect s our knowledge of t he occl1rrence of A prior t o perforrning an experirnen t. Sornetimes, -vve refer to P[A] as the a priori probability, or t h e prior probability, of A.

In rnany practical sitl1a,t ior1s, it is not possible t o find out t11e precise outcome of an experirnent. R ather than the Ol1tcorne s ,;,, itself, "''e obtain information that the outcorne is in the set B. That is, we learn that sorne event B has occl1rred , v. Cor1ditional probability describes Ollr knowledge of A v. The notation for this r1ew probability is P[AIB].

We read this as "the probability of A given B. Let B denote the event that the first chip tested is rejected. T he ch ips come from a high-qua lity productio n line. T herefore the prior pro babi lity P [A] is very low. In advance, we are pretty certai n t hat the second circuit will be accepted. J,TTen,ce of the even,t B is. Cor1ditional probabilit: Note t 11at P AIB ] is a respect able probability rneas11re relative t o a sarr1ple sp ace t hat consist s of all the tcornes in B.

Tliis rr1eans that P [A IB] h as properties corresponding t o t h e three axiorns of probability. A J;iorri 1: A xiorri 2: A xiorn 3: We saw in Example 1. Therefore, the a priori probabi lity t hat the second chip is rejected is. The co ndit ional probabilit y of the second ch ip being rejected given t hat the first c hi p is rejected is, by defi nition, the ratio of P [AB] to P [B ], where, in this exam ple,. The in formation that the first chip is a reject drastically changes our state of knowledge a bout t he second chi p.

What is t he cond it ional probabil ity that t he bottom card is t he ace o f cl u bs give n that the bottom card is a black card? The sample space consists of the 52 cards t hat can appear on t he bottom of the deck. Let A denote the event that t he bottom card is the ace of c lubs. Let B be the event that the bottom card is a black card. Given B, t he cond itional probability of A is. Let X 1 and X 2 denote t he number of dots t hat a p pear on die 1 and d ie 2, respectively.

What is P[A]? What is P [B]? What is P[A IB]?

Yates Probability 3rd Edition solutions - Probability and...

Each outcome is a pair of va l-. The rectangle re presents A. The triangle re presents B. It co nta ins six outcomes. From the definition of cond itional probability, we write. We can also derive th is fact from the diagram by restrict ing our attention to the s ix outcomes in B t he cond it ioning event a nd noting that three of the six outcomes in B one-ha lf of the tota l are a lso in A.

Classify each one as eit her video 'v or dat a d. Yotlr observatior1 is a seqtlence of three letters each one is either v or d.

For exam ple, three video packets correspor1ds t o vvv. T he Otltcorr1es vvv and ddd each ha: Col1nt the nl1rnber of video packets JV v in the three packets yol1 have observed. Describe in vvords and also calc11late the follo,vir1g probabilit ies: A partit ion d ivides the sarnple space into rn t1tt1a.

Examine the coins in order penny , then nickel, then dime, then q uarter a nd observe whether each coin shows a head h, or a tail t.

W hat is t he sample s pace? How many elements are in the samp le space? The sample space consists of 16 fo ur-lett er words, with each letter either h, or t. For examp le, t he outcome tth,h, refers to the pen ny and the nicke l show ing tails and t he dime and quarte r showing heads. The re are 16 members of t he sample space. Figure 1.

Continuing Example 1. Each Bi is an event contain ing one or more outcomes. Its mem bers are mutually exclusive and collectively exhaustive. It is not a sample space because it lacks the finest-grain property.

Learning that an experiment produces an eve nt B 1 tells you that o ne coin came up heads, but it doesn 't tell you w hich coin it was. The experirr1er1t in Exa rnple 1. Or t est four integrat ed circuits. For each one, observe "'' hether the circl1it is acceptable a or a reject r. We observed in Sect ion 1. The followir1g t heorerr1 sl1ov. From Examp le 1. In words , t his examp le states that the event "less than three heads" 1s the un ion of events "zero heads," "one head," and "two heads.

It stat es that we can find t11e probability of A by adding t11e probabilities of the parts of A t11at are in the separate componer1ts of the event space. P roof The proof follows d irectly from Theorem 1. Theorerr1 1. In this table, the rows and columns each represent a partition. This rnethod is s11own in t he follovving exarr1ple. It classifies a ll emails as e ither long Z , if they a re over 10 MB in size, o r brief b. It also observes whether the ema ii is just text t , has attached images i , or has an attached video v.

T his model implies an experiment in which the procedure is to monitor an email and the observation consists of the type of email, t, i , or v, and the length, l orb. The sample space has six outcomes: In this problem, each email is classifed in two ways: The sa m pie space can be represented by a table in which the rows and columns are labeled by events and the intersection of each row and column event contains a single outcome. The corresponding table entry is the probability of that outcome.

In this case , the table is. Thus we can app ly Theorem 1. Proof This follo,vs from Theorem 1. The usefulness of t11e result can be seen in the next example. Resistors withi n 50 n of the nom ina l value are considered acceptable. Each hour, machine B 1 produces resistors, B 2 produces resistors , and B 3 produces resistors. All of the resistors are mixed together at ra ndom in one bin and packed for shipment. What is the probabi lity t hat the company ships an acceptable resistor?

Using the resistor accuracy information to formulate a probabi lity model, we write. Now it is a sim ple matter to apply the law of total probabi lity to find the acceptable probabil ity for all resistors shi pped by the compa ny:. Bayes ' theorerr1 is a simple conseql1ence of the definition of conditional probability. It has a narr1e because it is extrernely useful for rr1aking inferer1ces about phenomer1a that car1r1ot be observed directly.

F or each possible sta,te, B i, v. That is vve use Ba: Thus for st ate B. What is the probabi lity that an acceptable resistor comes from m ach ine B 3? Using Bayes' t heorem , we have. Since all of the quantities we need are given in the problem description, our answer is. Q uiz 1. Classify t he behavior as b l1y- ing B if a custorner pl1rchases a sm artphor1e.

Ot hervvise t he beh avior is no pur- chase N. Fir1d t 11e following probabilities: T wo events a. To interpret independer1ce, consider probabilit: The fact that the outcorne is in Bis partial inforrnatior1 about the experirr1ent.

It is in this sense that t he events are independent. Problern 1. The logic be11ir1d this cor1cl11sion is that if learnir1g that event B occurs does riot a lter the probability of event A , t11en learning t11at B does not occt1r also shot1ld not alter t11e probability of A.

K eep ir1 rr1ind that i ndepende nt a nd mut u ally excl u sive a r e not syn- o n y m s. Ir1 sorne contexts these words can h ave sirnilar rr1eanir1gs, bt1t this is not the case in probability. Axiorr1 3 en ables us to add t11eir probabilities t o obtain the probability of the 'Un,ior1,. Definit io n 1. Are the events B 1 and B 2 independent? Learning whet her or not t he event B2 second customer downloads a p hone occurs drastica lly affects our knowledge of whether or not the event N 2 second customer does not b uy a phone occurs.

Learning w het her or not the event B 2 second customer downloads a phone occurs does not affect our k nowledge of whet her or not the event B 1 first customer downloads a phone occurs. In rr1any p ractical applicatior1s 'ive reason in t he opposite direction. Our kr1owledge of an experirnent leads us t o ass? J,rne that certain pairs of events are independent.

We t hen llSe t: A mecha nical test determines whether pi ns have the correct spaci ng , and an electrical test checks the relationshi p of outputs to inputs. We assume that electrica l fa ilures and mechanica l fai lures occur independently.

O ur information about circuit production tel ls us that mec han ica l failures occur with prob- ability 0. What is t he probab ility model of an experiment that consists of testing an integrated circu it and observing the resu lts of the mechan ical and electrical tests?

Let M and E de note the events that the mechanical and electrical test s are acceptable. Using the independe nce assumption and Definition 1. T11us far , v.

Often we consider larger sets of independer1t events. For rnore thar1 tvvo events to be in,de1Jeriderit, the probabilitJr rr1odel has to rneet a set of conditions. The final cor1dition is a sirr1ple exter1sion of Definition 1. T hese three sets satisfy the fina l cond it ion of Definition 1.

T11e definition of an a. On t h e ot11er hand if we kno-vv t 11at n, events are rnt1t tu. Just rr1ultipl: Classify each or1e tiS video v if it v. Yot1r observation is a seqt1ence of t wo letters either v or d. For ex ample two video p ackets correspor1ds to vv. The t v. Denote t 11e ident ity of p acket i by C,i. TLAB random nl1rnber generator rand. B prograrr1s in this book can be downloaded frorr1 the companion website. T- LAB. Like a sophisticated scient ific ca.

It can also simt1late experirr1ents -vvith random outcorr1es. To sirr1ulate experirr1ents, vie need a so11rce of r andomr1ess. MATLAB t1ses a computer algorit hm, referred to as a pse'udora,n,dorn 'n11,rnber gerierator, to produce a sequence of nurr1bers betv.

The calculation of each randorn n11mber is sirnilar to ar1 experirnent in -vvhich all outcornes are equally likely and the sarr1ple space is all bir1ary nt1rr1bers of a cer tain lengtli.

Each r1urr1ber is interpreted as a fraction , wit11 a binary poir1t preceding t he bits in the binary ntm1ber. TLAB sirnt1lation of an experirr1ent start s wit11 rand: Sirnilarly, rand n prod11ces a n ri x 11, array and rand 1 is jt1st a scalar randorn r1urnber. Each nurr1ber produced by rand 1 is in t he interval 0 , 1.

Each tirr1e we use rand, -vve get new, seemingly ur1predictable n11mbers. Finally, v. Initializing the r andorr1 nurr1ber generator 'ivith the sarr1e seed al'ivays generates t11e sarne sequence:. Hovvever, it can be instruct ive to use the sam e repeatable seqt1ence of rand values 'ix. R epeat this experirr1er1t 5 t imes.

Problems Difficulty: Easy Moderate D ifficu lt t Experts Only. All pizzas have cheese but not all piz- zas have tomato sauce. Roman pizzas can 1. W ; Neapolitan pizzas always have tomato a Are N and M mutually exclusive?

A Neapolitan t ive? State cheese. Draw a v enn diagram t hat shows t his condition in 'ivords. A user of t he w i-fi co nnection can transm it a short signal corresponding to a f Are C and D mut ually exclusive? A n t he first card. How m any ou tcomes are ple, a high-speed, m ouse-click transm ission in t he even t t hat t he first car d is a hear t?

G ive four examples of par t it ions. Give fou r exa1nples of part it ions. E it her a circuit is accep tab[e a or it fa ils f. Let R i be t he from m achines X, Y, and Z , respectively. Let vVj be t he For example, aaf is t he observation t hat event t hat t he white d ie rolls j.

For any collection onds. Slow programs W r equire at least of events A1, O bserve the length of the source code and t he run time. Calcu- late the follo,ving probabilities: P [W], P[B], 1. Phone calls cru1 be ioms of probability are needed to prove the classified by the t raveling speed of the user fact. The probability as they move from cell to cell. During a model for this experiment has the follow- call , a telephone either performs zero hand- ing information: In addition, each call is the experiment?

The following table describes the probabilities of t he possible 1. What is the probability t hat the first card is a heart? L et Ri denote the event that student gets an F by getting less than 4?

Let E denote pea has yello'v seeds. The two groups probability t hat 3 is rolled given t hat were crossbred so that each pea plant in the the roll is greater than 1? You dra': Let three cards. Let Ei denote the event that Ci denote the event t hat card i is picked. W hat is P[E2 I01], the conditional probability 1.

If you download one Bana11a at full price, the first card is odd? What is the probability that a couple ease and human granulocytic ehrlichiosis downloads a pair of Bananas?

If you monitor Example 1. To generate a ity that the two phones sold are the same? Charge event areas are proportional to t heir prob- for M minutes JV! A more that are independent. Pair,vise independent events meet the first many visib ly different kinds of pea plants three conditions of Definition 1.

One of rvlendel's most s ignificant results 'vas the conclusion that genes determin- 1. Mendel b A c and B are independent. In t his notation, rr denotes a pea w it h two round" genes 1.

The first generation 'vere ei- bilities to illustrate three events A, B , and ther r1D,yg , r1D ,gy , '1Dr, yg , or v1r, gy C that are independent. What is the probability P [R] that a vector T of independent test scores a second-generation plant has round peas? How equally likely. E ach a. In analyzing sequeritia. Tree d iagrarns diisplay the outcomes of the subexperiments in a seqt1ential experirrient.

T lie labels of the branches are probabilities and conditional probabilities. Many experirnents consist of a sequence of s'ubex1Jerirnerits. The procedure fol- lowed for each st1bexperirnent rriay depend on tlie resl1lts of the previous subexper- iments. We often find it l1seft1l to use a type of graph referred to as a tree diagrarn to represent t he sequerice of st1bexperirnents. To do so, "''e assernble t he outcomes of each subexperirrierit int o sets in a par t ition.

Each bra. The events iri the partitiori of the second Sl1bexperirnent appear as branclies growing from every node at the end of the first subexperiment. The labels of the branclies. Some of them h ave t h eir roo ts on top a n d leaves on t he bo t tom.

T11e r1odes at t he end of the final subexperirr1er1t are the leaves of the tree. Eac11 leaf corresponds to an outcorne of the ent ire sequer1t ial experirr1er1t.

The probability of each outcorr1e is the product of the probabilities and conditional probabilities on the path frorr1 the root t o the leaf. This is a cornplicated description of a sirr1ple procedure as v. The experi ment of testing a resistor can be viewed as a two-step procedu re.

First we identify which machine B 1 , B 2 , or B 3 produced the resistor. Second, we find out if the resistor is acceptable. Draw a tree for this sequentia l experime nt. What is the probability of choosing a resistor from machine B 2 that is not acceptable? To use the tree to 0. This is a consequer1ce of the lav. Moreover , Axiorr1 2 implies that the probabilities of all of the leaves add up to 1. Example 2.

In particula r, the tim ing was designed so that with probability0. Assuming t he first light is equa lly likely to be red o r green, what is the probabil ity P [G2] t hat the second light is green? Also, what is P[W ], the probab ility t hat you wait for at least one of t he first two lights?

Lastly, what is P [G 1 IR2].

The probability that the second light is green is. An alternative way to the same answer is to observe that VV is a lso the complement of the event that both lights a re green. Thus ,. Coin 1 is biased. Suppose you pick a co in at random and flip it. Let Ci denote the event that coin i is picked.

Let Hand T denote the possible outcomes of the flip. Given that the outcome of the fl ip is a head , what is P [C 1IH], the probability t hat you picked the biased coin? Given that the outcome is a t a il, what is the probability P[C 1IT] that you picked the biased coin?

Tree diagrarns provide a clear explanatior1 of the ansvver. You r goal is to select the door that hides the car. You make a preliminary selection and then a final select ion. T he game proceeds as follows: You select a door. The host, Monty Ha 11 who knows where the car is hidden , opens one of the two doors you didn't select to reveal a goat. Monty then asks you if you would li ke to switch yo ur selection to the other unopened door.

After you make your choice either staying with your original door, or switching doors , Monty revea ls the prize behind your chosen door.

To maximize your probability P[C ] of winning the car, is switching to the other door either a a good idea, b a bad idea or c makes no difference? To solve th is problem, we wi 11 consider the "switch " and "do not switch" policies separately. That is, we will construct two different tree diagrams: The first describes what happens if you switch doors while the second describes what happens if you do not switch.

First we describe what is the same no matter what po licy you fo llow. Suppose the doors are numbered 1, 2, and 3. Le t H i denote the event that the car is hidden behind door i. Also, let's assume you first choose door 1. Whatever door you do choose, that door can be labeled door 1 and it would not change your probabil ity of winning. Now let Ri denote the event that Monty opens door i that hides a goat. If the car is behind door 1 Monty can choose to open door 2 or door 3 because both hide goats.

He chooses door 2 or door 3 by fl ipp ing a fair coin. If the car is behind door 2, Monty opens door 3 and if the car is behind door 3, Monty opens door 2. Let C denote the event that you win the car and G the event that you win a goat. After Monty opens one of the doors, you decide whether to change your choice or stay w ith your choice of door 1. Finally, Monty opens the door of your final choice, either door 1 or the door you switched to. The tree diagram in Figure 2. Monty opens door 2 event R2 , you switch to door 3 and t hen Monty opens door 3 to revea l a goat event G.

On t he other ha nd, if the car is beh ind door 2 , Monty revea ls the goat behi nd doo r 3 and you switch to door 2 a nd win t he car. Si milarly , if the car is beh ind door 3, Monty revea ls the goat behind door 2 and you sw itch to door 3 and w in the car. For always switch, we see that.

If yo u do not switch , the t ree is shown in Figure 2. In th is tree , when the car is behi nd door 1 eventH 1 and Monty opens door 2 event R2 , you stay w ith door 1 and then Monty opens door 1 to revea l the car. On the other hand , if the car is behi nd doo r 2 , M onty w ill open door 3 to revea l the goat.

Since your final choice was doo r 1, Monty opens door 1 to rev ea I the goat. For do not switch ,. Note that the two trees look largely the sa me because the key step where you make a choice is somewhat hidden because it is impl ied by t he f irst t wo bra nches fol lowed in the tree. Quiz 2. However , paging atternpts don't alvva: Consequent ly, t 11e system v.

If t he results of all paging atterr1pts are indeper1der1t and a single pagir1g att err1p t succeeds -vvit h probability 0. In all applications of probability theory it is important t o under- star1d the sarr1ple sp ace of an experiment. This r1urr1ber can be enorrnous as in t he followir1g sirr1ple ex arnple. D isp lay the cards in the order in which you choose them. How many different sequences of cards are possible?

The procedure consists of seven subexperiments. In each subexperiment, the obser- vation is the identity of one card. The first subexperiment has 52 possib le outcomes corresponding to the 52 cards that could be drawn.

For each outcome of the first subex- periment, the second subexperiment has 51 possible outcomes corresponding to the 5 1 remaining cards. Therefore there are52 x 51 outcomes of the first two subexperiments. The total number of outcomes of the seven subexperiments is.

If o'ne s'ubexperirnen,t has k ,t cornes an,d the other sv,bei;per'irnen,t has n, ov. The first su bexperi ment is "Flip a coin and observe either heads Hor tails T. In Exarr1ple 2. In ger1eral, ar1 ordered sequence of k distir1gt1ishable objects is called a k -perrnv,tation,..

To find ri, k , suppose -vve have n, distinguishable objects, arid the experirr1er1t is to choose a sequer1ce of k of t11ese objects. There are ri choices for the first object , ri - 1 c11oices for the second object, etc. Therefore, the total r1urr1ber of possibilities is. Sampling without Replacement Samplir1g wit11ot1t replacerr1ent correspor1ds to a sequential experirnent in -vvhich t he sarnple space of each st1bexperirnent deper1ds on t11e outcornes of previous subex- perirner1ts.

C11oosing objects randornly frorn a collectior1 is called sarnplin,g, and the chosen objects are known as a sarnple. A k-perrnt1tation is a type of sample ob- tained by specific rules for selecting objects from t11e collection. Different outcornes ir1 a k-perrr1utation are distingt1ished by the order in which ob- j ects arrive ir1 a sarnple. For exarr1ple, in many card garr1es, only the set of cards received by a player is of interest. In contrast to this exarr1ple -vvith s ix outcorr1es , the r1ext exarnple shows that the k-permt1tation corresponding to an experirnent ir1 -vvhich the observation is these- quence of t-vvo letters 11as 4!

Exam p Ie 2. Each subset is called a k -cornbin,atiori. The vvords for this n11mber a re "ri choose k," t he nl1rr1ber of k-combinations of n, obj ects. Choose a k-cornbina,tion out of the ri obj ect s. Choose a k-permuta. Theorem 2. By Theorem 2. Since tl1er e a re n, k possib le ou tcorr1es of t he cornbined exper im ent ,.

R earrar1gir1g t h e terrns yields our next result. In a ddition, we observe that. The logic beh ind t his ider1t ity is that choosir1g k out of n, elerner1ts to be part of a subset is equivalen t to choosing n, - k elerner1ts t o be ex cluded from the su bset.

Here, vve ad op t the follov. Definition 2. By contrast, we found in Example 2. The ratio is 7! There are 11 players on a basket bal l team. The starting lin eup consists of five players. A baseball tea m has 15 field players and ten pitc hers. Each field player can take any of the eight no npitchi ng positions. The starting lineup consists of o ne pit cher and eight fie ld players. For each choice of start ing line up, t he manager must submit to t he umpire a batting order for the 9 starters.

T he number of possible batting orders is N x 9! You are given seve n cards at ra ndo m from the deck. What is the probability t hat you have no quee ns? Consider an expe ri ment in wh ich the proced ure is to select seven cards at random from a set of 52 cards and the observatio n is to determine if there a re o ne or more queens in the se lection. The pro bability of receiving no queens is the ratio of the number of outcomes with no queens to the numbe r of outcomes in the samp le space.

Another way of analyz ing th is experime nt is to co ns ider it as a seque nce of seven su bexperiments. The first su bexperiment consists of select ing a card at ra ndom and observing whether it is a queen.

The probabil ity of t he event N 7 that no queen is received in your seven cards is the product of the probabi lities of the bra nches leadi ng to N Sampling with Replacement Consider selecting an obj ect from a collectior1 of objects, r eplacing t he selected object , and repeating t: Jith replacernen,t. Each selectior1 is the procedure of a subexperirnent. In tr1is section v. In t he next sec- t ion vie derive probability models for for experirr1ents that specify sarr1pling witr1 replacemer1t.

You are given seve n cards at ra ndo m from the dec k. After receiving eac h card you ret urn it to the deck and receive another card at random. Observe whether you have not received any quee ns amo ng t he seven cards you were given. What is the probabi lity t hat you have received no queens?

The sample space contains 52 7 outcomes. There are 48 7 outcomes with no q ueens. If this experiment is considered as a sequence of seve n subexperiments, t he tree looks the sa me as the tree in Example 2. It is possib le to connect two memory cards, two cameras, or two printers to the laptop. How many ways ca n we use the two USB slot s?

Let x;y denote the outcome that device type x is used in s lot A a nd device type y is used in slot B. The sample space S co ntains ni ne outcomes. The fact that Exarr1ple 2. Since v.. Hence, by the fundarnenta.

Probability and Stochastic Processes - A Friendly Introduction for El…

In Exarnple 2. This result ger1eralizes nat- urally vvhen v. The experirr1ent consists of a sequen ce of 'n identical st1bexperiments v.. Sarr1pling v.. Usir1g xi t o denote the outcorne of the i th subexperiment , t11e r esult for 77, repetitior1s of the subexperirnen t is a sequence ;,r; 1 ,..

Each m icroprocessor is tested to determin e whether it runs re liab ly at an acceptable clock speed. In test ing four m icroprocessors, the observation sequence, x 1 x2 ;,r ;3x;4, is one of 16 possible outcomes:. Note that we can think of the observation seq11ence x; 1 , We use the notatio n x ,i to denote the grade of the i th student.

For exa m ple, the grades for the class cou Id be. In Example 2. A more ch aller1gir1g problerr1 than finding t he n11mber of possible corr1binations of , obj ects sampled "''ith replacernent frorr1 a set of n, objects is to calculat e t11e nt1rr1ber of observation sequences such t hat ea.

Exarr1ple 2. Even in this sirnple example it is not a, simple rnatter to deterrnine all of t he outcornes, and in rnost practical a. More generally, for length n, binary words wit h n,1 1 's, vve choose r: T 11eorem 2.

Of co11rse, t11ere a. The notat ion for t he r1urr1ber of outcomes is. It is r eferred to as t he rnv,ltin,ornial coefficien,t. To derive a formula for t he rnult i- nornial coefficient, -vve generalize t he logic used in derivir1g t he forrr1ula.

The d etails can be found in t 11e proof of t he following theorern:. Start wit h n, empty slots and perform t he follo,ving sequence of su bexperiments: Subexpe riment Proce dure 0 Lab el n,o slots as so. Ther e are C: After n,o slot s have been labeled, t here are n -n1no wavs J to per for m subexper imen t 1. J From t he fundamen tal count ing principle,. Whe n the re are ten students in a probabil ity c lass, the professor always issues two g rades of A, three grades of B, th ree grades of C and two grades of F.

How many different ways can the professor assign grades to t he ten students? The n um ber of ways that fit the curve is the multinom ial coefficient. Ari exarr1ple of a code vvord is Independent trials are ideritical subexperirrients iri a sequential ex- perirrient.

The probability rnodels of all the st1bexperiments are ident ical and independer1t of t he ot1tcomes of previot1s subexperi- ments. The results of all t rials of t he subexperirnerit are mutually independerit.

An outcome of the cornplet e experiment is a sequen ce of successes and failures derioted by a sequence of ones arid zeroes. For exarnple, To find P [En0 ,n 1 ], vve first consider an exarriple. To find P [E 2 ,3 ], we observe that the outcomes with three successes in five tria Is a re , 10 , , , , 1 , , , 1, and We note that the probability of each outcome is a product of five probabilities , each related to one su bexperi ment.

In outcomes with three successes, three of the probabi lit ies are '[J and the other two are 1 - '[J. Therefore each outcome with three successes has probability 1 - p 2 p 3. From Theore m 2. To find P[E 2 ,3 ] , we add up the prolbabi lities associated with the 10 outcomes with 3 successes, yielding. If we random ly test res istors, what is t he probab ility of Ti, the event t hat i resistors test acceptable? Testi ng each resistor is an indepe ndent trial with a success occurring when a resistor is acceptab le.

This shows t hat although we might expect the number acceptable to be close to 78, that does not mea n that the probabi lity of exact ly 78 acceptab le is high. Thus the in formation "zero" is transmitted as and "one" is Th e rece iver detects t he correct information if three or more binary symbols are rece ived correctly. In th is case, we have five t r ia ls correspond ing to the five t im es the binary sym bo l is sent.

Stochastic yates and probability 3rd pdf processes

On eac h trial, a success occ urs when a bin ary symbol is rece ived correctly. T he error event E occurs when t he number of successes is strictly less t han t hree:. By increasing the number of bina ry sym bols per information bit fro m 1to5, the cellular phone reduces the probabi lity of error by more than one order of magnitude, from 0. Therefore, in every subexperiment t11e probabilities of correspondir1g events are t he same and v.

An outcom e of t he experiment cor1s ist s of a sequen ce of ri subexperimer1t out - corr1es. In t he probability t ree of t he ex perirr1ent, eac11 r1ode 11as m branches and branch i has probability 'fJi The probability of an ot1tcorr1e of t11e seql1en tial experi- rr1er1t is just t he prod uct of t hen, branch probabilit ies on a path frorn the root of t he t r ee t o t he leaf represent ing t he ot1tcorr1e.

En0 , Note t 11at t he notation En 0 ,. T o calculate P [En0 , P [Eno , Applying Theorerr1 2. Exa mple 2. Let Ea ,v,t denote the event that the router processes a aud io packets, v video packets, and t text packets in a seq uence of packets. In this case,. Let E 25 ,25 ,25 ,25 denote the probab ility of exactly 25 mi croprocessors of each grade.

From T heorem 2. T he packet has been coded in Sl1ch a v. If rnore thar1 three bits are received in error, then the packet is decoded with errors. Sequential experimer1ts are models for practical processes that depend on several operations to succeed.

In sorr1e cases, the processes cor1tain redundant componen ts t hat protect the en t ire process frorr1 the failure of one or more com- por1er1ts. Let Wi denote the event t hat corr1por1ent i succeeds. As depict ed in F igt1re 2.

Cornpon,erits in, series. The operation succeeds if all of its components st1cceed. One exarnple of such an oper ation is a seqt1en ce of cornputer programs in "'' hich each progra rn after t he first one uses the result of the previous pro- grarn.

Therefore, t he corr1plete oper ation fails if any cornponent program fails. The probability t 11at the operatior1 succeeds is. If the independent cornponents ir1 parallel have different success probabilities P1, P With componer1ts in series, the probability of a successful operatior1 is lovver than the success probability of t11e weakest corr1por1ent. Cornporien,ts in, parallel. This operation occurs vvhen vve introduce redur1dancy to prornote reliabilit y.

In a redundant systerr1, such as a space shuttle , there are n, corr1puters on board so t 11at the shuttle can cor1tinue to fur1ction as lor1g as at least one cornputer operates successfully. If the componer1ts are in parallel, the operation fails "'' hen all elerr1ents fa il, so v. Figure 2. On the left is t he origi nal operation. On t he right is t he equivalen t operation vvith each pair of series con1ponents replaced Virith an equivalent component. P [vT! We can ar1a. Draw a diagram of t he operat ion and calculate the probabi lity t hat the operation succeeds.

A diagram of the operation is shown in Figure 2. The entire o peration then consists of W 5 and W6 in para llel , wh ich is also shown in Figure 2. The success probab il it y of the o peration is. Note that in Equation 2.

The reason is that for the process to be su ccessful , all corriponents rriust be su ccessful. Ori the other hand, v. Each failure probability is tlie difference betvveen 1 and the success probabilit y. Hence in Eql1atiori 2. TLAB code can yield quick sirriulations of rriany experirrien ts. In each experiment, flip a coi n t imes and reco rd the number of heads in a vect or Y such that the j t h eleme nt Y: Since Y sums X across t he first dimension, Y j is the numbe r of heads in t he j th subexperiment.

Each Y j is between 0 a nd and general ly in t he neighborhood of The output of a sample run is shown in F igu re 2. Your o utput should be a 4 x 1 vector X such that X ,i is t he nu mber of grade i microprocessors. Note that " h elp h ist" wi ll show the variety of ways that the hist fu nction ca n be cal led. In pa rticular, hist G, T creates bins centered arou nd each T j and counts the number of elements of G t hat fall into each bin.

TLAB all variables are assurr1ed to be rnat rices. In addition v. T11us X and X in a M. Sirnulate thick coin flips. Yol1r outpl1t sl1ol1ld be a 3 x 1 vector X such that X 1 X2 and X3 are t11e nurr1ber of occurrences of heads, tails and edge. Easy Moderate D ifficu lt Experts Only. A test for the result of flip i.

Suppose t he test the first flip is heads given that the sec- gives the correct ans,ver of the t ime. What is P [- IHJ , the conditional probabil- ity that a person tests negative given that b What is the probabilit: What flip is heads and the second flip is tails? W hen the first photo detector is accept- If the player inakes exactly one free throw, able, the second photo detector is accept- the game goes into overtime. However, if the a F ind the probability that exactly one photo detector of a pair is acceptable.

Probability and Stochastic Processes

What is the probability that the game detectors in a pair are defective. Coin A 2. If t he flip is heads, yo u guess that the coin randomly. Use H i and Ti to denote the flipped coin is B; otherwise, you guess that result of flip i. Let B1 be the event that ity P[C that your guess is correct?

Are H 1 and H 2 independent? Explain your counted in the attempt to win t'vo of three answer. Let Hi be the event that 2. Let Ti be the is rare; a ne,vborn infant w ill have t he de- event t hat tails occurs on flip i. In the general exa1n of a ne,vborn, a particular a Draw the tree for this experiment. La- heart arrhythmia A occurs with probability bel t he probabilities of all outcomes.

What is P[D? In a newborn 'vith the de- detectors produced by the machine in Prob- fect, the lab test is positive 'vith probabil- lem 2. I I opinionator. Without ater, the plant has a 90 fant has heart surge1y performed for percent chance of dying.