Consistency of Parameter Estimates in Statistics

Consistency of Parameter Estimates in Statistics

Machine-readable: Markdown · JSON API · Site index

Поделиться Telegram VK Бот
Транскрипт Скачать .md
Анализ с AI

Оглавление (3 сегментов)

Segment 1 (00:00 - 05:00)

welcome back so we've been talking about parameter estimation essentially a statistical method of fitting the parameters of a probability distribution from data and I want to introduce today this notion of the consistency of that parameter estimate Theta hat um so maybe I'll just write down uh kind of what we're talking about so we have some probability distribution uh of you know x given these parameters maybe this is a Pon distribution or a normal distribution and given some data some measurement data X I want to estimate I want to find the best estimate of the parameters of that distribution Theta hat so in the case of pan I'd be trying to estimate the Lambda parameter in the case of a normal distribution I'd be estimating the mean and the variance mu and sigma squar and this idea of consistency is super important it essentially tells us whether or not this estimate is unbiased or bias if it converges to the true parameter values or not in the large n limit in the limit of large data sample okay so I'm going to Define what I mean by consistency and then we're going to State a fact that the parameter estimate Theta hat obtained through the method of moments is in fact a consistent uh estimate of the parameters okay so uh consistency so we're going to say if uh Theta hat is an estimate of theta estimate of a true Theta of theta and uh this is based on a sample size of n okay then Theta n is consistent I'm going to put this little n here Theta n hat so just explicitly saying that this is based on a sample of sized n then uh Theta n hat is consistent if uh if it converges converges if it converges uh to the true value of theta to Theta and we say converges in probability so we've seen this before um when we looked at the law of large numbers there's this very like mathematical probabilistic definition of converges it means that um the distribution converges um in probability I'll write this out as math in a minute then Theta is consistent if it converges to the True Value in probability uh as n goes to infinity and specifically what we mean by converges in probability is a very mathematical notion it says that remember Theta hat is a random variable because it is a function of a bunch of samples which themselves are random variables each of these X's are random variables my data I collect as a statistician I think those are random variables drawn from this distribution then this estimate itself is a random variable with a mean and a standard deviation and all of you know a distribution so for this estimate to converge to the true value means that the density function the probability of this being close to this has to converge so the way we write this mathematically is the probability of the absolute value of the difference between our estimate and the True Value being greater than uh than Epsilon so the probability of my estimate being more than Epsilon away from the True Value goes to zero as n goes to infinity and you could actually formulate this in terms of like a Delta and an Epsilon using like Calculus if you wanted but this is mathematically how to write this so consistency means that as n goes to Infinity the probability that our estimate is more than Epsilon away from our true value goes to zero for all positive Epsilon this means that essentially this distribution has to converge to the True Value Theta okay it means that the mean of this random variable the average value has to be the True Value and its variance has to go to zero as n goes to Infinity for this probability to go to zero that's what it means intuitively okay uh and then I'm going to State this uh fact about the method of moments um which I think is pretty useful is that

Segment 2 (05:00 - 10:00)

the method of moments method of estimates we're going to call those Theta hat those are consistent okay this is a fact um fact I am not going to prove this um this fact it you know this is actually a pretty good exercise for you to make sure you understand the method of moments but I'm going to walk you through approximately how it works okay so the idea here is what you can show before you show that these estimates are consistent these estimates Theta hat are a function of my estimated moments these are my uh estimated moments remember the first moment is the expected value the second of x squ and so on and so forth um these higher and higher moments what you can show first what you need to prove first kind of a Lemma if you will a Lemma is that the estimated moments are consistent the estimated moments these mu K hat are consistent meaning that they converge to the true moments in probability as n goes to Infinity uh they converge to the true moments we say improbability meaning it's this expression in probability they convert to the true moments in probability as n goes to Infinity now you've already seen an examp example of this remember the law of large numbers the law of numbers essentially is a proof it is a statement of this Lemma for k equals 1 okay this is a special case for k equals 1 essentially showing that mu hat one the expected value the mean this is the sample mean um remember of your data of your data 1 / n sum I = 1 to n of each of my random variables that estimated uh first moment or estimated expectation value um converges to the true mean of your data as n goes to Infinity we've already stated this and proven the law of large numbers uh remember we use um I believe you know marov and chubby chubs inequalities to prove this okay so what I would like you to do if you really want to understand this first off you don't need to prove this you can take my word for it this is a fact that the method of moments estimates Theta hat are consistent meaning they converge in probability to the True Value meaning that random variable its variance goes to zero as n goes to infinity and its mean value its expected value is the true parameter value we're trying to estimate if you take my word for you don't have to prove this but if you want to kind of make sure that you understand all of these Concepts the method of moments and the law of large numbers um you know kind of in general you can actually prove this by first showing that the moments the estimated moments uh mu K hat are consistent meaning that they converge in probability to the true moments and the first k equals 1 case is the law of large numbers so you can go back to that lecture you can watch how we prove that using Mar and chubby Chev inequalities and then you can use that to prove this for k equals 2 and three and four and for all K and if all of these M's are consistent then you can also show that our estimated parameters Theta hat are in fact consistent that would be a really nice exercise for you and if you can do that then you'll have really good Mastery over all of this material okay um last point I just want to make a couple of like little notes here so that I don't forget consistency essentially means that Theta hat uh is an unbiased estimate or estimator of the true Theta that's essentially what this implies is that Theta hat is an unbiased estimate um and that essentially also means that the expected value of theta hat equals Theta true the true value of theta okay um so that's what consistency means you can

Segment 3 (10:00 - 10:00)

prove it for the method of moments this is the thumbnail sketch it's also true for the maximum likelihood estimate which we'll be talking about soon the maximum likelihood estimate is also a consistent unbias estimate of the true parameter values so really useful um and it's related to this law of large numbers and things we've looked at before okay thank you

Другие видео автора — Steve Brunton

Ctrl+V

Экстракт Знаний в Telegram

Экстракты и дистилляты из лучших YouTube-каналов — сразу после публикации.

Подписаться

Дайджест Экстрактов

Лучшие методички за неделю — каждый понедельник