How is it possible for both the likelihood and log-likelihood to be asymptotically normal?Do third order...
Why wasn't TEventArgs made contravariant in the standard event pattern in the .NET ecosystem?
Why did the villain in the first Men in Black movie care about Earth's Cockroaches?
Why did Democrats in the Senate oppose the Born-Alive Abortion Survivors Protection Act (2019 S.130)?
In Linux what happens if 1000 files in a directory are moved to another location while another 300 files were added to the source directory?
Is a new Boolean field better than a null reference when a value can be meaningfully absent?
Can I make estimated tax payments instead of withholding from my paycheck?
How is it possible for both the likelihood and log-likelihood to be asymptotically normal?
What is the difference between rolling more dice versus fewer dice?
Can we harness gravitational potential energy?
Why am I able to open Wireshark in macOS without root privileges?
Why is it that Bernie Sanders is always called a "socialist"?
Do theoretical physics suggest that gravity is the exchange of gravitons or deformation/bending of spacetime?
Is using an 'empty' metaphor considered bad style?
Why do neural networks need so many training examples to perform?
Can you tell from a blurry photo if focus was too close or too far?
Consequences of lack of rigour
What's a good word to describe a public place that looks like it wouldn't be rough?
Why was Lupin comfortable with saying Voldemort's name?
Why is Agricola named as such?
How does Leonard in "Memento" remember reading and writing?
What sets the resolution of an analog resistive sensor?
Is there any risk in sharing info about technologies and products we use with a supplier?
Is it possible to grant users sftp access without shell access? If yes, how is it implemented?
Should I reinstall Linux when changing the laptop's CPU?
How is it possible for both the likelihood and log-likelihood to be asymptotically normal?
Do third order asymptotics exist?Help with Taylor expansion of log likelihood functionIs it better to use a MLE or a MME to build an asymptotic confidence interval for a real parameter $theta$?How to find the asymptotic variance of a UMVUE?When is logistic regression MLE consistent and asymptotically normal?On the joint asymptotic distribution of two maximum likelihood estimatorsFunctional Invariance of the MLEAsymptotic distribution of sample variance via multivariate delta methodIs the Quadratic Approximation of Log-Likelihood Equivalent to the Normal Approximation of the MLE?Is the distribution of the logarithm of the mean of Bernoulli random variables ($log overline X$) still asymptotically normal?
$begingroup$
I was trying to understand asymptotic normality of the posterior better, and came across a confusing point. So let's say we have a likelihood, $L(theta | X) = Pi_{i=1}^n p(X_i | theta)$, so the log-likelihood is $J(theta) = log L = Sigma_{i=1}^n log(p(X_i | theta))$.
J is itself a sum of random variables, so the log-likelihood J will be asymptotically normal, by the central limit theorem.
But we can also show the likelihood is asymptotically normal through a Taylor expansion. Let $hat{theta}$ be the mle. So we have
$J(theta) = J(hat{theta}) + nabla J cdot (theta-hat{theta}) + frac{1}{2}(theta-hat{theta})H(theta-hat{theta})$. Since $hat{theta}$ is the mle, we know $nabla J = 0$, and $I(theta)=-H$ so this reduces to
(1) $J(theta) = log(L) = J(hat{theta}) - frac{1}{2}(theta-hat{theta})I(theta)(theta-hat{theta})$
Now exponentiating (1), we get
$e^J = L = ke^{-frac{1}{2}(theta-hat{theta})I(theta)(theta-hat{theta})}$, which is also asymptotically normal, with L ~ $N(hat{theta},I(theta)^{-1})$.
Am I making a mistake here...?
bayesian mathematical-statistics likelihood asymptotics
$endgroup$
add a comment |
$begingroup$
I was trying to understand asymptotic normality of the posterior better, and came across a confusing point. So let's say we have a likelihood, $L(theta | X) = Pi_{i=1}^n p(X_i | theta)$, so the log-likelihood is $J(theta) = log L = Sigma_{i=1}^n log(p(X_i | theta))$.
J is itself a sum of random variables, so the log-likelihood J will be asymptotically normal, by the central limit theorem.
But we can also show the likelihood is asymptotically normal through a Taylor expansion. Let $hat{theta}$ be the mle. So we have
$J(theta) = J(hat{theta}) + nabla J cdot (theta-hat{theta}) + frac{1}{2}(theta-hat{theta})H(theta-hat{theta})$. Since $hat{theta}$ is the mle, we know $nabla J = 0$, and $I(theta)=-H$ so this reduces to
(1) $J(theta) = log(L) = J(hat{theta}) - frac{1}{2}(theta-hat{theta})I(theta)(theta-hat{theta})$
Now exponentiating (1), we get
$e^J = L = ke^{-frac{1}{2}(theta-hat{theta})I(theta)(theta-hat{theta})}$, which is also asymptotically normal, with L ~ $N(hat{theta},I(theta)^{-1})$.
Am I making a mistake here...?
bayesian mathematical-statistics likelihood asymptotics
$endgroup$
2
$begingroup$
If the log likelihood is asymptotically normal, then the likelihood must be asymptotically lognormal. Can it then at the same time be asymptotically normal? asymptotics can be strange ...
$endgroup$
– kjetil b halvorsen
5 hours ago
add a comment |
$begingroup$
I was trying to understand asymptotic normality of the posterior better, and came across a confusing point. So let's say we have a likelihood, $L(theta | X) = Pi_{i=1}^n p(X_i | theta)$, so the log-likelihood is $J(theta) = log L = Sigma_{i=1}^n log(p(X_i | theta))$.
J is itself a sum of random variables, so the log-likelihood J will be asymptotically normal, by the central limit theorem.
But we can also show the likelihood is asymptotically normal through a Taylor expansion. Let $hat{theta}$ be the mle. So we have
$J(theta) = J(hat{theta}) + nabla J cdot (theta-hat{theta}) + frac{1}{2}(theta-hat{theta})H(theta-hat{theta})$. Since $hat{theta}$ is the mle, we know $nabla J = 0$, and $I(theta)=-H$ so this reduces to
(1) $J(theta) = log(L) = J(hat{theta}) - frac{1}{2}(theta-hat{theta})I(theta)(theta-hat{theta})$
Now exponentiating (1), we get
$e^J = L = ke^{-frac{1}{2}(theta-hat{theta})I(theta)(theta-hat{theta})}$, which is also asymptotically normal, with L ~ $N(hat{theta},I(theta)^{-1})$.
Am I making a mistake here...?
bayesian mathematical-statistics likelihood asymptotics
$endgroup$
I was trying to understand asymptotic normality of the posterior better, and came across a confusing point. So let's say we have a likelihood, $L(theta | X) = Pi_{i=1}^n p(X_i | theta)$, so the log-likelihood is $J(theta) = log L = Sigma_{i=1}^n log(p(X_i | theta))$.
J is itself a sum of random variables, so the log-likelihood J will be asymptotically normal, by the central limit theorem.
But we can also show the likelihood is asymptotically normal through a Taylor expansion. Let $hat{theta}$ be the mle. So we have
$J(theta) = J(hat{theta}) + nabla J cdot (theta-hat{theta}) + frac{1}{2}(theta-hat{theta})H(theta-hat{theta})$. Since $hat{theta}$ is the mle, we know $nabla J = 0$, and $I(theta)=-H$ so this reduces to
(1) $J(theta) = log(L) = J(hat{theta}) - frac{1}{2}(theta-hat{theta})I(theta)(theta-hat{theta})$
Now exponentiating (1), we get
$e^J = L = ke^{-frac{1}{2}(theta-hat{theta})I(theta)(theta-hat{theta})}$, which is also asymptotically normal, with L ~ $N(hat{theta},I(theta)^{-1})$.
Am I making a mistake here...?
bayesian mathematical-statistics likelihood asymptotics
bayesian mathematical-statistics likelihood asymptotics
edited 5 hours ago
kjetil b halvorsen
30.6k983220
30.6k983220
asked 6 hours ago
user49404user49404
1036
1036
2
$begingroup$
If the log likelihood is asymptotically normal, then the likelihood must be asymptotically lognormal. Can it then at the same time be asymptotically normal? asymptotics can be strange ...
$endgroup$
– kjetil b halvorsen
5 hours ago
add a comment |
2
$begingroup$
If the log likelihood is asymptotically normal, then the likelihood must be asymptotically lognormal. Can it then at the same time be asymptotically normal? asymptotics can be strange ...
$endgroup$
– kjetil b halvorsen
5 hours ago
2
2
$begingroup$
If the log likelihood is asymptotically normal, then the likelihood must be asymptotically lognormal. Can it then at the same time be asymptotically normal? asymptotics can be strange ...
$endgroup$
– kjetil b halvorsen
5 hours ago
$begingroup$
If the log likelihood is asymptotically normal, then the likelihood must be asymptotically lognormal. Can it then at the same time be asymptotically normal? asymptotics can be strange ...
$endgroup$
– kjetil b halvorsen
5 hours ago
add a comment |
1 Answer
1
active
oldest
votes
$begingroup$
I think you just have to be precise about what you mean by "asymptotically normal." For example, when people say that "a sum of random variables is asymptotically normal by the central limit theorem," they usually really mean a precise statement about convergence in distribution, e.g.,
Central Limit Theorem (Lindeberg–Lévy version).
Suppose $(X_n)_{n=1}^infty$ is a sequence of i.i.d. random variables with mean $mu$ and variance $sigma^2 < infty$. Let $S_n = n^{-1}(X_1 + cdots + X_n)$ (the $n$th sample mean). Then
$$
sqrt{n} (S_n - mu) Rightarrow N(0, sigma^2)
$$
as $n to infty$ (here $Rightarrow$ denotes convergence in distribution).
This doesn't say that $S_n Rightarrow N(mu, sigma^2/n)$ as $n to infty$, which is formally impossible because the expression on the right-hand side involves $n$, but it is often informally stated as $S_n approx N(mu, sigma^2/n)$ for large $n$ (the symbol $approx$ should be read "is approximately distributed as").
In your case, you have a sequence $(L_n)_{n=1}^infty$ of log-likelihoods that, after appropriate standardization, become a sequence $(S_n)_{n=1}^infty$ that satisfies
$$
sqrt{n}(S_n - theta) Rightarrow N(0, sigma^2)
$$
as $n to infty$ (for some $theta$ and $sigma^2$). Now you can recall the delta method:
Delta Method.
Suppose $(S_n)_{n=1}^infty$ is a sequence of random variables satisfying
$$
sqrt{n} (S_n - theta) Rightarrow N(0, sigma^2)
$$
as $n to infty$ for some constants $theta$ and $sigma^2$.
Let $g : mathbb{R} to mathbb{R}$ be a function such that $g^prime(theta)$ exists and is nonzero.
Then
$$
sqrt{n}(g(S_n) - g(theta)) Rightarrow N(0, sigma^2 left(g^prime(theta)right)^2)
$$
as $n to infty$.
The hand-wavey interpretastion of this is that if
$$
S_n approx N(theta, sigma^2 / n)
$$
for large $n$, then
$$
g(S_n) approx N(g(theta), sigma^2left(g^prime(theta)right)^2/n)
$$
for large $n$ (provided that $g^prime(theta)$ exists and is nonzero).
In particular, it shouldn't be surprising that the sequences $(S_n)_{n=1}^infty$ and $(exp(S_n))_{n=1}^infty$ are simultaneously "asymptotically normal."
$endgroup$
$begingroup$
This was super helpful, thanks for the reply. I missed the obvious delta method connection. Thanks.
$endgroup$
– user49404
2 hours ago
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "65"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f394768%2fhow-is-it-possible-for-both-the-likelihood-and-log-likelihood-to-be-asymptotical%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
I think you just have to be precise about what you mean by "asymptotically normal." For example, when people say that "a sum of random variables is asymptotically normal by the central limit theorem," they usually really mean a precise statement about convergence in distribution, e.g.,
Central Limit Theorem (Lindeberg–Lévy version).
Suppose $(X_n)_{n=1}^infty$ is a sequence of i.i.d. random variables with mean $mu$ and variance $sigma^2 < infty$. Let $S_n = n^{-1}(X_1 + cdots + X_n)$ (the $n$th sample mean). Then
$$
sqrt{n} (S_n - mu) Rightarrow N(0, sigma^2)
$$
as $n to infty$ (here $Rightarrow$ denotes convergence in distribution).
This doesn't say that $S_n Rightarrow N(mu, sigma^2/n)$ as $n to infty$, which is formally impossible because the expression on the right-hand side involves $n$, but it is often informally stated as $S_n approx N(mu, sigma^2/n)$ for large $n$ (the symbol $approx$ should be read "is approximately distributed as").
In your case, you have a sequence $(L_n)_{n=1}^infty$ of log-likelihoods that, after appropriate standardization, become a sequence $(S_n)_{n=1}^infty$ that satisfies
$$
sqrt{n}(S_n - theta) Rightarrow N(0, sigma^2)
$$
as $n to infty$ (for some $theta$ and $sigma^2$). Now you can recall the delta method:
Delta Method.
Suppose $(S_n)_{n=1}^infty$ is a sequence of random variables satisfying
$$
sqrt{n} (S_n - theta) Rightarrow N(0, sigma^2)
$$
as $n to infty$ for some constants $theta$ and $sigma^2$.
Let $g : mathbb{R} to mathbb{R}$ be a function such that $g^prime(theta)$ exists and is nonzero.
Then
$$
sqrt{n}(g(S_n) - g(theta)) Rightarrow N(0, sigma^2 left(g^prime(theta)right)^2)
$$
as $n to infty$.
The hand-wavey interpretastion of this is that if
$$
S_n approx N(theta, sigma^2 / n)
$$
for large $n$, then
$$
g(S_n) approx N(g(theta), sigma^2left(g^prime(theta)right)^2/n)
$$
for large $n$ (provided that $g^prime(theta)$ exists and is nonzero).
In particular, it shouldn't be surprising that the sequences $(S_n)_{n=1}^infty$ and $(exp(S_n))_{n=1}^infty$ are simultaneously "asymptotically normal."
$endgroup$
$begingroup$
This was super helpful, thanks for the reply. I missed the obvious delta method connection. Thanks.
$endgroup$
– user49404
2 hours ago
add a comment |
$begingroup$
I think you just have to be precise about what you mean by "asymptotically normal." For example, when people say that "a sum of random variables is asymptotically normal by the central limit theorem," they usually really mean a precise statement about convergence in distribution, e.g.,
Central Limit Theorem (Lindeberg–Lévy version).
Suppose $(X_n)_{n=1}^infty$ is a sequence of i.i.d. random variables with mean $mu$ and variance $sigma^2 < infty$. Let $S_n = n^{-1}(X_1 + cdots + X_n)$ (the $n$th sample mean). Then
$$
sqrt{n} (S_n - mu) Rightarrow N(0, sigma^2)
$$
as $n to infty$ (here $Rightarrow$ denotes convergence in distribution).
This doesn't say that $S_n Rightarrow N(mu, sigma^2/n)$ as $n to infty$, which is formally impossible because the expression on the right-hand side involves $n$, but it is often informally stated as $S_n approx N(mu, sigma^2/n)$ for large $n$ (the symbol $approx$ should be read "is approximately distributed as").
In your case, you have a sequence $(L_n)_{n=1}^infty$ of log-likelihoods that, after appropriate standardization, become a sequence $(S_n)_{n=1}^infty$ that satisfies
$$
sqrt{n}(S_n - theta) Rightarrow N(0, sigma^2)
$$
as $n to infty$ (for some $theta$ and $sigma^2$). Now you can recall the delta method:
Delta Method.
Suppose $(S_n)_{n=1}^infty$ is a sequence of random variables satisfying
$$
sqrt{n} (S_n - theta) Rightarrow N(0, sigma^2)
$$
as $n to infty$ for some constants $theta$ and $sigma^2$.
Let $g : mathbb{R} to mathbb{R}$ be a function such that $g^prime(theta)$ exists and is nonzero.
Then
$$
sqrt{n}(g(S_n) - g(theta)) Rightarrow N(0, sigma^2 left(g^prime(theta)right)^2)
$$
as $n to infty$.
The hand-wavey interpretastion of this is that if
$$
S_n approx N(theta, sigma^2 / n)
$$
for large $n$, then
$$
g(S_n) approx N(g(theta), sigma^2left(g^prime(theta)right)^2/n)
$$
for large $n$ (provided that $g^prime(theta)$ exists and is nonzero).
In particular, it shouldn't be surprising that the sequences $(S_n)_{n=1}^infty$ and $(exp(S_n))_{n=1}^infty$ are simultaneously "asymptotically normal."
$endgroup$
$begingroup$
This was super helpful, thanks for the reply. I missed the obvious delta method connection. Thanks.
$endgroup$
– user49404
2 hours ago
add a comment |
$begingroup$
I think you just have to be precise about what you mean by "asymptotically normal." For example, when people say that "a sum of random variables is asymptotically normal by the central limit theorem," they usually really mean a precise statement about convergence in distribution, e.g.,
Central Limit Theorem (Lindeberg–Lévy version).
Suppose $(X_n)_{n=1}^infty$ is a sequence of i.i.d. random variables with mean $mu$ and variance $sigma^2 < infty$. Let $S_n = n^{-1}(X_1 + cdots + X_n)$ (the $n$th sample mean). Then
$$
sqrt{n} (S_n - mu) Rightarrow N(0, sigma^2)
$$
as $n to infty$ (here $Rightarrow$ denotes convergence in distribution).
This doesn't say that $S_n Rightarrow N(mu, sigma^2/n)$ as $n to infty$, which is formally impossible because the expression on the right-hand side involves $n$, but it is often informally stated as $S_n approx N(mu, sigma^2/n)$ for large $n$ (the symbol $approx$ should be read "is approximately distributed as").
In your case, you have a sequence $(L_n)_{n=1}^infty$ of log-likelihoods that, after appropriate standardization, become a sequence $(S_n)_{n=1}^infty$ that satisfies
$$
sqrt{n}(S_n - theta) Rightarrow N(0, sigma^2)
$$
as $n to infty$ (for some $theta$ and $sigma^2$). Now you can recall the delta method:
Delta Method.
Suppose $(S_n)_{n=1}^infty$ is a sequence of random variables satisfying
$$
sqrt{n} (S_n - theta) Rightarrow N(0, sigma^2)
$$
as $n to infty$ for some constants $theta$ and $sigma^2$.
Let $g : mathbb{R} to mathbb{R}$ be a function such that $g^prime(theta)$ exists and is nonzero.
Then
$$
sqrt{n}(g(S_n) - g(theta)) Rightarrow N(0, sigma^2 left(g^prime(theta)right)^2)
$$
as $n to infty$.
The hand-wavey interpretastion of this is that if
$$
S_n approx N(theta, sigma^2 / n)
$$
for large $n$, then
$$
g(S_n) approx N(g(theta), sigma^2left(g^prime(theta)right)^2/n)
$$
for large $n$ (provided that $g^prime(theta)$ exists and is nonzero).
In particular, it shouldn't be surprising that the sequences $(S_n)_{n=1}^infty$ and $(exp(S_n))_{n=1}^infty$ are simultaneously "asymptotically normal."
$endgroup$
I think you just have to be precise about what you mean by "asymptotically normal." For example, when people say that "a sum of random variables is asymptotically normal by the central limit theorem," they usually really mean a precise statement about convergence in distribution, e.g.,
Central Limit Theorem (Lindeberg–Lévy version).
Suppose $(X_n)_{n=1}^infty$ is a sequence of i.i.d. random variables with mean $mu$ and variance $sigma^2 < infty$. Let $S_n = n^{-1}(X_1 + cdots + X_n)$ (the $n$th sample mean). Then
$$
sqrt{n} (S_n - mu) Rightarrow N(0, sigma^2)
$$
as $n to infty$ (here $Rightarrow$ denotes convergence in distribution).
This doesn't say that $S_n Rightarrow N(mu, sigma^2/n)$ as $n to infty$, which is formally impossible because the expression on the right-hand side involves $n$, but it is often informally stated as $S_n approx N(mu, sigma^2/n)$ for large $n$ (the symbol $approx$ should be read "is approximately distributed as").
In your case, you have a sequence $(L_n)_{n=1}^infty$ of log-likelihoods that, after appropriate standardization, become a sequence $(S_n)_{n=1}^infty$ that satisfies
$$
sqrt{n}(S_n - theta) Rightarrow N(0, sigma^2)
$$
as $n to infty$ (for some $theta$ and $sigma^2$). Now you can recall the delta method:
Delta Method.
Suppose $(S_n)_{n=1}^infty$ is a sequence of random variables satisfying
$$
sqrt{n} (S_n - theta) Rightarrow N(0, sigma^2)
$$
as $n to infty$ for some constants $theta$ and $sigma^2$.
Let $g : mathbb{R} to mathbb{R}$ be a function such that $g^prime(theta)$ exists and is nonzero.
Then
$$
sqrt{n}(g(S_n) - g(theta)) Rightarrow N(0, sigma^2 left(g^prime(theta)right)^2)
$$
as $n to infty$.
The hand-wavey interpretastion of this is that if
$$
S_n approx N(theta, sigma^2 / n)
$$
for large $n$, then
$$
g(S_n) approx N(g(theta), sigma^2left(g^prime(theta)right)^2/n)
$$
for large $n$ (provided that $g^prime(theta)$ exists and is nonzero).
In particular, it shouldn't be surprising that the sequences $(S_n)_{n=1}^infty$ and $(exp(S_n))_{n=1}^infty$ are simultaneously "asymptotically normal."
edited 4 hours ago
answered 4 hours ago
Artem MavrinArtem Mavrin
786710
786710
$begingroup$
This was super helpful, thanks for the reply. I missed the obvious delta method connection. Thanks.
$endgroup$
– user49404
2 hours ago
add a comment |
$begingroup$
This was super helpful, thanks for the reply. I missed the obvious delta method connection. Thanks.
$endgroup$
– user49404
2 hours ago
$begingroup$
This was super helpful, thanks for the reply. I missed the obvious delta method connection. Thanks.
$endgroup$
– user49404
2 hours ago
$begingroup$
This was super helpful, thanks for the reply. I missed the obvious delta method connection. Thanks.
$endgroup$
– user49404
2 hours ago
add a comment |
Thanks for contributing an answer to Cross Validated!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f394768%2fhow-is-it-possible-for-both-the-likelihood-and-log-likelihood-to-be-asymptotical%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
2
$begingroup$
If the log likelihood is asymptotically normal, then the likelihood must be asymptotically lognormal. Can it then at the same time be asymptotically normal? asymptotics can be strange ...
$endgroup$
– kjetil b halvorsen
5 hours ago