A Sentence with Different Parse Tree StructuresHow to resolve lexical ambiguity in natural language...
Explain the objections to these measures against human trafficking
Prove the support of a real function is countable
Why is working on the same position for more than 15 years not a red flag?
How to tag distinct options/entities without giving any an implicit priority or suggested order?
Why zero tolerance on nudity in space?
Why is "points exist" not an axiom in geometry?
Every character has a name - does this lead to too many named characters?
Check if the digits in the number are in increasing sequence in python
Why avoid shared user accounts?
Does fast page mode apply to ROM?
Can I become debt free or should I file for bankruptcy? How do I manage my debt and finances?
What is this metal M-shaped device for?
Checking for the existence of multiple directories
Program that converts a number to a letter of the alphabet
A universal method for left-hand alignment of a sequence of equalities
What to do when being responsible for data protection in your lab, yet advice is ignored?
Why did the villain in the first Men in Black movie care about Earth's Cockroaches?
Can an insurance company drop you after receiving a bill and refusing to pay?
Why does a metal block make a shrill sound but not a wooden block upon hammering?
If I sold a PS4 game I owned the disc for, can I reinstall it digitally?
Is there some relative to Dutch word "kijken" in German?
Jumping Numbers
Why Normality assumption in linear regression
Why does String.replaceAll() work differently in Java 8 from Java 9?
A Sentence with Different Parse Tree Structures
How to resolve lexical ambiguity in natural language processing?Problem with the PLN classifierSentence similarity in PythonHow can I build an AI with NLP that reads and understands documents?Sentence classification and named identity detection with automatic retrainingSoftware that understands hometask in mathematics with the hometask being a textLearning Tree Paths when Given VectorsDatastructure for grounded grammar inductionIs Fuzzy control equal to grounded motion primitives?What is the motivation for row-wise convolution and folding in Kalchbrenner et al. (2014)?
$begingroup$
I just read about Parse Tree for parsing a sentence as an Input for NLP Task.
In my understanding, a valid Parse Tree of a sentence should have be validated by linguistic expert. So, I concluded, a sentence only has one Parse Tree structure.
But, is that correct? is it possible a sentence has more than one valid structures of parse tree with the same type (e.g. Constituency-based)?
natural-language-processing
$endgroup$
add a comment |
$begingroup$
I just read about Parse Tree for parsing a sentence as an Input for NLP Task.
In my understanding, a valid Parse Tree of a sentence should have be validated by linguistic expert. So, I concluded, a sentence only has one Parse Tree structure.
But, is that correct? is it possible a sentence has more than one valid structures of parse tree with the same type (e.g. Constituency-based)?
natural-language-processing
$endgroup$
add a comment |
$begingroup$
I just read about Parse Tree for parsing a sentence as an Input for NLP Task.
In my understanding, a valid Parse Tree of a sentence should have be validated by linguistic expert. So, I concluded, a sentence only has one Parse Tree structure.
But, is that correct? is it possible a sentence has more than one valid structures of parse tree with the same type (e.g. Constituency-based)?
natural-language-processing
$endgroup$
I just read about Parse Tree for parsing a sentence as an Input for NLP Task.
In my understanding, a valid Parse Tree of a sentence should have be validated by linguistic expert. So, I concluded, a sentence only has one Parse Tree structure.
But, is that correct? is it possible a sentence has more than one valid structures of parse tree with the same type (e.g. Constituency-based)?
natural-language-processing
natural-language-processing
asked 4 hours ago
malioboromalioboro
478116
478116
add a comment |
add a comment |
2 Answers
2
active
oldest
votes
$begingroup$
Grammars in NLP basically correspond to Context-free Grammars(CFG) in formal Language theory. And, in case the CFG corresponding to the NLP task is ambiguous, then corresponding to a single sentence (more formally derivation), there can be multiple Parse Trees.
Hence, it depends on the grammar whether there can be more than one valid parse tree.
New contributor
$endgroup$
add a comment |
$begingroup$
But, is that correct? is it possible a sentence has more than one valid structures of parse tree with the same type (e.g. Constituency-based)?
The fact that a single sequence of words can be parsed in different ways depending on context (or "grounding") is a common basis of miscommunication, misunderstanding, innuendo and jokes.
One classic NLP-related "joke" (around longer than modern AI and NLP) is:
Time flies like an arrow.
Fruit flies like a banana.
There are actually several valid parse trees for even these simple sentences. Which ones come "naturally" will depend on context - anecdotally I only half got the joke when I was younger, because I did not know there were such things as fruit flies, so I was partly confused by literal (but still validly parsed, and somewhat funny) meaning that all fruit can fly about as well as a banana does.
Analysing these kinds of ambiguous sentences quickly leads to the grounding problem - the fact that without some referent for symbols, a grammar is devoid of meaning, even if you know the rules and can construct valid sequences. For instance, the above joke works partly because the nature of time, when referred in a particular way (singular noun, not as a possession or property of another object), leads to a well-known metaphorical reading of the first sentence.
A statistical ML parser could get both sentences correct through training on many relevant examples (or trivially by including the examples themselves with correct parse trees). This has not solved the grounding problem, but may be of practical use for any machine required to handle natural language input and map it to some task.
I did check a while ago though, and most Parts Of Speech taggers in Pythons NLTK get both sentences wrong - I suspect because resolving sentences like those above and AI "getting language jokes" is not a high priority compared to more practical uses for chatbots/summarisers etc.
$endgroup$
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "658"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fai.stackexchange.com%2fquestions%2f10958%2fa-sentence-with-different-parse-tree-structures%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
Grammars in NLP basically correspond to Context-free Grammars(CFG) in formal Language theory. And, in case the CFG corresponding to the NLP task is ambiguous, then corresponding to a single sentence (more formally derivation), there can be multiple Parse Trees.
Hence, it depends on the grammar whether there can be more than one valid parse tree.
New contributor
$endgroup$
add a comment |
$begingroup$
Grammars in NLP basically correspond to Context-free Grammars(CFG) in formal Language theory. And, in case the CFG corresponding to the NLP task is ambiguous, then corresponding to a single sentence (more formally derivation), there can be multiple Parse Trees.
Hence, it depends on the grammar whether there can be more than one valid parse tree.
New contributor
$endgroup$
add a comment |
$begingroup$
Grammars in NLP basically correspond to Context-free Grammars(CFG) in formal Language theory. And, in case the CFG corresponding to the NLP task is ambiguous, then corresponding to a single sentence (more formally derivation), there can be multiple Parse Trees.
Hence, it depends on the grammar whether there can be more than one valid parse tree.
New contributor
$endgroup$
Grammars in NLP basically correspond to Context-free Grammars(CFG) in formal Language theory. And, in case the CFG corresponding to the NLP task is ambiguous, then corresponding to a single sentence (more formally derivation), there can be multiple Parse Trees.
Hence, it depends on the grammar whether there can be more than one valid parse tree.
New contributor
New contributor
answered 3 hours ago
programmerprogrammer
112
112
New contributor
New contributor
add a comment |
add a comment |
$begingroup$
But, is that correct? is it possible a sentence has more than one valid structures of parse tree with the same type (e.g. Constituency-based)?
The fact that a single sequence of words can be parsed in different ways depending on context (or "grounding") is a common basis of miscommunication, misunderstanding, innuendo and jokes.
One classic NLP-related "joke" (around longer than modern AI and NLP) is:
Time flies like an arrow.
Fruit flies like a banana.
There are actually several valid parse trees for even these simple sentences. Which ones come "naturally" will depend on context - anecdotally I only half got the joke when I was younger, because I did not know there were such things as fruit flies, so I was partly confused by literal (but still validly parsed, and somewhat funny) meaning that all fruit can fly about as well as a banana does.
Analysing these kinds of ambiguous sentences quickly leads to the grounding problem - the fact that without some referent for symbols, a grammar is devoid of meaning, even if you know the rules and can construct valid sequences. For instance, the above joke works partly because the nature of time, when referred in a particular way (singular noun, not as a possession or property of another object), leads to a well-known metaphorical reading of the first sentence.
A statistical ML parser could get both sentences correct through training on many relevant examples (or trivially by including the examples themselves with correct parse trees). This has not solved the grounding problem, but may be of practical use for any machine required to handle natural language input and map it to some task.
I did check a while ago though, and most Parts Of Speech taggers in Pythons NLTK get both sentences wrong - I suspect because resolving sentences like those above and AI "getting language jokes" is not a high priority compared to more practical uses for chatbots/summarisers etc.
$endgroup$
add a comment |
$begingroup$
But, is that correct? is it possible a sentence has more than one valid structures of parse tree with the same type (e.g. Constituency-based)?
The fact that a single sequence of words can be parsed in different ways depending on context (or "grounding") is a common basis of miscommunication, misunderstanding, innuendo and jokes.
One classic NLP-related "joke" (around longer than modern AI and NLP) is:
Time flies like an arrow.
Fruit flies like a banana.
There are actually several valid parse trees for even these simple sentences. Which ones come "naturally" will depend on context - anecdotally I only half got the joke when I was younger, because I did not know there were such things as fruit flies, so I was partly confused by literal (but still validly parsed, and somewhat funny) meaning that all fruit can fly about as well as a banana does.
Analysing these kinds of ambiguous sentences quickly leads to the grounding problem - the fact that without some referent for symbols, a grammar is devoid of meaning, even if you know the rules and can construct valid sequences. For instance, the above joke works partly because the nature of time, when referred in a particular way (singular noun, not as a possession or property of another object), leads to a well-known metaphorical reading of the first sentence.
A statistical ML parser could get both sentences correct through training on many relevant examples (or trivially by including the examples themselves with correct parse trees). This has not solved the grounding problem, but may be of practical use for any machine required to handle natural language input and map it to some task.
I did check a while ago though, and most Parts Of Speech taggers in Pythons NLTK get both sentences wrong - I suspect because resolving sentences like those above and AI "getting language jokes" is not a high priority compared to more practical uses for chatbots/summarisers etc.
$endgroup$
add a comment |
$begingroup$
But, is that correct? is it possible a sentence has more than one valid structures of parse tree with the same type (e.g. Constituency-based)?
The fact that a single sequence of words can be parsed in different ways depending on context (or "grounding") is a common basis of miscommunication, misunderstanding, innuendo and jokes.
One classic NLP-related "joke" (around longer than modern AI and NLP) is:
Time flies like an arrow.
Fruit flies like a banana.
There are actually several valid parse trees for even these simple sentences. Which ones come "naturally" will depend on context - anecdotally I only half got the joke when I was younger, because I did not know there were such things as fruit flies, so I was partly confused by literal (but still validly parsed, and somewhat funny) meaning that all fruit can fly about as well as a banana does.
Analysing these kinds of ambiguous sentences quickly leads to the grounding problem - the fact that without some referent for symbols, a grammar is devoid of meaning, even if you know the rules and can construct valid sequences. For instance, the above joke works partly because the nature of time, when referred in a particular way (singular noun, not as a possession or property of another object), leads to a well-known metaphorical reading of the first sentence.
A statistical ML parser could get both sentences correct through training on many relevant examples (or trivially by including the examples themselves with correct parse trees). This has not solved the grounding problem, but may be of practical use for any machine required to handle natural language input and map it to some task.
I did check a while ago though, and most Parts Of Speech taggers in Pythons NLTK get both sentences wrong - I suspect because resolving sentences like those above and AI "getting language jokes" is not a high priority compared to more practical uses for chatbots/summarisers etc.
$endgroup$
But, is that correct? is it possible a sentence has more than one valid structures of parse tree with the same type (e.g. Constituency-based)?
The fact that a single sequence of words can be parsed in different ways depending on context (or "grounding") is a common basis of miscommunication, misunderstanding, innuendo and jokes.
One classic NLP-related "joke" (around longer than modern AI and NLP) is:
Time flies like an arrow.
Fruit flies like a banana.
There are actually several valid parse trees for even these simple sentences. Which ones come "naturally" will depend on context - anecdotally I only half got the joke when I was younger, because I did not know there were such things as fruit flies, so I was partly confused by literal (but still validly parsed, and somewhat funny) meaning that all fruit can fly about as well as a banana does.
Analysing these kinds of ambiguous sentences quickly leads to the grounding problem - the fact that without some referent for symbols, a grammar is devoid of meaning, even if you know the rules and can construct valid sequences. For instance, the above joke works partly because the nature of time, when referred in a particular way (singular noun, not as a possession or property of another object), leads to a well-known metaphorical reading of the first sentence.
A statistical ML parser could get both sentences correct through training on many relevant examples (or trivially by including the examples themselves with correct parse trees). This has not solved the grounding problem, but may be of practical use for any machine required to handle natural language input and map it to some task.
I did check a while ago though, and most Parts Of Speech taggers in Pythons NLTK get both sentences wrong - I suspect because resolving sentences like those above and AI "getting language jokes" is not a high priority compared to more practical uses for chatbots/summarisers etc.
edited 53 mins ago
answered 1 hour ago
Neil SlaterNeil Slater
5,6731416
5,6731416
add a comment |
add a comment |
Thanks for contributing an answer to Artificial Intelligence Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fai.stackexchange.com%2fquestions%2f10958%2fa-sentence-with-different-parse-tree-structures%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown