A Sentence with Different Parse Tree StructuresHow to resolve lexical ambiguity in natural language...

Explain the objections to these measures against human trafficking

Prove the support of a real function is countable

Why is working on the same position for more than 15 years not a red flag?

How to tag distinct options/entities without giving any an implicit priority or suggested order?

Why zero tolerance on nudity in space?

Why is "points exist" not an axiom in geometry?

Every character has a name - does this lead to too many named characters?

Check if the digits in the number are in increasing sequence in python

Why avoid shared user accounts?

Does fast page mode apply to ROM?

Can I become debt free or should I file for bankruptcy? How do I manage my debt and finances?

What is this metal M-shaped device for?

Checking for the existence of multiple directories

Program that converts a number to a letter of the alphabet

A universal method for left-hand alignment of a sequence of equalities

What to do when being responsible for data protection in your lab, yet advice is ignored?

Why did the villain in the first Men in Black movie care about Earth's Cockroaches?

Can an insurance company drop you after receiving a bill and refusing to pay?

Why does a metal block make a shrill sound but not a wooden block upon hammering?

If I sold a PS4 game I owned the disc for, can I reinstall it digitally?

Is there some relative to Dutch word "kijken" in German?

Jumping Numbers

Why Normality assumption in linear regression

Why does String.replaceAll() work differently in Java 8 from Java 9?



A Sentence with Different Parse Tree Structures


How to resolve lexical ambiguity in natural language processing?Problem with the PLN classifierSentence similarity in PythonHow can I build an AI with NLP that reads and understands documents?Sentence classification and named identity detection with automatic retrainingSoftware that understands hometask in mathematics with the hometask being a textLearning Tree Paths when Given VectorsDatastructure for grounded grammar inductionIs Fuzzy control equal to grounded motion primitives?What is the motivation for row-wise convolution and folding in Kalchbrenner et al. (2014)?













1












$begingroup$


I just read about Parse Tree for parsing a sentence as an Input for NLP Task.



In my understanding, a valid Parse Tree of a sentence should have be validated by linguistic expert. So, I concluded, a sentence only has one Parse Tree structure.



But, is that correct? is it possible a sentence has more than one valid structures of parse tree with the same type (e.g. Constituency-based)?










share|improve this question









$endgroup$

















    1












    $begingroup$


    I just read about Parse Tree for parsing a sentence as an Input for NLP Task.



    In my understanding, a valid Parse Tree of a sentence should have be validated by linguistic expert. So, I concluded, a sentence only has one Parse Tree structure.



    But, is that correct? is it possible a sentence has more than one valid structures of parse tree with the same type (e.g. Constituency-based)?










    share|improve this question









    $endgroup$















      1












      1








      1





      $begingroup$


      I just read about Parse Tree for parsing a sentence as an Input for NLP Task.



      In my understanding, a valid Parse Tree of a sentence should have be validated by linguistic expert. So, I concluded, a sentence only has one Parse Tree structure.



      But, is that correct? is it possible a sentence has more than one valid structures of parse tree with the same type (e.g. Constituency-based)?










      share|improve this question









      $endgroup$




      I just read about Parse Tree for parsing a sentence as an Input for NLP Task.



      In my understanding, a valid Parse Tree of a sentence should have be validated by linguistic expert. So, I concluded, a sentence only has one Parse Tree structure.



      But, is that correct? is it possible a sentence has more than one valid structures of parse tree with the same type (e.g. Constituency-based)?







      natural-language-processing






      share|improve this question













      share|improve this question











      share|improve this question




      share|improve this question










      asked 4 hours ago









      malioboromalioboro

      478116




      478116






















          2 Answers
          2






          active

          oldest

          votes


















          1












          $begingroup$

          Grammars in NLP basically correspond to Context-free Grammars(CFG) in formal Language theory. And, in case the CFG corresponding to the NLP task is ambiguous, then corresponding to a single sentence (more formally derivation), there can be multiple Parse Trees.

          Hence, it depends on the grammar whether there can be more than one valid parse tree.






          share|improve this answer








          New contributor




          programmer is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
          Check out our Code of Conduct.






          $endgroup$





















            1












            $begingroup$


            But, is that correct? is it possible a sentence has more than one valid structures of parse tree with the same type (e.g. Constituency-based)?




            The fact that a single sequence of words can be parsed in different ways depending on context (or "grounding") is a common basis of miscommunication, misunderstanding, innuendo and jokes.



            One classic NLP-related "joke" (around longer than modern AI and NLP) is:




            Time flies like an arrow.



            Fruit flies like a banana.




            There are actually several valid parse trees for even these simple sentences. Which ones come "naturally" will depend on context - anecdotally I only half got the joke when I was younger, because I did not know there were such things as fruit flies, so I was partly confused by literal (but still validly parsed, and somewhat funny) meaning that all fruit can fly about as well as a banana does.



            Analysing these kinds of ambiguous sentences quickly leads to the grounding problem - the fact that without some referent for symbols, a grammar is devoid of meaning, even if you know the rules and can construct valid sequences. For instance, the above joke works partly because the nature of time, when referred in a particular way (singular noun, not as a possession or property of another object), leads to a well-known metaphorical reading of the first sentence.



            A statistical ML parser could get both sentences correct through training on many relevant examples (or trivially by including the examples themselves with correct parse trees). This has not solved the grounding problem, but may be of practical use for any machine required to handle natural language input and map it to some task.



            I did check a while ago though, and most Parts Of Speech taggers in Pythons NLTK get both sentences wrong - I suspect because resolving sentences like those above and AI "getting language jokes" is not a high priority compared to more practical uses for chatbots/summarisers etc.






            share|improve this answer











            $endgroup$













              Your Answer





              StackExchange.ifUsing("editor", function () {
              return StackExchange.using("mathjaxEditing", function () {
              StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
              StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
              });
              });
              }, "mathjax-editing");

              StackExchange.ready(function() {
              var channelOptions = {
              tags: "".split(" "),
              id: "658"
              };
              initTagRenderer("".split(" "), "".split(" "), channelOptions);

              StackExchange.using("externalEditor", function() {
              // Have to fire editor after snippets, if snippets enabled
              if (StackExchange.settings.snippets.snippetsEnabled) {
              StackExchange.using("snippets", function() {
              createEditor();
              });
              }
              else {
              createEditor();
              }
              });

              function createEditor() {
              StackExchange.prepareEditor({
              heartbeatType: 'answer',
              autoActivateHeartbeat: false,
              convertImagesToLinks: false,
              noModals: true,
              showLowRepImageUploadWarning: true,
              reputationToPostImages: null,
              bindNavPrevention: true,
              postfix: "",
              imageUploader: {
              brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
              contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
              allowUrls: true
              },
              noCode: true, onDemand: true,
              discardSelector: ".discard-answer"
              ,immediatelyShowMarkdownHelp:true
              });


              }
              });














              draft saved

              draft discarded


















              StackExchange.ready(
              function () {
              StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fai.stackexchange.com%2fquestions%2f10958%2fa-sentence-with-different-parse-tree-structures%23new-answer', 'question_page');
              }
              );

              Post as a guest















              Required, but never shown

























              2 Answers
              2






              active

              oldest

              votes








              2 Answers
              2






              active

              oldest

              votes









              active

              oldest

              votes






              active

              oldest

              votes









              1












              $begingroup$

              Grammars in NLP basically correspond to Context-free Grammars(CFG) in formal Language theory. And, in case the CFG corresponding to the NLP task is ambiguous, then corresponding to a single sentence (more formally derivation), there can be multiple Parse Trees.

              Hence, it depends on the grammar whether there can be more than one valid parse tree.






              share|improve this answer








              New contributor




              programmer is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
              Check out our Code of Conduct.






              $endgroup$


















                1












                $begingroup$

                Grammars in NLP basically correspond to Context-free Grammars(CFG) in formal Language theory. And, in case the CFG corresponding to the NLP task is ambiguous, then corresponding to a single sentence (more formally derivation), there can be multiple Parse Trees.

                Hence, it depends on the grammar whether there can be more than one valid parse tree.






                share|improve this answer








                New contributor




                programmer is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
                Check out our Code of Conduct.






                $endgroup$
















                  1












                  1








                  1





                  $begingroup$

                  Grammars in NLP basically correspond to Context-free Grammars(CFG) in formal Language theory. And, in case the CFG corresponding to the NLP task is ambiguous, then corresponding to a single sentence (more formally derivation), there can be multiple Parse Trees.

                  Hence, it depends on the grammar whether there can be more than one valid parse tree.






                  share|improve this answer








                  New contributor




                  programmer is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
                  Check out our Code of Conduct.






                  $endgroup$



                  Grammars in NLP basically correspond to Context-free Grammars(CFG) in formal Language theory. And, in case the CFG corresponding to the NLP task is ambiguous, then corresponding to a single sentence (more formally derivation), there can be multiple Parse Trees.

                  Hence, it depends on the grammar whether there can be more than one valid parse tree.







                  share|improve this answer








                  New contributor




                  programmer is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
                  Check out our Code of Conduct.









                  share|improve this answer



                  share|improve this answer






                  New contributor




                  programmer is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
                  Check out our Code of Conduct.









                  answered 3 hours ago









                  programmerprogrammer

                  112




                  112




                  New contributor




                  programmer is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
                  Check out our Code of Conduct.





                  New contributor





                  programmer is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
                  Check out our Code of Conduct.






                  programmer is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
                  Check out our Code of Conduct.

























                      1












                      $begingroup$


                      But, is that correct? is it possible a sentence has more than one valid structures of parse tree with the same type (e.g. Constituency-based)?




                      The fact that a single sequence of words can be parsed in different ways depending on context (or "grounding") is a common basis of miscommunication, misunderstanding, innuendo and jokes.



                      One classic NLP-related "joke" (around longer than modern AI and NLP) is:




                      Time flies like an arrow.



                      Fruit flies like a banana.




                      There are actually several valid parse trees for even these simple sentences. Which ones come "naturally" will depend on context - anecdotally I only half got the joke when I was younger, because I did not know there were such things as fruit flies, so I was partly confused by literal (but still validly parsed, and somewhat funny) meaning that all fruit can fly about as well as a banana does.



                      Analysing these kinds of ambiguous sentences quickly leads to the grounding problem - the fact that without some referent for symbols, a grammar is devoid of meaning, even if you know the rules and can construct valid sequences. For instance, the above joke works partly because the nature of time, when referred in a particular way (singular noun, not as a possession or property of another object), leads to a well-known metaphorical reading of the first sentence.



                      A statistical ML parser could get both sentences correct through training on many relevant examples (or trivially by including the examples themselves with correct parse trees). This has not solved the grounding problem, but may be of practical use for any machine required to handle natural language input and map it to some task.



                      I did check a while ago though, and most Parts Of Speech taggers in Pythons NLTK get both sentences wrong - I suspect because resolving sentences like those above and AI "getting language jokes" is not a high priority compared to more practical uses for chatbots/summarisers etc.






                      share|improve this answer











                      $endgroup$


















                        1












                        $begingroup$


                        But, is that correct? is it possible a sentence has more than one valid structures of parse tree with the same type (e.g. Constituency-based)?




                        The fact that a single sequence of words can be parsed in different ways depending on context (or "grounding") is a common basis of miscommunication, misunderstanding, innuendo and jokes.



                        One classic NLP-related "joke" (around longer than modern AI and NLP) is:




                        Time flies like an arrow.



                        Fruit flies like a banana.




                        There are actually several valid parse trees for even these simple sentences. Which ones come "naturally" will depend on context - anecdotally I only half got the joke when I was younger, because I did not know there were such things as fruit flies, so I was partly confused by literal (but still validly parsed, and somewhat funny) meaning that all fruit can fly about as well as a banana does.



                        Analysing these kinds of ambiguous sentences quickly leads to the grounding problem - the fact that without some referent for symbols, a grammar is devoid of meaning, even if you know the rules and can construct valid sequences. For instance, the above joke works partly because the nature of time, when referred in a particular way (singular noun, not as a possession or property of another object), leads to a well-known metaphorical reading of the first sentence.



                        A statistical ML parser could get both sentences correct through training on many relevant examples (or trivially by including the examples themselves with correct parse trees). This has not solved the grounding problem, but may be of practical use for any machine required to handle natural language input and map it to some task.



                        I did check a while ago though, and most Parts Of Speech taggers in Pythons NLTK get both sentences wrong - I suspect because resolving sentences like those above and AI "getting language jokes" is not a high priority compared to more practical uses for chatbots/summarisers etc.






                        share|improve this answer











                        $endgroup$
















                          1












                          1








                          1





                          $begingroup$


                          But, is that correct? is it possible a sentence has more than one valid structures of parse tree with the same type (e.g. Constituency-based)?




                          The fact that a single sequence of words can be parsed in different ways depending on context (or "grounding") is a common basis of miscommunication, misunderstanding, innuendo and jokes.



                          One classic NLP-related "joke" (around longer than modern AI and NLP) is:




                          Time flies like an arrow.



                          Fruit flies like a banana.




                          There are actually several valid parse trees for even these simple sentences. Which ones come "naturally" will depend on context - anecdotally I only half got the joke when I was younger, because I did not know there were such things as fruit flies, so I was partly confused by literal (but still validly parsed, and somewhat funny) meaning that all fruit can fly about as well as a banana does.



                          Analysing these kinds of ambiguous sentences quickly leads to the grounding problem - the fact that without some referent for symbols, a grammar is devoid of meaning, even if you know the rules and can construct valid sequences. For instance, the above joke works partly because the nature of time, when referred in a particular way (singular noun, not as a possession or property of another object), leads to a well-known metaphorical reading of the first sentence.



                          A statistical ML parser could get both sentences correct through training on many relevant examples (or trivially by including the examples themselves with correct parse trees). This has not solved the grounding problem, but may be of practical use for any machine required to handle natural language input and map it to some task.



                          I did check a while ago though, and most Parts Of Speech taggers in Pythons NLTK get both sentences wrong - I suspect because resolving sentences like those above and AI "getting language jokes" is not a high priority compared to more practical uses for chatbots/summarisers etc.






                          share|improve this answer











                          $endgroup$




                          But, is that correct? is it possible a sentence has more than one valid structures of parse tree with the same type (e.g. Constituency-based)?




                          The fact that a single sequence of words can be parsed in different ways depending on context (or "grounding") is a common basis of miscommunication, misunderstanding, innuendo and jokes.



                          One classic NLP-related "joke" (around longer than modern AI and NLP) is:




                          Time flies like an arrow.



                          Fruit flies like a banana.




                          There are actually several valid parse trees for even these simple sentences. Which ones come "naturally" will depend on context - anecdotally I only half got the joke when I was younger, because I did not know there were such things as fruit flies, so I was partly confused by literal (but still validly parsed, and somewhat funny) meaning that all fruit can fly about as well as a banana does.



                          Analysing these kinds of ambiguous sentences quickly leads to the grounding problem - the fact that without some referent for symbols, a grammar is devoid of meaning, even if you know the rules and can construct valid sequences. For instance, the above joke works partly because the nature of time, when referred in a particular way (singular noun, not as a possession or property of another object), leads to a well-known metaphorical reading of the first sentence.



                          A statistical ML parser could get both sentences correct through training on many relevant examples (or trivially by including the examples themselves with correct parse trees). This has not solved the grounding problem, but may be of practical use for any machine required to handle natural language input and map it to some task.



                          I did check a while ago though, and most Parts Of Speech taggers in Pythons NLTK get both sentences wrong - I suspect because resolving sentences like those above and AI "getting language jokes" is not a high priority compared to more practical uses for chatbots/summarisers etc.







                          share|improve this answer














                          share|improve this answer



                          share|improve this answer








                          edited 53 mins ago

























                          answered 1 hour ago









                          Neil SlaterNeil Slater

                          5,6731416




                          5,6731416






























                              draft saved

                              draft discarded




















































                              Thanks for contributing an answer to Artificial Intelligence Stack Exchange!


                              • Please be sure to answer the question. Provide details and share your research!

                              But avoid



                              • Asking for help, clarification, or responding to other answers.

                              • Making statements based on opinion; back them up with references or personal experience.


                              Use MathJax to format equations. MathJax reference.


                              To learn more, see our tips on writing great answers.




                              draft saved


                              draft discarded














                              StackExchange.ready(
                              function () {
                              StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fai.stackexchange.com%2fquestions%2f10958%2fa-sentence-with-different-parse-tree-structures%23new-answer', 'question_page');
                              }
                              );

                              Post as a guest















                              Required, but never shown





















































                              Required, but never shown














                              Required, but never shown












                              Required, but never shown







                              Required, but never shown

































                              Required, but never shown














                              Required, but never shown












                              Required, but never shown







                              Required, but never shown







                              Popular posts from this blog

                              is 'sed' thread safeWhat should someone know about using Python scripts in the shell?Nexenta bash script uses...

                              How do i solve the “ No module named 'mlxtend' ” issue on Jupyter?

                              Pilgersdorf Inhaltsverzeichnis Geografie | Geschichte | Bevölkerungsentwicklung | Politik | Kultur...