際際滷shows by User: thomas_a_mathew / http://www.slideshare.net/images/logo.gif 際際滷shows by User: thomas_a_mathew / Sat, 11 Jun 2011 16:19:59 GMT 際際滷Share feed for 際際滷shows by User: thomas_a_mathew Text Categorization using N-grams and Hidden-Markov-Models /thomas_a_mathew/text-categorization-using-ngrams-and-hiddenmarkovmodels textcathmm-110611162000-phpapp01
In this paper I discuss an approach for building a soft text classifier based on a Hidden- Markov-Model. The approach treats a multi-category text classification task as predicting the best possible hidden sequence of classifiers based on the observed sequence of text tokens. This method considers the possibility that different sections of a large block of text may hint towards different yet related text categories and the HMM predicts such a sequence of categories. The most probable such sequence of categories can be estimated using the Viterbi algorithm.]]>

In this paper I discuss an approach for building a soft text classifier based on a Hidden- Markov-Model. The approach treats a multi-category text classification task as predicting the best possible hidden sequence of classifiers based on the observed sequence of text tokens. This method considers the possibility that different sections of a large block of text may hint towards different yet related text categories and the HMM predicts such a sequence of categories. The most probable such sequence of categories can be estimated using the Viterbi algorithm.]]>
Sat, 11 Jun 2011 16:19:59 GMT /thomas_a_mathew/text-categorization-using-ngrams-and-hiddenmarkovmodels thomas_a_mathew@slideshare.net(thomas_a_mathew) Text Categorization using N-grams and Hidden-Markov-Models thomas_a_mathew In this paper I discuss an approach for building a soft text classifier based on a Hidden- Markov-Model. The approach treats a multi-category text classification task as predicting the best possible hidden sequence of classifiers based on the observed sequence of text tokens. This method considers the possibility that different sections of a large block of text may hint towards different yet related text categories and the HMM predicts such a sequence of categories. The most probable such sequence of categories can be estimated using the Viterbi algorithm. <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/textcathmm-110611162000-phpapp01-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> In this paper I discuss an approach for building a soft text classifier based on a Hidden- Markov-Model. The approach treats a multi-category text classification task as predicting the best possible hidden sequence of classifiers based on the observed sequence of text tokens. This method considers the possibility that different sections of a large block of text may hint towards different yet related text categories and the HMM predicts such a sequence of categories. The most probable such sequence of categories can be estimated using the Viterbi algorithm.
Text Categorization using N-grams and Hidden-Markov-Models from Thomas Mathew
]]>
9409 18 https://cdn.slidesharecdn.com/ss_thumbnails/textcathmm-110611162000-phpapp01-thumbnail.jpg?width=120&height=120&fit=bounds document Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
Natural Language Generation from First-Order Expressions /thomas_a_mathew/nl-gfrom-fol2 nlgfromfol2-110611161442-phpapp02
In this paper I discuss an approach for generating natural language from a first-order logic representation. The approach shows that a grammar definition for the natural language and a lambda calculus based semantic rule collection can be applied for bi-directional translation using an overgenerate-and-prune mechanism.]]>

In this paper I discuss an approach for generating natural language from a first-order logic representation. The approach shows that a grammar definition for the natural language and a lambda calculus based semantic rule collection can be applied for bi-directional translation using an overgenerate-and-prune mechanism.]]>
Sat, 11 Jun 2011 16:14:39 GMT /thomas_a_mathew/nl-gfrom-fol2 thomas_a_mathew@slideshare.net(thomas_a_mathew) Natural Language Generation from First-Order Expressions thomas_a_mathew In this paper I discuss an approach for generating natural language from a first-order logic representation. The approach shows that a grammar definition for the natural language and a lambda calculus based semantic rule collection can be applied for bi-directional translation using an overgenerate-and-prune mechanism. <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/nlgfromfol2-110611161442-phpapp02-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> In this paper I discuss an approach for generating natural language from a first-order logic representation. The approach shows that a grammar definition for the natural language and a lambda calculus based semantic rule collection can be applied for bi-directional translation using an overgenerate-and-prune mechanism.
Natural Language Generation from First-Order Expressions from Thomas Mathew
]]>
1395 7 https://cdn.slidesharecdn.com/ss_thumbnails/nlgfromfol2-110611161442-phpapp02-thumbnail.jpg?width=120&height=120&fit=bounds document Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
https://cdn.slidesharecdn.com/profile-photo-thomas_a_mathew-48x48.jpg?cb=1598290092 My interests lie in unlocking business value from insights derived from unstructured data and media. www.linkedin.com/in/thomasmathew https://cdn.slidesharecdn.com/ss_thumbnails/textcathmm-110611162000-phpapp01-thumbnail.jpg?width=320&height=320&fit=bounds thomas_a_mathew/text-categorization-using-ngrams-and-hiddenmarkovmodels Text Categorization us... https://cdn.slidesharecdn.com/ss_thumbnails/nlgfromfol2-110611161442-phpapp02-thumbnail.jpg?width=320&height=320&fit=bounds thomas_a_mathew/nl-gfrom-fol2 Natural Language Gener...