{"id":6878,"date":"2021-06-19T18:16:14","date_gmt":"2021-06-19T18:16:14","guid":{"rendered":"http:\/\/invbat.com\/blog2\/?p=6878"},"modified":"2021-06-19T18:16:14","modified_gmt":"2021-06-19T18:16:14","slug":"transfomrer-is-a-deep-machine-learning-model-designed-to-handle-sequential-input-data-such-as-natural-language-but-it-does-not-process-the-words-by-sequence-of-appearance-but-rather-by-context-me","status":"publish","type":"post","link":"https:\/\/invbat.com\/blog2\/index.php\/2021\/06\/19\/transfomrer-is-a-deep-machine-learning-model-designed-to-handle-sequential-input-data-such-as-natural-language-but-it-does-not-process-the-words-by-sequence-of-appearance-but-rather-by-context-me\/","title":{"rendered":"<a href=\"https:\/\/en.wikipedia.org\/wiki\/Transformer_(machine_learning_model)\"><font color=\"blue\">    Transfomrer is a deep machine learning model designed to handle sequential input data such as natural language. But it does not process the words by sequence of appearance but rather by context meaning weighing the influence of different words position in sequence using combinatorial probability and statistical frequency distribution. <\/font><\/a>"},"content":{"rendered":"\n<p><a href=\"https:\/\/en.wikipedia.org\/wiki\/Transformer_(machine_learning_model)\"> \n\nTransfomrer is a deep machine learning model designed to handle sequential input data such as natural language. But it does not process the words by sequence of appearance but rather by context meaning weighing the influence of different words position in sequence using combinatorial probability and statistical frequency distribution.\n\n<br>\n\n<\/a><\/p>\n\n\n\n<figure class=\"wp-block-image\"><a href=\"https:\/\/invbat.com\/ads1.html\"><img loading=\"lazy\" decoding=\"async\" width=\"885\" height=\"305\" src=\"https:\/\/invbat.com\/blog2\/wp-content\/uploads\/2021\/04\/Help2.png\" alt=\"\" class=\"wp-image-5005\" srcset=\"https:\/\/invbat.com\/blog2\/wp-content\/uploads\/2021\/04\/Help2.png 885w, https:\/\/invbat.com\/blog2\/wp-content\/uploads\/2021\/04\/Help2-300x103.png 300w, https:\/\/invbat.com\/blog2\/wp-content\/uploads\/2021\/04\/Help2-768x265.png 768w\" sizes=\"auto, (max-width: 709px) 85vw, (max-width: 909px) 67vw, (max-width: 1362px) 62vw, 840px\" \/><\/a><\/figure><p>advertisement<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Transfomrer is a deep machine learning model designed to handle sequential input data such as natural language. But it does not process the words by sequence of appearance but rather by context meaning weighing the influence of different words position in sequence using combinatorial probability and statistical frequency distribution.<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[390,391],"tags":[],"class_list":["post-6878","post","type-post","status-publish","format-standard","hentry","category-bert-machine-learning-for-nlp","category-transformer-machine-learning-model"],"_links":{"self":[{"href":"https:\/\/invbat.com\/blog2\/index.php\/wp-json\/wp\/v2\/posts\/6878","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/invbat.com\/blog2\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/invbat.com\/blog2\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/invbat.com\/blog2\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/invbat.com\/blog2\/index.php\/wp-json\/wp\/v2\/comments?post=6878"}],"version-history":[{"count":1,"href":"https:\/\/invbat.com\/blog2\/index.php\/wp-json\/wp\/v2\/posts\/6878\/revisions"}],"predecessor-version":[{"id":6879,"href":"https:\/\/invbat.com\/blog2\/index.php\/wp-json\/wp\/v2\/posts\/6878\/revisions\/6879"}],"wp:attachment":[{"href":"https:\/\/invbat.com\/blog2\/index.php\/wp-json\/wp\/v2\/media?parent=6878"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/invbat.com\/blog2\/index.php\/wp-json\/wp\/v2\/categories?post=6878"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/invbat.com\/blog2\/index.php\/wp-json\/wp\/v2\/tags?post=6878"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}