Natural language processing using transformer architectures:
Whether you need to automatically judge the sentiment of a user review, summarize long documents, translate text, or build a chatbot, you need the best language model available. In 2018, pretty much every NLP benchmark was crushed by novel transformer-based architectures, replacing long-standing arc...
Gespeichert in:
Beteilige Person: | |
---|---|
Körperschaft: | |
Format: | Elektronisch Video |
Sprache: | Englisch |
Veröffentlicht: |
[Erscheinungsort nicht ermittelbar]
O'Reilly Media, Inc.
2020
|
Ausgabe: | 1st edition. |
Links: | https://learning.oreilly.com/library/view/-/0636920373605/?ar |
Zusammenfassung: | Whether you need to automatically judge the sentiment of a user review, summarize long documents, translate text, or build a chatbot, you need the best language model available. In 2018, pretty much every NLP benchmark was crushed by novel transformer-based architectures, replacing long-standing architectures based on recurrent neural networks. In short, if you're into NLP, you need transformers. But to use transformers, you need to know what they are, what transformer-based architectures look like, and how you can implement them in your projects. Aurélien Géron (Kiwisoft) dives into recurrent neural networks and their limits, the invention of the transformer, attention mechanisms, the transformer architecture, subword tokenization using SentencePiece, self-supervised pretraining-learning from huge corpora, one-size-fits-all language models, BERT and GPT 2, and how to use these language models in your projects using TensorFlow. What you'll learn Understand transformers and modern language models and how they can tackle complex NLP tasks Identify what tools to use and what the code looks like. |
Beschreibung: | Online resource; Title from title screen (viewed February 28, 2020) |
Umfang: | 1 Online-Ressource (1 video file, approximately 45 min.) |
Format: | Mode of access: World Wide Web. |
Internformat
MARC
LEADER | 00000cgm a22000002 4500 | ||
---|---|---|---|
001 | ZDB-30-ORH-050574639 | ||
003 | DE-627-1 | ||
005 | 20240228121009.0 | ||
006 | m o | | | ||
007 | cr uuu---uuuuu | ||
008 | 200324s2020 xx ||| |o o ||eng c | ||
035 | |a (DE-627-1)050574639 | ||
035 | |a (DE-599)KEP050574639 | ||
035 | |a (ORHE)0636920373605 | ||
035 | |a (DE-627-1)050574639 | ||
040 | |a DE-627 |b ger |c DE-627 |e rda | ||
041 | |a eng | ||
100 | 1 | |a Géron, Aurélien |e VerfasserIn |4 aut | |
245 | 1 | 0 | |a Natural language processing using transformer architectures |c Géron, Aurélien |
250 | |a 1st edition. | ||
264 | 1 | |a [Erscheinungsort nicht ermittelbar] |b O'Reilly Media, Inc. |c 2020 | |
264 | 2 | |a Boston, MA |b Safari. | |
300 | |a 1 Online-Ressource (1 video file, approximately 45 min.) | ||
336 | |a zweidimensionales bewegtes Bild |b tdi |2 rdacontent | ||
337 | |a Computermedien |b c |2 rdamedia | ||
338 | |a Online-Ressource |b cr |2 rdacarrier | ||
500 | |a Online resource; Title from title screen (viewed February 28, 2020) | ||
520 | |a Whether you need to automatically judge the sentiment of a user review, summarize long documents, translate text, or build a chatbot, you need the best language model available. In 2018, pretty much every NLP benchmark was crushed by novel transformer-based architectures, replacing long-standing architectures based on recurrent neural networks. In short, if you're into NLP, you need transformers. But to use transformers, you need to know what they are, what transformer-based architectures look like, and how you can implement them in your projects. Aurélien Géron (Kiwisoft) dives into recurrent neural networks and their limits, the invention of the transformer, attention mechanisms, the transformer architecture, subword tokenization using SentencePiece, self-supervised pretraining-learning from huge corpora, one-size-fits-all language models, BERT and GPT 2, and how to use these language models in your projects using TensorFlow. What you'll learn Understand transformers and modern language models and how they can tackle complex NLP tasks Identify what tools to use and what the code looks like. | ||
538 | |a Mode of access: World Wide Web. | ||
710 | 2 | |a Safari, an O'Reilly Media Company. |e MitwirkendeR |4 ctb | |
966 | 4 | 0 | |l DE-91 |p ZDB-30-ORH |q TUM_PDA_ORH |u https://learning.oreilly.com/library/view/-/0636920373605/?ar |m X:ORHE |x Aggregator |z lizenzpflichtig |3 Volltext |
912 | |a ZDB-30-ORH | ||
935 | |c vide | ||
951 | |a BO | ||
912 | |a ZDB-30-ORH | ||
049 | |a DE-91 |
Datensatz im Suchindex
DE-BY-TUM_katkey | ZDB-30-ORH-050574639 |
---|---|
_version_ | 1821494947377840128 |
adam_text | |
any_adam_object | |
author | Géron, Aurélien |
author_corporate | Safari, an O'Reilly Media Company |
author_corporate_role | ctb |
author_facet | Géron, Aurélien Safari, an O'Reilly Media Company |
author_role | aut |
author_sort | Géron, Aurélien |
author_variant | a g ag |
building | Verbundindex |
bvnumber | localTUM |
collection | ZDB-30-ORH |
ctrlnum | (DE-627-1)050574639 (DE-599)KEP050574639 (ORHE)0636920373605 |
edition | 1st edition. |
format | Electronic Video |
fullrecord | <?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>02456cgm a22003732 4500</leader><controlfield tag="001">ZDB-30-ORH-050574639</controlfield><controlfield tag="003">DE-627-1</controlfield><controlfield tag="005">20240228121009.0</controlfield><controlfield tag="006">m o | | </controlfield><controlfield tag="007">cr uuu---uuuuu</controlfield><controlfield tag="008">200324s2020 xx ||| |o o ||eng c</controlfield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-627-1)050574639</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-599)KEP050574639</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(ORHE)0636920373605</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-627-1)050574639</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-627</subfield><subfield code="b">ger</subfield><subfield code="c">DE-627</subfield><subfield code="e">rda</subfield></datafield><datafield tag="041" ind1=" " ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Géron, Aurélien</subfield><subfield code="e">VerfasserIn</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">Natural language processing using transformer architectures</subfield><subfield code="c">Géron, Aurélien</subfield></datafield><datafield tag="250" ind1=" " ind2=" "><subfield code="a">1st edition.</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="a">[Erscheinungsort nicht ermittelbar]</subfield><subfield code="b">O'Reilly Media, Inc.</subfield><subfield code="c">2020</subfield></datafield><datafield tag="264" ind1=" " ind2="2"><subfield code="a">Boston, MA</subfield><subfield code="b">Safari.</subfield></datafield><datafield tag="300" ind1=" " ind2=" "><subfield code="a">1 Online-Ressource (1 video file, approximately 45 min.)</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">zweidimensionales bewegtes Bild</subfield><subfield code="b">tdi</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">Computermedien</subfield><subfield code="b">c</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">Online-Ressource</subfield><subfield code="b">cr</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="500" ind1=" " ind2=" "><subfield code="a">Online resource; Title from title screen (viewed February 28, 2020)</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">Whether you need to automatically judge the sentiment of a user review, summarize long documents, translate text, or build a chatbot, you need the best language model available. In 2018, pretty much every NLP benchmark was crushed by novel transformer-based architectures, replacing long-standing architectures based on recurrent neural networks. In short, if you're into NLP, you need transformers. But to use transformers, you need to know what they are, what transformer-based architectures look like, and how you can implement them in your projects. Aurélien Géron (Kiwisoft) dives into recurrent neural networks and their limits, the invention of the transformer, attention mechanisms, the transformer architecture, subword tokenization using SentencePiece, self-supervised pretraining-learning from huge corpora, one-size-fits-all language models, BERT and GPT 2, and how to use these language models in your projects using TensorFlow. What you'll learn Understand transformers and modern language models and how they can tackle complex NLP tasks Identify what tools to use and what the code looks like.</subfield></datafield><datafield tag="538" ind1=" " ind2=" "><subfield code="a">Mode of access: World Wide Web.</subfield></datafield><datafield tag="710" ind1="2" ind2=" "><subfield code="a">Safari, an O'Reilly Media Company.</subfield><subfield code="e">MitwirkendeR</subfield><subfield code="4">ctb</subfield></datafield><datafield tag="966" ind1="4" ind2="0"><subfield code="l">DE-91</subfield><subfield code="p">ZDB-30-ORH</subfield><subfield code="q">TUM_PDA_ORH</subfield><subfield code="u">https://learning.oreilly.com/library/view/-/0636920373605/?ar</subfield><subfield code="m">X:ORHE</subfield><subfield code="x">Aggregator</subfield><subfield code="z">lizenzpflichtig</subfield><subfield code="3">Volltext</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">ZDB-30-ORH</subfield></datafield><datafield tag="935" ind1=" " ind2=" "><subfield code="c">vide</subfield></datafield><datafield tag="951" ind1=" " ind2=" "><subfield code="a">BO</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">ZDB-30-ORH</subfield></datafield><datafield tag="049" ind1=" " ind2=" "><subfield code="a">DE-91</subfield></datafield></record></collection> |
id | ZDB-30-ORH-050574639 |
illustrated | Not Illustrated |
indexdate | 2025-01-17T11:22:28Z |
institution | BVB |
language | English |
open_access_boolean | |
owner | DE-91 DE-BY-TUM |
owner_facet | DE-91 DE-BY-TUM |
physical | 1 Online-Ressource (1 video file, approximately 45 min.) |
psigel | ZDB-30-ORH TUM_PDA_ORH ZDB-30-ORH |
publishDate | 2020 |
publishDateSearch | 2020 |
publishDateSort | 2020 |
publisher | O'Reilly Media, Inc. |
record_format | marc |
spelling | Géron, Aurélien VerfasserIn aut Natural language processing using transformer architectures Géron, Aurélien 1st edition. [Erscheinungsort nicht ermittelbar] O'Reilly Media, Inc. 2020 Boston, MA Safari. 1 Online-Ressource (1 video file, approximately 45 min.) zweidimensionales bewegtes Bild tdi rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier Online resource; Title from title screen (viewed February 28, 2020) Whether you need to automatically judge the sentiment of a user review, summarize long documents, translate text, or build a chatbot, you need the best language model available. In 2018, pretty much every NLP benchmark was crushed by novel transformer-based architectures, replacing long-standing architectures based on recurrent neural networks. In short, if you're into NLP, you need transformers. But to use transformers, you need to know what they are, what transformer-based architectures look like, and how you can implement them in your projects. Aurélien Géron (Kiwisoft) dives into recurrent neural networks and their limits, the invention of the transformer, attention mechanisms, the transformer architecture, subword tokenization using SentencePiece, self-supervised pretraining-learning from huge corpora, one-size-fits-all language models, BERT and GPT 2, and how to use these language models in your projects using TensorFlow. What you'll learn Understand transformers and modern language models and how they can tackle complex NLP tasks Identify what tools to use and what the code looks like. Mode of access: World Wide Web. Safari, an O'Reilly Media Company. MitwirkendeR ctb |
spellingShingle | Géron, Aurélien Natural language processing using transformer architectures |
title | Natural language processing using transformer architectures |
title_auth | Natural language processing using transformer architectures |
title_exact_search | Natural language processing using transformer architectures |
title_full | Natural language processing using transformer architectures Géron, Aurélien |
title_fullStr | Natural language processing using transformer architectures Géron, Aurélien |
title_full_unstemmed | Natural language processing using transformer architectures Géron, Aurélien |
title_short | Natural language processing using transformer architectures |
title_sort | natural language processing using transformer architectures |
work_keys_str_mv | AT geronaurelien naturallanguageprocessingusingtransformerarchitectures AT safarianoreillymediacompany naturallanguageprocessingusingtransformerarchitectures |