Kikai gakushū enjinia no tame no Transformers: saisentan no shizen gengo shori raiburari ni yoru moderu kaihatsu / = Natural language processing with transformers :
機械学習エンジニアのためのTransformers : 最先端の自然言語処理ライブラリによるモデル開発 /
"Since their introduction in 2017, transformers have quickly become the dominant architecture for achieving state-of-the-art results on a variety of natural language processing tasks. If you're a data scientist or coder, this practical book -now revised in full color- shows you how to trai...
Saved in:
Main Authors: | , , |
---|---|
Other Authors: | |
Format: | Electronic eBook |
Language: | Japanese |
Published: |
Tōkyō-to Shinjuku-ku
Orairī Japan
2022
|
Edition: | Shohan. |
Subjects: | |
Links: | https://learning.oreilly.com/library/view/-/9784873119953/?ar |
Summary: | "Since their introduction in 2017, transformers have quickly become the dominant architecture for achieving state-of-the-art results on a variety of natural language processing tasks. If you're a data scientist or coder, this practical book -now revised in full color- shows you how to train and scale these large models using Hugging Face Transformers, a Python-based deep learning library. Transformers have been used to write realistic news stories, improve Google Search queries, and even create chatbots that tell corny jokes. In this guide, authors Lewis Tunstall, Leandro von Werra, and Thomas Wolf, among the creators of Hugging Face Transformers, use a hands-on approach to teach you how transformers work and how to integrate them in your applications. You'll quickly learn a variety of tasks they can help you solve. Build, debug, and optimize transformer models for core NLP tasks, such as text classification, named entity recognition, and question answering Learn how transformers can be used for cross-lingual transfer learning Apply transformers in real-world scenarios where labeled data is scarce Make transformer models efficient for deployment using techniques such as distillation, pruning, and quantization Train transformers from scratch and learn how to scale to multiple GPUs and distributed environments." -- |
Item Description: | Includes bibiographical references |
Physical Description: | 1 Online-Ressource (424 Seiten) color illustrations.. |
ISBN: | 9784873119953 4873119952 |
Staff View
MARC
LEADER | 00000cam a22000002c 4500 | ||
---|---|---|---|
001 | ZDB-30-ORH-089971590 | ||
003 | DE-627-1 | ||
005 | 20240228121924.0 | ||
007 | cr uuu---uuuuu | ||
008 | 230327s2022 xx |||||o 00| ||jpn c | ||
020 | |a 9784873119953 |c electronic bk. |9 978-4-87311-995-3 | ||
020 | |a 4873119952 |c electronic bk. |9 4-87311-995-2 | ||
035 | |a (DE-627-1)089971590 | ||
035 | |a (DE-599)KEP089971590 | ||
035 | |a (ORHE)9784873119953 | ||
035 | |a (DE-627-1)089971590 | ||
040 | |a DE-627 |b ger |c DE-627 |e rda | ||
041 | |a jpn | ||
082 | 0 | |a 006.3/5 |2 23/eng/20230220 | |
100 | 1 | |a Tunstall, Lewis |e VerfasserIn |4 aut | |
240 | 1 | 0 | |a Natural language processing with transformers |
245 | 1 | 0 | |6 880-01 |a Kikai gakushū enjinia no tame no Transformers |b saisentan no shizen gengo shori raiburari ni yoru moderu kaihatsu / = Natural language processing with transformers : |c Lewis Tunstall, Leandro von Werra, Thomas Wolf cho ; Nakayama Hiroki yaku = Natural language processing with transformers : building language applications with Hugging Face / Lewis Tunstall, Leandro von Werra and Thomas Wolf |
246 | 3 | 1 | |a Natural language processing with transformers : |
250 | |6 880-02 |a Shohan. | ||
264 | 1 | |6 880-03 |a Tōkyō-to Shinjuku-ku |b Orairī Japan |c 2022 | |
300 | |a 1 Online-Ressource (424 Seiten) |b color illustrations.. | ||
336 | |a Text |b txt |2 rdacontent | ||
337 | |a Computermedien |b c |2 rdamedia | ||
338 | |a Online-Ressource |b cr |2 rdacarrier | ||
500 | |a Includes bibiographical references | ||
520 | |a "Since their introduction in 2017, transformers have quickly become the dominant architecture for achieving state-of-the-art results on a variety of natural language processing tasks. If you're a data scientist or coder, this practical book -now revised in full color- shows you how to train and scale these large models using Hugging Face Transformers, a Python-based deep learning library. Transformers have been used to write realistic news stories, improve Google Search queries, and even create chatbots that tell corny jokes. In this guide, authors Lewis Tunstall, Leandro von Werra, and Thomas Wolf, among the creators of Hugging Face Transformers, use a hands-on approach to teach you how transformers work and how to integrate them in your applications. You'll quickly learn a variety of tasks they can help you solve. Build, debug, and optimize transformer models for core NLP tasks, such as text classification, named entity recognition, and question answering Learn how transformers can be used for cross-lingual transfer learning Apply transformers in real-world scenarios where labeled data is scarce Make transformer models efficient for deployment using techniques such as distillation, pruning, and quantization Train transformers from scratch and learn how to scale to multiple GPUs and distributed environments." -- | ||
546 | |a In Japanese. | ||
650 | 0 | |a Natural language processing (Computer science) | |
650 | 0 | |a Python (Computer program language) | |
650 | 0 | |a Deep learning (Machine learning) | |
650 | 4 | |a Traitement automatique des langues naturelles | |
650 | 4 | |a Python (Langage de programmation) | |
650 | 4 | |a Apprentissage profond | |
650 | 4 | |a Deep learning (Machine learning) | |
650 | 4 | |a Natural language processing (Computer science) | |
650 | 4 | |a Python (Computer program language) | |
700 | 1 | |a Werra, Leandro von |e VerfasserIn |4 aut | |
700 | 1 | |a Wolf, Thomas |e VerfasserIn |4 aut | |
700 | 1 | |a Nakayama, Hiroki |e ÜbersetzerIn |4 trl | |
966 | 4 | 0 | |l DE-91 |p ZDB-30-ORH |q TUM_PDA_ORH |u https://learning.oreilly.com/library/view/-/9784873119953/?ar |m X:ORHE |x Aggregator |z lizenzpflichtig |3 Volltext |
880 | 1 | 0 | |6 245-01/Chin |a 機械学習エンジニアのためのTransformers : |b 最先端の自然言語処理ライブラリによるモデル開発 / |c Lewis Tunstall, Leandro von Werra, Thomas Wolf著 ; 中山光樹訳 = Natural language processing with transformers : building language applications with Hugging Face / Lewis Tunstall, Leandro von Werra and Thomas Wolf. |
880 | |6 250-02/Chin |a 初版. | ||
880 | 1 | |6 264-03/Chin |a 東京都新宿区 |b オライリー・ジャパン |c 2022 | |
912 | |a ZDB-30-ORH | ||
912 | |a ZDB-30-ORH | ||
951 | |a BO | ||
912 | |a ZDB-30-ORH | ||
049 | |a DE-91 |
Record in the Search Index
DE-BY-TUM_katkey | ZDB-30-ORH-089971590 |
---|---|
_version_ | 1829007729164812288 |
adam_text | |
any_adam_object | |
author | Tunstall, Lewis Werra, Leandro von Wolf, Thomas |
author2 | Nakayama, Hiroki |
author2_role | trl |
author2_variant | h n hn |
author_facet | Tunstall, Lewis Werra, Leandro von Wolf, Thomas Nakayama, Hiroki |
author_role | aut aut aut |
author_sort | Tunstall, Lewis |
author_variant | l t lt l v w lv lvw t w tw |
building | Verbundindex |
bvnumber | localTUM |
collection | ZDB-30-ORH |
ctrlnum | (DE-627-1)089971590 (DE-599)KEP089971590 (ORHE)9784873119953 |
dewey-full | 006.3/5 |
dewey-hundreds | 000 - Computer science, information, general works |
dewey-ones | 006 - Special computer methods |
dewey-raw | 006.3/5 |
dewey-search | 006.3/5 |
dewey-sort | 16.3 15 |
dewey-tens | 000 - Computer science, information, general works |
discipline | Informatik |
edition | Shohan. |
format | Electronic eBook |
fullrecord | <?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>04210cam a22005772c 4500</leader><controlfield tag="001">ZDB-30-ORH-089971590</controlfield><controlfield tag="003">DE-627-1</controlfield><controlfield tag="005">20240228121924.0</controlfield><controlfield tag="007">cr uuu---uuuuu</controlfield><controlfield tag="008">230327s2022 xx |||||o 00| ||jpn c</controlfield><datafield tag="020" ind1=" " ind2=" "><subfield code="a">9784873119953</subfield><subfield code="c">electronic bk.</subfield><subfield code="9">978-4-87311-995-3</subfield></datafield><datafield tag="020" ind1=" " ind2=" "><subfield code="a">4873119952</subfield><subfield code="c">electronic bk.</subfield><subfield code="9">4-87311-995-2</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-627-1)089971590</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-599)KEP089971590</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(ORHE)9784873119953</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-627-1)089971590</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-627</subfield><subfield code="b">ger</subfield><subfield code="c">DE-627</subfield><subfield code="e">rda</subfield></datafield><datafield tag="041" ind1=" " ind2=" "><subfield code="a">jpn</subfield></datafield><datafield tag="082" ind1="0" ind2=" "><subfield code="a">006.3/5</subfield><subfield code="2">23/eng/20230220</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Tunstall, Lewis</subfield><subfield code="e">VerfasserIn</subfield><subfield code="4">aut</subfield></datafield><datafield tag="240" ind1="1" ind2="0"><subfield code="a">Natural language processing with transformers</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="6">880-01</subfield><subfield code="a">Kikai gakushū enjinia no tame no Transformers</subfield><subfield code="b">saisentan no shizen gengo shori raiburari ni yoru moderu kaihatsu / = Natural language processing with transformers :</subfield><subfield code="c">Lewis Tunstall, Leandro von Werra, Thomas Wolf cho ; Nakayama Hiroki yaku = Natural language processing with transformers : building language applications with Hugging Face / Lewis Tunstall, Leandro von Werra and Thomas Wolf</subfield></datafield><datafield tag="246" ind1="3" ind2="1"><subfield code="a">Natural language processing with transformers :</subfield></datafield><datafield tag="250" ind1=" " ind2=" "><subfield code="6">880-02</subfield><subfield code="a">Shohan.</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="6">880-03</subfield><subfield code="a">Tōkyō-to Shinjuku-ku</subfield><subfield code="b">Orairī Japan</subfield><subfield code="c">2022</subfield></datafield><datafield tag="300" ind1=" " ind2=" "><subfield code="a">1 Online-Ressource (424 Seiten)</subfield><subfield code="b">color illustrations..</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">Text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">Computermedien</subfield><subfield code="b">c</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">Online-Ressource</subfield><subfield code="b">cr</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="500" ind1=" " ind2=" "><subfield code="a">Includes bibiographical references</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">"Since their introduction in 2017, transformers have quickly become the dominant architecture for achieving state-of-the-art results on a variety of natural language processing tasks. If you're a data scientist or coder, this practical book -now revised in full color- shows you how to train and scale these large models using Hugging Face Transformers, a Python-based deep learning library. Transformers have been used to write realistic news stories, improve Google Search queries, and even create chatbots that tell corny jokes. In this guide, authors Lewis Tunstall, Leandro von Werra, and Thomas Wolf, among the creators of Hugging Face Transformers, use a hands-on approach to teach you how transformers work and how to integrate them in your applications. You'll quickly learn a variety of tasks they can help you solve. Build, debug, and optimize transformer models for core NLP tasks, such as text classification, named entity recognition, and question answering Learn how transformers can be used for cross-lingual transfer learning Apply transformers in real-world scenarios where labeled data is scarce Make transformer models efficient for deployment using techniques such as distillation, pruning, and quantization Train transformers from scratch and learn how to scale to multiple GPUs and distributed environments." --</subfield></datafield><datafield tag="546" ind1=" " ind2=" "><subfield code="a">In Japanese.</subfield></datafield><datafield tag="650" ind1=" " ind2="0"><subfield code="a">Natural language processing (Computer science)</subfield></datafield><datafield tag="650" ind1=" " ind2="0"><subfield code="a">Python (Computer program language)</subfield></datafield><datafield tag="650" ind1=" " ind2="0"><subfield code="a">Deep learning (Machine learning)</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Traitement automatique des langues naturelles</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Python (Langage de programmation)</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Apprentissage profond</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Deep learning (Machine learning)</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Natural language processing (Computer science)</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Python (Computer program language)</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Werra, Leandro von</subfield><subfield code="e">VerfasserIn</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Wolf, Thomas</subfield><subfield code="e">VerfasserIn</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Nakayama, Hiroki</subfield><subfield code="e">ÜbersetzerIn</subfield><subfield code="4">trl</subfield></datafield><datafield tag="966" ind1="4" ind2="0"><subfield code="l">DE-91</subfield><subfield code="p">ZDB-30-ORH</subfield><subfield code="q">TUM_PDA_ORH</subfield><subfield code="u">https://learning.oreilly.com/library/view/-/9784873119953/?ar</subfield><subfield code="m">X:ORHE</subfield><subfield code="x">Aggregator</subfield><subfield code="z">lizenzpflichtig</subfield><subfield code="3">Volltext</subfield></datafield><datafield tag="880" ind1="1" ind2="0"><subfield code="6">245-01/Chin</subfield><subfield code="a">機械学習エンジニアのためのTransformers :</subfield><subfield code="b">最先端の自然言語処理ライブラリによるモデル開発 /</subfield><subfield code="c">Lewis Tunstall, Leandro von Werra, Thomas Wolf著 ; 中山光樹訳 = Natural language processing with transformers : building language applications with Hugging Face / Lewis Tunstall, Leandro von Werra and Thomas Wolf.</subfield></datafield><datafield tag="880" ind1=" " ind2=" "><subfield code="6">250-02/Chin</subfield><subfield code="a">初版.</subfield></datafield><datafield tag="880" ind1=" " ind2="1"><subfield code="6">264-03/Chin</subfield><subfield code="a">東京都新宿区</subfield><subfield code="b">オライリー・ジャパン</subfield><subfield code="c">2022</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">ZDB-30-ORH</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">ZDB-30-ORH</subfield></datafield><datafield tag="951" ind1=" " ind2=" "><subfield code="a">BO</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">ZDB-30-ORH</subfield></datafield><datafield tag="049" ind1=" " ind2=" "><subfield code="a">DE-91</subfield></datafield></record></collection> |
id | ZDB-30-ORH-089971590 |
illustrated | Illustrated |
indexdate | 2025-04-10T09:34:55Z |
institution | BVB |
isbn | 9784873119953 4873119952 |
language | Japanese |
open_access_boolean | |
owner | DE-91 DE-BY-TUM |
owner_facet | DE-91 DE-BY-TUM |
physical | 1 Online-Ressource (424 Seiten) color illustrations.. |
psigel | ZDB-30-ORH TUM_PDA_ORH ZDB-30-ORH |
publishDate | 2022 |
publishDateSearch | 2022 |
publishDateSort | 2022 |
publisher | Orairī Japan |
record_format | marc |
spelling | Tunstall, Lewis VerfasserIn aut Natural language processing with transformers 880-01 Kikai gakushū enjinia no tame no Transformers saisentan no shizen gengo shori raiburari ni yoru moderu kaihatsu / = Natural language processing with transformers : Lewis Tunstall, Leandro von Werra, Thomas Wolf cho ; Nakayama Hiroki yaku = Natural language processing with transformers : building language applications with Hugging Face / Lewis Tunstall, Leandro von Werra and Thomas Wolf Natural language processing with transformers : 880-02 Shohan. 880-03 Tōkyō-to Shinjuku-ku Orairī Japan 2022 1 Online-Ressource (424 Seiten) color illustrations.. Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier Includes bibiographical references "Since their introduction in 2017, transformers have quickly become the dominant architecture for achieving state-of-the-art results on a variety of natural language processing tasks. If you're a data scientist or coder, this practical book -now revised in full color- shows you how to train and scale these large models using Hugging Face Transformers, a Python-based deep learning library. Transformers have been used to write realistic news stories, improve Google Search queries, and even create chatbots that tell corny jokes. In this guide, authors Lewis Tunstall, Leandro von Werra, and Thomas Wolf, among the creators of Hugging Face Transformers, use a hands-on approach to teach you how transformers work and how to integrate them in your applications. You'll quickly learn a variety of tasks they can help you solve. Build, debug, and optimize transformer models for core NLP tasks, such as text classification, named entity recognition, and question answering Learn how transformers can be used for cross-lingual transfer learning Apply transformers in real-world scenarios where labeled data is scarce Make transformer models efficient for deployment using techniques such as distillation, pruning, and quantization Train transformers from scratch and learn how to scale to multiple GPUs and distributed environments." -- In Japanese. Natural language processing (Computer science) Python (Computer program language) Deep learning (Machine learning) Traitement automatique des langues naturelles Python (Langage de programmation) Apprentissage profond Werra, Leandro von VerfasserIn aut Wolf, Thomas VerfasserIn aut Nakayama, Hiroki ÜbersetzerIn trl 245-01/Chin 機械学習エンジニアのためのTransformers : 最先端の自然言語処理ライブラリによるモデル開発 / Lewis Tunstall, Leandro von Werra, Thomas Wolf著 ; 中山光樹訳 = Natural language processing with transformers : building language applications with Hugging Face / Lewis Tunstall, Leandro von Werra and Thomas Wolf. 250-02/Chin 初版. 264-03/Chin 東京都新宿区 オライリー・ジャパン 2022 |
spellingShingle | Tunstall, Lewis Werra, Leandro von Wolf, Thomas Kikai gakushū enjinia no tame no Transformers saisentan no shizen gengo shori raiburari ni yoru moderu kaihatsu / = Natural language processing with transformers : Natural language processing (Computer science) Python (Computer program language) Deep learning (Machine learning) Traitement automatique des langues naturelles Python (Langage de programmation) Apprentissage profond |
title | Kikai gakushū enjinia no tame no Transformers saisentan no shizen gengo shori raiburari ni yoru moderu kaihatsu / = Natural language processing with transformers : |
title_alt | Natural language processing with transformers Natural language processing with transformers : |
title_auth | Kikai gakushū enjinia no tame no Transformers saisentan no shizen gengo shori raiburari ni yoru moderu kaihatsu / = Natural language processing with transformers : |
title_exact_search | Kikai gakushū enjinia no tame no Transformers saisentan no shizen gengo shori raiburari ni yoru moderu kaihatsu / = Natural language processing with transformers : |
title_full | Kikai gakushū enjinia no tame no Transformers saisentan no shizen gengo shori raiburari ni yoru moderu kaihatsu / = Natural language processing with transformers : Lewis Tunstall, Leandro von Werra, Thomas Wolf cho ; Nakayama Hiroki yaku = Natural language processing with transformers : building language applications with Hugging Face / Lewis Tunstall, Leandro von Werra and Thomas Wolf |
title_fullStr | Kikai gakushū enjinia no tame no Transformers saisentan no shizen gengo shori raiburari ni yoru moderu kaihatsu / = Natural language processing with transformers : Lewis Tunstall, Leandro von Werra, Thomas Wolf cho ; Nakayama Hiroki yaku = Natural language processing with transformers : building language applications with Hugging Face / Lewis Tunstall, Leandro von Werra and Thomas Wolf |
title_full_unstemmed | Kikai gakushū enjinia no tame no Transformers saisentan no shizen gengo shori raiburari ni yoru moderu kaihatsu / = Natural language processing with transformers : Lewis Tunstall, Leandro von Werra, Thomas Wolf cho ; Nakayama Hiroki yaku = Natural language processing with transformers : building language applications with Hugging Face / Lewis Tunstall, Leandro von Werra and Thomas Wolf |
title_short | Kikai gakushū enjinia no tame no Transformers |
title_sort | kikai gakushu enjinia no tame no transformers saisentan no shizen gengo shori raiburari ni yoru moderu kaihatsu natural language processing with transformers |
title_sub | saisentan no shizen gengo shori raiburari ni yoru moderu kaihatsu / = Natural language processing with transformers : |
topic | Natural language processing (Computer science) Python (Computer program language) Deep learning (Machine learning) Traitement automatique des langues naturelles Python (Langage de programmation) Apprentissage profond |
topic_facet | Natural language processing (Computer science) Python (Computer program language) Deep learning (Machine learning) Traitement automatique des langues naturelles Python (Langage de programmation) Apprentissage profond |
work_keys_str_mv | AT tunstalllewis naturallanguageprocessingwithtransformers AT werraleandrovon naturallanguageprocessingwithtransformers AT wolfthomas naturallanguageprocessingwithtransformers AT nakayamahiroki naturallanguageprocessingwithtransformers AT tunstalllewis kikaigakushuenjinianotamenotransformerssaisentannoshizengengoshoriraiburariniyorumoderukaihatsunaturallanguageprocessingwithtransformers AT werraleandrovon kikaigakushuenjinianotamenotransformerssaisentannoshizengengoshoriraiburariniyorumoderukaihatsunaturallanguageprocessingwithtransformers AT wolfthomas kikaigakushuenjinianotamenotransformerssaisentannoshizengengoshoriraiburariniyorumoderukaihatsunaturallanguageprocessingwithtransformers AT nakayamahiroki kikaigakushuenjinianotamenotransformerssaisentannoshizengengoshoriraiburariniyorumoderukaihatsunaturallanguageprocessingwithtransformers |