- 【Updated on May 12, 2025】 Integration of CiNii Dissertations and CiNii Books into CiNii Research
- Trial version of CiNii Research Knowledge Graph Search feature is available on CiNii Labs
- 【Updated on June 30, 2025】Suspension and deletion of data provided by Nikkei BP
- Regarding the recording of “Research Data” and “Evidence Data”
Can GPT-3 Pass a Writer’s Turing Test?
-
- Katherine Elkins
- Kenyon College
-
- Jon Chun
- Kenyon College
Description
<jats:p>Until recently the field of natural language generation relied upon formalized grammar systems, small-scale statistical models, and lengthy sets of heuristic rules. This older technology was fairly limited and brittle: it could remix language into word salad poems or chat with humans within narrowly defined topics. Recently, very large-scale statistical language models have dramatically advanced the field, and GPT-3 is just one example. It can internalize the rules of language without explicit programming or rules. Instead, much like a human child, GPT-3 learns language through repeated exposure, albeit on a much larger scale. Without explicit rules, it can sometimes fail at the simplest of linguistic tasks, but it can also excel at more difficult ones like imitating an author or waxing philosophical.</jats:p>
Journal
-
- Journal of Cultural Analytics
-
Journal of Cultural Analytics 5 (2), 2020-09-14
CA: Journal of Cultural Analytics
- Tweet
Details 詳細情報について
-
- CRID
- 1360302871504265088
-
- ISSN
- 23714549
-
- Data Source
-
- Crossref