# Schreiben auf KI
hhh
👍
AI Playgroun
<https://20b.eleuther.ai/>
## <https://app.wombo.art/>



## <https://hypnogram.xyz/>

## <https://soundcloud.com/slimgoldie>
<https://soundcloud.com/slimgoldie/sets/the-real-goldfish>
Ai based music by Slim Goldie.
[www.neuerordner7.art/](http://www.neuerordner7.art/)
Texte etc.Â
<https://www.neuerordner7.art/index.html>
derPapierkorb generiert eigene Texte, wenn du auf ihn klickst :-)
<https://neuerordner7.art/papierkorb/index.html>
# Pharmako-AI
K Allado-McDowell
<https://zabriskie.de/product/pharmako-ai-by-k-allado-mcdowell/>
* 2020ISBN 9781838003906
* 152 pages
* Softcover
* 20 Ă— 13 Ă— 1.5 cm
**Introduced by Irenosen Okojie**
**Cover by Refik Anadol**
During the first summer of the coronavirus pandemic, a diary entry by K Allado-McDowell initiates an experimental conversation with the AI language model GPT-3. Over the course of a fortnight, the exchange rapidly unfolds into a labyrinthine exploration of memory, language and cosmology.
The first book to be co-created with the emergent AI, *Pharmako-AI* is a hallucinatory journey into selfhood, ecology and intelligence via cyberpunk, ancestry and biosemiotics. Through a writing process akin to musical improvisation, Allado-McDowell and GPT-3 together offer a fractal poetics of AI and a glimpse into the future of literature.
*Pharmako-AI* reimagines cybernetics for a world facing multiple crises, with profound implications for how we see ourselves, nature and technology in the 21st century.
**Generative Pre-trained Transformer 3** (**GPT-3**) is an autoregressivelanguage model that uses deep learning to produce human-like text. It is the third-generation language prediction model in the GPT-n series created by OpenAI, a San Francisco-based artificial intelligence research laboratory.[2] GPT-3’s full version has a capacity of 175 billion machine learning parameters. GPT-3, which was introduced in May 2020, and is in beta testing as of July 2020,[3] is part of a trend in natural language processing (NLP) systems of pre-trained language representations.[1]Before the release of GPT-3, the largest language model was Microsoft’s Turing NLG, introduced in February 2020, with a capacity of 17 billion parameters or less than 10 percent compared to GPT-3.[4]
The quality of the text generated by GPT-3 is so high that it is difficult to distinguish from that written by a human, which has both benefits and risks.[4] Thirty-one OpenAI researchers and engineers presented the original May 28, 2020 paper introducing GPT-3. In their paper, they warned of GPT-3’s potential dangers and called for research to mitigate risk.[1]:34 David Chalmers, an Australian philosopher, described GPT-3 as „one of the most interesting and important AI systems ever produced.“[5]
## Das könnte dir gefallen...
Scott Rettberg und 2 weitereÂ
# Electronic Literature Communities (Computing Literature
<https://elmcip.net/sites/default/files/media/critical\_writing/attachments/rettberg\_electronicliteraturecommunities.pdf>
\--
Everyone viewing or typing on this page sees the same text.Â
Create your own board and a (secret) name for it here: <https://board.net> This service is provided fair-use for sustainability initiatives. Â
Pay What You Can <https://www.fairkom.eu/en/donations> for disk space and new features. Or try more open source cloud services at <https://fairapps.net>Â