Language Models
Sequential Data and Recurrent Neural Networks
among the reasons I use large pre-trained language models sparingly in my computer-generated poetry practice is that being able to know whose voices Iβm speaking with isβ¦ actually important, as is being understanding how the output came to have its shape - @aparrish, full thread
LLM Training
Datasets for LLMs
Climate Impact
Web Servers with Node.js + p5.js
Code Examples and Implementations
Replicate
Ollama
OpenAI
Assignment
- Read Language models can only write ransom notes by Allison Parrish and review the The Foundation Model Transparency Index. What questions arise for you about using LLMs in your work at ITP?
- Experiment with prompting a large language model in some way other than a provided interface (e.g. ChatGPT) and document the results in a blog post. Consider how working with an LLM compares to generating text from the other methods including but not limited to markov chains and context free grammars. Here are some options:
Add your assignment below via Pull Request
(Please note you are welcome to post under a pseudonym and/or password protect your published assignment. For NYU blogs, privacy options are covered in the NYU Wordpress Knowledge Base. Finally, if you prefer not to post your assignment at all here, you may email the submission.)