A2Z-F24

Language Models

Sequential Data and Recurrent Neural Networks

Transformers and Large Language Models

among the reasons I use large pre-trained language models sparingly in my computer-generated poetry practice is that being able to know whose voices I’m speaking with is… actually important, as is being understanding how the output came to have its shape - @aparrish, full thread

LLM Training

Datasets for LLMs

Climate Impact

Web Servers with Node.js + p5.js

Code Examples and Implementations

Replicate

Ollama

Transformers.js

OpenAI

Assignment

Add your assignment below via Pull Request

(Please note you are welcome to post under a pseudonym and/or password protect your published assignment. For NYU blogs, privacy options are covered in the NYU Wordpress Knowledge Base. Finally, if you prefer not to post your assignment at all here, you may email the submission.)