

Discover more from Bestiario substack
Random picks 🎲
The age of average: Why it all looks the same by @alexjmurrell
Five charts that changed the world: Data visualization 101
https://www.bbc.co.uk/ideas/videos/five-charts-that-changed-the-world/
100 years of robots taking our jobs: Find out which machines were about to take all the jobs the last century.
https://newsletter.pessimistsarchive.org/p/robots-have-been-about-to-take-al
Poetry camera: photo to written poetry with just a click! by @carolynz and @Flomerboy
What can’t large language models do? Yoav Goldberg explains.
Tool shed ⚒️
Text to Plot: Thousands of apps generated in a week using GPT4, one of them, OpenAxis helps getting data and generating a plot.
Sunny side of the street: Spring is here and you need to know where does the sun shine by @shadowmap
https://app.shadowmap.org/ (better with laptop)
Memory lane 📽️
Cameron’s world: is a web-collage of text and images excavated from the buried neighbourhoods of archived GeoCities pages. We’ll talk about GeoCities another time.
Ego corner 🪞
We got a brand new website! Go play with our new ball!
Word pool 🤽
Luddite: One opposed to industrialisation, automation, computerisation, or new technologies in general. The origin is a secret orgnisation who formed a radical faction which destroyed textile machinery during the 19th century. [w]
Large language model (LLM): LLMs are usually very large deep-neural-networks (hundreds of billions of parameters), which are trained by going through billions of pages of material in a particular language, while attempting to execute a specific task such as predicting the next sentence. As a result, they are sensitive to contextual relationships between the elements of that language (words, phrases, etc). They can be used to recognize, summarize, translate, predict and generate text. [w]
Prompt engineering: Methods for how to communicate with LLM to steer its behavior for desired outcomes. A whole course by DAIR here. [w]
Find the canvas of all discoveries in our kinopio. Thank you all for reading!