Follow

technologyreview.com/2020/12/0

> Strubell’s study found that one language model with a particular type of “neural architecture search” (NAS) method would have produced the equivalent of 626,155 pounds (284 metric tons) of carbon dioxide—about the lifetime output of five average American cars. A version of Google’s language model, BERT, which underpins the company’s search engine, produced 1,438 pounds of CO2 equivalent in Strubell’s estimate...

Sign in to participate in the conversation
Dog.estate

A personal mastodon instance for me and people I know. I'll only approve people I know.