• LostXOR@fedia.io
    link
    fedilink
    arrow-up
    1
    ·
    2 months ago

    This article estimates that GPT-4 took around 55 GWh of electricity to train. A human needs maybe 2000 kcal (2.3 kWh) a day and lives 75 years, for a lifetime energy consumption of 63 MWh (or 840x less than just training GPT-4).

    So not only do shitty “AI” models use >20x the energy of a human to “think,” training them uses the lifetime energy equivalent of hundreds of humans. It’s absolutely absurd how inefficient this technology is.