Revenge Of The Humanities

The simple fact of the matter is that interacting with the most significant technology of our time—language models like GPT-4 and Gemini—is far closer to interacting with a human, compared to how we have historically interacted with machines.

Can an A.I. Make Plans?

How can these powerful systems beat us in chess but falter on basic math? This paradox reflects more than just an idiosyncratic design quirk. It points toward something fundamental about how large language models think.

Computational Power and AI

Large-scale compute is also environmentally unsustainable: chips are highly toxic to produce and require an enormous amount of energy to manufacture: for example, TSMC on its own accounts for 4.8 percent of Taiwan’s national energy consumption, more than the entire capital city of Taipei. Running data centers is likewise environmentally very costly: estimates equate every prompt run on ChatGPT to the equivalent of pouring out an entire bottle of water.