Tech
- Unless you’re working as an engineer for the designing company, one doesn’t actually build a PC. You push the components into the backbone circuit board (“motherboard”).
- A good GPU can cost more than all the rest of the computer’s components combined.
- Cultural artifacts are repurposed all the time. For example, a widely used medical blood thinner started as rat poison.
- I assume the first on switch sends power to the motherboard, and the recessed switch powers on the entire system.
Nvidia Recursion
*I’m going to write 500 to 1000 words or so that I compose in one sitting and post here. I intend these posts to be more ephemeral than those on gebloom.com.
If you take Ubers and find yourself occasionally baffled in how to exit the back seat, you’re probably in a Tesla. The pressure is on when you search for the door handle while your driver’s in a rush to get their next customer. The Tesla door handle is recessed, and I’m looking for a handle rather than the telltale outline in the interior that says handle here.
I felt under a different kind of pressure, that of disappointing my 14-year-old grandson, who had already waited over a month for a custom-built “gaming PC” from a company big in reputation and small in employee count. Dell and HP, the dominant Windows PC makers, construct office and home computers from cheap parts adequate to run Microsoft Office and a browser. But computer game players who are graduating from gaming consoles such as the Sony Playstation require big-cased computers that can house a large and expensive dedicated graphic processing unit (GPU) such as those made by Nvidia.
You may have heard of Nvidia. For most of their years, they were known as the graphics processor company that made increasingly powerful GPUs to service the ever-increasing graphics demands of modern computer games. Now they’re the top company in the history of capitalism, worth on paper (the NY Stock Exchange) four-trillion dollars. How did that happen?
Many hands make light work
—A sentiment from the New Testament and a Chinese proverb
It turns out that the talent of the GPU to power computer graphics is the same talent that trains large language models (LLM) such as OpenAI’s ChatGPT, Google Gemini, and Anthropic’s Claude. While the main engine of a computer is the central processing unit (CPU), the CPU can process only one instruction at a time. Modern CPUs are fast but not fast enough to push demanding modern computer graphics or train an LLM. While the CPU is processing one instruction at a time, the GPU is a parallel processor, which runs multiple instructions at a time. The CPU is a speedy modern train. The GPU is a freeway adjacent to the train tracks full of sports cars.
Because I “built” two computers in the past, I recognize all the parts inside my grandson’s computer, but I could not find the on switch. To be fair, I thought we had already turned on the computer with a toggle switch in the back, and I feared there was something wrong with the new machine or that we hadn’t hooked the display up correctly.
Because custom-built computers are built from a numerous variety of parts—only your willingness to spend is the limit—the builder can’t supply more than basic instructions for your particular order. After troubleshooting for an hour, I did a search on Google Gemini’s LLM—powered by billions of dollars of those linked parallel processing Nvidia GPUs—and Gemini explained that the on-off switch was recessed (like the Tesla door handle) in the black mental case, a switch difficult to discover in that dark bedroom. My grandson pushed the switch and the display lit up. Nvidia GPU to power the computer. Nvidia GPUs to learn how to turn the computer on.
–––