SNS: THE COST (AND BENEFITS) OF GPT
 

Register for Future in Review 2023

 

THE COST (AND BENEFITS) OF GPT

By Mark Anderson

Last Call! Register for Future in Review now to take advantage of our summer pricing at $4,900. The price jumps up to $5,900 starting Sept 1.

_____

Why Read: The world of GPT is too much with us, given its unreliable outputs. It's time to stop fawning over the mirror of human text and start considering the cost of the folly of shipping broken product.

 

Author's Note: We have been writing about the technical need for Explainable AI, particularly for those running GAI, for several months. Please refer back to those SNS issues if interested in that aspect of this discussion.

On GPT, with Humor

Reinforcement learning is interesting; without it, GPT is just another iterative, go-forward, pre-trained, large language-model neural network based on transformer architecture.

Right? 

At this point, I should have already lost something like 98% of readers, and it's therefore almost time for my first GPT joke to help bring you back. (There are two, in total.)

But even now, outside these pages, you have begun to realize that no one has a clue how this stuff works, that it's prone to dramatic and dangerous failures ("hallucinations" - the technical term - is too kind, I think), and that there is really no way to fix this.

As my team knows, I've been working on a refined comedy form I'm calling The Elevator Joke. It's for all those times we spend 10 seconds or so with strangers in elevators. It turns out, with the right goofy smile, this is the perfect time to break through all those shared, awkward silences with a ridiculous joke.

In the case I'm thinking of, we were attending the Fortune Brainstorm AI conference in San Francisco, and we'd just gotten our first agenda dose of GPT champions, capped with a stunning (not) demonstration, by Stability.ai, of dogs wearing different kinds of hats.

So of course, since GPT stands for generative pre-trained transformer - and since I assumed that all my elevator companions were GPT fans and AI execs - well, as the elevator doors closed, I asked, "OK, for 10 points, who can define what a transformer is?"

See, this wasn't really a joke, although the outcome was pretty funny: by a couple floors later and doors open, no one had done it. Some adventurous soul finally shouted, "So what is it?!"

Here's the definition on Wikipedia: "transformer is a deep learning architecture that relies on the parallel multi-head attention mechanism."

Now is the time to insert graphics of the architecture to make everything clear again, since I sense many of you may still not be following. It is really simple, so just jump down to "Upgrades," right now, then come back.

(Pause)