16 Comments
User's avatar
Collette Greystone's avatar

There were three acronyms in computing in the late 70’s early 80’s. KISS, GIGA, RTKP.

Keep it simple stupid.

Garbage in garbage out.

Reduce-it to-a known problem.

Guess they don’t teach those ideas anymore?

Expand full comment
Alejandro Piad Morffis's avatar

I mention KISS and GIGA to my students all the time ;) but hadn't heard the last one. I'm stealing it.

Expand full comment
Collette Greystone's avatar

Generational transfer of knowledge.

Expand full comment
Alejandro Piad Morffis's avatar

GTK!

Expand full comment
Michael Woudenberg's avatar

It doesn't matter how well we've coded the algorithm if it's on top of bad data...

Expand full comment
Andrew Smith's avatar

I always enjoy apocryphal history lessons- there's a reason they keep being retold!

This is well said: "what the researchers asked it to do, but not what they wanted it to do."

Asked vs wanted is such an important and clear distinction.

Expand full comment
Alejandro Piad Morffis's avatar

It's the hardest problem in AI again: alignment.

Expand full comment
Andrew Smith's avatar

Is another way to think about this: GIGA (Garbage In, Garbage Out)?

I mean, the flawed assumptions are always going to bring in an element of garbage, right?

Expand full comment
Alejandro Piad Morffis's avatar

Yep. GIGA is sound!

Expand full comment
Andrew Smith's avatar

Thanks, man. I really like to make sure I'm picking up what you're putting down!

Expand full comment
Alejandro Piad Morffis's avatar

Part of the problem in this case is that it's very hard to realize you had garbage. The COVID images are good, the non-COVID are also good. The issue is when you combine without considering all the assumptions.

Expand full comment
Andrew Smith's avatar

I mean, that is THE problem, right? (I mean, the problem you want to identify in the article, not the only actual problem with AI!)

Expand full comment