16 Comments

There were three acronyms in computing in the late 70’s early 80’s. KISS, GIGA, RTKP.

Keep it simple stupid.

Garbage in garbage out.

Reduce-it to-a known problem.

Guess they don’t teach those ideas anymore?

Expand full comment

I mention KISS and GIGA to my students all the time ;) but hadn't heard the last one. I'm stealing it.

Expand full comment

Generational transfer of knowledge.

Expand full comment

GTK!

Expand full comment

It doesn't matter how well we've coded the algorithm if it's on top of bad data...

Expand full comment

I always enjoy apocryphal history lessons- there's a reason they keep being retold!

This is well said: "what the researchers asked it to do, but not what they wanted it to do."

Asked vs wanted is such an important and clear distinction.

Expand full comment

It's the hardest problem in AI again: alignment.

Expand full comment

Is another way to think about this: GIGA (Garbage In, Garbage Out)?

I mean, the flawed assumptions are always going to bring in an element of garbage, right?

Expand full comment

Yep. GIGA is sound!

Expand full comment

Thanks, man. I really like to make sure I'm picking up what you're putting down!

Expand full comment

Part of the problem in this case is that it's very hard to realize you had garbage. The COVID images are good, the non-COVID are also good. The issue is when you combine without considering all the assumptions.

Expand full comment

I mean, that is THE problem, right? (I mean, the problem you want to identify in the article, not the only actual problem with AI!)

Expand full comment