Motor coach futurism

· Tieandjeans:iX


from Zvi

That is not remotely what Google did with Gemini. To the extent it was, Google trained Gemini to classify a large group of request types as harmful and not to be produced, and it very intentionally overrode the clear intent and preferences of its users. I am sympathetic to Mitchel’s argument that this was largely a failure of competence, that the people who actually know how to do these things wisely have been disempowered, and that will get discussed more later. The catch is that to do these things wisely, in a good way, that has to be your intention.


I see the similarity in the Gemini incident that's in the langauge of the NYT Headline from earlier in the roundup. Sentences with "white people" as a phrase have a strong semantic weight. It's worth steering around that phrase in many cases.

When you steer around mentioning whiteness absolutely, it really warps your decision space.

This adjust my models in two ways:

We're all piled into this Winnebago, and we have rolled low on this Drivin':slowTakeoff skill test. But lots of the froth about this story seems to be rubbernecking back at the specific curb we clipped.

My model is just more worried about how badly we are at steering this bus at this speed.