If experts in every field are saying "AI can't do my job, but it can probably do everyone else's," why would we trust their beliefs about fields they *aren't* experts in, but not their beliefs about fields they *are* experts in?
Wouldn't it make more sense to assume that the reason everyone thinks AI can't do their *own* job is because they know enough about their own field to see the ways in which an AI output fails, but they think it can do other jobs because they aren't experts so can't tell why an AI output might look good but not actually hold up? A sort of "Gell-Mann amnesia" effect.
That's a great point, and I'd bet there's a lot of truth in it. Experts genuinely can see AI's limitations in their field more clearly, and yes, this strikes me as exactly analogous to gell-mann amnesia. I think this starts to get problematic when people make claims like "AI will *never* replace my job" and cite some specific skill to justify it. Or, more to the point in this piece, use their own perceived current exceptionalism (real or not!) as a way to dismiss or ignore broader questions about whether AI *should* replace X job even if it *can*.
If experts in every field are saying "AI can't do my job, but it can probably do everyone else's," why would we trust their beliefs about fields they *aren't* experts in, but not their beliefs about fields they *are* experts in?
Wouldn't it make more sense to assume that the reason everyone thinks AI can't do their *own* job is because they know enough about their own field to see the ways in which an AI output fails, but they think it can do other jobs because they aren't experts so can't tell why an AI output might look good but not actually hold up? A sort of "Gell-Mann amnesia" effect.
That's a great point, and I'd bet there's a lot of truth in it. Experts genuinely can see AI's limitations in their field more clearly, and yes, this strikes me as exactly analogous to gell-mann amnesia. I think this starts to get problematic when people make claims like "AI will *never* replace my job" and cite some specific skill to justify it. Or, more to the point in this piece, use their own perceived current exceptionalism (real or not!) as a way to dismiss or ignore broader questions about whether AI *should* replace X job even if it *can*.