As an aside and generally, a problem that bothers me: the economics of AI replacing knowledge workers doesn't add up. If all knowledge workers become manual laborers the market for AI augmentation products shrinks. Manual laborers don't need a great deal of AI augmentation - almost by definition. If all the knowledge workers go away what's then the market for the AI suppliers?
Also, the feedback loop. If the next 10 generations of AI products are based on the output of the last 10 generations, with a diminishing proportion of human-generated content, is there a risk of a negative feedback loop with an ability to "judge" also diminishing in a similar way. (I'll be candid and say that I believe, but don't know, that humans, at the upper margin, are better at willing and judging than any AI. Let's not underestimate the willing piece. What's the best way to do something can only be judged in context of what you want to achieve. It's a bazillion factor optimization problem.)
Intuitively, this seems correct.
As an aside and generally, a problem that bothers me: the economics of AI replacing knowledge workers doesn't add up. If all knowledge workers become manual laborers the market for AI augmentation products shrinks. Manual laborers don't need a great deal of AI augmentation - almost by definition. If all the knowledge workers go away what's then the market for the AI suppliers?
Also, the feedback loop. If the next 10 generations of AI products are based on the output of the last 10 generations, with a diminishing proportion of human-generated content, is there a risk of a negative feedback loop with an ability to "judge" also diminishing in a similar way. (I'll be candid and say that I believe, but don't know, that humans, at the upper margin, are better at willing and judging than any AI. Let's not underestimate the willing piece. What's the best way to do something can only be judged in context of what you want to achieve. It's a bazillion factor optimization problem.)