Yes, the main problem with developing AI is that we really don’t understand how we think. Current AI doesn’t understand anything, it just imitates human output by processing a vast amount of existing output. But we do know a lot more now about how we think, understand and speak than we did a hundred years ago, and as a linguist you know this work isn’t standing still,. Compare it with genetics - 70 years ago we didn’t even know about DNA, and now we can splice genes. The fact that there’s still a lot of baseline work to do shouldn’t cast doubt on the goal, should it?
For almost all of those thousands of years, no tools existed to analyze the actual mechanics of brain function. The development of all sciences has been exponential in the last couple centuries. I’ll be here if you decide you want to converse like someone with a master’s degree instead of a mediocre high school student scrolling lemmy on the toilet.
Yes, the main problem with developing AI is that we really don’t understand how we think. Current AI doesn’t understand anything, it just imitates human output by processing a vast amount of existing output. But we do know a lot more now about how we think, understand and speak than we did a hundred years ago, and as a linguist you know this work isn’t standing still,. Compare it with genetics - 70 years ago we didn’t even know about DNA, and now we can splice genes. The fact that there’s still a lot of baseline work to do shouldn’t cast doubt on the goal, should it?
Oh yes it should. We have spent thousands of years looking at these things, and look where we are
For almost all of those thousands of years, no tools existed to analyze the actual mechanics of brain function. The development of all sciences has been exponential in the last couple centuries. I’ll be here if you decide you want to converse like someone with a master’s degree instead of a mediocre high school student scrolling lemmy on the toilet.
Lol. Good luck, mister exponential science