I play a lot of video games and one of the major limiting factors in games today and really from their inception, is the fact that no matter how pretty the world is, it's ultimately scripted. The creatures and characters within the world move along a predefined path, even if that path is very wide. You'll notice it whenever you try to do something out of bounds of what the programmers intended. Sometimes you'll find the thing you're interacting with to repeat itself because it simply doesn't know how to respond beyond what it's been told to accept. With artificial intelligence there is a test to see if a machine has the ability to think for itself. So far no computer system has completely passed this test, although many have tried. It's hard to say just when we'll find ourselves surrounded by machines that think. The question is though, if we were to turn a test like that on ourselves how well would we actually do given a long enough testing period.
Each of us lives our life based on the experiences we've gained over time. We have the capability to learn from these experiences and with any luck our mistakes in the past help prevent mistakes in the future. If you're anything like me though, it doesn't really stop the mistakes from happening though. Instead it's just a variation on a theme over and over again. Could this be because we ourselves are limited by life parameters? A computer has no idea how to respond to a question about theoretical physics if it has no experience with it. It has to be programmed into its system for it to have any understanding. Our brains are very much the same way. If you were to ask me a question about theoretical physics I'd do roughly as well as a computer that could speak. So let's say both the computer and I are given some experience and we're now able to respond to questions regarding the theory of everything. Our answers would again be limited to the knowledge we've been given combined with our previous experiences. As a rational creature, I may be able to make intuitive leaps in logic based on seemingly unrelated fields of study. The computer would also be able answer in its own way. Given time though, eventually we'd both come to the upper limit of our knowledge and experience.
When we meet new people for the first time we may find them refreshing and interesting because at least initially, they are outside our experience. What they say or do may seem like a mystery until we get to know them better. As we get to know them we can start to predict how they'll react to certain situations. If we spend a really long time with them we may start to know them better than they know themselves. The things they do become almost repetitive. That's not exactly a bad thing either. No person has endless experience, even if we wish we did. On a long enough timeline everyone will come to a point where the well is dry. If that's the case then just how dynamic are we compared to some video game character or robot that's being tested for its ability to think on its own? We look a computer system and decide through a complex series of tests that it doesn't think and yet if we were to extend the same kind of test to ourselves it's likely we'd eventually come to a point where we'd be hard pressed to say if we have intelligent behavior. What if there is something else out there in the universe that has its own standards of determining if something is intelligent? Something with standards so far beyond what we can comprehend that we'd be looked at roughly the same way we look at a calculator. Yes we can do simple tasks, but we can't "think". Does that call into question our abilities or our standards? Just how sure can we be when our only basis for intelligent thought is how everything compares to our own minds?