A.I.

greenspun.com : LUSENET : The Art of Film : One Thread

Any thoughts on this one? I'll add my own soon enough... going to see it a second time today.

-- Mat Rebholz (matrebholz@yahoo.com), July 04, 2001

Answers

Just saw it yesterday for the first time, and I'm still in a bit of shock. A.I. is by far Spielberg's best work, and while for me, that in itself isn't saying much, it is a great film regardless of who made it. Since my reactions to the film are so complex and far- ranging, I'll just make a few points to avoid a tangled narrative.

1. Anyone who thinks this is a movie about a robot boy searching for love is hopelessly stuck in literal-mindedness.

2. Whether intentionally or not, this has got to be the most anti- clerical mainstream Hollywood movie I've ever seen. You'd have to go back to Bunuel or Jodorowsky's "Santa Sangre" for a comparably sustained attack on the church in cinematic form.

3. The irony of David's plight is that he is already superior to that which he seeks to become and doesn't know it. It's his programming (indoctrination) which makes him blind to the power of his potential. The Pinocchio story, which is nothing more than a retelling of the Judaeo-Christian myth, is exploded thoroughly. He is taught to stay on the straight and narrow path, for in the end, his faith will bestow on him a great reward. But what if the promised reward is not for him to have? David's tragedy is that he's unable to be free of his single-minded fixation.

4. The ending is marred only by the annoying and needless narration (probably added at the last minute). Imagine the ending without it, and it is sublime. A simulation of a boy professing and receiving love from a simulation of his mother. The mother's perception of this event is entirely illusory (and for this reason cruel, if she is truly alive for that one day.) For David, it's a somewhat pathetic cardboard version of his dream, yet it doesn't matter. He can't really distinguish between simulation and reality anyway, since he himself is a simulation. It's perfectly ambiguous: both happy and hellish at once.

5. To an artificial mind, the concept of reality is meaningless. And by artificial mind, I'm not just talking about robots. All cultural conditioning is man-made. Our thought patterns, sense of values, notions of morality are indeed mostly engineered.

6. We can imagine that the sentient machines are the result of an evolutionary process that began with the creation of compassionate robots of which David was the first. The movie is clearly misanthropic in its depiction of humans as cruel, lustful, and environmentally destructive. I like the twisted optimism of showing the inheritors of humanity to be enlightened machines of our own design.

7. Even the nature of love itself is questioned as an absolute good. The human need for love is an affliction in David's case, and leads to acts of destruction: David's "murder" of his double and his own attempted "suicide". The brilliance behind the conception of David's character is that he is an automaton driven, not by reason (the usual depiction of robots in SF), but purely by emotion. His emotions are beyond the scale of human capacity, and he serves as a counterpoint to the argument that what makes humans better is our ability to feel.

8. For those who think it's too bad that Kubrick didn't make this film himself, I (surprising myself) disagree. For once, Spielberg's penchant for overwrought sentimentality is applied to the perfect subject. Since A.I. is about artificial emotions, who better to direct it than the master of the phony sentiment? I think Kubrick was right when he judged Spielberg to be the more appropriate orchestrator of this material. The process of watching parts of the film are very much like becoming robotic ourselves. In this case, perfectly complimentary.

-- Peter Chung (neo830holy@orgio.net), July 16, 2001.


No. A.I. may be a great film from a purely intellectual level, and it may indeed contain a great deal of moral ambiguity (making little Haley unlikable is the bravest thing this movie ever did), but on philosophical and ethical grounds, it is shallow, cynical, and ultimately reprehensible.

The "irony" in David's longing to become human, coupled with the ending, has an underlying message: technology as the Second Coming, a judgment both amplifying and cleansing us of our flaws. And what is left? Perfect "mecha" that all look the same. Here, spirituality and Darwinism is conflated.

It's not for nothing that the evolved mecha resemble the "Grays", stereotypical movie aliens. Among UFO supporters, Grays are perceived as peaceful visitors from the future - a future where man has evolved to ONE common denominator. The inherent racism in such an assumption (that humans would have to be the same color to co-exist peacefully) is glossed over. Darwin himself was a virulent racist, a champion of eugenics all too eager to extend his theory of "survival of the fittest" into human society.

Like most eco-apocalyptic fantasies, A.I. attempts to justify Darwinistic, linear growth by incorporating a redemptive Armageddon. Sure, we'll destroy ourselves; but it's OK, the film reassures, because we built ourselves better. Evolution at work.

Furthermore, while I accept that some emotions are the result of conditioning, not ALL feeling can be taught. There is such a thing as gut feeling, an intuitive sense of right and wrong that these androids lacked. David didn't fail because he was programmed to feel, thus compromising his logical superiority - he failed because those feelings were programmed. "The mind (artifice) creates monstrosities, paradise makes the opposite" - Killing Joke, "Intellect".

A.I. is, in many respects, a monstrosity - a beautiful, hideous, evolutionary propaganda film, with an environmental and spiritual message potentially as harmful as anything Riefenstahl ever did. You may ask what ethical judgments have to do with art, but in my opinion, art without a strong ethical base (ethics being defined as basic principles of living) is meaningless.

-- Inukko (nadisrec@worldnet.att.net), July 17, 2001.


Inukko, I agree with most of what you're saying, especially the idea that high aesthetic values do not redeem art that is morally bankrupt. I’ve always believed that Spielberg is the most cynical of filmmakers. His need to sentimentalize is directly proportional to his lack of faith in the audience to feel emotion on their own. Kubrick is often called a cold-hearted misanthrope—yet watching his films, I know he respects me as a viewer and holds me in high enough regard not to talk down to me. That’s the opposite of cynicism.

I really have no idea if the virtues I perceive in A.I. are intentional, and as I’ve said many times, it ultimately doesn’t matter. Every creative choice in a film reflects the judgment of the filmmaker, consciously or unconsciously.

I do not agree that the identical appearance of the mechas at the end need be seen as a veiled signifier of racism. (Even though Spielberg has revealed his deep-seated racist views in the Indiana Jones movies.) They are faceless, genderless and egoless as well as being without race. And I wouldn’t say that their presence is entirely comforting—even if it is for David. The tone of the ending is ambiguous. It is neither celebratory nor is it regretful of humanity’s passing.

The story is told from the sentient machines’ point of view. The narrator is the voice of the robot who explains things to David at the end. From their perspective, the humans were savages, much as we regard neanderthals. And who’s to say that neanderthals weren’t our betters?

But then, the movie is not really about robots—not any more than Pinocchio is about a wooden boy. A.I. is a fairy tale, and so speaks in the language of metaphor and symbol. In Collodi’s case, the puppet represents the worldly body, subject to base instincts and temptations which would hinder it from attaining the godly state of realboy-dom (sorry, don’t know what other word to use.), in other words, man’s immortal life in heaven. The problem I’ve always had with Pinocchio (and, yes, Christianity) is that it places happiness always beyond reach, and in the process deprives us of the joy of living in the moment. This is why David is such a tragic figure, even at the end, when he finally attains the fulfillment of his dream. His reward is ultimately hollow.

David never reaches enlightenment nor frees himself of the burden of his idee fixe. He is a model of how not to live. Unlike most such stories, the unwavering pursuit of one’s dream is not presented as a virtue.

-- Peter Chung (neo830holy@orgio.net), July 17, 2001.


OK, I can appreciate that. It's not how I read the film, but you're speaking from a very different perspective.

I agree about Spielberg being the more cynical of the two; it is, in part, my dislike for his "deus ex machina" scenarios that led to my strong reaction against this movie. Giving us perfect machines to carry on the flame of humanity feels like an easy-out - forgivable in pure entertainment, but I resented the idea that this was somehow an important message. On the other hand, a lot of Hollywood films attack religion just to be "with it"... the explanation you gave here made more sense.

Veiled religious significance aside, I'd have an easier time accepting A.I. if the concept of feeling robots hadn't been touted as something profound. When robots (or aliens, or angels) are portrayed as superior, it's usually so we can project our own wishful thinking... to me, A.I. represents escapism of the worst kind.

-- Inukko (nadisrec@worldnet.att.net), July 17, 2001.


I have not seen this film, but this discussion urges me too. The only thing I feel compelled to add is that when Hollywood portrays love it is the usual brand of love they accept as the only brand; that is the love that stems from the need to actually satisfy ourselves, our wants, our needs in relation to the approval of those around us. We do in films without recognition. Perhaps the growth beyond this sort of love is what would be the attainment of what would be a more superior people. To make it more clear: didn't Ma Barker's sons love her? Does that emotion that drove them to behave in accordance with her approval prove them superior for having loved? Love in the state of perfection is not even truly considered here, from what I see, but is a self centered sort of love, and so deserving of the schmaltz bestowed upon it by Spielberg...

-- Barbara e. (Suesuesbeo9@cs.com), July 20, 2001.


Wow... that actually has some relevance to what I'm doing. I can't reveal anything more yet, but thanks for the encouragement Barb.

I've been on a hating jag recently (with regards to film)... don't like any of the movies I've seen lately. Final Fantasy was... ergh. But I take back what I said about AI being harmful; maybe "clumsy" is a better description. Anyway, you could do worse. I loved Duel ;)

-- Inukko (nadisrec@worldnet.att.net), July 20, 2001.


Of course, the love of a son for a mother and the desire for her love returned is not exactly an inferior aspiration, and realboy-dom is the cutest word for Pinocchio's aspiration as I have ever heard.

-- Barbara e. (Suesuesbeo9@cs.com), July 21, 2001.

Now that I'm finally back on the net, it's about time I put down my own thoughts on this film, even though it's been more than a month since I saw this thing. Let's see...

1. To echo Peter somewhat, my most overwhelming impression while watching this was how almost all of the character interaction was either between machine and man or between machine and machine. At first, I thought, "well, there's really no substance there... how funny that they've tricked us into being interested in it!" After further thought, I realized that human interaction doesn't necessarily require a human presence... it's the behavior that defines humanity, not the thing behaving. The system is human; we are not. I believe in the Turing test of machine intelligence: if a machine seems intelligent, one must assume that it is. If it seems to feel, then it does feel.

2. I've always found the traditional sci-fi plot of the robot seeking humanity to be somewhat uninteresting -- this film, though, let me happily think of other things. It was something of a reversal... instead of the viewing trying to define a robot's humanity, the viewer tries to define his own humanity by observing how his emotions can just as easily be swayed by the human behavior of an inhuman machine. Of course, this goal was stated rather blatantly from the onset of the film.

3. Lots of clever bits of writing helped the story. One of my favorites: Professor Hobby's conversation with the mecha in the first scene ("What did I do to your feelings?" "You did it to my hand.")

4. On that note, some nice, subtle mecha acting, probably aided greatly by equally subtle makeup jobs, like the woman from the first scene and Gigolo Joe. Perhaps my favorite shot in the film is when we first see Joe's face, peering closely into his eyes as he provides emotional support to a troubled woman. His expression caught me immediately -- perfectly sculpted, soft, gentle, but eerily distant. In that single image, they had captured for me the mood of the era, and of the film.

5. I liked the contrast in environments. The shift from David's abandonment to Joe's first scene was especially nice -- suddenly lost in an "adult world", we're immediately thrust into dealing with issues of abuse and sexuality (the only components of the experience of love for many adults, very much unlike David's version of the feeling).

6. Teddy was my favorite character. For me, he quite blatantly represented one of the movie's main ideas: despite the fact that these machines are distinctly artificial, they can seem as "real" as the next character. If you notice (I suspect this was planned, but it was pulled off quite subtly for the most part), none of Teddy's lines reveal any true intelligence. In fact, our current technologies could imitate any of his interactions. He's a simple question-answer program, designed to aid in a child's upbringing by making bedtime story statements ("I see the moon"). And when he doesn't know the answer, he gives magic 8-ball responses. ("Is it real?" "I can't tell yet.") Furthermore, he's designed to protect himself as the product, and his fellow products ("I'll break", "You will break.") I was surprised by some reviews I read, which named Teddy as the wisest character of the bunch. He was the most simplistic of them all, and yet, he was viewed as equal, if not superior, to his companions.

7. I was alarmed at how many small children came with their parents to this movie. One boy was pelting his mother with questions during the ending, and she seemed just as confused as he was.

8. I must admit that the final scenes were hard to sit through, and on the second viewing, I even had to walk out about ten minutes before the credits (a very rare thing for me). The explanatory dialogue was unbearable, depending on my mood. Even though I like the idea of resurrecting a person from the past using quantum mechanics or some similar method (in fact, I wrote a short story a few years ago on the idea), the last thing I wanted was Grandpa explaining the process to David using Star Trek terms like "space-time pathway". Also, I found the CG fairy uncomfortable to watch. David twitching upon being woken, though, sent shivers down my spine, as did his frozen form rigidly maneuvering out of the newly-opened vehicle (yes, I'm swayed by imagery). I agree with Peter that the sentient machines were not racist -- rather, they were entirely devoid of such notions (believe it or not, a secondary theme of another story I've been working on).

Whew. I just got off work, and I'm tired and full of caffeine. I'll be sure to add anything else I might remember.

-- Mat Rebholz (loquat@csranet.com), August 01, 2001.


Hey, Mat is back!

I can't really argue with that (although I don't agree with point 1; I don't see machines ever becoming sentient, only people finding new ways to channel their own sentience through them). The ending, I still have "issues" with... I don't think it's racist, not directly, but I still resented it's implications. And yeah, the dialogue was rather cheesy.

I *do* like the close-up of the blue fairy as it breaks into a thousand pieces, though... that's some cool (no pun intended) imagery.

-- Inukko (nadisrec@worldnet.att.net), August 02, 2001.


I guess I didn't make it clear enough (I was a little out of it when I wrote that), but I think machine sentience is a moot point. I don't particularly see machines ever becoming sentient, however, they can become human (which is effectively the same thing for our purposes), and I agree with you: we only apply our own sentience, our own humanity, to them. They are vessels for our own emotions. (I'd also argue that this process doesn't require a non-human target; we do it with other people on a daily basis.) I'm currently writing a short novel based in part upon this concept, and on the malleable nature of reality that Peter referred to earlier (I agree with your comments on the topic); I'll be sure to post a link to it once it's finished, and on the net.

-- Mat Rebholz (loquat@csranet.com), August 03, 2001.


Moderation questions? read the FAQ