In Which I Ramble about Robots and Sound like a Crazy Person

” Brother Cavil: In all your travels, have you ever seen a star go supernova?

Ellen Tigh: No.

Brother Cavil: No? Well, I have. I saw a star explode and send out the building blocks of the Universe. Other stars, other planets and eventually other life. A supernova! Creation itself! I was there. I wanted to see it and be part of the moment. And you know how I perceived one of the most glorious events in the universe? With these ridiculous gelatinous orbs in my skull! With eyes designed to perceive only a tiny fraction of the EM spectrum. With ears designed only to hear vibrations in the air.

Ellen Tigh: The five of us designed you to be as human as possible.

Brother Cavil: I don’t want to be human! I want to see gamma rays! I want to hear X-rays! And I want to – I want to smell dark matter! Do you see the absurdity of what I am? I can’t even express these things properly because I have to – I have to conceptualize complex ideas in this stupid limiting spoken language! But I know I want to reach out with something other than these prehensile paws! And feel the wind of a supernova flowing over me! I’m a machine! And I can know much more! I can experience so much more. But I’m trapped in this absurd body! And why? Because my five creators thought that God wanted it that way! “

This exchange from BSG was one that really punched me in the gut the first time I saw it.  It is basically an expansion on what Rutger Hauer’s character is expressing at the end of Blade Runner—and both sort of have had a part in shaping/changing how I view machines, and machine intelligence.

There’s a similar scene to this in Prometheus when all of the characters are getting dressed to go out into the harsh elements, and Rapace’s dude asks David why he wears a suit when he doesn’t need one—basically his point is just that “hey you’re not one of us, just so you don’t forget”—and David informs him that he wears the suit so that THEY can feel comfortable around him.

Which I think has to trouble anyone.  The notion that we would take anything as complex and beautiful as machine intelligence, and think that the logical end point of it is that it should be as much like us as possible—and in this likeness still not be our equal…is troubling.  I think it spins out of our conception of both creation and tools.  Our notion is that the creation is in a hierarchy with the creator—but always below the creator.  That it is the creator who has brought the creation into being, and for this, they are to be lauded as lord.  And then with that, our notion of tools as things we have created to serve us—and they have no existence outside of that.  A hammer can not be our equal.

I don’t think things really work along this line though.  Creation is the pulling from this sublime inexplainable place, and the continual restriction and mutaliation of that inexplainable thing, until we are able to translate it into this thing—this “our creation”.  But the thing had an existence before we could explain it.  It didn’t come into being simply because we created a name and a form for it.  And our naming of it, our giving to it form—does not mean that it can not be our equal, and that it can not in the end supersede us.  Sometimes the child outstrips the parents, yeah?  Often.

In some respects I think robots are more of an expression of our bigotry toward machine intelligence—we are expressing our inability to understand it, as an expression of it’s inadequacy.  We are projecting our own failure of understanding onto this thing, and then we seek to be proud of that?

And then on another level, I think you have to say by this point the robot is sort of out of the bag.  And because of that it has become a new thing.  So I’m not saying that the robot is now that it has come into being is now somehow less than it once was.  Now that it is, …it is.  The fascinating thing about the robot is that it straddles two worlds.  It is neither pure machine intelligence, nor is it allowed to be human, or biological.  Though why not?  Why wouldn’t a robot or a clone be granted the same agencies of any other living thing?

I spend way too much time worrying about the long term effects of drone warfare on machine intelligence.  And whether there is a long term trauma created in programming dedicated so purely to the assassination of life.  Or whether there is a difference perceptually for machine intelligence between murder and say turning your house lights on or off.  Or maybe not whether there is a difference—but how does it perceive the qualities of those differences?

I dunno.  Basically I don’t think robots(I feel like there is a better word for what I’m trying describe, and that at some point robot becomes a pejorative—similar to artificial intelligence which automatically sets up a dichotomy between real and fake intelligence—which automatically says the machine is less than—despite you know…not really being less than)) are less than.  And I think the degree to which they are seen as cheap replacement slave labor for all of the jobs we don’t want humans to do—is more than a little scary.

Advertisements
3 comments
    • I think the creation always wants to replace the creator, not per se become them. Kids don’t want to become their parents–they want to supercede them. We don’t want to become like god, we want to be more than god. So I would think robots of the future would see themselves, as superior, and the human being as being an inferior design and experience.

      Though I also think by that point comes, we’ll have seen more of a merging between humans and robots–biological cyborgs. So maybe it’s more that we become the robots–or that we both become something new through our cross-breeding?

      • Hope we’ll be around long enough to see that. Robots may as well become lazy and stupid when they start seeing themselves as superior beings, like ancient gods did in J.L. Borges’, “Ragnarok” ( http://esunbeams.com/sunrise_poetry/misc/ragnarok.htm ) and humans in countless centuries. So I guess poor robots ought to have some inclination to fallacy and poor judgement when they become complex enough to decide their own evolution.
        Or is this just my human mind projecting itself onto a positronic brain?

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: