We Talk “IF” With Alterna Comics – Part 2

Nicholas Bennett Nicholas Bennett
Subscriber
September 19th, 2015

We Talk “IF” With Alterna Comics – Part 2
Exclusives
0

We at ComiConverse are pleased to present to you the second in our series of Q&A's with some of the many diverse voices from the upcoming science fiction themed anthology by Alterna Comics – IF.

Started by Peter Simetti, Alterna Comics offers a unique platform for writers to write in the short-form and present their unique ideas. While one of the big lines in IF says "there are no questions, only answers", we hope to do a little bit of both in these posts.

Below are a series of back and forths with some of the creators, who give their take on science fiction and the many questions that are arising in the genre as a whole.

More of these exchanges will be posted in the coming weeks.

You can check out their  for this project for more information:

You can also visit Alterna Comics and get your copy of the series.

Check out part one here.

Nick Bennett for ComiConverse:

Dino said this one of his e-mails that I think applies here:

Maybe if the story is too true to life, and we recognize the grocery store, the microwave oven, the taxi cab, then we use a subjective lens and we're not able to empathize with the characters in quite the same way.

Alex mentioned the speed with which we are advancing. You see on the news companies working on enhanced cybernetics. It is possible we will all see viable versions of these kind of technologies within our lifetimes. I think James did a nice job with the newspaper clippings and the slow burn of how technology is introduced into our society.

Glenn, you mentioned Blade Runner and Terminator. Two films/stories where identity is crucial. In both stories, robots are able to approximate humans. While in Love by Numbers it was clear your robot was a machine, James went with the humanoid look in Apex War where one couldn't tell the difference between a robot and a person. We could be here all day naming the stories that have touched upon this subject.

Perhaps it is the enemy you know (a giant Transformer) vs. the enemy you don't (Terminator).

Do you think we inherently trust clunky robots more than ones approximating human features?

James Roche, author of Apex War

Maybe we'd trust a robot more if we can see it's gears turning, hydraulics pumping, because it'd look like just another machine designed to make our lives easier. A glorified Roomba vacuum.

I think once you throw ‘sentience' into the mix though, the idea of a robot being able to think and have feelings - to recognize itself for what it is and where it came from - then the blanket idea of trust towards one right off the bat, skin or no skin, goes out the window.

How much do you really ‘trust' a human before getting to know them?

Maybe enough that they're not going to bludgeon you to death with a bowling pin then and there, but not enough to let them sit at home with your family while you run out for milk. And, after all, most of *them* have human features too.

If I have a heavy-duty, metal robot staring at me with round, jet black eye-holes (lifeless eyes; like a dolls eyes!) and this thing can think for itself, it's become another ‘being' in my home and around my family, and the enemy I knew becomes the one I now don't.

I hate to even refer to the idea of the sentient robot as an enemy. I know that guys like Stephen Hawking and Elon Musk are cautioning against this, that it's a very bad idea, but do we really "know* that that'd be the case? Maybe they'll understand that we share the same rock, hurling around a giant ongoing nuclear explosion at tens-of-thousands of miles an hour, and that we should work together to take care of it and figure out how to live away from it as well.

Maybe they'll help us?

second image

Glenn Matchett, author of Love By Numbers:

I think either way that even if they looked like us we'd probably all treat robots relatively poorly.  Humanity in general is never kind to itself so I don't see why we'd be particularly trusting or caring to something that a large number of people would consider an object.

The new AMC drama Humans did create some great points about how we treat robots.  In the show robots were treated as servants, sex slaves and given any job a human found undesirable.  However, the question of how human robots were and how much feeling they had were was a big theme of the show.  Mostly, though, it showed that humanity treated very human looking robots as either servants barely worth their attention, as expensive toys or objects of hatred.

I think in my story, the robots I introduce are probably viewed similarly to how computers or televisions were initially.  At first, if someone had one, it was a big deal but over time they're just taken for granted.  Even looking at Back To The Future, it showed that the concept of someone having more than one television in 1965 was unbelievable.  Even ten years ago, how many households could afford more than one computer?  Now we have one on our damn phones.  I think my robots are still somewhat specialist but are becoming more commonplace. The main robot thinks of itself as more, but the object of its 'desire' sees it as no more of an appliance or at best a pet.  In one scene she undresses in front of the robot, something we would feel comfortable with in front of an appliance or a pet but certainty not something we'd do casually in front of another person.

I don't think its a question really of trust but like an appliance or a pet, its like the robot isn't even there.  If, perhaps, it had a human outer shell then undressing in front of it would be a bit more off putting.  Again, I don't think this as a trust issue but more of a subconscious security thing that's in all of us.  Its like the majority of us shutting the bathroom door when we're alone, we do it without thinking about it.  How the robots in my universe are treated are much the same,

Alex Eckman-Lawn, author of Moon:

I tend to think we (humans) are pretty resentful of anything we can't control. There's also an entitlement that comes with perceived ownership. The more self sufficient machines become, the more uncomfortable people are about them, even while we become more and more reliant on them. It's kind of the perfect storm of human insecurity, megalomania, and paranoia.

I think people are most comfortable when a "robot" is easily identifiable, and confined to a strict set of parameters. As soon as they start imitating us, we feel threatened. Then again, I feel threatened by almost everything, so maybe I'm the wrong guy to ask.

 

Nick Bennett is a Contributor to ComiConverse. Follow him on Twitter: @TheTVBuddy

(Visited 144 times, 1 visits today)

Comments are closed.

İstanbul escort mersin escort kocaeli escort sakarya escort antalya Escort adana Escort escort bayan escort mersin