Hi folks, unfortunately this week is going to be a bit of a short one, and a little light on the content, as I’m going on vacation on Thursday and trying to wrap up a couple of things before I leave, including a series of documents at work and the first draft of my novel so that I can get it to beta readers. Apologies for these being more “bite-sized” versions of my thoughts. I expect a lot more next week, with my batteries recharged.
Confession time: until this past Friday night, I had never seen an episode of Star Trek: The Next Generation. I know, right? What kind of self-respecting geek could I possibly be? The truth was that the concept never really appealed to me. I mean, okay, I loved the original series, but not for the reasons that most Trekkers liked it. I just thought that it was hilariously hokey. I didn’t see how I could get that same sort of effect from a show like The Next Generation. So I remained a TNG virgin until Friday, when we were testing the setup of another completely useless television gadget and watched an episode at the fiancee’s insistence.
She picked a great episode to break me in, Measure of a Man, in which the question of the android Data‘s essential humanity is put to the test in a trial. The quality of the writing in the episode just bowled me over, to go with some pretty great acting. Certainly not what I was expecting. The questions dealt with in the episode actually reminded me of some of the quandaries that we tackled in my college class on Postmodernism and Technology (basically, the Cyberpunk Class). They boiled down to the question of what makes us essentially human and whether a replication of humanity in an artificial intelligence would make that intelligence “human”.
It’s a question that I actually think about quite a bit in terms of not just artificial intelligence but also animals – and the concept of “other” intelligences, such as hypothetical angels or other supernatural beings. How much would we have in common? Which of their actions would seem human to us? Would there be an essential alien quality to their behavior, and if not, why not? It informs some of the decisions that I made with characters in Corridors of the Dead and with some future ideas coming down the pike.
But I think the biggest question, for me, is that of the moral framework of such characters. How much of morality is innate to humanity, how much of it is culture, and how much could an outsider learn to effectively mimic it? Those are the kinds of questions that I like to explore in my works, and while I’m not sure if I effectively answer them, sometimes it’s just important to pose them.
The reason I bring this all up is to suggest a way to bring some personal emotion and stakes into a story: looking at some existential questions that resonate with you, and attempting to explore them within the context of your story.