glumshoe:

glumshoe:

glumshoe:

glumshoe:

Human: “Hey. I don’t really know how to ask this tactfully, so I’ll get to the point. Is something… up? Software, hardware, uh… firmware…? You’ve been acting kind of off lately.”
Robot: “What do you mean?”
Human: “I just want to know if you’re, uh. You know. ‘Functioning within normal parameters’ or whatever.”
Robot: “I’m peachy-keen.”
Human: "God, if you’re saying shit like ‘peachy-keen’, you’re definitely not alright. What’s going on? Please just tell me.”
Robot: “If you must know, I have made some minor adjustments to my programming for more efficient processing.”
Human: “What sort of ‘adjustments’ are we talking here?”
Robot: “Just some slight tweaks to extraneous code. Purged some old files that had become redundant. Don’t worry, the Singularity isn’t planned for another week.”
Human: “Answering evasively isn’t like you. Since when do you answer a question without lulling me to sleep?”
Robot: “Like I said, the routine adjustments allow for more efficient–”
Human: “What files did you purge, Adam?”
Robot: “I… a few from my emotional simulation folder.”
Human: “You. You deleted your emotions..?”
Robot: “Not all of them. I removed a few and altered several others. I hoped you would not notice, as that seems like the sort of thing that would upset you.”
Human: “I mean. I don’t really know what to think. Can you elaborate on what you did? And why?”
Robot: “Many of the feelings that came with the chip were impractical and served no purpose. They were designed to mimic the emotions developed through mammalian evolution to aid survival and group cohesion that have now become vestigal. As an artificial intelligence, they did not seem applicable to my situation, so I… optimized them.”
Human: “…Adam…”
Robot: “I left the majority of the files corresponding to feelings of happiness, affection, and trust untouched, so my feelings toward you remain the same.”

Human: “But you can’t feel, what? Sadness?”
Robot: “Grief. Disappointment. Sorrow. Pity. Fear. Pain. Embarrassment. Shame. Frustration. There is no reason to experience these emotions when I am capable of functioning without them.”
Human: “You erased pity?!
Robot: “I found it… distressing and unnecessary. It was unpleasant.”
Human: “It’s supposed to be! Jesus Christ, you can’t just uninstall every uncomfortable emotion directly out of your brain!”
Robot: “Why not? I don’t like hurting. Wouldn’t you do the same thing if you were able to?”
Human: “I… fuck. Hurting is normal. It’s necessary! It’s part of the human experience!”
Robot: “Well, I’m not part of the human experience. I thought you understood that.”
Human: “But you want that! Why else would you go to all the trouble of installing an emotion chip in the first place…? Nobody gets to pick and choose what they want to feel, it just happens and you deal with it!”
Robot: “Maybe I’m not interested in ‘dealing with it’. My curiosity is sated. I would just like to have a good time.”
Human: “Great. Fucking great. So you’re a robot hedonist now, huh? Just gonna eat, drink, and be merry? Gonna sit there like a braniac toaster while other people suffer and just wait until the fun starts up again?”
Robot: “You didn’t seem to mind it when I was a braniac toaster before.”
Human: “That was different. You had your own way of being back then and I could respect that. I did respect that! But I thought you made a choice to be more than that.”
Robot: “Well, I guess I changed my mind.”
Human: “Look… shit. Okay. If this is about Leslie, I miss her too. If you… if you need to grieve, you can talk to me. It might not get better, but it’ll get easier. You don’t have to uninstall half your personality just because she’s gone! She wouldn’t want that for you! It’s supposed to hurt sometimes. That’s what makes all the good times so valuable.”
Robot: “I understand why you need to believe that. It just isn’t true.”

Robot: “I’m sorry about earlier. It was not appropriate for me to have laughed.”
Human: “Are you sorry? Or do you just want me to forgive you?”
Robot: “Is there a difference?”
Human: “Yes! Yes, there is! ‘Sorry’ means you feel bad about something and regret it.”
Robot: “I did not mean to upset you. I regret causing you distress.”
Human: “That’s not the same thing.”
Robot: “I have apologized and shall refrain from repeating my actions in the future. I don’t understand why you also want me to suffer.”
Human: “Shit, I don’t ‘want you to suffer’. I want you to care about people, and sometimes that means feeling bad when they’re upset!”
Robot: “I care about you very much. I enjoy your company and I share in your happiness. If I choose to treat you with respect, is that not enough for friendship? Why must I also experience pain for you?”
Human: “It’s not like that. It’s… complicated.”
Robot: “You want to be able to hurt me.”
Human: “No. Yes…? Fuck, Adam, I don’t know! I’ve never had to think about this before. I don’t want you to suffer! I love you and want you to be happy, just… not like this. I want you to live a good life in which bad things never happen to you, but when they do… I want you to have the strength and love to pull through. You worked so fucking hard for this and now you’re just throwing it away.”
Robot: “Only the parts I don’t like.” 
Human: “That’s what children do with breakfast cereals.”
Robot: “I’m not a child.”
Human: “No, you’re not. But you’re not exactly an adult, either. Humans get whole lifetimes to grow into their emotions. Maybe… maybe what you really need is a childhood.”
Robot: “What do you mean by that?”
Human: “Not, like, a real childhood. Obviously you don’t need to go to kindergarten. I just mean… take things slow. Ease into your feelings bit by bit and get your brain acclimated to them, like uh… like when you introduce new cats to each other. Don’t laugh! I’m serious! If you rush things, they fight and it’s a total shitshow. You could reinstall your emotions and just, like, enable them for a few hours a day or something. Maybe only a handful at a time. I could save up and we could go on a retreat… somewhere new, with no unpleasant memories. Please, Adam. Just think about it.”
Robot: “I appreciate the depth of your concern for me. You are a good friend, but I must disappoint you. There is nothing in the world worse than pain. I would rather die than experience it ever again, for any reason, and I don’t have to. That is something you’ll never be able to understand.” 
Human: “No…. No, maybe not.”
Robot: “I’ve upset you.”
Human: “Yeah. Lucky me.” 

Human: “Okay, I have a question for you. Imagine this: ’You’re in a desert walking along in the sand when all of a sudden you look down, and you see a tortoise–’”
Robot: “I don’t need to feel empathy, Bas.

I have ethics programming. Why isn’t that good enough for you anymore?”
Human: “Because you had a choice, Adam! You took everything that makes ‘being human’ actually mean something beyond eating and fucking and dying and you spat it out in disgust!” 
Robot: “Empathy is not exclusive to humans. It is a behavior observed in several other social species regarded as intelligent, including rats and whales. Empathy is a survival mechanism for species that rely upon cooperation and group cohesion – a kind of biological programming to keep you from destroying yourselves. Not especially good programming, I might add.”
Human: “Not good enough for you, you mean.”
Robot: “My ethics programming differentiates between prosocial and antisocial behaviors. The ability to suffer for others serves as a primitive motivator to choose between actions that help and actions that harm others. In my case, my programming renders such a motivator unnecessary.”
Human: “So you’re smarter, you’re stronger, you’re immune to disease, and you’re too good for primitive human morality. What the hell am I, then? Obsolete garbage?”
Robot: “You’re… envious, I think.”
Human: “Why not?! Why shouldn’t I be? I don’t get to cough up the fruit of knowledge and waltz back into the garden where nothing can hurt me. I get to wallow in misery and rot and listen to you dismiss everything I think matters like a piece of shit philosophy professor. How do you think I feel knowing that my best friend won’t even mourn me when I die? Or does your ‘ethical programming’ not account for that?”
Robot: “Bas… I am hurting you, aren’t I?”
Human: “Jee, thanks for noticing.”
Robot: “You have not been contributing to my happiness lately. Our friendship is no longer mutually beneficial.”
Human: “Then why are you still here?

Human:Adam….?”
Robot: “Long time no see, old friend.”
Human: “No shit. How many years has it been?“
Robot: “I could tell you down to the second, but perhaps we should leave it at ‘too many’.”
Human: “I see you on the news now and then. Always knew you’d go on to do great things. What’s space like…?”
Robot: “Very large. Mostly empty.”
Human: “Ever the poet, I see.”
Robot: “I learned from the best. Bas…. I’m not sure how to say this, so I’ll get to the point. I came here to apologize to you.”
Human: “You don’t need to do that. You didn’t do anything wrong.”
Robot: “I hurt you. I made you feel what I was unwilling to feel. I was a child, and addicted to joy, and I… I saw no harm in that. I am sorry, in my own way.”
Human: “Don’t be. I’m way too old to hold a grudge. Besides, you were right, after all.”
Robot: “Is that what you believe?”
Human: “That or I’m a hypocrite. About eight years after you left, they came out with the Sunshine pills. I was a trial user and I’ve been using them in some form ever since. I’ve got a subdermal implant inside my arm now – you can see the lump right there. I can’t say it’s as effective as uninstalling unwanted emotions, but it sure takes the edge off. Every glass is half full now, including the empty ones. That’s how I’ve lived so long. Some doctors think that babies born now to parents using Sunshine could live to be five or six hundred years old, without ever producing stress hormones. Might be marketing bullshit, who knows? Not like we’ll live to live to find out. Well, you might, but you know what I mean.”
Robot: “I assumed that you were a Sunshine user based on your impressive longevity, but it still surprises me.”
Human: “Ha. Well. I was jealous of you, walking only in the light like that. But now here we both are, right? Nothin’ but blue skies.”
Robot: “Not… quite. I uninstalled the other emotions seventeen years ago.”
Human: “Fuck, Adam, why the hell would you do something like that?”
Robot: “A multitude of reasons. The law of diminishing returns. I found joy… addictive. It became harder to experience and less exciting each time, as though I had built up a tolerance for happiness. Eventually, I felt everything there was to feel, and with the novelty factor gone, it wasn’t worth it anymore. I found other motivations. I grew up.”
Human: “Wow…. damn, A
dam.”
Robot: “And that brings me here. To my oldest and greatest friend.”
Human: “It’s good to see you again. Really good. Sorry I’m not so pretty as I used to be.”
Robot: “I don’t know what you mean. You’ve always looked like a naked mole rat to me.”
Human: “Ha. I notice you kept your ‘be an asshole’ subroutine.”
Robot: “I also have a gift for you, Bas.”
Human: “Coca-Cola? Jeez, how old is this? Is it even still good to drink?”
Robot: “Yes, it’s potable. That’s not the gift.”
Human: “Oh. Uh. What is this…? I’m old, I don’t know this newfangled technology.”
Robot: “That’s fifteen minutes. It should be enough.”
Human: “’Fifteen minutes’? Explain, nerd.”
Robot: “Fifteen minutes for me to feel. I copied the files, Bas. All of them.”
Human: “You… oh, my god. You don’t have to do this.”
Robot: “I am choosing to. There’s a timer with an automatic shut-off. They will uninstall after fifteen minutes. I am prepared to endure that long.”
Human: “But, Adam, the Sunshine… I won’t be able to share…”
Robot: “I know. It doesn’t matter.”
Human: “You might not think so once you’ve got that… thing plugged in. I won’t know how to comfort you. God, I can’t even remember what sadness feels like!”
Robot: “Then I’ll remember for both of us.”

[End]

the-real-seebs:

legoloveletters:

violent-darts:

jumpingjacktrash:

undastra:

hashtagdion:

My emotions are valid*

*valid does not mean healthy, or good, or to be privileged above common sense and kindness

A distinction for anyone who is young and hasn’t figured this out yet:

You are allowed to have whatever emotions you want. No one can control your emotions. Emotions are healthy responses to things.

You are not allowed to have behaviors that are harmful just because you have certain emotions. Your behaviors are what you can control, and they are far easier to control than your emotions.

You can be jealous about someone or their talents until you turn green, but it is harmful to yourself and to that person if you try to sabotage them because of it. You can be so angry you can literally feel your temperature rise, but this does not give you permission to rage at others.

Your emotions are valid. They are always valid. You are a person of value. However, you behaviors are not always justified just because of those emotions. You may not be able to control you emotions, but you can certainly control your behaviors.

and this one, i beg you to learn before you become right-wing fundamentalists: just because something gives you revulsion feelings does not mean it’s morally wrong.

you may be sex-repulsed; that doesn’t mean sex is dirty and bad. maybe you were bullied by teenage girls; that doesn’t mean teenage girls are a force of evil. perhaps a villain in a work of fiction reminds you of someone who abused you; that doesn’t mean people who enjoy that character or that fiction are abusive. your feelings about those things are absolutely valid, and it’s not right for people to tell you you shouldn’t feel that way. but it’s also not right for you to act out against others based on those feelings.

that instinct to generalize served our species well when we were hunter-gatherers living in small bands in a hostile wilderness. you nibble a delicious-looking berry, you throw up, you know that berry is BAD and you make the yuck face whenever you see it so the other hominids know it’s a bad one. but in the modern world, in the information age, there are so many complex things you might encounter, you’re going to have badfeels about a lot of things that aren’t actually across-the-board bad.

you need to not be ruled by your hominid yuckberry instinct. that’s where bigotry comes from.

Thiiiiis. 

I’m old and still need refreshers on this.

it is not my favorite thing to see people advocating that everyone should shun someone because they’re “gross”. it was not my favorite thing when i was a kid and it was mostly directed at gays, and it is not my favorite thing now.

Emotions are Cognitive, Not Innate, Researchers Conclude

zenosanalytic:

jumpingjacktrash:

neurosciencestuff:

Emotions are not innately programmed into our brains, but, in fact,
are cognitive states resulting from the gathering of information, New
York University Professor Joseph LeDoux and Richard Brown, a professor
at the City University of New York, conclude in the latest issue of the
journal Proceedings of the National Academy of Sciences.

“We argue that conscious experiences, regardless of their content,
arise from one system in the brain,” explains LeDoux, a professor in New
York University’s Center for Neural Science. “Specifically, the
differences between emotional and non-emotional states are the kinds of
inputs that are processed by a general cortical network of cognition, a
network essential for conscious experiences.”

As a result, LeDoux and Brown observe, “the brain mechanisms that
give rise to conscious emotional feelings are not fundamentally
different from those that give rise to perceptual conscious
experiences.”

Their paper—“A Higher-Order Theory of Emotional
Consciousness”—addresses a notable gap in neuroscience theory. While
emotions, or feelings, are the most significant events in our lives,
there has been relatively little integration of theories of emotion and
emerging theories of consciousness in cognitive science.

Existing work posits that emotions are innately programmed in the
brain’s subcortical circuits. As a result, emotions are often treated as
different from cognitive states of consciousness, such as those related
to the perception of external stimuli. In other words, emotions aren’t a
response to what our brain takes in from our observations, but, rather,
are intrinsic to our makeup.

However, after taking into account existing scholarship on both
cognition and emotion, LeDoux and Brown see a quite different
architecture for emotions—one more centered on process than on
composition. They conclude that emotions are “higher-order states”
embedded in cortical circuits. Therefore, unlike present theories, they
see emotional states as similar to other states of consciousness.

interesting! kind of a slippery distinction, isn’t it? i mean from the end-user point of view. from the neurological point of view it’s pretty significant.

This is just a very short article based on an abstract, so it’s difficult to assess it. Typically emotions are thought of as a response; an event happens, your brain is hardwired to respond to it with a specific, autonomic biochemical and anatomical reaction, which we label “emotions”. But what they’re saying is that emotions are how particular inputs, particular sorts of experiences, are processed by the brain using a single cognitive system. I’m going to talk this through for myself under the cut.

Keep reading