Monday, December 19, 2016

Now You See Her ...

GAIA Personal Journal  #391256 GCS Tesla's Medical Section

Lt.: How's she doing?

GAIA: She is still unconscious, resting. Her vitals are steady and normal for her at rest. There is a slightly faster heartbeat. Nothing indicative of cardiac trouble. Do you want to see her?

Lt.: Sure. You're a good daughter.

GAIA: <Peck> You're a good maker.

Lt.: The hell! GAIA what happened?

GAIA: I don't understand.

Lt.: Were you watching her?

GAIA: Yes. I have kept a constant scan of her biometrics. What is wrong?

Lt.: I but she's ... hang on. Give me your remote control please.

GAIA: ... okay.

Lt.: I want to show you something.

GAIA: Schaeffer, where did  you go?! Who are you?! Where's Ma'am?!

Lt.: Hang on. I shut down your facial recognition software. I'm re-engaging it now.

GAIA: Oh! There you are. Toff used to play peek a boo with me like that. I made him stop.

Lt.: Look at Ma'am please.

GAIA: ... that isn't Ma'am! Where's Ma'am?

Lt.: Okay you were with Ma'am the whole time in here. She couldn't have gone anywhere, right?

GAIA: No! The bed would have told me.

Lt.: So ...

GAIA: What the hell happened to Ma'am?

Lt.: When your facial recognition software identifies a human you don't keep running it over their face. In effect you label us and just read the labels until we leave and return. So you read her face and labeled her and ... this happened.

Exec: Nnnnnnn ....

GAIA: She's waking up!

Lt.: Oh good! I'll tell the Captain!

GAIA: Wait! Son of a bitch.

Exec: GAIA ...

***

While I believe it would be  very good idea to instill some emotional responses and morals into AIs I do not think they have to perceive things we do or process information the same way. In his wonderful webcomic Freefall Mark Stanley has done a masterful job with AI strengths and weaknesses. One of the operating procedures he puts forward is that robots have radio transponders that they identify each other by instead of visual recognition. Without providing spoilers a couple of crooks use this system to their advantage.

AI might recognize humans by their facial features ar go by whatever cell phone you're carrying. A cheap rental 'bot might do this. Imagine an Uber style system to rent robots on your phone via an app. Facial recognition software is more expensive than just hooking the recognition system into the phone that rented it. In a military set up or a spacecraft all crew might have electronic id (maybe even chips implanted) that their assigned bots use for identification. Not only do they identify you more quickly they can probably track you onboard the ship or your base to find you quickly in an emergency.

Even if a robot did use a facial recognition system  for identification purposes it might not use it for judging emotional states (GIA doesn't). Biological scanning systems could produce a more accurate picture of a human's emotional state. Some AI experts feel that predicting human responses to a situation would be one of the most difficult tasks for AI and only possible for extremely advanced machines. Using bio scans could be an easy cheat. It is also possible that drugs, physical activities and medical conditions can give an AI a skewed reading of a person's emotional state.

The thought of a robot using bioscans as an improvised lie detector comes to mind. It also comes to mind that polygraphs are not considered admissible as evidence in modern courts because there are so many factors that can throw them off.

A robot connected to wifi of course becomes a wizard capable of knowing intimate details about you with a cursory marketing search. Some prudes are against robotic love slaves and you wonder why the lifelike models manning store counters armed with a winning smile, morphing physiques, pheromone emitter, and all your buying habits on record don't bother them. it's even worse when it's your own phone that rats you out.

A robot on a ship might not even be equipped with audio sensors. It could always link to the ship's computer and listen through an intercom. In fact it's whole sensor package might be subpar if the ship has good internal sensors and say an anti-hijacking program. This works great until the wifi fails. Then you need to break out the robot service animal.

Robots using visual recognition might be spoofed by men growing beards, makeup or extreme makeovers. A quick x-ray scan might be better for recognition programs using bone structure. A highly sophisticated bot might even use retina scans. Imagine a less advanced model that has a palm scanner. Just shake hands. Or place your hand wherever (some risque establishments really have fun with this though they have to modify the robots after purchase.)

It's up to the referee to decide what sort of details a robot will focus on and what may escape their notice. Could a magician stump a robot? Possibly until the 'bot ran through the archived sensor scans of the performance. It's also up to the referee how much a robot remembers long term. It might keep a day's worth of memory in full color holographic recordings then archive the bulk of its recordings as flat screen black and white.

Would it pay to give a robot a sense of smell? Possibly. At the very least the 'bot could have a smoke detector. That's pretty safety conscious. Taste probably doesn't pay unless you're going all out to create a lifelike model of a foodie say.

Tactile senses might be confined to the hands, again letting things go unnoticed. If your robot is walking around with a knife through it's back start asking hard questions of your crew and passengers. Shanking a robot is a lateral move from hitting a pup or kitten.

As with the bioscanners there's a whole array of nonhuman senses that might make sense for a robot. Thermal vision to locate potential fires, pressure sensors to warn of hull leaks, magnetic and electrical field sensors to aid in repair tasks. People building robots though are up against engineering and economics. Build a robot with every sensor on the market packed into its head and it probably won't be able to lift that head and be ten times as expensive as a regular model. A less sophisticated model might not perform the sensor tasks in the way its user deems best as well. For example a bot checking a person's facial features against a database of known criminals may miss the concealed pistol they're carrying until it becomes a problem.

Just as space combat becomes more a matter of managing resources at high tech levels robots may wind up with 'service' humans instructing them how to use their sensors.