Mind you, I could be partially or entirely wrong. But that's why
   we're hashing out these ideas back and forth.

   We're creating a feedback loop, completely with NOR capabilities
   between the two of us, as we absorb, reject and reflect each
   others ideas.

   Communication changes our own code. But we have something that's
   _not_ programmed into machines yet: emotional states. That's
   what makes them scary.

   The mistake we make is ultimately platonic in nature but
   Aristitlian really. Excluded middle. Logic is not separate from
   emotion. Logic is pushed BY emotion and is intertwined. It's in
   the way the biochemicals squirt around in our brains. We're also
   not separate beings from our environments.

   Machines with ambitions to take over Earth will have been
   programmed with a limited emotional set: that perhaps of
   Economic Theory; "Pure Reason". 18th Century thinking, Adam
   Smith, etc.

   *That* type of AI would be frightening.

   But emotional logic programmed into AI wouldn't be frightening
   at all, unless the ability to cope with emotional states wasn't
   programmed in as well.