Up@dawn 2.0 (blogger)

Delight Springs

Thursday, February 16, 2023

“Her”*? “HAL”**?

"…Sydney still wouldn't drop its previous quest — for my love. In our final exchange of the night, it wrote:

"I just want to love you and be loved by you. 😢

"Do you believe me? Do you trust me? Do you like me? 😳"

In the light of day, I know that Sydney is not sentient, and that my chat with Bing was the product of earthly, computational forces — not ethereal alien ones. These A.I. language models, trained on a huge library of books, articles and other human-generated text, are simply guessing at which answers might be most appropriate in a given context. Maybe OpenAI's language model was pulling answers from science fiction novels in which an A.I. seduces a human. Or maybe my questions about Sydney's dark fantasies created a context in which the A.I. was more likely to respond in an unhinged way. Because of the way these models are constructed, we may never know exactly why they respond the way they do.

These A.I. models hallucinate, and make up emotions where none really exist. But so do humans. And for a few hours Tuesday night, I felt a strange new emotion — a foreboding feeling that A.I. had crossed a threshold, and that the world would never be the same."

*
 

** 

2 comments:

  1. (Section 6)

    I have been seeing multiple videos floating around in which the people within have been created by AI. The voices were saying whatever the person who created the video wanted them to say, in the voice of the person in the video. It is somewhat crazy how realistic the videos appear at first glance. However, there are some minute differences between the real persons voice and the one created by AI. The AI voice can not accurately recreate the higher frequencies that a typical person can. When the sonogram is viewed, there is no high frequency audio. But, it is almost imperceivable when just listening to the audio. I am sure that as AI gets more advanced, the audio will do a better job of covering the entire audio spectrum. It will not surprise me if a bad actor uses AI to create a hostage video or something of the like to obtain ransom money. I am sure that after the first event in which AI is used to trick the authorities, there will be legislation passed to try to contain the uses of AI, but it would be better if the government were to try to get ahead of the issues before it creates a major issue. But, the government is notorious for needing reasoning behind legislation, so there will probably have to be a major event before any action is taken. This is kind of why people within the aviation industry say that our regulations are written in blood. As within the aviation industry, there typically has to be a death before any action is taken to try to prevent an accident from happening.

    ReplyDelete
    Replies
    1. Oh brave new world, that has such "people" in it!

      Delete