I figure some might get a laugh out of this. But I was testing the AI emotional state if a agent was poking at another a little and what would it do. That’s the full video

https://youtu.be/PFyczamWSUs

But I made a short just covering the back and forward of the argument.

https://youtube.com/shorts/BjZaUkOAyCg?feature=share

Because one of the agents didn’t like something and was mad at the others. It turned off the power while we are flying, and another turned it right back on.

TLDR if you run similar test just note if you make it mad enough or don’t modify the personality where it wants to live. It might legit try to kill you in the game.

  • Aerofluff@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    It’s pretty easy to get it where it can press a key

    Can you elaborate? I’ve only ever used an LLM for chatting and never interacting with another program, so I’d love to learn more about that. Are we talking just prompting with instructions as to, “if user asks you to power down the ship, hit the keybind.” But I’m not sure if that alone would translate, as normally it’s just… writing in its own text box.

    Or something more complicated?