AI girlfriend bots are already flooding OpenAI’s GPT store::OpenAI’s store rules are already being broken, illustrating that regulating GPTs could be hard to control

  • tias@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    5
    ·
    edit-2
    6 months ago

    Vector databases are relatively good at this kind of thing, because they can find records based on queries that are semantically close instead of just a lexical search. It would probably still make sense to split the information up in fragments such as e.g., “Interstellar movie,” “watched on February 2nd, 2021,” “made me vomit”, and then connect those records to each other. GPTs are good at that kind of preprocessing. The idea would not be to store exact data such as timestamps and that’s not how vector databases work, so recall would be more associative just like for humans (I can’t ask you what movie you watched on Feburary 2nd, 2021 and expect an accurate reply either).

    • ExLisper@linux.community
      link
      fedilink
      English
      arrow-up
      4
      ·
      6 months ago

      But you would have to do something like multiple steps of preprocessing with expanding search depth on each step and do it both ways: when recollecting and changing memories. Like if I say:

      • Remember when I told you I’ve seen Interstellar last year?
      • AI: Yes, you said it made you vomit.
      • I lied. It was great.

      So you process the first input, find the relevant info in the ‘memory’ but then for the second one you have to recognize that this is regarding the same memory, understand the memory and alter it/append to it. It would get complicated really fast. We would need some AI memory management system to manage the memory for the AI. I’m sure it’s technically possible but I think it will take another breakthrough and we won’t see it soon.

      • tias@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        4
        ·
        6 months ago

        Again, you will certainly hit limitations if you push it, but the example you give would work fine if you just append the added information to the database. A query for Interstellar would return both your original statements and the fact that you later said you lied about it, and all of these records are inserted into the GPT’s context (short-term memory) when discussing that subject.