So real quick, I’ve been exploring local LLM for a bit and so on. In this video I get into what I think is the future for LLM, but in a nut shell I think Microsoft will eventually push out a local LLM to machines to cut down on a lot of resources and cost. In doing so, it likely will be possible for developers to tap into that local LLM for their game.

The worries I seen bring up is

  1. Spoilers - As mention in the video it is currently and it should always be possible to solve for this in the stuff sent to the LLM. The LLM can’t talk about what it doesn’t know.
  2. The NPC talks about stuff it shouldn’t - by fine tuning it, this solves this problem to an extreme degree. The better you prep it, the less likely it will go off script. Even more with how you coded your end.
  3. Story lines shouldn’t be dynamic. The answer to this is simple. Don’t use the LLM for those lines and given NPC.
  4. Cost - Assuming I’m right about Microsoft and others will add a local LLM. The local part of it removes this problem.

https://www.youtube.com/watch?v=N31x4qHBsNM

It is possible to have it where given NPC show different emotions and direct their emotions as shown here where I tested it with anger.

https://www.youtube.com/shorts/5mPjOLT7H-Q

  • rainbowkarin@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    I always felt like Animal Crossing would be a perfect candidate for LLMs. A lot of the appeal of the earlier games was its sheer quantity of charming dialogue that also changed depending on the time of day or current events. You could also write letters to the villagers, but they always replied with something vague.

    So having LLMs generate endless, creative dialogue and engaging response letters could probably breathe new life into the older games.

    Also off-topic, I’m surprised I haven’t seen any AI-based world generators trained on fantasy heightmaps

  • JustOneAvailableName@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    but in a nut shell I think Microsoft will eventually push out a local LLM to machines to cut down on a lot of resources and cost

    Not their valuable LLMs. You don’t want to give those weights away.

  • involviert@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    I don’t agree with the assumption that there is a pressure for companies like MS to reduce costs via local models. Compute on that gamer’s PC is probably the biggest problem right now. Especially since in a game, pretty much all of the hardware is already used to the limit. And then you throw a 10GB LLM on top, maybe even loading different finetunes for different jobs? Then the TTS model? This does not result in reasonable response times any time soon, at least not with somewhat generalistic models.

    On the other hand, that’s something MS must like a whole lot. What you see as “optimizing costs” is optimizing their profit away. They can sell that compute to you. That’s great for them, not something to be optimized away. And it’s the best DRM ever too.

    • crua9@alien.topOPB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Microsoft has said many times the version (many expect to be win 12) is going to focus around AI. Not only this, some Linux versions are building around it and aiming for something like Her (movie).

      Keep in mind if you’re offline you can’t use a cloud version, and the current version they put out has the bing budget limitations. So like if you are to say you want your computer in dark mode so it can be in dark mode. This goes against the budget for the day. And the budget is there due to resources on the cloud.

      So now assuming they (Microsoft) and others are right that the main way people in the future will interact with the computers is by these models and interfacing with it like you would a human, the system figuring out what you want, and it doing it. If it is stuck on the cloud you’re limited to a budget by how many times you can use it per day or it will cost you. Where if it’s local it uses your resources and you can use it how much or how little you want. Plus you can use it offline.

      Tldr this stuff is going to local llm.

  • sshan@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    First use probably won’t be deploying live to games.

    It will be to turbocharge developers to develop / tweak storylines and content faster. Get inspiration and first drafts done 10x quicker and then polish.

    It will be part of the tooling, not the actual game. But just a guess.

  • sdfgeoff@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    There is no doubt current gen games will be made way more awesome with AI, but I think the real power of LLM’s is opening up whole new types of game. Current games are (largely) static universes and static stories, with simple things like physics simulations (bullets, destruction). These are all things that are easy to program using traditional techniques.

    LLM’s will allow new genre’s: spying simulators where control of information flow has a significant impact. Political games where you actually have to negotiate and convince NPC’s of your actions. Managerial games with subordinates who can operate independently. RTS’s where you operate as a general because AI’s handle passing battlefront information up the command structure. And dozens probably more I haven’t yet thought of…

    Many board games rely on social aspects. These can now be made into single player games.

    • thewayupisdown@alien.top
      cake
      B
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      (I only lurk here so apologies for any mistakes or dumb ideas) Anyway I spent some time trying to get some text-based fanfiction Disco Elysium setting game working in GPT4 when it was still only 8k (?) tokens. I got it to keep track of the sympathy level between Harry and Kim, and have the voices in Harry’s head as well as NPCs speak a) in dialect (Scouse, Cockney,…) and style fitting their character, and b) reproduce to some extent the Martinaise Créole spoken in the game by having x% of words (nouns primarily) replaced permanently with their German, Romanian, French, … translation. After an instruction phase I had the game first think up the story and then transform it into a branching narrative. It worked at some point pretty well until it would start forgetting things and hallucinate about prior events. Thus I never got to finish the first day or judge the narrative. Then it felt like GPT4 was taking a dive, the dialects where verbalized poorly (lots of ‘h’ in Cockney) or at some point it would just write “(Received Pronunciation)” or similar in front and print the answer in standard English.

      I tried to implement a stripped down version of a safegame, have GPT4 use a compression algorithm and hand that over from round to round when I lost interest/realized I needed a better understanding of the fundamentals.

  • Crafty-Run-6559@alien.top
    cake
    B
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    I think this would play well with some games.

    Might need to go back to dual graphics cards setups, one for finetuned LLMs and one for the actual game.

    But imagine a civilization 7 with LLM based AI.

    Can’t imagine the batshit stuff that would come out of that and high replayability. It would even pair well with the more limited context (we don’t really need catherine the great to hold a grudge from 3000bc in 1800ad).

    I could see secret alliances, clandestine plans to take over the world, strange treaties… the world congress could be a lot more fun/interactive…

    Why did Gandhi team up with Sulaiman to launch a first strike nuclear attack on me??? What does he mean the burning flesh of my people is more satisfying than the scent of fresh rose water???

  • damnagic@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    I doubt LLMs will become an integral part of storytelling any time soon (at least less than 5 years). Where I think they will play a major part (and already are) is in making the game, larger and more detailed environments, bigger and more expansive (curated) stories and potentially and I think very very tiny models trained to the tits for limited and controlled interactions. Running them will take nothing and the results won’t break the world.