@FermiEstimate
@lemmy.dbzer0.comlmao, Zoom is cooked. Their CEO has no idea how LLMs work or why they aren't fit for purpose, but he's 100% certain someone else will somehow solve this problem:
So is the AI model hallucination problem down there in the stack, or are you investing in making sure that the rate of hallucinations goes down?
I think solving the AI hallucination problem — I think that’ll be fixed.
But I guess my question is by who? Is it by you, or is it somewhere down the stack?
It’s someone down the stack.
Okay.
I think either from the chip level or from the LLM itself.
If I was them (and in a way I am) I’d probably kill the witnesses and bail.
So I totally get this conclusion, but I think it's worth slowing down and considering whether this makes as much sense as it seems at first glance. The fact that magic exists means that simply killing someone simply doesn't do much to shut them up, if a sufficiently powerful entity is willing to spend the resources. The fact that undeath exists means that killing someone has a very real risk of making them a bigger threat than they were in life--it's not like you can just stab a ghost. Cultists, being familiar with eldritch powers themselves, know this full well. This means that keeping people merely out of communication might be the simplest way to achieve their actual goal with the minimum of fuss. They don't need someone quiet forever, they just need enough secrecy to achieve their goals. Murdering every person who takes an interest in them is mission creep.
Also, keep in mind cults generally exist for specific purposes, and people join them for specific purposes. These purposes aren't necessarily overtly evil at the rank-and-file level, which is integral to their recruitment. The turnip farmer who wants to resurrect a dead harvest god to grow more turnips might be okay with some dodgy rituals the church wouldn't approve of, but straight up committing multiple murder might take some working up to, if he can be talked into it at all.
So in short, consider what your cult wants, and assume a degree of rationality and thoughtfulness (at least, when they're not channeling horrors from another plane). What do they want, and how the party could provide what they want?
You get medals and requisition points from playing that you can use to unlock new stratagems, which includes everything from weapons to orbital bombardment. Medals get you new weapons, cosmetics, etc. You also find samples you can collect on missions, and these unlock permanent upgrades for stratagems. There are player levels, but these just unlock new titles once you get past the basics.
The battle pass equivalent is Warbonds, which include new weapons, armor, cosmetics, etc. Unlike most games, warbonds don't expire and you can find enough premium currency while playing to get them without too much trouble.
On the whole, new warbond weapons tend to be different rather than obvious upgrades. The default assault rifle you get stays perfectly viable throughout the game.
All I really know is shoot bug and if you aren’t getting friendly fired to hell and back you’re playing wrong
You've pretty much got it down, though you also shoot terminators.
OpenAI: "Our AI is so powerful it's an existential threat to humanity if we don't solve the alignment issue!"
Also OpenAI: "We can devote maybe 20% of our resources to solving this, tops. We need the rest for parlor tricks and cluttering search results."
If they're really lucky, they'll end up working for the Laundry only once. Residual Human Resources is a bad way to go out.
Charles Stross' Laundry series is basically this concept set in the present day: magic is a branch of mathematics, which means it can be computed and programmed.
It is perhaps worth noting at this point the series genre is cosmic horror.
FWIW, the shield backpack and either AMR or Quasar/EAT have served me well against bots, but I typically run light armor. I bring the grenade pistol to handle factories.
If you aren't already using it, there's never been a better time to get into the AMR now that they buffed the damage and finally zeroed in the scope.
Orbital stratagem timings make no sense, and are strictly a gameplay balance issue that *cannot* be realistic: the loading screen shows the first helldiver drops well outside the atmosphere and take several minutes to reach the ground, but turrets take 3 seconds to deploy?
I assumed this was because equipment can endure acceleration that would make a person pass out, or at least be combat-ineffective on landing. A trip from the Karman line to the ground in a few seconds would involve some deeply unpleasant G-forces...in opposite directions, back-to-back.
Come to think of it, this might explain why different gear has different call down times, as more fragile stuff might require a slower and (relatively) gentler drop.
For safety reasons, a magical museum might well want to avoid putting actual items on display, especially since Illusion magic makes it trivial to create simulated substitutes crowds can safely interact with. This also keeps security budgets manageable, since even a magical museum probably doesn't have the kind of money to protect the arcane equivalent of a small nuclear arsenal.
So the museum researchers might be happy keeping items on indefinite loan, as long as they know the borrower and have guaranteed access when they need it. This way the mage owning the item is responsible for keeping it safe, and arcane historians--who definitely didn't spend all those years in magic academy just to play guard--don't have to.
As a bonus, this method gives the party a quirky but knowledgeable NPC contact who can give them clues or set up sidequests whenever you need.