Egregoros

Signal feed

Paco Hope

@paco@infosec.exchange

Amateur professional #selfhost sysadmin. Professional amateur #cloud #security at #AWS. Also fond of #cats, #cigars, #whiskey and #pipes. I like board games and some video games. I am #covid cautious and I still #wearamask. Opinions are my own, but they can be yours too. 100% Organic:,No artificial colors, preservatives, or intelligence added.

Posts

Latest notes

But we are not done.

Teach your developers ignore #AI generated warnings with this one weird trick!

If you look sy my previous post, under the erroneous "You should use OAI" message, there's a little icon and the word "Explain." You can ask the #LLM to explain the warning. Here's the explanation that Claude Sonnet 4.0 has for that warning. For those of you who don't want to read a screenshot, the LLM has prompted itself with:

Explain CloudFront Distribution Resources have an S3 Origin configured without an Origin Access Identity (OAI). issue in 70-frontend-stack.yaml at [29]

And the LLM's response to its own prompt is:

This is a false positive security warning. The issue is incorrectly flagged because:

You ARE using access control - Line 16-23 creates an Origin Access Control (OAC), which is the modern replacement for OAI

OAC is properly configured - Line 38 attaches the OAC: OriginAccessControlId: !GetAtt CloudFrontOriginAccessControl.Id

S3 bucket policy enforces OAC - Lines 120-135 ensure only CloudFront can access the S3 bucket

The security scanner likely only checks for the legacy OriginAccessIdentity property and doesn't recognize the newer OriginAccessControlId. Your configuration is actually more secure than using the deprecated OAI.

The warning can be safely ignored - your setup uses the AWS-recommended OAC approach for securing S3 origins.

Thanks for wasting my time AND egregious amounts of electricity generating a pointless "high priority" security warning.

@feld They’re just an English staple. When I lived in the UK I insisted on Jif peanut butter even though it was like £6.99 for a jar that was $3.99 in the US. Now that I live in the US, there are a few English things I must have on hand. Jacob’s Cream Crackers is one. Perfect underlayment for a good cheese. (PG Tips tea, marmite, and crumpets being some others)

@jerry I just heard a brief story on Marketplace about people taking “sleepcations.” They take a weekend and go somewhere. Not someplace nice or fancy or expensive. Maybe not even far. Just away. And they spend the whole weekend in a hotel sleeping. Maybe treat themselves to a room service breakfast, but mostly just get a hotel room and sleep a lot.

Here is a way that I think #LLMs and #GenAI are generally a force against innovation, especially as they get used more and more.

TL;DR: 3 years ago is a long time, and techniques that old are the most popular in the training data. If a company like Google, AWS, or Azure replaces an established API or a runtime with a new API or runtime, a bunch of LLM-generated code will break. The people vibe code won't be able to fix the problem because nearly zero data exists in the training data set that references the new API/runtime. The LLMs will not generate correct code easily, and they will constantly be trying to edit code back to how it was done before.

This will create pressure on tech companies to keep old APIs and things running, because of the huge impact it will have to do something new (that LLMs don't have in their training data). See below for an even more subtle way this will manifest.

I am showcasing (only the most egregious) bullshit that the junior developer accepted from the #LLM, The LLM used out-of-date techniques all over the place. It was using:

  • AWS Lambda Python 3.9 runtime (will be EoL in about 3 months)
  • AWS Lambda NodeJS 18.x runtime (already deprecated by the time the person gave me the code)
  • Origin Access Identity (an authentication/authorization mechanism that started being deprecated when OAC was announced 3 years ago)

So I'm working on this dogforsaken codebase and I converted it to the new OAC mechanism from the out of date OAI. What does my (imposed by the company) AI-powered security guidance tell me? "This is a high priority finding. You should use OAI."

So it is encouraging me to do the wrong thing and saying it's high priority.

It's worth noting that when I got the code base and it had OAI active, Python 3.9, and NodeJS 18, I got no warnings about these things. Three years ago that was state of the art.

@halva You didn’t hear the news? This is the new, 100% accurate, practically instantaneous #Bluetooth pairing method. They call “plugging in a wire.” People are complaining it reduces range and increases tripping hazards. Apparently you can go farther than Bluetooth if you have the right equipment. But I hear that you have to literally pay by the meter to increase your range with these new-fangled “wires”. Sounds like a rip off to me.

More moaning about #linux. I'm trying to get a discourse server off the ground. Why the fuck do they insist on installing non-OS stuff in the middle of all the OS stuff.

/var/discourse is not a good default. And then fucking docker wants to be /var/lib/docker. Never mind how /var/lib doesn't make any goddamn sense.

Related to my earlier discussion of hard drive partitioning. What I would like to do is have a volume that is not the operating system, but is instead all the application data. The discourse data, database, assets people upload, etc. That way I can have this nice virtual disk that encapsulates it. I could theoretically build a new node, attach this drive to the new node, and migrate the site. I can snapshot that drive more frequently than, say, the OS drive. Lots of benefits to encapsulating it.

But no. The discourse "easy installer" sprays shit all over /var. I'm gonna get some database down in /var/lib/docker/something and I'm going to get assets living in /var/discourse, and umpteen gajillion container images in /var/lib/docker.

But I also have fucking /var/log for OS logs, and /var/run for runtime information like PIDs, and /var/lock and /var/tmp. I think /var/most-important-app-on-the-system is NOT where you put application software. So my volume/filesystem to encapsulate discourse? Is that a big ass /var?

I love #FreeBSD because it is so well organized. They even go to trouble to make packages like postfix or apache fit the idiom, rather than let it install in /var/lib/opt/sbin/etc or some shit.

This public service announcement sponsored by Old Man Yells at Cloud, Inc.

One of the ways that LLM-authored code improves productivity is by merely SAYING it does things. It's way faster than the whole time-consuming process of actually doing things. This is real code someone sent to me for review.

[CW]

Content warning

re: long tech rant (disney, hulu, amazon)

Show

@ava When older folks, like parents, complain that all technology sucks, I just solemnly nod.

I feel your pain. I've done stuff like this for my parents and in-laws. It always leaves me scratching my head: "They think normal people can understand all this shit?"

@evacide#Microsoft says it will provide encryption keys for Windows PC data protected by BitLocker where it has access to them and it's received a valid warrant.”

The word “valid” sure is doing a lot of work there. This is the most corrupt DoJ and FBI in generations. One that ignores court rulings that it disagrees with. So what way is the warrant “valid”? Syntactically? Grammatically? Because if we get any deeper, like morally or ethically, the argument gets harder to make.

This silly statement from #openai about #security drives me crazy. People talk about this all the time as if it means something.

‘files in ChatGPT as a whole are "encrypted by default at rest and in transit"’

What attack does that encryption at rest defeat? What hacker says “darn it! I would have gotten the data if it hadn’t been for that pesky encryption at rest?”

Think it over. Go ahead. I’ll wait.

Physical theft of hard drives/storage. That’s it. Encryption at rest at OpenAI, or any cloud, defeats the same singular attack that it defeats when you encrypt the hard drive on your laptop: if someone physically steals the device, they don’t get the data.

They can sell your data. They can store it (encrypted at rest) on a web site that has a vulnerability or incorrect security, and bad people can download the unencrypted data. They can share it with “partners” who misuse it. Encrypting at rest is NOT an important protection. Literally every other protection is more important.

https://www.darkreading.com/remote-workforce/chatgpt-health-security-safety-concerns