

An extended US military campaign in Iran could be constrained by a munitions shortage
Lot of ifs for something that the US is known to prioritize above all else. What affordable food? Sorry, got to make missiles.


An extended US military campaign in Iran could be constrained by a munitions shortage
Lot of ifs for something that the US is known to prioritize above all else. What affordable food? Sorry, got to make missiles.
Well in all fairness protests these days don’t get a message across so much as present new targets for the Government to shoot at.


So why wouldn’t that extend to all those other things I listed?
There’s a fundamental difference between the action prohibited and the means by which that is carried out. We can ban drunk driving, now we can enforce that by arresting people driving drunk or shooting everyone who walks out of a bar that touches a car. The latter is extreme but technically does the thing we are after. If we murder everyone who walks out of bar drunk, we technically prevent drunk drivers.
That’s the issue. We are trying to make it where computers keep us in check. That’s a bad idea for sort of the same reasons why installing breathalyzers in every car would be a bad idea. We’re trying to paper over actual enforcement. So that way when there’s a failure we don’t have to blame law makers for making bad choices or law enforcement for not doing their job, we can just blame computers.
I just hope you can understand why that’s bad.
Like… The flock cameras. Made to be able to pinpoint the motions of criminals so that law enforcement doesn’t have to. That’s a great starting intention, but having cameras that watch everyone at all times, that’s bad. And I think you can understand why it would be bad.
Kids still drink, kids still vape, kids still get behind the wheel when they ought not to. It’s up to us humans to enforce our rules on other humans. And the more we forget that, the more we hand power over to whoever is controlling the computers or technical aspects or whatever.
If parents don’t want their kids watching porn, that’s a pretty easy fix that doesn’t require us to hand over critical functions of our computers to some 3rd party to, at some later date, do something we know not of.
Like goodness how is the bad aspects of this not obvious outright? Like how did we start getting to a point where we’re so blind to how all of this can go off the rails so quickly? All these are bad things for reasons that’s really complicated that might not fit in 5000 characters or less. But they’re bad. The whole having a computer verify age by scanning the barcode, what happens when that company signs off on a deal with health insurance? What would happen if the Kroger plus card data was sent over to your insurance provider? Everything you bought at the grocery store is something that your insurance provider has access to?
Like c’mon how are we not seeing this? It’s not about “kid should have access to porn”, it’s about how we go about enforcing the whole “kids shouldn’t have access to porn.” You have to understand, I’m making a statement not about the “ends,” I’m making a statement about the “means.”
We all seem to be always getting so caught up on the end goal that we forget to stop and consider the actual path we’ve selected. We’re so preoccupied with whether or not we can prevent something, that we don’t stop to think if we should reconsider how we go about it.
Please I’m begging you, there’s a really important point in this and we keep failing to see it, A LOT! Like, I’m glad everyone is starting to understand the dangers of having a Ring camera everywhere, but it’s so frustrating that it took a Super Bowl ad for it to finally sink in how bad an idea it is when a lot of people were pointing this out very early on with the Ring TOS.
I’m getting old and I’m getting tired that this keep happening, I don’t want any of us to be agreeing to something that’s got a pretty easy fix for it already, that’s got massive ramifications down the road if we go down the purposed path. It’s not ends, it’s the means, it’s the means. We keep selecting ones that have that really bad consequences.


I’m not saying kids should be handed porn that’s such a stupid take. I can do the exact same for your comment.
Parents shouldn’t take care of their kids, instead we should let the government tell them what to think, what to eat, who to hate, and who we should be praying to.
Do you see how brain dead your comment is?


We don’t want kids downloading bad stuff.
Then parents should keep an eye on their kids. Or just don’t give them full on access to the computer.
I hate that politicians keep trying to invent technology to do a parents job.


That’s comforting to hear. Thank you.


I mean, isn’t this what Gacha games basically are?


Just when I think California couldn’t possibly come up with dumber laws, they deliver yet again.
There’s genuine concerns they could be addressing but instead go after something that’s going to be near impossible for them to enforce.
Blueprints for homemade 3D printers exist that can be built with a pretty short list of parts from Digikey.


One can only wonder as to why that may be.


Well I can say technology stack cost is a consideration, but it isn’t a leading consideration. Even in the scope of AI, the machine cost isn’t the primary factor. The single biggest cost for development is scope and complexities. This is the thing AI is looking to address, ease complexity of projects. And I don’t say that as a promotion for AI, just, that’s what the advertisement is all about for the stuff.
The largest cost is things just being way too complex and right behind that is hiring. I’ve known companies to let go of staff too soon and it’s cost them when they need new talent back in. It’s cheaper in the long run to just give raises to people who’ve been there than fire them and look for new young blood. I usually say this to C-staff as. We don’t need a lot of energy to keep a rock shaped as a wheel rolling, but if you stop it and it falls flat, you’re going to paying to upright that stone. Sometimes that is what is required, sometimes that is not the case. But it’s a choice no one should be making lightly without a serious consideration of what’s to come.
Third party integrations are also a big cost and again, that’s something AI is “supposed to” help with. And it’s kind of the same reason as the first. Complexity. And that’s the big thing you should keep in mind when normal people talk about AI. Complexity. It’ll help you to understand kind of both sides of the AI debate. We are asking for more, faster, with more complex interactions and a team trying to keep on top of all of that is … really hard to say the least. You know we’ve come up with all kinds of “solutions” to solve that complexity, which I won’t go into but things like CI, agile development, etc are examples of that. And ask anyone, quote/unquote agile is a … complicated topic to broach. It absolutely has it’s fans and detractors, there’s no one universal consensus.
But as AI eats all the hardware in existence apparently, there’s a tipping point where even if it ate all of it, that consumption cuts off people from getting to the product itself. At some point people will need a device to access AI for AI to be useful. We don’t have grocery stores located in a remote location far from everyone in a forest for a reason.
The more likely thing is that AI carves out the upper end of the industry. Think for a second. The nVidia 5090 Blackwell architecture surpasses the H100/H200 Blackwell architecture in AI performance. That 5090 is consumer grade. It’s wild to think about that the consumer market has the room for such a massively powerful device. But I feel that’s going to change. Let’s imagine the nVidia 6090! A hypothetical next-gen GPU. In the AI world, the 6090 isn’t marketed toward consumers, it’s a enterprise device. Just like you wouldn’t run into some average gamer running a Threadripper CPU, that’s HPC territory. Instead the 6090 becomes consumer grade after five years, and it’s the top of the line consumer GPU when it hits. The reality is that they’re just putting the 6090s that they couldn’t sell into consumer friendly boxes.
We have to understand that consumer grade has encompassed a LOT. And yes, we are having a massive knee-jerk to that right now. I say that the over reaction that’s playing out is going to hurt the “AI bubble” more than it helps. We’ve yet to see if AI companies have “bit off more than they can chew”. If these chips are produced and the AI companies miss a payment, that’s one thing. But if they keep on missing payments for hardware delivered, it will topple the whole effing thing that we’ll be replacing Humpty Dumpty with Sam Altman in children’s songs. Not that they went bankrupt or anything, but we would see a surge of GPUs hitting the market causing prices to plummet faster than a Boeing aircraft. The sheer ripple would begin ripping apart start ups, slowly moving it’s way up to the end points that provide AI services, and it would just keep going like a snowball down a Swiss ski resort’s freshly tended to snow.
AI companies would indeed have leverage, but that’s a double sided sword to swing. They can do a lot more harm to themselves than good. We just don’t know right now because this is all new. And yeah, our monkey brains hate it when we don’t know something, but that’s really all that can be said at this point. We just don’t know, the interwoven chart everyone panders around where A is buy B, selling to C, who sells to A and buys from B, etc, etc, etc usually ends badly. But this situation has a lot of “unique”… “investors”, aka, there’s a shit ton of corruption Governments around the world are overlooking. But there’s still very greedy people betting against them all because if their bet pays out, they’ll be billionaires overnight.


But the alternative costs from the other side are creeping up. If you are a company looking to hire developers to write software, you need to provide development machines to those developers. A development machine that might have cost $2000 a couple years ago is well on the way to $6000-$7000 in the near future.
This isn’t true because development doesn’t have to be on a max powered system. Myself, I work on AS400 stuff. The compiler is on the remote machine. The database is on the remote machine. The JVM is on the remote machine. All I need is an editor and the ability to SSH into the machine, that’s all that’s required for development. SQL queries? I do those in a web tab, the query is ran on the remote machine.
I could easily write all the code, debugging, SQL queries, ORM, the Java stuff, node.js stuff (Yes, AS400 has node.js and it works pretty well with COBOL objects), and so on, on a Raspberry Pi if I needed to. And this might surprise people and then there will be people who this won’t surprise. Because waaaaaaaaaaaayyy back, that’s how it actually worked. You had a terminal and the machine you wrote code for and debugged and what not was in the basement and you were talking to it via some twinax. We’ve done way more development on a machine nowhere near us than we have on our own machines historically speaking.
As for the mobile stuff. That’ll figure itself out. Or it won’t and we just have wrappers called apps, that are just fullscreen web page. But again, the AS400, we had a product from Profound that basically wrapped a web app into a “native” app and the product from Profound handled all the stuff like taking a picture and putting on the IFS for an RPG program to pick up.
There’s all kinds of ways around this notion that devs won’t have beefy laptops, that’s literally the way we used to do it for nearly forty years.
Tesla hasn’t put out a successful new product in 20 years, and it continues to barrel right along, with its useless hack CEO hanging on as the richest person in the world.
This conflates a ton of unrelated things. Tesla has been unseated by BYD, but the US Government still is hung up about Chinese car makers enter the US market. Musk is the richest person because his wealth is largely predicated on the US economy, which because of the previous international cooperation stood as the leader. Everyday we are witnessing the US becoming weaker on the global stage, which means that his wealth translates less and less to things outside of the US. The US Stock Market is only a big thing because the US and the dollar is the thing so many things are pegged to. When that ceases being the case, the US Stock Market loses value in an international sense and since Musk’s wealth it largely tied to that, so too does his wealth go down.
But the success of Tesla, isn’t actually success, it’s story of the incredibly sorry state US automakers are in that they can’t provide a solid alternative. But don’t get confused. Tesla sales are slowing drastically. This why we are seeing consolidation of Twitter, Tesla, and SpaceX. They’re being consolidated because they’re hitting rough patches standing alone.
So please don’t confuse the odd situation of Tesla with Bubbles don't HAVE to burst anymore. Those are not correct conclusions here. Ford, GM, and the rest of US automakers are so down BAD at the moment that Tesla is able to shine. That’s really the only thing keeping them floating, US car makers are jokes at this point.
NOBODY who is responsible for enforcing anything like responsible economic activity will EVER allow the bubble to burst
Greed. That’s what you are speaking of. But greedy people come in all kinds of flavors. And there’s no shortage of people who short the market and make mad cash doing so. Greed is universal, but the “upside” (I guess we’ll call it that) is that the system allows greedy people to bet against the system. There are people putting money into this whole thing crashing down and the bigger the fall the bigger the reward. There were a ton of people who make billions when the housing market crashed. People watched 9/11 and were betting that the market would collapse in response. Don’t underestimate people’s greed. There are people who would bet money on innocent people getting shot if the odds were good.
Now all that said, there’s a difference between this AI bubble and the technology. Like if the bubble pops, AI will still be a thing, the bubble is not AI itself, it’s how we’re developing AI at the breakneck speed we’re going at.


Then we have the whole TPUSA, which you can’t stop me from reading that as t-pussa half time show.
I hope all the the t-pussas out there enjoy it.
How does that change what I said? Remote X is massively more bandwidth hungry than all the others. I mean things like TeamViewer Tensor exist and from what I’ve done, is massively stable. RHEL works perfectly for it. So I don’t want to hear this can’t get a commercially supported… There’s tons of vendors that will thin client for you.
X is a terrible protocol for modern widgets because modern widgets do their best to work around X, that’s literally in the code. Look at GTK or Qt, both are actively trying to avoid working with X when it can and just render directly, because in every metric, it’s better to work directly with the hardware than to go through some slow middle layer that just spins and wastes cycles.
Heck, even the X developers have left X, because it’s done. It’s a dead technology. It doesn’t matter how many people are deploying in enterprise environments, or how well they are deploying those things. There’s no devs on the project and GPUs keep changing. There’s only so many ways you can keep band-aiding a GPU into thinking it’s a giant frame buffer, at some point, there’s going to be a break in the underlying architecture of GPUs, that thinking it’s just VRAM to dump data to, will no longer work. The amount of space on die for the backwards VGA and SHM methods is minuscule these days on cards.
Heck, Using MIT-SHM on X11 for a Pi is something that’s terrible. You usually get worse results because the underlying hardware is woefully optimized for you to treat it like how old video cards worked. You actually do better using hardware acceleration. The usual mantra for X11 apps on Pi is, if you get good results with shared memory, use that and never upgrade your underlying Pi, otherwise always use hardware where possible.
Also, unlike X, Wayland generally expects a GPU in your remote desktop servers, and have you seen the prices for those lately?
You don’t even need a good one in today’s standards. At most, most compositors just need to convert pixmap into texture. Anything that supports GLX_EXT_texture_from_pixmap will be enough and at low resolutions, just give it to your CPU, we’re not talking intense operations. But literally anything from the last fifteen years of GPUs has enough power to complete these operations reasonably. Shoot, if you’re thin client on a Pi, the Pi itself has vastly more resources. You can literally have a cluster of Pis if you wanted, labwc is a completely fine compositor for basic thin clients and is basically the replacement of X on Pi. Because X11 was just so terrible because it was so misaligned to how modern GPUs actually work.
What I am saying is X can be whatever in “enterprise deployment”, but X has stopped matching how modern machines look like. Video cards have become more than a bunch of bits dumped into VRAM. No matter how many deployments you’ve done, that doesn’t change that fact. X barely resembles what modern systems of the last twenty-five years looks like. Nobody is working on it. You can have 100 deployments under your belt, nobody is still working on it. No matter how you slice the attributes of X, nobody is actively coding for X any longer. And as for damage and what not, lots of implementations of wl_surface_damage_buffer are using underlying hardware EGL/DMABUF because GPUs are smart enough for the last fifteen years to do that on their own, most compositors utilize that.
Again, it doesn’t matter how many deployments you might have, the hardware does it better than X will ever do it, it’s impossible for X to do it better, there’s nobody there to write better. And it will always be this way, until the heat death of the universe unless someone(s) picks up this massive task of taking care of Xorg. There’s nothing that changes any of this reality.
Does this mean you need to drop X11 tomorrow. No. That’s the entire point of why Xorg was open. So that you can keep it until someone rips it from your cold dead hands. But your stubbornness does not change the fact X is absolutely garbage on the network, is massive inefficient, and most things these days actively try to avoid using X directly and if they have to, they just stuff uncompress bits into a massive packet with zero optimization. You can totally mill grain with a stone wheel today, no one stops you. But you’re not going to convince many people that, that is the best way to mill grain. I don’t know what else to say. I don’t want you to stop using X, but your usage of it doesn’t change any fact that I’ve stated. It’s a very fat, very unoptimized, very slow protocol and there are indeed commercial solutions that are better. I’ve just named one, but there are many. That is just reality, the world has moved past dual channel RAM and buffers. I’ve built VGA video cards, I know how to build a RAMDAC form logic gates, all of that is gone in today’s hardware, and X still has these silly assumptions of hardware that doesn’t even exist anymore.
And the network transparency argument is long gone. While you can indeed network windows over the wire, most toolkits use client side rendering/decorations. So you’re just sending bloated pixmaps across the wire when things like RDP , VNC, etc deal better with compression, damage to the window, etc. And anything relying or accelerated with DRI3 is just NOT network transparent.
Most modern toolkits have moved past X11 because the X protocol was severely lacking, and there wasn’t a good way as a committee to modify the protocol in an unified manner. I mean look at the entire moving Earth that it took for XFixes and Damage extensions. Toolkits wanted deep access to the underlying hardware and so they would go out of their way to work around X, because it just could not keep up.


Shapiro’s previously unreported disclosure, dated Friday, came as part of a list of “corrections” to testimony by top SSA officials during last year’s legal battles over DOGE’s access to Social Security data. They revealed that DOGE team members shared data on unapproved “third-party” servers and may have accessed private information that had been ruled off-limits by a court at the time
Wow… Such shock… Much surprise.
Yeah, they basically were sending XSLs,CSVs of everyone’s shit over non secure channels like Cloudflare shares, that had unknown people in it who didn’t work for the Government.
Because they’re all fucking amateur idiots who think they know what the fuck they’re doing and they don’t. So shit like “Hey brah, pass me that sheet you were working on. NP, posted the link in the Molon labe Discord channel, hey you should check the meme BindensProstate6969 posted LOLOL.”
I mean why did anyone think anything different was ever going to come of this? And now a bunch of private citizen unknowns who are a loose collective fringe group and believe crazy shit has everyone’s details.


Yes, this has a name. Asymmetric warfare. Anyone at the other end of the US military has to engage in it at the present time and considering that the United States will likely vote a Trump 2.0 President down the road is what is driving a lot of these countries to begin ramping up domestic weapon production.
All this will do is take resources that have long since been something given to the people and redirect it to building weapons until the entire world is armed to the teeth and we are exactly where we were just before World War I. However, this will make those who profit from war even richer along the way and thus the people are robbed of a world we could of had of shared opportunity, just because some rich assholes in the United States wanted more money.
This is one of those speeches that’s going to be a thing people study and read deep into, because it perfectly captures this moment in history and the generalized sentiment of the massive shift in the world today.


At this rate it’s Congress’ limp dick energy that let’s them get away with this. It’s absolutely amazing how spineless the institution has become.


But the prize was made of chocolate, so it’s only peace until it’s completely gone. Then it’s back to authoritarianism.
Firmware on these is pretty tight. They’re usually using CC2510s or CC2530s. The CC2510 has a voltage glitch hack that you can use to attempt to read the contents via the DCOUPL capcitor, but it’s not very effective and you can only read a few bytes per attack.
You can see a github some tools some have created here. Eventually someone is going to read the firmware off theses and be able to hack them, it’s just a matter of time.