In a time when some of Satya Nadella's chickens are coming home to roost and Windows being the most obvious example and most of their AI things quickly approaching too, it's good to laugh at their stupidity as a consolation prize.
In the past Microsoft fucked up some many times but they had the absolute dominance of the market and a huge pool of talent and knowledgeable people capable of making them try again and win. Times have changed, many have retired or been layoff to give way for the next round of "cheap young" talent in the form of contract workers.
Now they have the Cloud, I'm not so sure the Windows division can turn this turd around this time. Xbox has tangentially been the canary.
Absolutely this IBM. In 1989, they owned the PC market. It was theirs. Of course the cracks were deep and deepening, but IBM still could have maintained a leadership of that industry to this day. Instead, they squeezed so hard the PC market fell from their iron grip.
The fact IBM still exists and is an important company is irrelevant. They lost control of the de facto computing standard. Microsoft could lose control as well.
Maybe, but the markets also shift. Companies evolve. Old products fade out of importance and new goods and services appear. Their company value has never been higher.
I'm not sure I would have liked a world where IBM continued to control so much - we probably would have a much smaller open ecosystem.
I mean, the stuff I was hearing about Microsoft over a decade ago was that they were giving up on the OS and moving everything to a cloud based SaaS model. Basically, focusing energy where the money is.
If you are at JPMorgan or aerospace today using quantum computing, there is a very good chance you are using an IBM Heron 156 qubit scalable computer. They have been pushing quantum computers into very large companies for quite awhile now.
Google is strictly in research mode, but doing a lot of good, hard work.
Can you give some concrete examples of “cheap young” contract workers being hired to work on product features? You seem to know a lot of things so maybe some concrete examples will help.
> Times have changed, many have retired or been layoff to give way for the next round of "cheap young" talent in the form of contract workers.
A couple of quotes from the article above:
"WebView2-based Microsoft Teams consistently uses 1-2GB of RAM while doing nothing. Microsoft likely doesn’t know how to make these web apps use fewer resources, so it’s instead moving Teams calling to a separate process to reduce crashes."
"But Teams is not the only web app causing trouble when RAM prices are about to soar, as we also have WhatsApp. When WhatsApp debuted on Windows, it was an Electron app. However, Meta later upgraded it to WinUI/XAML (also known as native code on Windows), and WhatsApp eventually became one of the best apps [... using] less than 200MB of RAM and had smoother animations and faster load times."
It seems that most developers these days focus on web-exclusive technologies and try to force desktop and system level programs into this paradigm.
C, C++, and C# programmers seem to be as rare as hens teeth today?
Are colleges and universities not teaching these languages anymore? Is this a symptom of 'cloud-first' strategies where its easier to 'just use JavaScript' for everything, perhaps developer laziness/reluctance to learn another 'lower level' language?
I really don't understand the appeal of web-centric languages like JavaScript and TypeScript in the desktop and systems realm when they lack a standard library (which genuinely scares me: supply chain attacks...), likely contributing to the RAM consumption issue as developers just keep piling packages on for one specific function missing in another imported
library, and aren't natively compilable to small binaries that aren't dependent on a runtime or bulky embedded interpreter.
Yes, C# technically falls afoul of this (in .NET), but C# at least has a standard library that is comprehensive and is supported by an enterprise (Microsoft, for all its faults), not random developers on the internet.
Microsoft allowing key components of Windows 11 to be rewritten in web-wrappers is only going to drive people further into Linux, as the RAM affordability crisis continues.
Electron apps do are resource hogs, but that's not the reason teams is crap. Neither XAML the reason the Whatsapp app is good.
Developers focus on web because that's where the money is. Who would want to go down the Desktop road when it's less money and a dying field?
Similarly, IMO for making UIs, declarative is the way to go. A lot of these UI Desktop frameworks are procedural, which is a drag to write for UIs, and also, has many times less the size and support that, say, React, does.
Another thing is that hardware is always getting better. There is no incentive to increase performance if no one complains. A vocal minority of tech guys raving about how Electron apps are resource hogs don't dictamine what's performant and what's not.
I feel like this ‘cloud-first’ strategy will only get worse now that AI assisted development is common. I notice my personal AI assisted C# projects get far more complex than when I use some JS framework.
If it’s not the colleges and universities, you can bet the AIs are better trained on JS/TS.
Honest q, 3rd party crashing in: does it make a difference? My assumption is not for parsing for understanding and grammatically, its fine, if significantly less common than the “for” construct.
Microsoft doing Microsoft things, even with all those fresh coats of "open source" paint they bathe themselves in the last decade they really can't change their DNA.
Expect the amount of f*ckery to increase as the AI realities set in but the number has to go up either way.
It reminds me of the good old days of Visual Studio + .NET + SQL Server where they played these games too.
"Just" some Copilot integration (in the form of chat or smart suggestions) is just the start.
The next major Windows 11 update coming in 2026 will have full agentic AI with full control over your (your?) PC. And it will hard require a pretty recent processor with Neural Processor Unit to make it work (so a lot more e-waste is coming).
Nope, you go to upgrade, because windows update downloaded it and restarted, and it tells you “Your processor is not supported”.
Why would it be any different than the Windows 7 -> Windows 10 debacle? Disabling entire processor families after it boots into installation and wiped the previous windows.
I think they fixed that to some degree. I have an old win10 PC that now has a persistent "upgrade to W11" banner that informs me my PC is below spec, so I can't upgrade. Fine by me!
A 2025 Linux kernel with all recent features is able to boot on a system from 2006.
Likewise the Windows 11 (which is just a rebranded Windows 10, just look at the full build number which should start with 10.x) kernel could boot systems from ~2017 onwards. Maybe with some kernel features disabled which most (if not all) Windows 10 users would not miss anyway, but it could still boot without any issues. Those running a Rufus-patched Windows 11 are living proof of this.
This never was a technical issue, or one which could cost them money, but a cold blooded business decision which generated thousands upon thousands of kilos of e-waste.
>A 2025 Linux kernel with all recent features is able to boot on a system from 2006.
Because no one on the kernel team likes deleting code, specifically because someone will try to install it on their old ass work laptop from a decade ago.
Microsoft choosing not to support that old ass laptop is a company choice. There are costs involved with maintaining the support structure. Whereas Linux is primarily funded by enterprises who use it on servers, which may not be updated hardware wise in a longer period of time.
If Linus Torvalds or Miguel De Icaza introduce copilot, I swear I’m going to go all in on BSD.
Almost nobody has this functionality on their desktop processor. I hate Microsoft as much as the next guy but there's enough real problems to yell about without making some up. The agentic AI will even be entirely opt in.
It's not. Like TPM and several new CPU features are a hard requirement for Windows 11, which are patched out by tools like Rufus but can lead to a broken system with every single update you install, a NPU is a hard requirement for the upcoming major update with agentic AI.
They got a storm of criticism after that announcement, but Microsoft seemingly has not given a single fuck about that and has not backtracked on this decision.
(Just like they technically could have allowed Windows 11 to run on older PCs with some disabled features out of the box, but didn't)
Fuck did it really? How on earth does that pan out? Who uses notepad? Writers? Word. Coders? Vim or VSCode or <IDE of choice>. I just don’t understand their logic.
They make this beautiful pasture (Windows XP wallpaper) and then lay mines all over the field. Put up signs that say “Free Lemonade” and charge for parking.
It still starts very fast, even with quite large files and line wrapping. (pretty much on par with lite-xl, though lite-xl did get much faster with some recent version. prior to it though, it was easier and faster to launch notepad)
I've heard Github makes more money from copilot than everything else combined. You can think what you want about the strategy, but it's hard to ignore that.
But enterprises may negotiate not to use (and pay) Copilot, can't they? Or go with another provider if it's such a big deal. Plus it being enabled by default in every VS Code (I haven't checked this, last I remember you need to sign in with GitHub) gets you on the free tier where you make zero revenue for Microsoft and some expense (not too much, probably).
I’m not sure why people are surprised. If you watch Nadella interviews, he tells you what he thinks and where he wants to take the company.
He touts AI, services, agentic copilot, and all the other stuff customers are railing against.
Some Windows manager got crucified on X recently for an enthusiastic tweet about turning Windows into an agentic OS. People called for this persons firing. But, this was straight out of Nadella’s playbook.
Windows users are not customers. Businesses are. Tech conglomerates and everyone adjacent are going for the big money, it's what everyone is doing , it really is a fantastic world devoid of anything but ROI numbers. The fastest way to get rich or die trying, gangstas got nothing on these cats.
I'm doubtful many businesses are requesting many of these features.
I don't think it's wise for them to want stuff like Recall (data exfiltration) or current state of the art agents doing calculations or analysis for them -- at least without a qualified human closely reviewing it's output and conclusions.
I do see businesses wanting simpler, more reliable software with fluid and consistent interfaces, but MSFT isn't focusing on that.
Business owners are stupid, and they, too, need to signal to their investors that they're "all in" on AI.
AI is, currently, more of a culture than a product for businesses. At least, the ones that don't literally make the models. I'm sure a lot of business owners really do want all the AI stuff, and then their employees will just work around it, like they do with all sales or signaling driven decisions.
The bit that really annoyed me: you can't even remove the Copilot button from the Office ribbon any more. Microsoft simply have hidden the option in the Ribbon customisation settings.
Even though I don't use it, and have disabled as much Copilot functionality as Microsoft will let me.
At least in my experience, most of the poor UX can be explained by the fact that LG shipped underpowered hardware for the OS and apps that are expected to run on it. I bought my TV a year ago and it lags or loses input on the main menu, and it's even worse in apps. Forget it if you want to use the overlay menu to change a setting lol
If you remember Palm/HP webOS, it had Preware homebrew that didn't require exploits to run, it was supported by default and was amazing. LG patched the one vuln that would have let me at least root the TV.
The Android TV devices I bought from reputable retailers are at least beefy enough to handle input without lag, and I can run whatever APK I want on them.
The only software that I want to run on my TV are TV channels, and all the streaming operators, for anything else I have devices that I don't need to root.
My Android TV on the other room is equally just good enough to run Android, also not going to win any benchmarks.
Agree that the overlay menus on WebOS take their time to come up, but I am not going into them all the time for them to get into my nerves.
I have a C8 from LG, and I'm so happy with it after so many years, works wonderfully as a dumb panel, and a great panel at that. I wonder if it's impossible to use the newer ones like that. Anyone has any experience? Asking because our neighbors want the same great "tv".
I have to agree, simply not buying LG isn't an option, we'd have to rule out just about everyone for the same reason.
I have a slightly older WebOS LG TV, it has PS5, Switch 2, and FireStick 4K Max and an Onkyo receiver plugged in, and as an OLED TV it's incredible, LG would always be my first choice for picture. Don't care about built-in sound as I use a sound-system.
Right now I'm in the market for another TV at around 65inches and was looking at the 2025 model LG OLED, I likely won't connect it to the internet and will probably just hook up an Apple TV following some discussion in another comment section about how much I hate my Fire TV for being ad-ridden.
Really I wish LG or someone would just make a dumb TV with 4+ HDMI, ARC, perhaps DP and a remote and let us hook up what we want; but it'll never happen.
This is my plan for beginning of new year (42" model), mixed games & desktop usage (I know oled ain't best for windows work but non-oled gaming monitors are rather crap ie due to non ideal local dimming, ghosting, mediocre colors compared to oled and so on).
Didnt plan on making it also a TV with internet connection, now I darn sure as hell won't.
Its really sad state of things that the best course of action now for new hardware is to simply use it as it is, never update or plug online since for any chance of any minor issue being fixed there is 100x the risk it will go to shit in substantial ways (I have Samsung q990d - they soundbar literally dying for good after an official update, but that one you had to at least push yourself from phone or via usb).
Not possible with everything, or at least not without substantial hacking for many.
That seems a bit of an overreaction. The top 10 front loading washing machines on Consumer Reports' rating list are 8 LGs followed by a Samsung and another LG.
If you don't want WiFi you can still get a top rated washer. The LG WM3400CW, which is in a 3 way tie for high score, does not have WiFi (or Bluetooth, or any other radio).
Note: Consumer Reports says that it does have WiFi but they are mistaken. It does have LG's "SmartDiagnosis" which lets you view diagnostic data in their app which is probably what confused them. On models with WiFi the app gets the data via the network.
On the 3400 you press some buttons on the washer to tell it to send diagnostics, and then it sends them acoustically similar to the way analog modems sent data. You tell their app to use the mic to listen to that and decode the data.
The WM3470CW, #10 on the Consumer Reports list, also is radio free and uses sound for SmartDiagnosis. Consumer Reports correctly lists this one as not having WiFi.
That's the problem. Front-loading washers have generally been a terrible invention. Unbalancing and mold are among the widespread problems. The actually reliable washers are still top-load.
I've always wondered, since we only have front-load washers here in the UK, is there some sort of advantage to it, aside from space, which seems to be the obvious one, does gravity help with battering the clothes around when the drum spins slowly enough they can fall from the top of the drum?
Front loaders are gentler on clothes, use a lot less water, use a lot less energy, and spin faster in the spin cycle so there is less work for your dryer if you use one.
Top loaders are easier to load and unload, cheaper, and slightly easier to maintain.
With front loaders you should wipe the gasket after use because water left in its folds can promote mold and odors. With both you should leave the door open when not in use so air can circulate in the drum. With a front loader the open door can get in the way and is easier to accidentally close.
Interesting, thanks, I had no idea about much of this, I was aware of the door/mould thing, and stacking, though it's not something I've ever seen done here in the UK personally.
As a "typical" British household, we don't use a dryer, don't even own one in fact, we just hang our clothes to dry, which always struck me as ironic for such a humid, cold country, with smaller (than the US) homes and thus less space to hang stuff to dry.
It’s funny, I never connected my G5 to the network or accepted any of the optional T&Cs, so there’s now numerous places in the UI that say “accept terms to see personalised content”.
While I’m not a fan of mobbing on someone as it easily escalates to bullying an gratuitous attacks, parodying his name is the least of my concerns. And he is a public figure. Being the head designer at Apple grants you that status and don’t even doubt for a second anyone who wants that type of job doesn’t play the fame/status game.
Build the “killer app” that the audience wants and needs and where D is the best lang to do it and justify the investment in learning yet another lang. I have no idea of what that is and nobody knows, it’s the universal problem of any product/lang/tech. Right place at the right time I guess
Of course it can, but different killer apps for a different crowd. A missile tracking system wouldn't be the kind of application to do in PHP. Wrong app, wrong crowd.
I'm aware of D since it's inception more or less but don't know it very well. I would say D lacks a "bombastic" feature and maybe that's both the reason is not used more but also why is such a good language.
It's not "memory safe" like Rust, yes it's fast but so is C/C++, it doesn't have the "massive parallelism/ always-on" robustness like Erlang. It has a bit of everything which is good and bad.
Being a mid all-arounder is OK in my book, perhaps it's more a matter of some "post-AI" tech startup adopt it and get massive or famous, like Ruby because of the Web 2.0 era or Erlang with the Whatsapp thing.
Maybe D is good the way it is and will always be there.
D doesn't have a bombastic or killer feature. What it has is elegance. It simplifies things, and smooths out the ugly stuff. You don't have to worry if your char is signed or unsigned, or how many bits are in an int, or whether to use dot or arrow, or remember to make your destructor virtual, and on and on.
It's a more memory safe language than C/C++, no need to worry about forward references, strong encapsulation, simple modules, and so on.
And let's face it - the C preprocessor is an abomination in modern languages, why does it persist?
Over the years, especially in areas where some "hard science" is needed I've been seeing the US startup / US founder announcements where in reality is mostly a cash grab for VC money, a glowing marketing operation for some dipshit business venture outsourced to indians or easter europeans or the typical trust fund kid playing "Little Timmy does startup" game.
The type of guy Cook is, was the “best” and safe choice for a company like Apple on the trajectory it was. Now everyone is a multimillionaire on the bank but the culture inside is quite hollowed out. Good luck for the next guy, he’ll need all of it.
What's up with the naming of rage forked projects?
OpenMaxIO, Forgejo, Valkey, OpenTofu..
Some worse than the others, but still..
Btw, MinIO making these not "open open-source" moves should not be a surprise. Since the beginning, YEARS ago their own CEO, lead people talked in a way it was clear they wanted to follow the Hashicorp book, a few years later they've taken quite a few hundreds of millions in investment.
So let's not be childish, adopt it for what it is and when it happens adapt it for what it is.
OpenTofu is a "best of a bad situation" kind of thing. The project was originally called OpenTF, which makes sense since Terraform config code almost always ends in `.tf`, but HashiCorp sent out the lawyers, so they had to change names. Plus, it goes well with OpenBao, their Vault fork.
I don't get your point about naming. What would you have them named? OpenMaxIO seems different than any of the other ones you listed. They wouldn't have been able to name it OpenMinIO without some legal problems.
In a time when some of Satya Nadella's chickens are coming home to roost and Windows being the most obvious example and most of their AI things quickly approaching too, it's good to laugh at their stupidity as a consolation prize.
In the past Microsoft fucked up some many times but they had the absolute dominance of the market and a huge pool of talent and knowledgeable people capable of making them try again and win. Times have changed, many have retired or been layoff to give way for the next round of "cheap young" talent in the form of contract workers.
Now they have the Cloud, I'm not so sure the Windows division can turn this turd around this time. Xbox has tangentially been the canary.
reply