Growing up (born in late 70s), all I heard was “OMG OVER POPULATION” and how the planet can’t support the projected N billion people who will be living on it.
Now the birth rate actually slows down to correct itself and we’re not all breeding like rabbits, that’s a bad thing?
This feels like a capitalist concern, “we won’t have enough workers to produce goods and then consume them!”
> Elderly care is basically going to wipe generational savings from the 20th century off the map
Probably for the best.
Currently most of that wealth is being hoarded by the top 0.1%, at the expense of 8 billion people having to deal with global warming for the foreseeable future (e.g. - centuries).
If that's the best humanity can do with wealth, then burn it all down. As long as we keep some advances from medicine (vaccines, dentistry) and technology which aren't as energy intensive, it should all work itself out in the end.
What's going to wipe out billionaires is lack of a highly-educated workforce, because no one is having babies.
And no, you can't completely solve this by immigration (because the demographic crisis is global).
They might still stay billionaires in absolute terms, but a lot of their wealth will be wiped out as companies struggle to sell their goods to a population with reduced purchasing power (since we're too busy taking care of elderly folks)
"Among the major global environmental crises
– climate change, biodiversity loss and land
degradation, and pollution and waste – population
growth is most evidently a key factor in biodiversity
decline. This is largely due to increased demand
for food production, which leads to agricultural
expansion and land degradation (Cafaro, Hansson
and Götmark 2022). As the population grows and
consumption rises, fewer resources and less habitat
are available for non-human species (Crist 2019).
Overpopulation occurs when the total human
population multiplied by per capita consumption
surpasses the capacity of sustainable ecosystems
and resources. Although the global human population
continues to grow, per capita consumption is
increasing at a faster rate. To the extent that people are
disrupting natural habitats and degrading ecosystem
services for future generations, despite regional heterogeneity, some research suggests that most of the world’s nations may be considered overpopulated
(Lianos and Pseiridis 2016; Tucker 2019)"
Specifically going back to 70s overpopulation concerns, thing shifted with the Green Revolution / Norman Borlaug but it came at the cost of reducing groundwater supply and reducing agricultural diversity. See 'The Globalization of Wheat' and https://climatewaterproject.substack.com/p/groundwater-and-c...
It's possible to have both overpopulation(too large of a population for a given metric like water, energy, pollution, etc) and demographic collapse(too many old people, not enough young workers). It's not intuitive but they are separate phenomenon.
The reaction to overpopulation concerns probably discouraged people from having kids but it's unlikely to be the main cause.
Less consumer demand means fewer jobs. When people can't find good well-paying jobs, they become pretty unhappy, and they won't be magically enlightened and out of misery by being told it's the capitalist wheel turning.
On a macro scale you want to see country wide economic statistic numbers go up, regardless of who the money gets to in the end. When your population's age isn't evenly distributed it causes spikes in productivity and costs associated with the elderly which makes the metrics go down. Combined with short term politics that are not incentivized to prepare for it, but rather to play hot potato with it, it makes for interesting situations. If, in the worst case, the country is functioning paycheck to paycheck, you have every member of the workforce supporting multiple elderly and children via taxes, since their taxes were already spent on X or stolen long ago during the productivity boom.
Capitalism does not need and has never had free markets, though some arguments for capitalism being ideal rest on the assumption of free markets, along with a stack of other idealized assumptions, like human behavior conforming to rational choice theory.
China encourages exports and has no recent history of confiscating property owned by foreigners. Combined with cheap labor this makes it a great place to set up sweat shops. If you are selling a good that can be made in a low labor area but you use high cost labor you will be outcompeted and the market wont buy your expensive products so over time all the successful firms make their low skill products in sweatshop zones
wiki article states "Up to 10,000 TWh/yr of power could be generated from OTEC without affecting the ocean's thermal structure". which converts to about 500GW which... isn't that much
10,000 TWh/y = 1e+7 GWh/y, divide it by 365.25 days/y to produce daily output of 27,379 GWh/day, then by 24 h/day to get pure power of 1,141 GW. It's still more than a terawatt, three orders of magnitude larger than the largest nuclear reactors.
This is neat, but it's not zettelkasten - it's building a browse-able knowledge DB from content.
Zettelkasten is about about writing down your ideas in response to content, with a link to that content, and then linking to other ideas that your already logged. It's not an extraction of ideas from that content. This is a common mis-understanding of zettelkasten.
> rampant monkeypatching that made the standard library hard to count on
That was very frustrating when doing regular development after using Rails, all the "built-ins" were actually patched into the stdlib by Rails and not available without it.
> What I want is a dispassionate discussion of how different language features impact code quality
This can be difficult because code quality, productivity, safety are hard to objectively define and measure, so we always fall back to differences in interpretation and experience.
I would be interested in serious attempts to argue for this, even if they can't reasonably be backed up by data.
For example, I think there's a pretty strong argument that immutability makes it easier to write multithreaded code (perhaps at some performance cost), because it entirely prevents common types of bugs.
Similarly there's a good argument that making methods open to extension (like Kotlin or Julia) makes it easier for an ecosystem to adopt unified APIs without explicit coordination.
There's obviously a very strong argument that Garbage Collection prevents a lot of memory safety bugs, at costs to interoperability and performance.
I am the last person to ever promote perforce, but as of last yearish it has the option for binary delta transfer using fastCDC.
Even without that, it is a just straight up a lot faster than git lfs. I know this because I benchmark it against git pretty frequently as I am creating my own large file capable VCS.
What do you mean by this? It's hardly equivalent to LFS. The binary files aren't replaced with a text pointer with actual content stored on a server elsewhere. Binary files are stored in the same place as text files.
From the user's perspective, when setup correctly Git LFS is transparent and they don't see the text pointers - the binary files are replaced on push and pull to the server.
It's the same user experience as Perforce?
Yes, Git is more low-level and it's possible to see those text pointers if you want to.
This is what you want to believe but its not true.
I’m really sorry, git lfs is an ugly hack, and its always painful when you discover that some gamedev team has been forced into it by “better knowing” software developers.
It reminds me a lot of “features” of software that is clearly a box ticking exercise, like technically MS Teams has a whiteboard feature. Yet it lacks any depth: its not persistent so its gone after the call, and it’s clunky to use and to save.
… but technically the feature exists, so it’s harder to argue for better software thats fit for purpose, like miro or mural.
The question as I recall was what Perforce does that Git LFS doesn't, so I'm sorry to disappoint but my hands were tied.
Anyway, I dunno, man. If you want binary files to work, some form of per-file mutex is indeed a requirement. And for this to work well, without being a lot of hassle (and regarding that, see point 2, which I note has been accepted without comment - not that I expected anything else, the argument that Git is the artist-friendly choice would be a difficult one to make), any modification of the mutex's state has to involve a round trip to ensure the info is up to date. You can't rely on something local, that only gets updated sometimes, because then the info can be out of date! Worst case, N people find out too late that they've all been making changes simultaneously, and now N-1 of them will almost certainly lose work.
(You might be inclined to moan at people for not going through the full process, but: we have computers now! They can do the full process for us!)
> produce memory safe software with a bit of discipline
"a bit of discipline" is doing a lot of work here.
"Just don't write (memory) bugs!" hasn't produced (memory) safe C, and they've been trying for 50yrs. The best practices have been to bolt on analyzers and strict "best practice" standards to enforce what should be part of the language.
You're either writing in Rust, or you're writing in something else + using extra tools to try and achieve the same result as Rust.
As Rust Zig has type-safe enums/sum types. That alone eliminates a lot of problems with C. Plus sane error handling with good defaults that are better than Rust also contributes to code with less bugs.
Surely there is no borrow checker, but a lot of memory-safety issues with C and C++ comes from lack of good containers with sane interfaces (std::* in C++ is just bad from memory safety point of view).
If C++ gained the proper sum types, error handling and templates in Zig style 15 years ago and not the insanity that is in modern C++ Rust may not exist or be much more niche at this point.
AFAIK "P2688 R5 Pattern Matching: match Expression" exists and is due C++29 (what actually matters is when it's accepted and implemented by compilers anyway)
Also, cheap bound checks (in Rust) are contingent to Rust's aliasing model.
I actively dislike Zig's memory safety story, but this isn't a real argument until you can start showing real vulnerabilities --- not models --- that exploit the gap in rigor between the two languages. Both Zig and Rust are a step function in safety past C; it is not a given that Rust is that from Zig, or that that next step matters in practice the way the one from C does.
Wasn't Bun the project where the creator once tweeted something along the lines of "if you're not willing to work 50+ hours a week don't bother applying to my team"? Because if so then I'm not surprised and also don't think Zig is really to blame for that.
I'm pretty sure that in an overworked environment the engineers would reach for Rust's unsafe mode pretty quickly because they're too tired to make sense of the borrow checker.
I'm no expert, but I've been hacking in Rust for several years now, and the only unsafe I've written was required as part of building a safe interface over some hardware peripherals. Exactly as intended.
The borrow checker is something new Rust devs struggle with for a couple months, as they learn, then the rules are internalized and the code gets written just like any other language. I think new devs only struggle with the borrow checker because everyone has internalized the C memory model for the last 50 years. In another 50, everyone will be unlearning Rust for whatever replaces it.
Just for background, I have not tried out either Zig or Rust yet, although I have been interestedly reading about both of them for a while now, on HN and other places, and also in videos, and have read some of the overview and docs of both. But I have a long background in C dev earlier. And I have been checking out C-like languages for a while such as Odin, Hare, C3, etc.
> "Just don't write (memory) bugs!" hasn't produced (memory) safe C
Yes it did, of course. Maybe it takes years of practice, the assistance of tools (there are many, most very good), but it's always been possible to write memory safe large C programs.
Sure, it's easier to write a robust program in almost every other language. But to state that nobody ever produced a memory safe C program is just wrong. Maybe it was just rethoric for you, but I'm afraid some may read that and think it's a well established fact.
>Yes it did, of course. Maybe it takes years of practice, the assistance of tools (there are many, most very good), but it's always been possible to write memory safe large C programs.
Can you provide examples for it? Because it honestly doesn't seem like it has ever been done.
I don't understand where you stand. Surely, you don't mean that all C programs have memory bugs. But on my side, I'm not claiming that discipline makes C a memory safe language either. This discussion has taken a weird turn.
> you don't mean that all C programs have memory bugs
Well all of them "potentially" do, which is enough from a security standpoint
There have been enough zero days using memory leaks that we know the percentage is also non trivial.
So yes, if programmers can write bugs they will, google SREs were the first to famously measure bugs per release as a metric instead of the old fashioned (and naive) "we aren't gonna write any more bugs"
Haven't written C in a while but I think this program has an integer overflow error when you input 2 really large integers such that the sum is more than a 32 bit signed integer.
Also I believe in entering null values will lead to undefined behaviour.
I'm not sure how showing that gp can't even write a dozen lines of memory safe C proves that doing so for the exponentially harder 100+k LoC projects is feasible.
The program contains potential use of uninitialized memory UB, because scanf error return is not checked and num1 and num2 are not default initialized. And a + b can invoke signed integer overflow UB. A program with more than zero UB cannot be considered memory safe.
For example if the program runs in a context where stdin can't be read scanf will return error codes and leave the memory uninitialized.
num1 and num2 are declared on the stack and not the heap. The lifetimes of the variables are scoped to the function and so they are initialized. Their actual values are implementation-specific ("undefined behavior") but there is no uninitialized memory.
> And a + b can invoke signed integer overflow UB. A program with more than zero UB cannot be considered memory safe.
No, memory safety is not undefined behavior. In fact Rust also silently allows signed integer overflow.
Remember, the reason memory safety is important is because it allows for untrusted code execution. Importantly here, even if you ignore scanf errors and integer overflow, this program accesses no memory that is not stack local. Now if one of these variables was cast into a pointer and used to index into a non-bounds-checked array then yes that would be memory unsafety. But the bigger code smell there is to cast an index into a pointer without doing any bounds checking.
That's sort of what storing indexes separately from references in a lot of Rust structures is doing inadvertently. It's validating accesses into a structure.
Regarding initialization, if one wants portable code that works for more than one machine+compiler version, it's advisable to program against the C++ virtual machine specified in the standard. This virtual machine does not contain a stack or heap.
Generally your comment strikes me as assuming that UB is some kind of error. In practice UB is more a promise the programmer made to never do certain things, allowing the compiler to assume that these things never happen.
How UB manifests is undefined. A program that has more than zero UB cannot be assumed to be memory safe because we can't make any general assumptions about its behavior because. UB is not specified to be localized it can manifest in any way, rendering all assumptions about the program moot. In practice when focusing on specific compilers and machines we can make reasonable localized assumptions, but these are always subject to change with every new compiler version.
Memory safety is certainly critical when it comes to exploits, but even in a setting without adversaries it's absolutely crucial for reliability and portability.
> In fact Rust also silently allows signed integer overflow.
Silently for release builds, and panic in debug builds. The behavior is implementation defined and not undefined, in practice this is a subtle but crucial difference.
Take this example https://cpp.godbolt.org/z/58hnsM3Ge the only kind of UB AFAIKT is signed integer overflow, and yet we get an out-of-bounds access. If instead the behavior was implementation defined the check for overflow would not have been elided.
I wasn't trying to be a dick, I am saying that my experience is that no big C program is ever safe. You replied that it is possible and I asked for an example. Providing a small script to prove that big C programs are safe isn't enough.
Making a broad statement like there has never been a memory safe C program is a bit of a dickish thing to say.
especially when you phrase it as
> Can you provide examples for it? Because it honestly doesn't seem like it has ever been done.
it comes off as pedantic and arrogant.
It obviously is possible to write memory safe software in C and obviously it has been done before otherwise we would not be currently communicating over the goddamn internet.
Asking for evidence of something this obvious is akin to asking for a source on if water is in fact wet.
I think pretty much any non trivial C example has memory safety issues. It doesn't mean that they aren't useful and can't be used. But time and time again we have seen security reports that point to memory issues. So no, I don't think I'm asking for something obvious, quite the contrary. I think the claim that it's possible to write big C programs that are memory safe is really strong and I heavily disagree with it.
It's not dickish, and it's weird you seem to feel attacked/offended by that. It is a realistic conclusion, that we have come to over the course of decades of C usage. One could call it wisdom or collective learning.
Now the birth rate actually slows down to correct itself and we’re not all breeding like rabbits, that’s a bad thing?
This feels like a capitalist concern, “we won’t have enough workers to produce goods and then consume them!”
reply