Omarchy is the passion project of a really wealthy person and is backed by his profitable business. What does ‘sponsoring Omarchy’ mean? Like.. where does that money go?
I think it amounts to providing free premium CDN service, the stuff you'd usually have to pay for. They didn't say anything about cash money changing hands.
That’s really reasonable then (I guess apart from any disagreements with the authors views). Omarchy isn’t just a post installation script, they have the entire thing bundled as an ISO. So I can see why an in-kind sponsorship of a CDN makes sense. Although it’s still unclear to me how Omarchy specifically fits into ‘the future of the open web’ vs Ladybird
I worked at an org which has a ‘modern’ NodeJS+React codebase and an ancient legacy Django app on Python 2.7 which is nearing 15 years old.
I was worried that the old codebase would be a pain to work on. It was the complete other way around. The Django app was a complete joy to work with and I literally had so much fun tidying it up and working with it that I’ll be sad when they finally retire it in favor of the new new Go/React rewrite.
I just verified. You can install Python 2.7 on an up to date Windows install as well as an up to date Linux install. Python 2.7 hasn't received security updates (or really any updates) in years, but that does not mean it can't still work on an up to date OS.
Yeah, it's been 5 years (almost 6) since python 2.7 stopped receiving security updates, but it does still run on modern OS's.
Looking at the list, I'm actually kind of surprised there aren't more CVEs for python 2.7, but if you're only running it locally or on an intranet I could see letting it ride.
Nothing like a full rewrite. I migrated multiple projects, but while there is a significant amount of work involved its a tiny fraction of what a full rewrite would require.
Did it a couple of times. Not something you can do with your eyes closed, but not even close to the nightmare of upgrading a JS application or upgrading a rails app.
I remember having great fun in QuickBASIC. And my son enjoys Scratch.
Django code is much more fun to work with than Node, but I can't imagine developing something competitive in it in 2025 to what I'm developing in Node. Node is a pain in the butt, but at the end of the day, competitiveness is about what you deliver to the user, not how much fun you have along the way.
* I think the most fundamental problems are developer-base/libraries and being able to use the same code client-side and server-side.
* Django was also written around the concept of views and templates and similar, rather than client-side web apps, and the structure reflects that.
* While it supports async and web sockets, those aren't as deep in the DNA as for most Node (or even aiohttp) apps.
* Everything I do now is reactive. That's just a better way to work than compiling a page with templates.
I won't even mention mobile. But how you add that is a big difference too.
It's very battery-included, but many of the batteries (e.g. server-side templating language) are 2005-era nickel cadmium rather than 2025-era lithium ion.
I would love to see a modern Node framework as pleasant to work with, thought-out, engineered, documented, supported, designed, etc. as well as Django, but we're nowhere close to there yet.
You spell out a lot of examples, but all of them are purely technical. What is it that you can deliver to the user using Node that you cannot deliver using Django? This is a genuine question.
The lithium-ion battery analogy seems fitting: When we're not careful about sourcing those modern batteries from a trustworthy supply-chain, they tend to explode and injure the user.
Man, the only true part is the async/web socket part (and it's most because of python and not django itself) ... you can do a lot, and by a lot I mean almost 99% of websites/apps out there, with django and it's 2005-era nickel cadmium features
Racket is a fun language. My university uses the bundled teaching languages for first year CS courses. Some people really hate it, and others silently like it.
Might have been Waterloo's Introduction to Functional Programming (CS 135). I have TA'd (technically ISA'd) that course several times and helped countless students in office hours. The struggling students didn't just hate Racket, they hated the whole HTDP philosophy of following a "design recipe" and writing documentation prior to implementing a function. Most of those struggling students essentially waited till the last minute to do the documentation, completely flouting the intention of the course.
I don't know if the strong students had the intended approach since they were never in office hours asking for help!
I really loved the course too. That's why I kept working for it! It always made me sad when students hated the course, which was most of the ones I met in office hours. I think the students who really loved the course did well enough that they didn't come to office hours, so I never met them!
I admit I'm one of those students who never used Racket in a non-academic setting (but mostly because I needed to contribute to already-existing projects written in different languages), and I was taught Racket from one of its main contributors, John Clements at Cal Poly San Luis Obispo. However, learning Racket planted a seed in me that would later grow into a love of programming languages beyond industry-standard imperative ones.
I took a two-quarter series of classes from John Clements: the first was a course on programming language interpreters, and the second was a compilers course. The first course was taught entirely in Racket (then called DrScheme). As a guy who loved C and wanted to be the next Dennis Ritchie, I remember hating Racket at first, with all of its parentheses and feeling restricted by immutability and needing to express repetition using recursion. However, we gradually worked our way toward building a Scheme meta-circular evaluator. The second course was language-agnostic. Our first assignment was to write an interpreter for a subset of Scheme. We were allowed to use any language. I was tired of Racket and wanted to code in a much more familiar language: C++. Surely this was a sign of relief, right?
It turned out that C++ was a terrible choice for the job. I ended up writing a complex inheritance hierarchy of expression types, which could have easily been implemented using Racket's pattern matching capabilities. Additionally, C++ requires manual memory management, and this was before the C++11 standard with its introduction of smart pointers. Finally, I learned how functional programming paradigms make testing so much easier, compared to using object-oriented unit testing frameworks and dealing with mutable objects. I managed to get the project done and working in C++, but only after a grueling 40 hours.
I never complained about Racket after that.
In graduate school, I was taught Scala and Haskell from Cormac Flanagan, who also contributed to Racket. Sometime after graduate school, I got bit by the Smalltalk and Lisp bugs hard....now I do a little bit of research on programming languages when I'm not busy teaching classes as a community college professor. I find Futamura projections quite fascinating.
I'm glad I was taught programming languages from John Clements and Cormac Flanagan. They planted seeds that later bloomed into a love for programming languages.
that's an often repeated misconception about lisps.
lisps are pretty good at low-level programming, but then you'll need to make some compromises like abandoning the reliance on the GC and managing memory manually (which is still a lot easier than in other languages due to the metaprogramming capabilities).
there are lisps that can compile themselves to machine code in 2-4000 LoC altogether (i.e. compiler and assembler included; https://github.com/attila-lendvai/maru).
i'm not saying that there are lisp-based solutions that are ready for use in the industry. what i'm saying is that the lisp langauge is not at all an obstacle for memory-limited and/or real-time programs. it's just that few people use them, especially in those fields.
and there are interesting experiments for direct compilation, too:
BIT: A Very Compact #Scheme System for #Microcontrollers (#lisp #embedded)
http://www.iro.umontreal.ca/~feeley/papers/DubeFeeleyHOSC05....
"We demonstrate that with this system it is clearly possible to run realistic Scheme programs on a microcontroller with as little as 3 to 4 KB of RAM. Programs that access the whole Scheme library require only 13 KB of ROM."
"Many of the techniques [...] are part of the Scheme and Lisp implementation folklore. [...] We cite relevant previous work for the less well known implementation techniques."
People always point out this as a failure, when it is the contrary.
A programming language being managed doesn't mean we need to close the door to any other kind of resource management.
Unless it is something hard real time, and there are options there as well, we get to enjoy the productivity of high level programming, while at the same time having the tools at our disposal to do low level systems stuff, without having to mix languages.
C++ is one of my favourite languages, and I got into a few cool jobs because of my C++ knowledge.
However, given the option I would mostly reach for managed compiled languages as first choice, and only if really, really required, to something like C++, and even then, probably to a native library that gets consumed, instead of 100% pure C++.
I didn’t know you like C++. I’ve been reading your posts for a few years now and your advocacy of the Xerox PARC way of computing. I’ve found that most Smalltalkers and Lispers are not exactly fond of C++. To be fair, many Unix and Plan 9 people are also not big C++ fans despite C++ also coming from Bell Labs.
Back when C++ was becoming famous, my favourite programming language was Object Pascal, in the form of Turbo Pascal, having been introduced to it via TP 5.5 OOP mini booklet.
Shortly thereafter Turbo Pascal 6 was released, and I got into Turbo Vision, followed by Turbo Pascal 1.5 for Windows 3.1, the year thereafter.
I was a big Borland fan, thus when you got the whole stuff it was Object Pascal/C++, naturally C was there just because all C++ vendors started as C vendors.
On Windows and OS/2 land, C++ IDEs shared a lot with Smalltalk and Xerox PARC ideas in developer experience, it wasn't the vi + command line + debuggers are for the weak kind of experience.
See Energize C++, as Lucid was pivoting away from Common Lisp, with Cadillac what we would call a LSP nowadays, where you could do incrementatl compilation on method level and hot reload
You're right altougth C++ was born on UNIX at Bell Labs there is that point of view, and also a reason why I always had much more fun with C++ across Mac OS, OS/2, Windows, BeOS, and Symbian, with their full stack frameworks and IDE tooling.
However with time I moved into managed languages, application languages, where it is enough to make use of a couple of native libraries, if really required, which is where I still reach for C++.
I use it professionally. My favorite is its seemingly complete lack of bad behavior:
"3" + 1 is neither "4", "31", nor 4. It's illegal.
0 is not false, causing endless confusion on filters and &&s.
For loops don't alter the iterated value within a closure to the max value when it's finally called.
And some positives:
Immutable/functional is the default, but mutability is easy too.
Nice optional, keyword, and variable arity support.
Straight forward multithreading, semaphores, shared state, and unshared state.
Excellent module system:
- renames both in and out, including prefixes, all applied to arbitrary scopes of identifiers (I may be using inaccurate terminology)
- nested submodules
- automatic tests and/or "main" submodules
.....etc.......
If I could be grated a wish though it would be for nice struct syntax, but I think that's in Racket's successor Rhombus; haven't personally tried it yet.
I also sometimes wish it was slightly Haskell-ier in various ways, as did the talented individual who created Hackett.
If I were to guess why it's not used, it's because it's not used, which has a kind of downward-spiral force thing going on with it. If you're a random guy in charge of 200 dudes at BigCo, your first thought probably isn't "We should rewrite this whole thing in Racket!", it's probably more like "We should fire everyone and have Claude rewrite our codebase into Rust!" and tell your boss you saved 200*0.5M a year and ask for a comparative bonus. But if you're solo and in charge of designing, implementing, and maintaining a system with 1 or 2 people over the next 20 years, you can use whatever language you want, and Racket's a pretty good choice.
None of this really ruins the language for me, considering pros vs cons as a whole, but sometimes I'm slowed down in that by the time I finish mentally spelling out and typing the struct accessing I half forget the context I was in, and in general I'm sensitive to "eye bleed". Sometimes Racket looks like:
(define define match-define define define-values define begin cond define define)
when the real meat of the algorithm is more like:
cond
...and where Haskell's "where"s, "|"s, and "="s shine.
I'm sure I've over-answered your question but it's the holidays and I'm bored :)
edit: Since Racket uses dot already, it would probably have to be a different character, or the other way around.
University is to open the people's horizons, to learn how to learn, too see computing systems in action that most people on programming bootcamps never deem possible, unless they are curious to learn about computing history.
Sometimes it takes a couple of years, before a seed grows. I for one had a professor, who said: "I am not here to teach you C or Java. I am here to teach you computer programming." and then went on to take us on a tour through various paradigms, and included Prolog, back then Dr.Scheme (which turned into Racket), C, Java and Python. At the time I didn't understand Scheme at all. Didn't understand the idea of passing a function as an argument, so deeply rooted in the imperative world I was. But a couple of years later, I came upon HN and comments mentioning SICP ... Didn't that professor teach us something about that? What if I took a look and started learning from this book everyone is recommending?
And there it was. I worked on exercises of SICP and finished approximately 40% of the book's exercises and had a very solid grasp of Scheme and Racket, and any hobby project I would take out Racket to try and build it. Along the way I learned many things, that I would still not know today, had I stuck with only mainstream imperative languages. I wouldn't be half the computer programmer, that I am today, without going the SICP and Scheme way. I also worked through The Little Schemer. What an impressive little book it is!
So it is far from what you claim. In fact even a little exposure to Scheme once upon a time can make all the difference.
Everyone gets to choose which language they use for their personal projects.
Where are all the Racket personal projects?
N.B. I say this as someone who personally contributed small fixes to Racket in the 90s (when it was called mzscheme) and 00s (when it was called PLT-Scheme).
I view Racket as an academic language used as a vehicle for education and for research. I think Racket does fine in its niche, but Racket has a lot of compelling competitors, especially for researchers and professional software engineers. Those who want a smaller Scheme can choose between plenty of implementations, and those who want a larger language can choose Common Lisp. For those who don't mind syntax different from S-expressions, there's Haskell and OCaml. Those who want access to the Java or .NET ecosystems could use Scala, Clojure, or F#.
There's nothing wrong with an academic/research language like Racket, Oberon, and Standard ML.
I wish Standard ML had a strong ecosystem and things like a good dependency manager/package manager. I really liked it. But there is even less of an ecosystem around it than some other niche languages, and I've gone into the rabbit hole of writing everything myself too often, to know that at some point I will either hit the limit of my energy burning out, or the limits of my mathematical understanding to implement something. For example how to make a normal distribution from only having uniform distribution in the standard library. So many approaches to have an approximation, but to really understand them, you need to understand a lot of math.
Anyway, I like the language. Felt great writing a few Advent of Code puzzles in SMLNJ.
Racket is my first choice for most code I write these days and I've published a fair number of libraries into the raco package manager ecosystem in hopes other people using Racket might find them useful too.
Maybe my use case is abnormal, but I allocate the majority of my resources to a primary VM where I run everything, including containers, etc. but by running Proxmox now I can backup my entire server and even transfer it across the network. If I ever have some software to try out, I can do it in a new VM rather than on my main host. I can also ‘reboot’ my ‘server’ without actually rebooting the real computer, which meant less fan noises and interruption back when I used an actual rack mounted server at home.
reply