I personally got one of those jobs at a smaller, "non-tech" (read: engineering) company, quite close to where I grew up. Got to pick my language and tools, have ownership of my slice of the product, live a very comfortable and flexible 9-to-5 where they gave me the room to do my best work. Pays maybe half of the typical inflated tech salary. But I'm doing well for myself.
My only issues at this point are 1) my life has bucket list things to check off that have nothing to do with work, 2) I'm running out of super interesting things to work on and entering a sort of maintenance mode on my program, 3) the old guard, who were all retirement age, are checking out for a new generation, with a new direction.
So all that being said, I've been looking for work far afield in places closer to my bucket list items. Hopefully, I can find another smaller, specialized company that needs a programmer with my skillset, because this experience has been fantastic.
Couldn't cell or Starlink substitute in this day and age? 5G cells are pretty prolific now. So even if you might not want to actually leave, it's a perfectly good threat and functions as competition in their eyes.
I tried to leave just a few days ago and use a Mint Mobile Wireless Internet. But then I found that I had to be double-natted and my reverse proxy didn't work so I would have to give up my servers and VPN.
Nope. Usually much slower with higher latency, and CGNAT limits what you can do with your connection. Starlink uses CGNAT and some mobile carriers do as well.
I greatly look forward to the day when the Godot team focuses on UI tools and workflow, a layout and theming engine, and slim UI-focused builds. So we can avoid this total nonsense local-server React-UI insanity. This is such a stupid way to build desktop applications.
Most of the performance penalty for the languages you mentioned is because they're dynamically typed and interpreted. The GC is a much smaller slice of the performance pie.
In native-compiled languages (Nim, D, etc), the penalty for GC can be astoundingly low. With a reference counting GC, you're essentially emulating "perfect" use of C++ unique_ptr. Nim and D are very much performance-competitive with C++ in more data-oriented scenarios that don't have hard real-time constraints, and that's with D having a stop-the-world mark-and-sweep GC.
The issue then becomes compatibility with other binary interfaces, especially C and C++ libraries.
> With a reference counting GC, you're essentially emulating "perfect" use of C++ unique_ptr.
Did you mean shared_ptr? With unique_ptr there's no reference-counting overhead at all. When the reference count is atomic (as it must be in the general case), it can have a significant and measurable impact on performance.
You might be right. Though with the way I design software, I'm rarely passing managed objects via moves to unrelated scopes. So usually the scope that calls the destructor is my original initializing scope. It's a very functional, pyramidal program style.
Definitely true! Probably add Swift to that list as well. Apple has been pushing to use Swift in WebKit in addition to C++.
Actually Nim2 and Swift both use automatic reference counting which is very similar to using C++’s SharedPointer or Rusts RC/ARC. If I couldn’t use Nim I’d go for Swift probably. Rust gives me a headache mostly. However Nim is fully open source and independent.
Though Nim2 does default to RC + Cycle collector memory management mode. You can turn off the cycle collector with mm:arc or atomic reference counting with mm:atomicArc. Perfect for most system applications or embedded!
IMHO, most large Rust project will likely use RC or ARC types or use lots of clone calls. So performance wise it’s not gonna be too different than Nim or Swift or even D really.
> IMHO, most large Rust project will likely use RC or ARC types or use lots of clone calls. So performance wise it’s not gonna be too different than Nim or Swift or even D really.
I do not think so. My personal experience is that you can go far in Rust without cloning/Rc/Arc while not opting for unsafe. It is good to have it as default and use Rc/Arc only when (and especially where) needed.
Being curious I ran some basic grepping and wc on the Ion Shell project. It has about 2.19% of function declarations that use Rc or Arc in the definition. That is pretty low.
Naive grepping for `&` assuming most are borrows seems (excluding &&) to be 1135 lines. Clone occurs in 62 lines for a ratio of 5.4%. Though including RC and ARC with clones and that you get about 10.30% vs `&` or borrows borrows. That's assuming a rough surrogate Rc/Arc lines to usages of Rc/Arc's.
For context doing a grep for `ref object` vs `object` in my companies Nim project and its deps gives a rate of 2.92% ref objects vs value objects. Nim will use pointers to value objects in many functions. Actually seems much lower than I'd would've guessed.
Overall 2.19% of Rust funcs in Ion using Rc/Arc vs 2.92% of my Nim project types using refs vs object types. So not unreasonable to hold they have similar usage of reference counting vs value types.
Modularity is how we put out some long-running fires at my company. Our old infrastructure, from people who had long-since left, was a cross-cutting mish-mash of monoliths that each tried to do everything. Data analysis to web serving to PDF generation. Written in everything from C to Perl to PHP to VBA. All were supported because different clients depended on different monoliths depending on when they were brought aboard. Basically, each language was intended to serve on whatever walled-garden platform any given employee or client was expecting to use it. They didn't even talk to each other, reimplemented the same algorithms, it was a mess.
We spun out the specialized tasks (data analysis and PDF generation key among them) to native-compiled binaries or containerized packages like Gotenberg, started moving data around between modules via JSON, isolated the legacy monoliths to containers, unified on our now-modularized PHP backend, and have been working on updating or replacing any other pieces with new modules that can serve the task better. Our clients and non-engineering employees get antsy, but as a smaller company and a smaller programming team, we simply cannot maintain multiple 20-year-old codebases with near-total overlap. It makes no sense now, it didn't make sense when they were each created.
Throwing my hands up and moving to Nim was downright easy next to the excessive effort I put into trying out Nuitka, Numba, and PyInstaller for my use case. If you want static compilation, use a language and libraries built with that assumption as a ground rule. The herculean effort of building a half-compatible compiler for a dynamic language seems like a fool's errand, and would be a fun curiosity if so many people hadn't already tried it, especially with Python.
I was looking for someone else that had done this, I had the same exact experience.
That said, anyone looking into a completely static typed language that has nice ergonomics, is easy to pick up but has enough depth to keep you busy for weeks on end, and is versatile enough to be used for anything, do yourself a favor and give Nim a try.
which has high compatibility and relatively good performance for that kind of thing. The strength of Python is that so many people are trying things with it.
all of this is well and good if you completely forget that there are billions of lines of Python in prod right now. so your grand epiphany is basically on the level of "let's rewrite it in Rust". i'll let you hold your breath until that rewrite is done (and in the meantime i'll explore workarounds).
I finished the rewrite in just a few months and have been happily maintaining it for two years and extending it with powerful features it never had before, partially thanks to optimized native binary speed, partially thanks to the ludicrous compatibility of a musl-libc static binary. The rewrite is the backend foundation of a web product stack that is best-in-category.
I didn't choose Nim because I was an evangelist; I had only toyed with it twice before. After much evaluating (and wrestling an ultra-frustrating previous attempt at a Rust rewrite my predecessor had tried), Nim surfaced as the best tool for the specific job. And it still is. It doesn't have to be our entire product stack; it's doing a small subset of very important jobs, doing them well, and is invoked by other languages that do their jobs better as web servers and such. A modular stack helps each language and tool shine where it works best.
Nobody's talking about porting billions of lines of code, for all we know it's just for personal projects, or a learning experience.
This kind of replies is like killing an idea before it's even started, smells like the sunk cost fallacy.
OTOH I do understand the weight of a currently existing corpus in production, evidence is the ton of COBOL code still running. But still, your reply kind of sucks.
Now there are billions of lines of python in production. But it wasn't so long ago that it seemed like the entire industry was going to completely standardise on C++, and if you wanted a picture of your future, it would be grinding away debugging buffer overflows and thread lockups - forever.
Nim honestly seems like such a perfect general purpose language, I wish it was invented 10 years ago or that the adoption drastically increases though because the lack of ecosystem makes it tough rn
I'm a Master's in CS, and a primarily backend software engineer focusing on mathematical & data analysis, currently specializing in vibration, signals, and sensor analysis, as well as data visualizations. I recently reverse-engineered and from-scratch rewrote an obfuscated backend analysis codebase, resulting in a maintainable, readable, concise, well-documented system.
I have some past experience building backend web services with Clojure, Postgres, and Datomic, and more recent experience making local frontend GUI tools with TypeScript.
I'm a strong advocate of FOSS tools, powerful pragmatic languages, modular technology stacks, and simple solutions. All directly informed by experience with how proprietary languages & libraries, monolithic stacks, and clever programming wizardry result in systems that are brittle and crash-prone, expensive or impossible to maintain, and difficult to defend to clients and courts.
This code style is psychotic. I had to reverse-engineer and verify a C codebase that was machine-obfuscated and it was still clearer to follow than this. Increasing clarity through naming is great, but balancing information density is, dare I say, also a desirable goal. Compacting code rapidly diminishes returns once you're relying on a language having insignificant whitespace.
My only issues at this point are 1) my life has bucket list things to check off that have nothing to do with work, 2) I'm running out of super interesting things to work on and entering a sort of maintenance mode on my program, 3) the old guard, who were all retirement age, are checking out for a new generation, with a new direction.
So all that being said, I've been looking for work far afield in places closer to my bucket list items. Hopefully, I can find another smaller, specialized company that needs a programmer with my skillset, because this experience has been fantastic.
reply