Hacker Newsnew | past | comments | ask | show | jobs | submit | wyattpeak's commentslogin

One day, long after I'm gone, people will finally accept that Python and JavaScript are no longer young languages.

JavaScript is 26 years old, Python is 31. They both continue to grow in importance year-on-year, JavaScript because there is nothing on the horizon which will plausibly replace it, and Python because a large number of industries and programmers genuinely love it.

I think there's a nontrivial chance they'll both still be languages of primary importance in 50 years, but I'd bet my bottom dollar that they'll at least remain as relics yet needing support the way Fortran and COBOL exist today.


> people will finally accept that Python and JavaScript are no longer young languages

> JavaScript is 26 years old, Python is 31

I can't speak for Python, but Javascript has changed¹ massively in recent years, more so (I expect) than Fortran or COBOL every did in their active history. It could be argued that what we have now is a younger language with the same name.

> but I'd bet my bottom dollar that they'll at least remain as relics yet needing support

This I definitely agree with, though I suspect less so than Fortran/COBOL/similar. It is much cheaper to rebuild these days, and so many other things change around your projects², and there are more forces pushing for change such as a legion of external security concerns. That will add up to there being far fewer projects³ left to be maintained that haven't been redone in something new, because they fall into the comfy gap between the cushions of “it still works, don't touch it” and “it is far more hassle to replace than to live with as-is”.

----

[1] the core language is still the same, but there is so much wrapped around it from the last decade or so that I suspect someone who learned it fresh recently would struggle initially on EcmaScript 3 or before/equivalent.

[2] where a Fortan/COBOL project might live for all its decades on the same hardware using the same library versions.

[4] no absolutely fewer of course, but relative to the number of people capable of working on them – much of the price commanded by legacy COBOL work is due to very few having trained on the language in decades and many of those that did earlier being fully not-coming-back-for-any-price retired or no longer capable at all (infirm or entirely off this mortal coil), so those remaining in appropriate health and available are in demand despite a relatively small number of live projects.


Fortran77 vs Fortran90 were fairly different languages that required a substantial revision to the numerical methods assignments that I had in the early 90s as the department shifted from one to the other.

https://www.nsc.liu.se/~boein/f77to90/f77to90.html

> There are now two forms of the source code. The old source code form, which is based on the punched card, and now called fixed form and the new free form.

> ...

> A completely new capability of Fortran 90 is recursion. Note that it requires that you assign a new property RESULT to the output variable in the function declaration. This output variable is required inside the function as the "old" function name in order to store the value of the function. At the actual call of the function, both externally and internally, you use the outer or "old" function name. The user can therefore ignore the output variable.


> but Javascript has changed¹ massively in recent years

Does anyone have any good resource to learn modern JavaScript? Not any of the weekly js framework, but the updated language, capabilities and patterns.


I have found https://javascript.info/ to be a good resource for both learning and reference around modern JS. I visit it instead of MDN with regularity for practical examples of JS features.

The grammar can be a bit spotty in places - but it is open source and has gotten a lot better.


I can recommend Gary Bernhardt's execute program[0]. One of the courses offered is "Modern Javascript" which goes through additions in ES5 and ES2020. There are also multiple courses on typescript. It does cost some money, but there are occasionally special offers.

[0] https://www.executeprogram.com/


Yes...Fortran at least has changed a lot since inception. There's been Fortran 90, 95, 2003, 2008 & 2018 standards since to keep up with the various industry fads of the time (You want OO Fortran? Sure thing.). You can get a good overview of Fortran features from inception through the 2008 standard in the paper "The Seven Ages of Fortran" by Michael Metcalf or on the Fortran wiki (https://fortranwiki.org/fortran/show/Standards).


Does a lot of that extra pool of features get used in production (relative to more "legacy" code) as seen with many reengineering JS projects regularly, it is the Fortran user base more conservative? I might expect the latter, but this is just a gut guess.


I'm not doing much work with Fortran-using communities these days, so this is an opinion only, and probably out of date in various ways.

Yes, developers are using the new features. Most code I touched was regularly using up to 2003 features, with later stuff dependent on other factors (solid compiler support across different compilers being a big one). However, most Fortran programs are going to be fleshed out with components taken from the vast collections of well debugged and tested libraries available, many of which are still F77, and probably will stay that way. Fortran is more 'conservative' in the sense that there's not much compulsion to rewrite working code in the name of 'refactoring' or whatever. Adoption of new features is more 'I'm going to use a select set of things that will make my life appreciably easier' rather than 'everyone on board with all the latest shiny things'.


> there is nothing on the horizon which will plausibly replace it

I'm not going to be making any bets - but the one project that has possibility is WASM. A mature, polyglot ecosystem on top of WASM runtimes with web-apis seem like it could displace JS in browser as #1.


Almost no languages run as WASM.

This is not likely to change anytime soon (if ever), as nobody is working on this, and there is even quite strong opposition to get features in that are fundamentally needed to run anything else than the very few languages that already compile to WASM. ("Nobody" is interested in invalidating their investment in JS ;-)).

Also WASM is actually slow, or better said, "it does not deliver its full potential".

It will need advanced JIT compilers to keep up with the other two mayor VM langues. But in this regard WASM is behind around 20 years of constant development and improvement.

My strongest hopes in this regard are currently with Microsoft (even I don't trust this company at all!), who are indeed interested to run their CLR stuff in a WASM VM, and could probably deliver on the needed features. But then, when you would run a CLR-VM (or a JVM) on top of a WASM VM, you know, you're building just the next Matryoshka… There are no real benefits to that besides "look mom, it runs in the browser".


Probably not. Unless you're rendering to another target besides the DOM (ie canvas) I doubt you see JS displacement as #1 in the browser. JS is not the performance bottleneck, the DOM itself is. And in the meantime, you've got 25 years of example code, component libraries, talent development, dev productivity tooling, browser integration, etc built up around it.

And unlike other operating systems, the browser does not give you any kind of standard library of reasonably good components to build on. So the sheer size and volume of components and the ecosystem built up around npm well be an uphill battle for any WASM target language to compete with.


>Unless you're rendering to another target besides the DOM (ie canvas) I doubt you see JS displacement as #1 in the browser.

if we're talking on the level of 20,30,50 years, we may in fact be able to move away from a DOM-based web. and WASM is simply a binary spec, so it can adjut with whatever comes on the horizon. We've had similar sized giants rise and fall in that span.


Python3 yes, but Python2 will have faded away.

Perl! Oh, poor Perl.

Python 3, or its children, will be around a long time. As will some version of /bin/sh


Yes, Perl certainly took an odd turn on their 'next gen version of the language' journey, but I'm willing to bet there will be a Perl community running 5.247.2 or some such decades from now, alongside sh, awk & sed.


> As will some version of /bin/sh

I hope not!

That's one of the things I pray every day to go away. (Even I don't believe in any gods, and am a Linux-only user for the last 20 years).

The Unix shell language is one of the most horrific legacy technologies that are still around. I really wish it dies soon™ and gets replaced finally by something sane!


It won't because it's too close to human interaction.


I don't get this argument.

UIs changed in the past, and still change the whole time.

Why would shell be any different?


Why did Python win the war with Ruby? Was it purely the math community deciding this is where we throw our weight and left Ruby the runt of the litter?


The libraries. Ruby has Rails. Python has... everything else (plus Django, so it also kinda has "a Rails"). You'll likely be using something less well-maintained and shakier if you use Ruby outside of Rails stuff, than if you'd picked Python. Python's basically the modern Perl.

Why that all happened, IDK.

I write that as someone with a soft spot for non-Rails Ruby (after much consideration and repeated encounters, I kinda hate Rails). But it's rarely the best choice, unfortunately.


> Why that all happened, IDK.

I'd reckon the parent's suspicion about the scientific community is correct in that it was a large influence. When ML and deep learning blew up, the academic Python community was in a great position -- you had numpy and scipy early on (both optionally BLAS and LAPACK btw), then scikit-learn for ML, matplotlib for plotting results, open CV ports, etc. As for why Python was adopted so early by the scientific community, I'm not sure. Maybe because it was a scripting language that was also very friendly for hooking to C and Fortran?


I genuinely love Python. Not in a shallow feature-to-feature way. But deeply that it has enabled a career and provides a livelihood to me and my family. It puts bread on the table. It taught me how to program and it taught me the power of computers.

Life changing tool. No other tool in my house comes close to what computers + python has done in my life.


Oh, I like it too. It's got problems like most languages that see any actual use, but it's totally OK, even good. I didn't intend my post as a put-down of Python, so if it came off that way—whoops, not what I was going for.


I kinda hate Django (ducks). The data model being so intricately tied to the business logic makes it impossible to refactor.


It's funny, you don't hear much about the Python/Ruby war anymore. Python was more of a general purpose language and had decent web frameworks (Django and Flask primarily). Ruby's main claim to fame was, and still is, Rails. Rails has lost a bit of steam over the years, partly due to node.js and the microservice revolution, so to speak. If anything, Sinatra is a better fit for microservices and yes, sure microservices aren't a perfect fit for all use cases, but they do exist now and are reasonably popular compared to when Rails first came out.

Additionally, Python made significant inroads as a teaching/academic language and a scientific/math/ML language.

Way back in 2004, I had been using C/C++, Java and Perl and was ready for something new and useful. I'd heard about Ruby and Python at that point and tried both. Ruby felt too much like Perl for my tastes (no surprise, it's kind of like OO Perl) and while I didn't love the significant whitespace in Python, it just looked cleaner and simpler to me.

I have been using Python off and on ever since. I have worked with Ruby a bit as well. What's funny is that they are fairly similar and I've long argued that the two language communities would be better and stronger if they "joined forces".

But of course people have strong opinions about programming languages. Myself personally, I like Python a lot more than Ruby, but I've been using Go for a few years now and it's my current language of choice.


Ruby was very much general-purpose. Homebrew was written in Ruby. Vagrant was written in Ruby.


True, but Python became more popular as a general purpose language. For example, Python starting shipping in most Linux distributions sometime in the late 2000s, Ruby did not.

I didn't mean to imply that Ruby isn't or can't be a general purpose language.


Growth of data science and AI/ML saved Python from being over leveraged on web dev backends.

I’d say also it was more at war with node until data science took off.


It was already in wide use for scientific computing by 2000, due to the comparative ease of writing interfaces to C code. The main idea was to use Python as a glue language to "steer" high-performance computing.

The Python/C API was easy to learn and use, Python's reference counts worked well for C-based objects, and it was easier to build non-trivial data structures than Perl or Tcl, which were its two main competitors at the time.

(Tcl extensions required manual garbage cleanup, I remember Perl's extension API as being rather complex, and I had to read the Advanced Perl manual to understand something as simple as having a list of dictionaries.)


Node didn't even exist yet when python and ruby were in competition.


Performance.

So many people say it doesn't matter. Until it does.

Python works around it by having so many libraries built in C or C++.


> Python works around it by having so many libraries built in C or C++.

Which works quite fine, until it doesn't.

By than the needed rewrite in some language that delivers decent performance and safety all over the place in one package will be very expensive.

I'm not saying that you should avoid Python (and its native code kludge) altogether but when using it just pray that you never reach that point mentioned above. It's a dead end and will likely require an almost full rewrite of a grown, business critical (and already heavily optimized) application.


Prototyping in Python, then rewriting the performance critical parts in a speedier more difficult language is one of the most efficient paths a project could take.


I knew Python decently well before I ever played with Ruby.

Ruby to me feels like a very ugly version of Python. It's like Python and Perl had a baby, and I have very strong negative opinions of Perl's syntax. It baffles me how a language that people jokingly refer to as a "write-only" language ever got any sort of ground.


Python is easier to use if you come from a C/C++ style coding background.


I also think it is easier to use, period. I've used Ruby professionally since the Rails 1 days, and still program in it most days. A couple of years ago, while working at an AI company, I helped out on an ML project due to a time crunch, and I needed to use Python to contribute. I wasn't asked to do anything ML-specific, but rather help by building out infrastructure and data processing pipelines, i.e. the stuff that used the ML models.

I'd never used Python before but within a couple of hours I was writing code and in less than a week I'd engineered a pretty slick, very robust pipeline. I was quite honestly fairly astonished at how quickly I became productive in the language.

I could be wrong about this (my experience with Python started and stopped in that one week) but the impression I got was that Python is smaller, more constrained (i.e. fewer ways to do the same thing), and syntactically less complex.


Python is easier to use if you come from almost any background, programming or not. I believe this is primarily b/c there isn't a lot of "special syntax" in Python, it's all very explicit and common. The same is not true with Ruby.


Could you point out specific parts of python that are easier for someone with C/C++ background as opposed to Ruby? I remember starting with Ruby (after rudimentary CS50-level C), and finding it quite reasonable and logical, and nicer than python. I still think it's nicer than python, although I've long since stopped using it.


I believe the issue isn't so much "vanilla python" vs "vanilla ruby" for a developer coming from a C background but rather that ruby's programming style leads to a significant bit of meta programming which (aside from being a bit of a challenge to get one's head around) leads various shops and frameworks having built their own DSL for writing ruby.

Open classes give me the security heebie jeebies.

    irb(main):001:0> "foo".bar
    (irb):1:in `<main>': undefined method `bar' for "foo":String (NoMethodError)
            from /usr/local/lib/ruby/gems/3.1.0/gems/irb-1.4.1/exe/irb:11:in `<top (required)>'
            from /usr/local/bin/irb:25:in `load'
            from /usr/local/bin/irb:25:in `<main>'
    irb(main):002:1* class String
    irb(main):003:2*   def bar
    irb(main):004:2*     "foobar!"
    irb(main):005:1*   end
    irb(main):006:0> end
    => :bar
    irb(main):007:0> "foo".bar
    => "foobar!"
    irb(main):008:0> 
On one hand, that's really neat. On the other hand, the ability to add or modify a method in a system class is not something that I'd want near production code. I'm sure that other orgs have sufficient checks and style guide to prevent something from creeping in... but that sort of flexibility in the language is something that I'd prefer to stay away from if I want to be able to reason about ruby code.

See also Ruby Conf 2011 Keeping Ruby Reasonable by Joshua Ballanco https://youtu.be/vbX5BVCKiNs which gets into first class environments and closures.


I don't think that argument holds up. A VR headset, say, is astronomically high up the hierarchy of needs, but I know very few people who'd say the inherent value of a piece of modern technology is zero.

It really seems to be art specifically which people are often keen to describe as worthless, not any particular category of good that artwork might fall into.


> A VR headset, say, is astronomically high up the hierarchy of needs

Debatable. One of the most popular uses of VR is pornography, which is targeted at one of the needs on the very bottom layer. Other uses are probably mostly serving mid-level needs like social belonging or esteem needs.


I get frustrated with games that scale things with me. I want to be able to feel myself getting better, having my enemies scale with me on any axis really takes away from that feeling that I'm progressing.

I say this not because I think it's an especially noteworthy or important objection, but to echo GP's point, that it's very hard to find AI that suits everyone, and it's not just a matter of difficulty.


Exactly. I do want AI to scale with me, but by me selecting better AI when I get better. That's how I measure my progress: by having to select better AI. That makes my progress transparent.

And by having different AI at different difficulty levels, I think you actually can have AI that suits everyone. But I don't think most game companies like developing several separate versions of AI. They just want one that's superficially "good enough".


I recall seeing Sid Meier say in a talk that he liked to scale his AI to give players the sense of being on a knife's edge and then scaling it back near the end of a game to give a feeling of victory over impossible odds.


I think this is a weird framing of the issue. Sure, lots of businesses go under, and maybe being larger would have saved them, but maybe not. Plenty of VC-funded businesses go under precisely because they tried to be too large, when they could have perfectly comfortably served a few satisfied initial clients for enough money to pay all their bills.

I think the idea that companies go under because they aren't ambitious enough says more about modern attitudes towards growth than it does about the reality of business.


When I say growth I mean net profits. Those imploding VC companies were never profitable.

A larger profitable company has more chance of survival by shrinking into a smaller profitable company. It's a buffer. But an already small profitable company doesn't have that option, there's more risk.


For anyone as confused as me, from Wikipedia:

> [Hunger] stones were embedded into a river during droughts to mark the water level as a warning to future generations that they will have to endure famine-related hardships if the water sinks to this level again.


I think there is a similar concept in Japan about marking the high point of Tsunamis as a warning for future generations.


"If you see me, weep"


EDIT: Never mind, sorry, I missed part of the article, it does indeed say what you say.

---

That's the opposite of what your source says:

> In writing, however, use to in place of used to is an error.

Used to X was the standard past tense of to use in the sense of being in the habit of:

I used to fish: I was previously in the habit of fishing, correct both in the past and today.

I use to fish: I am presently in the habit of fishing, correct in the past but no longer understood today.

The second, however, is according to MW occasionally misused to mean "I was previously in the habit of fishing".


There's a difference when use is preceded by did:

> The problem becomes a little trickier in constructions with did. The form considered correct following did, at least in American English, is use to. Just as we say "Did he want to?" instead of "Did he wanted to?," so we say "Did he use to?" instead of "Did he used to?" Here again, only in writing does the difference become an issue.

> While in American English "did used to" is considered an error...

Personally, didn't used to looks jarring, but it made me open the article so…


Oh, excuse me, I entirely missed that section of the page. Makes sense there's a geographical distinction, I'm not American so I suppose I don't have the intuitive issue with it, but on reading I see the logic.


Honestly, I started working this out because I thought it would be negligible. But I think you're right to doubt. Leaving aside the question of how much a human produces, since two people have suggested human output is neutral and I don't know enough to question it:

It takes an average of 0.10kcal to walk up/down a step, averaged.[1]

2.2kg of CO2 are emitted per 2000kcal of consumption (I just averaged table 3 for want of a better idea)[2]

37 steps in a staircase (TFA, 46 total - 9 flat)

3.7kcal burned, 3.7kcal * 1.1g/kcal ~= 4g CO2 per person per trip

Obviously very rough, but unless I've made an order-of-magnitude error it's in the same ballpark.

[1] https://pubmed.ncbi.nlm.nih.gov/9309638/

[2] https://nutritionj.biomedcentral.com/articles/10.1186/s12937...


I don't know, I maintain that policy fairly strictly, but I can imagine falling for this.

I won't as a policy give out information to an incoming call, and I do call back if they want any info from me. But my working memory is not endless. The topic of discussion had changed three times before he was asked for any information, and the information still wasn't PII, it was a confirmation code. The scammer knew enough about him that he wasn't especially on alert. I can well imagine that flag in my mind that I was on an incoming call having been lost before we got to that point. And I suspect that's exactly how the scam was designed.


Computers don't execute code perfectly 100% of the time.

I agree that it's fundamentally different, but I'm not exactly sure how, and I think it's subtler than you're suggesting.


It is fundamentally different in mathematical foundations; some functions are proven formally verified and therefor will execute 100% perfectly (I guess you are talking about actual bugs like hardware issues?); what gpt3 does is not even close to that; if you put the same input to gpt3 multiple times it comes up with different answers. That is nowhere close to a computer executing an algorithm.


I'm not talking about GPT-3, I'm discussing the theoretical question raised by the grandparent of my comment: How is predicting the output of a function fundamentally different from executing the code?

We call computers deterministic despite the fact that they don't with perfect reliability perform the calculations we set them. The probability that they'll be correct is very high, but it's not 1. So the requirement we have for something to be considered deterministic is certainly not "perfectly a hundred percent of the time", as the parent to my comment suggested.


> if you put the same input to gpt3 multiple times it comes up with different answers. That is nowhere close to a computer executing an algorithm.

It's a non-deterministic algorithm, of which many kinds exist. Producing different answers that are close-ish to correct is in fact what a Monte Carlo algorithm does. Not that you'd use GPT3 as a Monte Carlo algorithm though, but it's not that different.


Sure, but if you have something as clear as some of the actual deterministic python code from the article, this doesn’t fly.

Close-ish to correct makes sense for some problems and makes no sense at all for others.


Assuming no hardware errors, they do.


I don't think that's a reasonable assumption. If we allow ourselves to assume no errors, we could just assume GPT-3 makes no errors and declare it equivalent to a code interpreter.


Interpreter? Sure. That interpretation is not "equivalent to executing the code", though.

Imagine a C compiler that does aggressive optimizations - sacrificing huge amounts of memory for speed. On one hand, it even reduces computational complexity, on the other it produces incorrect results for many cases.

GPT-3 as presented here would be comparable to that. Neither are equivalent to executing the original code.

Meanwhile, the result of something like gcc is, even if it runs on a computer with faulty RAM.


I've lost track of what point you're making.

Speed and memory is orthogonal to my point, which is about the output of two methods of arriving at an answer. I'm obviously not saying GPT-3 is anything like as efficient as running a small function.

What distinction are you drawing between the output of an interpreted program and a compiled program?


After two or three years, neither would the owner.

Edit: More seriously, if they took this approach, people would regularly lose their codes, which would necessitate a backup means of obtaining it. Which would no doubt require the purchaser log into their Google account. And we're back at square one.


> After two or three years, neither would the owner.

It might surprise you, but back before games where always online everyone had to keep track of license keys. Still have a box with all of them. Only those that required feedback from activation servers are now useless because the companies killed the servers.


And I misplaced a few CD keys over the years, and sometimes even repurchased the game for $20-50.

Modern phones are closer to $1000.


Some I still have, too. But even back then, I just looked for a key online, than bothered searching. That was way quicker ... you didn't even had to go to any dark sites. Google showed them right upfront.

(for single player games)


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: