Other than that, I would've understood this notion better in the 90's when we were all on dialups. Maybe my perception is skewed growing up and seeing in real-time a picture loading on a website?
Now, even with outdated hardware on an ok connection, even larger sites like WAPO (3MB) loads what I feel like instantly (within 5-10 seconds). If it loaded in 2 seconds or 1 second, I really don't know how that would impact my life in any way.
As long as a site isn't sluggish while you browse around.
Most of the time, and to a lot of people, it doesn't matter. I have a fast mobile data plan and fast home Internet. But even I have encountered the following circumstances where I wish sites were smaller:
- on an underground subway with slow and spotty connection
- on slow airplane WiFi
- on slow hotel WiFi
- on a slow mobile network while traveling internationally
I disagree that literacy is all about reading fictional books and I wish we could broaden it a little bit more, even at school.
This might be an odd take, but I never liked reading books and have read very few books in my whole life. I do love to read news articles, forum posts, magazines etc. because the format fits me.
Judging myself by my education level and career I'd say I did just fine without opening a single book.
I enjoyed reading fiction until I studied electrical engineering. After that I felt that fiction was a waste of time. On a cognitive level I don't think it is, but I think the feeling comes from having been instilled with a sense that reading has to involve some degree of learning. As though everything I do has to involve some self betterment. It's frustrating and I feel a sense of loss about it.
It's more of an oligopoly than a cartel to be honest. It's more like how the telco industry operates. High barriers to entry with generous subsidies and incentives for the existing few providers.
I wish breakthroughs in Alzheimer's were like breakthroughs in battery tech. Batteries are getting dramatically cheaper and denser every year with a clear path for 10+ years of improvements.
Alzheimer's treatments today maybe slow cognitive decline by 3 months and cost a fortune.
The paper itself and the article on it both state it is in mouse models of disease within their titles.
Seems like many here bring up this point for no reason other than to repeat it over and over as a low-effort dismissal. It is exhausting to read these threads.
At the moment I am writing this, there is only a single comment about the actual article.
Why stop there? They should expand it to blockchain, 3d printing, VR and quantum computing to make sure it really tickles the executives' imagination...
I get that we're skeptical of every little company doing "AI" projects, and we should be, but local AI for browsers already has several obvious applications - translating text without sending it off to a third party like Google, automatically generating subtitles / transcripts for arbitrary audio / video, etc.
Firefox like most independent browsers is also reliant on search engine revenues. If AI threatens search, then it also threatens Mozilla.
Plus Chrome, Brave, Safari, and Edge are already doing the same kinds of research, for the same reasons.
But I don't understand why people think AI can be lumped into the same category as VR/3D printing/blockchain/etc. It's clearly going to change the world, and in 200 years we'll look back on it the same way we saw the invention of electricity. (You know, assuming we're still around as a species).
To me, it's so weird to watch people dunk on AI for hallucinating a bit or because some CEOs hack it into their product in unfulfilling ways. Look how bad image generation was just a year ago (fuzzy and mushy), and now we have movie clips that are almost studio quality (but generated in minutes).
I agree that AI/ML is overhyped right now. There are however some practical applications for machine learning, even for web browsers. The most obvious one to me is translating content in foreign languages. Another would be tools that make poorly designed websites more accessible for the visually impaired.
As someone who is fluently bilingual I have tested the exact thing people say AI in the browser would be good for, translations and subtitle generation and it was abysmal. It produced sentences that absolutely look correct but the meaning has fundamentally changed. Someone fluent in both languages would know what it intended to say but for those people we wouldn't need the translation anyway.
It was deemed far too much work to be viable at this point seening as we would still need to employ someone to double check the translations anyway so the project was canned.
Yeah, it’s not really enable new projects in the language arts unless it’s cases where accuracy and the truth don’t matter compared to volume (and those are mostly things that are harmful to society).
You can now write a non-fiction book with a subject matter expert and an editor, skipping the ghost writer. Saves one whole salary, right? No, because it’ll take a lot more time from both the SME and editor to reach the same quality level as before. You can dial the quality down a little—worse presentation of material, more repetition, more inaccuracies—and save some money, which was harder to reliably do before, so there’s that I guess, but now we’re back at “it saves money if you’re trying to make junk”.
I don't know how you can say that. We passed the Turing test within the past year, and nobody even noticed because that now seems like such a low bar for where we'll be in a few years.
ML is an important component of AI, but it's not a simple "rebrand". I get that it's easy to think of it as marketing hype, because there's a lot of people (the same people who got on the blockchain bandwagon) shilling crap. But what we're seeing isn't overhyped; it's severely underhyped.
> I don't know how you can say that. We passed the Turing test within the past year
Easy: Remember OpenAIs claims regarding the bar exam and did you read the news the last couple of days how it was all bollocks?
Great... it's another hype. It's "just" statistics, very impressive, but as soon as it will run out of training data, it will plateau... and it will run out by GPT-5 or GPT-6 :)
Then it can learn from its predecessor's hallucinations maybe :D
> But what we're seeing isn't overhyped; it's severely underhyped.
I don't know if passing the turing test is really that much of an accomplishment. Some would argue we create software that passed the turing test when Eliza was made, which was developed in the 60s. I mean, Eliza really kind of shows that the turing test isn't actually much of a test. Your testing more of how easy it is to fool a person and exactly how intelligent or smart an artificial system is.
Agreed, which is why the next part of my sentence went on to say "nobody even noticed because that now seems like such a low bar for where we'll be in a few years"
All the while there are numerous issues with Firefox on M1. They simply ignore or close them as “works for me” [1] while I personally have encountered at least half of these issues as of last month.
IMO they should first focus on being a good cross platform browser that works well on desktop arm before doing anything else.
Thanks for this link. Firefox has been getting worse for me stability-wise on my Mac M1, even with tab discarding it consumes huge amounts of power, and at least two or three times a day it will just stop loading webpages and show errors in the network tab and need to be restarted. I spend a couple of hours every few weeks trying to track down the issues and Firefox and even in the bug tracker can't find answers.
I also have a bizarre problem where any Chromium-based browser (Chrome, Brave, Edge) are extremely slow to load any page since upgrading to Sonoma, where Firefox or Safari are near-instant - like taking 60 seconds to even start DNS lookup. After a couple of minutes it will eventually fully load a page. I've seen other people mention the same issue online, but no fixes. I have spent hours trying to debug and track down problems for that too.
It's discouraging how much it feels like every software tool I use on every device has gone to shit, especially things as fundamental as a web browser.
Yeah, I stopped reading when I saw the focus on AI. And I could probably cobble together a good proposal, but I'm not willing to lump it in with that sociodigital fatberg.
To flesh this out, the root notes of C and A minor are not that close in the circle of fifths, but the chords sound similar because they share 2 notes (C is spelled C, E, G, and A minor is spelled A, C, E). A common device in major keys is to take a part played over the "1" chord (C in this case) and play it again, or some close variation, over the chord 2 steps down (A minor in this case) to get a "kind of the same, but sadder" version of the same part.
Me and my son created a choose your own adventure console game using Repl.It and Python.
It gave us time to go over programming fundamentals but for the most part we just had fun and came up with wacky content for players to progress through. Later, we went back and added static images.
One of the most satisfying things for him was being able to spin up his retro game at school for his friends to play.
Other than that, I would've understood this notion better in the 90's when we were all on dialups. Maybe my perception is skewed growing up and seeing in real-time a picture loading on a website?
Now, even with outdated hardware on an ok connection, even larger sites like WAPO (3MB) loads what I feel like instantly (within 5-10 seconds). If it loaded in 2 seconds or 1 second, I really don't know how that would impact my life in any way.
As long as a site isn't sluggish while you browse around.