Google had PageRank, which gave them much better quality results (and they got users to stick with them by offering lots of free services (like gmail) that were better quality than existing paid services). The difference was night and day compared to the best other search engines at the time (WebCrawler was my goto, then sometimes AltaVista). The quality difference between "foundation" models is nil. Even the huge models they run in datacenters are hardly better than local models you can run on a machine 64gb+ ram (though faster of course). As Google grew it got better and better at giving you good results and fighting spam, while other search engines drowned in spam and were completely ruined by SEO.
PageRank wasn't that much better. It was better and the word spread. Google also had a very clean UI at a time where websites like Excite and Yahoo had super bloated pages.
That was the differentiation. What makes you think AI companies can't find moats similar to Google's? The right UX, the right model and a winner can race past everyone.
I remember the pre-Google days when AltaVista was the best search engine, just doing keyword matching, and of course you would therefore have to wade through pages of results to hopefully find something of interest.
Google was like night & day. PageRank meant that typically the most useful results would be on the first page.
why would you take on that burn rate when you can invest, get the investment back over time in cloud spend, and maybe make off like bandits when they ipo
the deadpan emotionless delivery of the original memes are an important part of their humor. this remaster looks fancy but loses the entire spirit of the thing
Given how famously the tesla switches self driving off moments before an accident, these statistics are impossible to trust (but also because musk is famously a liar)
> If FSD (Supervised) was active at any point within five seconds leading up to a collision event, Tesla considers the collision to have occurred with FSD (Supervised) engaged for purposes of calculating collision rates for the Vehicle Safety Report.
Sure, it's very possible, and I'm sure has happened (an autopilot-instigated crash that happened >5sec after disengagement), but I do think 5 seconds is a reasonable threshold for this data, at the very least.
Another example scenario: FSD-caused spin out then hit by a car close to 5 seconds later. I've been close to this one before with a tire blow out but wasn't hit, but maybe you are right that these would be in the tail of the data.
sorry this is kind of nuts to me. You want something to play video games for you because the video game isn't fun? Just play a game that is fun. The point of the game is to play it
Sorry if it was unclear, but I want something that can act like a dumb co-op partner for games obviously balanced around co-op play so I can have fun playing the game. I've run into a few games like that and I end up dropping them when solo play is too tedious.
"The Electric Monk was a labour-saving device, like a dishwasher or a video recorder. Dishwashers washed tedious dishes for you, thus saving you the bother of washing them yourself, video recorders watched tedious television for you, thus saving you the bother of looking at it yourself; Electric Monks believed things for you, thus saving you what was becoming an increasingly onerous task, that of believing all the things the world expected you to believe."
If the game needs to perform grind yourself, without delegating it (think Albion Online, Eve Online, Black Desert Online, Path of Exile etc. basically every MMO with economy), it means it's actively part of the game design, and delegating it to a robot is just cheating (and against the ToS in all of those cases)
So the point of GP's post, which I agree with, is that if that kind of grind is not part of the fun for you, then the game wasn't designed for you (and please don't cheat because you ruin other's fun)
OCaml is fantastic, but I avoided it and went with Rust for many projects because it had no multicore story and no green threads for a very long time. But it has that now, and I wish I could go back and start everything over in OCaml.
I've also felt very similar, and adopted Rust for those reasons. Even though I'd still love to have a garbage collected, AOT compiled ML-variant, using OCaml still feels like a step backwards now. OCaml's module system and generics feel clumsy and unintuitive now that I've used Rust's Trait system.
Like clockwork every year or so someone emerges and says "I'm going to fix computing" and then it never happens. We're as mired in the status quo in computing as we are in politics, and I don't see any way out of it, really.
Also the website is very low contrast (brighten up that background gray a bit!)
Pretty much. I mean, best of luck to them. One has to try if anything is to change but I have seen this kind of thing so many times. Filled with enthusiasm but lacking in execution.
I have been having a lot of fun with PicoCalc. It's not targeted at end users but is fun for developers alike who want a taste of developing things from first principles. More than anything it can live independently from your other devices.
It comes with a basic interpreter, but the thriving community develops plenty of stuff: python, lua, forth, lisp... have nice ports which you can play with on device. There also is a library of software developed for rp2040 which have been ported, such as a mac classic emulator. If you feel that a micro controller is too low power, you can plug in a luckfox lyra which runs a proper linux with 128M of RAM.
reply