Hacker Newsnew | past | comments | ask | show | jobs | submit | Zee2's commentslogin

Seems like the search is based only on the transcript/dialogue - not an image embedding. Would be super cool to actually use some CLIP/embedding search on these for a more effective fuzzy lookup.

Agreed. If you search for Barney, say, none of the top ten picture him at all and is mostly people speaking to or about him. Even running them through a vision LLM for a list of keywords would yield better results than the subtitles, I suspect.

How would someone go about doing this, just curious?

You’d just run every picture through CLIP, essentially you run an image generator backwards. Instead of text to image like most end users use when using something like stable diffusion (been awhile since I’ve done this), it can do the exact opposite and generate tokens (just words in this case) to describe the input image.

I’d guess famous characters like Bart and Marge and other Simpsons characters would likely be known by the tokenizer so it’d be pretty easy. So then you’d be able to guess.

Feel free to correct me on small details if anyone has this more fresh in their mind but I’m roughly correct here.


That is something that Macs do at the system level (if you were experiencing this on a Mac)

This is super cool, and exactly what I’d want! Although I just tried creating a Drop (twice) and it didn’t seem to work.


This is AI-written.

- Ten em-dashes

- "not just A, but B"

    - wasn’t just a vacuum cleaner; it was a small computer on wheels
    - they didn’t merely create a backdoor; they utilized it
    - they hadn’t merely incorporated a remote control feature. They had used it to permanently disable my device
- incessant bullet points/markdown-style formatting

- And an overly dramatic/promotional tone

Obviously the image is AI as well, but /shrug


I don’t see why you’re being downvoted.


Because using good typography and editing is not a unique fingerprint of AI generated content.


AI's habit of writing every story as a bullet-point list really isn't a good writing style.


It was until AI started doing it everywhere. When I took technical writing course in college, many years ago, breaking up paragraphs with bullet-point lists was one of the core techniques taught for writing clear, effective documentation.


I don’t think there’s any single way to be sure, but it sure reads like ChatGPT to me. Which I’m not sure is such a bad thing—I presume the author used an AI to help them write the story, but the story is real. Or maybe they edited it themselves to make it sound more generic. Whatever the reason, the style takes away from my reading experience. It’s a blog post, I expect some personality!


Yeah, stopped reading immediately when I noticed this


Oh boy. Is this not essentially a neuralese CoT? They’re explicitly labelling z/z_L as a reasoning embedding that persists/mutates through the recursive process, used to refine the output embedding z_H/y. Is this not literally a neuralese CoT/reasoning chain? Yikes!


This is almost certainly LLM generated.

Six flowery “from-to”s in one article:

>from beginners to seasoned professionals

>from seasoned backend engineers to first-time data analysts

>from GPU acceleration and distributed training to model export

>from data preprocessing with pandas and NumPy to model serving via FastAPI

>from individual learners to enterprise teams

>from frontend interfaces to backend logic

And more annoyingly, at least four “not just X, but Y”.

>it doesn’t just serve as a first step; it continues adding value

>that clarity isn’t just beginner-friendly; it also lowers maintenance costs

>the community isn’t just helpful, it’s fast-moving and inclusive

>this network doesn’t just solve problems; it also shapes the language’s evolution

And I won’t mention the em-dashes out of respect to the human em-dash-users…

This stuff is so tiring to read.


The fact that the first ad on the page (for me) is a scam for "Free Robux!" is hilarious.


It’s still my strongly held belief that things like this are one of humanity’s grandest achievements.


It's definitely an "on the shoulders of giants standing on the shoulders of giants" thing. Insane breakthrough technologies on top of other insane breakthroughs. Firing lasers at microscopic molten drops of metal in a controlled enough manner to get massively consistent results like what??


It’s a mind blowing achievement, nothing below sorcery if you think about it.

ASML machines are hitting tin droplets with 25kW laser 50,000 times a second to turn them into plasma to create the necessary extreme ultraviolet light, and despite generating 500W of EUV, only a small fraction can reach the wafer, due to loses along the way. I believe it was like 10%.

Here’s an incredible, very detailed video about it: https://youtu.be/B2482h_TNwg


That is a very high quality video.

One thing I am curious about - how many generations of process shrink is one of these machines good for? They talk about regular EUV and then High-NA EUV for finer processes, but presumably each machine works for multiple generations of process shrink? If so, what needs to be adjusted to move to a finer generation of lithography and how is it done? Does ASML come in and upgrade the machine for the next process generation, or does it come out of the box already able to deliver to a resolution a few steps beyond the current state of the art?


If you’re interested in this stuff Asianometry has lots of great videos. They’re not all on semiconductors, but he’s done a lot on this history, developments, and what’s going on in that world.

https://www.youtube.com/c/Asianometry/videos


I've learned so much from this channel. One of the best out there.

Even the video about zippers was fascinating.

https://www.youtube.com/watch?v=9d6eNmtHFQk


That is a really cool video, thank you!

Maybe the high water usage is at some other stage? Or intermediate preceding stages? I'd love to understand more end-to-end, as surely it isn't as easy as popping a wafer in a semi-truck trailer sized lithography machine.


Check out the Branch Education channel, they have a series of videos that explain how the underlying transistors are made in 3d space with multilayer exposures etc.

One thing to understand is that you’re seeing an accumulation of over 50 years of incredible engineering and cutting edge science, these things were invented incrementally.


Lithography is one of many steps, but probably the most important one. You use it to expose a photoresist to create a mask for further processing. After exposing the photoresist you need to develop it, remove either the exposed or unexposed photoresist. The remaining photoresist then is the mask and you either etch or dope the surface that is not covered by the mask or you deposit material on top. And then you need to remove the mask and start all over again for the next layer. The high water usage comes from repeatedly needing to clean the surface to remove chemicals and photoresist.


I've been trying to find that video!! Our professor showed it in class but I was half-asleep and I wanted to rewatch it so badly.


Thanks for this! This must be the best video on EUV lithography that has ever been made.


I think this clockwork-in-a-vacuum was preceded by eidophors: a projector with a spinning disc of oil that has an image drawn by an electron gun, that is then illuminated by an arc lamp.

https://www.youtube.com/watch?v=5783jPTKzjk


Nah, NVIDIA GPU has way more raw computational power & smarts.


See how fast it is running on 8 watts.


GPUs are far more regular in structure, lots of "copy-pasted" blocks, because they are a collection of many relatively simple processors.


I’ll give it a shot. Zippers/velcro are critical for most modern military gear. Elevators are used to increase the storage capacity for warplanes on aircraft carriers. Thermometers (well, any temperature sensing device) are important for many weapons systems, guidance computers, etc. Wind turbines… hmm, the infamous Stuka siren was basically a wind turbine welded to the side of the plane!

(This is mostly facetious)


From my experience every hotel integrated charger is barely 5V 0.5A, much less PD or anything else. Parent comment was talking about the integrated usb chargers in rooms, if I’m understanding correctly (hence the need to still bring their own)


Confused since 2x10W sounds adequate for their overnight charging. (Which is still an upgrade for hotels, but a rather old one..)


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: