Hacker Newsnew | past | comments | ask | show | jobs | submit | pveierland's commentslogin

When using the OpenTelemetry Collector you might need a set of modules not supported by a standard distribution, e.g. the otelcol-k8s variant but with some extra exporter from otelcol-contrib. I found building such a custom distribution a bit cumbersome but that it was made much easier by this project.

Rust runs quite well today via WebAssembly. Continuing to improve interop between Web API / WASM / language runtimes seems like a good route that allows people to use the language they prefer against shared Web APIs.

What's really neat is that most of the problems are so compact that you can just read about the full problem, then spend hours and days thinking about possible solutions.

I created a PDF version that I keep on my Remarkable for puzzling: https://github.com/pveierland/project_euler_offline


Thank you for the PDF version!


Isn't is good that there are some forcing factors to help ensure the quality of the content? I get that there's plenty of drama and difficulties in building and moderating the content of Wikipedia, but it certainly does not appear to stagnate in terms of content if you are looking at e.g. the number of articles on English Wikipedia. The overall process appears to produce great outcomes and it is the greatest collection of knowledge created.

https://en.wikipedia.org/wiki/Wikipedia:Size_of_Wikipedia


GP wasn't complaining.


I took it as a statement that it is prohibitively difficult to contribute to Wikipedia, and wanted to point out that a large number of contributions are being made and the resulting quality being high, in part due to the difficulty of making contributions.


My comment was disputing the statement above that anyone can just stick junk in Wikipedia. While yes anyone can submit edits, it's pretty hard to get them accepted so the content on Wikipedia is more reliable than just a public notepad.


You are mistaking quality with difficulty. Many people have quality information for contributing but lack the time for politics.


Where do I mistake quality for difficulty?

My statement was that the quality of Wikipedia overall is high, and that one of the reasons for that is because they set and enforce standards for contributions.

Certainly many people are put off by the process and will not have time to deal with it, but my belief would be that such cases are more likely on more controversial topics, and less likely for less controversial topics. Inherently, collaborating on difficult topics will be a difficult process, which also means that there are likely no easy answers for how to make this process not discourage anyone.


The forcing factors aren't what they are supposed to be though. "Credible" sources and citations are exclusively up to the article moderators personal tastes which are very rarely objective.


> The forcing factors aren't what they are supposed to be though.

Is it clear what they should rather be - and are there any examples of mechanisms that have worked better at a scale like this? How are you judging that they are not what they are supposed to be?

If the resulting body of work, which is the totality of Wikipedia, is able to be a curated and high signal collection of knowledge as a result of these mechanisms, how can it be said that they are not working? Having forcing factors, even if they are not ideally aligned or executed, which pushes contributors to increase the quality of their edits to pass, seems overall like a good thing. I'm not saying that its processes and mechanisms cannot be improved, I'm saying I believe it is incorrect to say that they are not working as a whole.

> "Credible" sources and citations are exclusively up to the article moderators personal tastes which are very rarely objective.

Overall I believe Wikipedia to be curated by a large group of people which coordinate through various rules and consensus mechanisms, where I don't believe it is correct to state that sources and citations are exclusively up to any specific article moderators, as they need to be able to build consensus and co-exist with other moderation.

Exactly because Wikipedia is such a large body of work it seems more resistant to corruption to have a large number of curators with different tastes and motivations. How would you determine that their selection of sources and citations is very rarely objective - especially when objectiveness itself seems quite hard to agree upon for many of the topics covered?

From my perspective it seems far more important to consider the quality and value of the totality of Wikipedia, which is massive and signs that many things are working, rather than insisting that it is not working, especially in times where knowledge is being broadly attacked, and where Wikipedia is one of the targets.




Shorter than I would have imagined.


The SpecTec mentioned in the announcement is really cool. They're using a single source of truth to derive LaTeX, Sphinx docs, Coq definitions for proofs, and an AST schema. Building the language spec in a way that its soundness can be proven and everything derived from one truth in this way seems super useful.

https://webassembly.org/news/2025-03-27-spectec/


> "The content loading failed."

It's amazing how far and how short we've come with software architectures.


This "plan" seems more like a vision announcement compared to Master Plan #3. MP3 had a structured and quantified thesis, while this post offers few details. Wonder if they'll release a more substantiated version?

https://www.tesla.com/ns_videos/Tesla-Master-Plan-Part-3.pdf


Will any source to build new images remain available without subscription?


They write in the press release, that the sources remain under Apache 2 license, they just stop distributing prebuilt images for free.

Edit: As I see it's true.

Source code for OCI images: https://github.com/bitnami/containers/tree/main/bitnami

Charts: https://github.com/bitnami/charts/tree/main/bitnami


> Source code for OCI images: https://github.com/bitnami/containers/tree/main/bitnami

If you look at the folders there, you'll see that all of the older Dockerfiles have been removed, even for versions of software that are not EOL.

For example:

PostgreSQL 13 (gone): https://github.com/bitnami/containers/tree/main/bitnami/post...

PostgreSQL 14 (gone): https://github.com/bitnami/containers/tree/main/bitnami/post...

PostgreSQL 15 (gone): https://github.com/bitnami/containers/tree/main/bitnami/post...

PostgreSQL 16 (gone): https://github.com/bitnami/containers/tree/main/bitnami/post...

PostgreSQL 17 (present): https://github.com/bitnami/containers/tree/main/bitnami/post...

> The source code for containers and Helm charts remains available on GitHub under the Apache 2.0 license.

Ofc they're all still in the Git history: https://github.com/bitnami/containers/commit/7651d48119a1f3f... but they must have a very interesting interpretation of what available means then.


It looks like setting up a mirror and CI/CD on top of Github might work for some time. ghcr is free for public images


I've been thinking a lot about this kind of thing recently - and put a prototype up of htvend [1] that allows you to archive out dependencies during an image build. The idea being that if you have a mix of private/public dependencies that the upstream dependencies can be saved off locally as blobs allowing your build process to be able to be re-run in the future, even if the upstream assets become unavailable (as appears to be the case here).

[1] https://github.com/continusec/htvend


Their Dockerfiles include things like download pre built binaries from $SECRET_BASEURL which is hosted by them, can still be found in git log though. I imagine it will go offline / have auth soon enough.


Or if you have a decent sized deployment in one of the clouds, it's extremely likely you'll already use their internal registry (eg AWS ECR). I know that we do. So it's just a case of setting up a few docker build projects in git that push to your own internal registry.


Is it clear whether the Debian image sources will continue to be maintained?


I do not see direct statements that they will stop maintaining sources in open source.

We'll see :)


It is at the top of the announcement. This only affects OCI images, not source code "The source code for containers and Helm charts remains available on GitHub under the Apache 2.0 license."


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: