Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

So this seems to be a build definition and some form of attestation system? Does this require builds are done via CI systems instead of on adhoc developer machines?

I find that for many npm packages, I don't know how builds were actually published to the registry and for some projects that I rebuilt myself in docker, I got vastly different sizes of distribution artifacts.

Also, it seems like this is targeting pypi, npm, and crates at first - what about packages in linux distro repositories (debian, etc.)?



Author here!

> Does this require builds are done via CI

Nope! One use for OSS Rebuild would be providing maintainers that have idiosyncratic release processes with an option for providing strong build integrity assurances to their downstream users. This wouldn't force them into any particular workflow, just require their process be reproducible in a container.

> for some projects that I rebuilt myself in docker, I got vastly different sizes of distribution artifacts.

Absolutely. OSS Rebuild can serve to identify cases where there may be discrepancies (e.g. accidentally included test or development files) and publicize that information so end-users can confidently understand, reproduce, and customize their dependencies.

> what about packages in linux distro repositories (debian, etc.)

OSS Rebuild actually does have experimental support for Debian rebuilds, not to mention work towards JVM and Ruby support, although no attestations have been published yet. There is also no practical impediment to supporting additional ecosystems. The existing support is more reflective of the size of the current team rather than the scope of the project.


The industry has been coalescing around third-party attestation for open source packages since COVID. The repercussions of this will be interesting to watch, but I don't see any benefits (monetary or otherwise) for the poor maintainers dealing with them.

There's probably a lot of people that see GenAI as the solution to Not Invented Here: just have it rewrite your third party dependencies! What could go wrong. There will also be some irony of this situation with third party dependencies being more audited/reviewed than the internal code they get integrated into.


I don't mind if the "third parties" are other trusted developers of the same project, for example. But please let's not centralise it. We're just going to get Robespierre again.


Two fold: AI makes it easier to find "issues" in existing software and automate the CVE process. This means more "critical" vulnerabilities that require attention from developers using these packages.

At the same time rolling your own implementation with GenAI will be quick and easy. Outsiders are checking these, so no CVEs for these. Just sit back and relax.


For packages in Linux distributions you probably want one of:

- https://reproducible.archlinux.org/ (since 2020)

- https://reproduce.debian.net/ (since 2024)

There's arch-repro-status and debian-repro-status respectively to show the status of the packages you have installed, but since it's not yet possible to make an opensource computer out of reproducible-only software, there isn't really any tooling to enforces this through policy.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: