Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Yes it is a driver which is signed and tested by Microsoft. Driver allows to run arbitrary unsigned code. Why is that allowed?


The driver is some kind of AV/Signature detection hook. E.g check every open() for this list of checksums and refuse to open known viruses style system. The 'update' was a borked definition file which triggered a bug in that system.

It's not code execution without signing, and I think probably they do want these files to be updated hands free.

The real problem was the lack of testing, rather than the actual mechanism I think.


This is the nugget of the issue. The code-signing process, in this case, was abused to verify something that, fundamentally, cannot give the guarantee "Doesn't crash your OS" because it is allowed to run arbitrary code in the form of novel commands in what is essentially a DSL. So if code-signing is supposed to be a guarantee from MS that "this code can't crash your system," it should never have been signed... But then MS would have been on hooks for blocking a competitor.

There is no guarantee the law is written soundly.


To get a driver signed by Microsoft, the developer of the driver is required to provide a full cert pass log from the Windows Hardware Lab Kit to dev center [0]. Do you have any article that says the CrowdStrike driver has been tested by Microsoft?

[0]: https://learn.microsoft.com/en-us/windows-hardware/drivers/i...


To avoid going through the full cert process the sensor was certified but it loaded code from an uncertified module too so that it could be quickly updated to catch new threats. It's a tough corner to be in, to function properly it needs to update very quickly but the cert process takes a while to complete so they went with this work around of a signed module loading uncertified code.


...you want Microsoft to forbid you from running certain kinds of programs on your own machine, even if you really, really insist on it, do I understand you correctly?


More like: "...you want Microsoft to forbid you from running certain kinds of programs (with gaping security holes / processes) on your own machine" YES


> (with gaping security holes / processes)

The problem is that you're assuming you can prove a program doesn't having security holes and bad processes.


You're moving the goal post waaaay far down. How about just following best practices? How about not allowing runtime code injection? Turns out security holes often have much in common, and with ways to mitigate them. Stop 100% of security holes? nah. Stop 99.9% of security holes? Yes and what an improvement.


The Crowdstrike failure was not caused by running unsigned code.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: