What do bureaucrats get right that causes problems for engineers? US gov vs. Memory Security
During 2024, the US government issued a cybersecurity recommendation[0] which sparked a disproportionate amount of controversy.
[0] - https://www.cisa.gov/resources-tools/resources/product-security-bad-practices
A lot of unreasonable comments like "we cannot rewrite Kernel in two years" or "you can write bugs in Java too"
by drama queens who couldn't even bother to evaluate their reasoning... I mean read the recommendation.
And I'm saying this as someone who's been paid to do C and is still receives money for doing C++.
So, what's the issue? As you're likely aware, software is embedded in nearly every aspect of modern society:
banking, healthcare, military, education, energy delivery, airlines, entertainment, and countless other sectors.
Essentially, the entire nation's critical infrastructure heavily depends on software.
Given that software can be vulnerable and potentially exploited to cause chaos, interruptions, damage, and more-as
we've seen with incidents like CrowdStrike's BSOD (though it wasn't caused by a vulnerability)
- it's in the government's interest to ensure software is as robust, reliable, and secure as possible, hence the recommendation.
But let's consider the industry's perspective:
In recent years, research conducted by security teams at Microsoft and Google (Chromium) found that around 70% of security vulnerabilities (CVEs) were related to memory issues.
Switching to safer system programming languages could significantly improve this situation.
Even if not all memory issues can be resolved by changing languages, a 50%? 75%? 90% (who knows how much) improvement would still be a substantial gain.
[1] https://msrc.microsoft.com/blog/2019/07/a-proactive-approach-to-more-secure-code/
[2] https://www.chromium.org/Home/chromium-security/memory-safety/
So as you see, even those giant and experienced software behemoths with highly paid engineers
not only significantly suffer from those memory issues, but also believe that there's a better way, so it's not just bureaucrats' fantasy.
So, the "Product Security Bad Practices" publication says that: development of new product lines in memory unsafe language (e.g C or CPP) where there are available alternatives is dangerous and significantly elevates risk to national security, national economic security, and national public health and safety.
And it's perfectly fucking rational. If it is possible to write your new product in other technology which reduces the potential of CVEs occuring, then you should do it.
Another thing, what is this two years thing about?
For existing products that are written in memory-unsafe languages, not having a published memory safety roadmap by January 1, 2026 is dangerous and significantly elevates risk to national security, national economic security, and national public health and safety. The memory safety roadmap should outline the manufacturer’s prioritized approach to eliminating memory safety vulnerabilities in priority code components (e.g., network-facing code or code that handles sensitive functions like cryptographic operations). Manufacturers should demonstrate that the memory safety roadmap will lead to a significant, prioritized reduction of memory safety vulnerabilities in the manufacturer’s products and demonstrate they are making a reasonable effort to follow the memory safety roadmap. This does not apply to products that have an announced end-of-support date that is prior to January 1, 2030.
Recommended action: Software manufacturers should build products in a manner that systematically prevents the introduction of memory safety vulnerabilities, such as by using a memory safe language or hardware capabilities that prevent memory safety vulnerabilities. Additionally, software manufacturers should publish a memory safety roadmap by January 1, 2026.
So, if your existing product uses memory-unsafe languages, you’re expected to prepare a plan and a roadmap to enhance its security.
Some might say, "The government expects your boss to give you time to improve or refactor issues in your product - isn't that great?"
Frankly, I struggle to understand why some people are so opposed to the government expecting software vendors to prioritize security.
It is called software engineering after all, right?
Even if you don't like that they label C or C++ as unsafe, the data somewhat supports their stance.
And really, the argument that highly skilled engineers can write safe code isn't good, because highly skilled engineers do not scale.