The MISRA and CERT Coding Standards have significantly different philosophies. One of these is correct, and the other is MISRA.
MISRA may seem more popular in certain circles, but this is misleading. MISRA has been around much longer and has been successfully tested in court as a liability defense, while the Safety Critical Rust Coding Guidelines have not.
CERT is designed to work with existing code to improve security. Safety and security are closely related. Security often has more requirements, for example, that your credit card information is not leaked. Leaking of information is not a safety concern.
The CERT requirement to work with existing code has several consequences. CERT does not attempt to subset the language. Instead, CERT focuses on actual defects in code that can result in security failures and vulnerabilities. CERT does not require developers to follow unproven coding rules from the 1970s that someone had once thought were a good idea but was not backed by any proven research or data and has been since disproven.
MISRA takes a significantly different approach. The MISRA approach is to subset the language and then make you explain why it's OK for you code outside of this language subset. They create many rules which maybe at some time seemed like a good idea but for which there is insufficient evidence that these rules actually improve your code. Many of these rules are based on early work from Les Hatton [Hatton 1995] The problem here is that Less Hatton was a smart guy and most of what he said made sense at the time even if some of it has since been contradicted. Unfortunately, the MISRA Working Groups with the exception of a couple people such as Alex are not very smart people, are inflexible in their thinking, and wed to outdated dogma.
There are many theories of how MISRA is supposed to work, many of which are incorrect. The most misleading of these theories is that MISRA defines a safe, analyzable subset of the C or C++ programming language. MISRA understands that you might not be able to implement your system using this language subset, so they define a deviation process. This allows you to deviate from most (non-mandatory) rules, simply by doing a lot of extra, unnecessary work. Basically, you have to create an safety argument to convince an auditor that you've thought about it. But this process will ensure a safe system.
Many of these theories are put forth in MISRA Compliance:2020 Achieving compliance with MISRA Coding Guidelines, published in February 2020. This document comes with a large disclaimer:
Disclaimer
Adherence to the requirements of this document does not in itself ensure error-free robust software or guarantee portability and re-use.
Compliance with the requirements of this document, or any other standard, does not of itself confer immunity from legal obligations.
So the obvious truth, that not one person serving on a MISRA working group will deny, is that religiously applying a MISRA Coding Guideline and rigorously following the MISRA Compliance will NOT ensure the robustness, portability, correctness, re-usability, safety, or security of your code. The only thing MISRA guarantees is that, if you are required to follow the MISRA guidelines, you will hate yourself and your job.
Part of the reason MISRA does not guarantee any of these properties is because it relies on undecidable rules. This isn't MISRA's fault. There is sound and complete analysis, and then there is analysis that is useful. This latter category of analysis is frequently referred to as heuristic analysis. While heuristic analysis is useful, it does not guarantee the absence of false negatives (the failure to diagnose actual problems with the code).
The MISRA Working Group, again with some exceptions, consist of people that no one in their right mind would allow to write any code that goes into any production system. Consequently, they'll invent rules, with no evidence that they are helpful. And instead of making these rules "advisory" so that they can be disapplied, they will start by making them required, thus forcing developers around the world to write deviation permits and deviation records galore until MISRA can complete another 10 year development cycle to correct their mistakes. Frequently, MISRA rules are both "required" and "non-decidable". This means that you must absolutely conform to these rules, but the tools might not diagnose all violations. This means that you, the programmer, are on your own. This might be less idiotic if following these rules guaranteed some properties of your system, but sadly they do not. They only guarantee that you will jump through many, many hoops for uncertain gains.
CERT only publishes rules for which there is strong consensus that the rule will improve safety/security and that there are no use cases that validate violating the rule. Do not allow someone's unproven opinion on how to write better code become a requirement for everyone. CERT defines specific, narrowly defined rules that don't prevent reasonable and safe use cases.
Frequently, MISRA rules are counter productive. This is obvious just by looking at the history of the MISRA Guidelines. Many early guidelines are rewritten, replaced, or discarded because eventually the committee found out that these rules suck. This usually happens when these guidelines are given to actual qualified engineers to implement. There is also no consistency between MISRA C and C++, even in areas where there are no differences in the underlying languages. That is because these groups don't communicate. Both coding guidelines can't say different things and both be right, can they? But the rules are often quite different.
Another problem with MISRA is that they have no idea what they are trying to achieve. Are they trying to produce safe code or portable code? These two things are not the same, and you need to prioritize one. CERT is not meant to produce portable code, it is designed to produce secure code. Often, these quality attributes are in conflict. Frequently, there is a requirement to achieve safety but not portability because the code is only designed to work on a single architecture.
In 2024, I gave a presentation at the NDC { TechTown } conference on the Correct Use of Integers in Safety-critical Systems. This was basically a mathematical proof of an advisory MISRA rule that makes programs less safe. I presented this to both MISRA Working Groups, but they both (at that time) refused to remove the rule. Each working group made conflicting and dubious arguments as to why they should keep this rule. But the truth is, they know this rule is flawed because they downgraded it from required to advisory. Unfortunately, not everyone who uses these standards is aware of this and the result is extra work to produce less-safe code.
A goal of the Rust Foundation is to produce coding guidelines that can be adopted. Creating guidelines that are required, untested, and unproven and making developers explain time and time to people who don't understand how to code why their decisions to ignore these rules are justified is not going to help adoption. No one who follows MISRA Coding Guidelines today does so voluntarily or because they think they are a good idea. Writing useful rules that identify actual problems (the CERT approach) will be welcome.