There is a whole industry of people providing operating systems and compilers for high assurance embedded development, led by companies like Green Hill Software. Nobody would ship a safety critical embedded project using either GCC or Clang.
At the highest levels of safety, the DO-178B standard for aviations systems requires compilers to generate code in a way that has a fixed structural relationship between source and output patterns, where each pattern is independently verified, so that every output instruction is directly traceable to a source instruction. All other code generated by the compiler has to be manually verified. This eliminates the use of most of the optimization techniques found in modern compilers. Also, the emphasis on Worst-Case Execution Time (WCET) verification means that performance improvements only matter if your WCET estimates can incorporate them.
Back in 1996-1997 I was a federal contractor working as a software consultant and analyst using several different systems for logistics and provisioning for the US Army to downsize a base to a different state.
We used different operating systems and different databases and programming languages to create a redundant system that solved the same problem or generated the same report. We would have at least three programs running with three different technologies and if the reports were all the same, the system was in good shape, but if one of the reports differed somehow we knew we had to check and fix something.
We did use SunOS/Solaris, HPUX, Linux (Slakware IIRC), DOS/Windows 3.0, Windows 95, Windows NT, etc. In some cases there were C programs using the native C compiler with whatever Unix system that was in use, and in the case of Windows or DOS we used whatever language or software was available. I worked in Clipper, DBase, Oracle PL/SQL, SQL Server, MS-Access, and sometimes even flat text files downloaded from mainframes or FTP sites on the Military network that needed converting to different databases. As a federal contractor I was limited in what I could do, for example my PC did not have a CD drive and they would not allow ODBC drivers to be installed. I did not have administrative access but instead access to a maintenance account that others shared to work with databases that was limited in many ways.
My point is you work with whatever they give you to work with, you may be limited in what you do, but you use what you have available. So yes you might be limited to GCC on a Linux system and no root access to install CLANG or LLVM.
Was this before or after Bradley Manning? (I think he used a CD-RW drive on a "secure" computer to extract the data he sent to Wikileaks.)
> account that others shared
Shared account? Really? Sounds like top-notch security in action. [/sarcasm]
> no root access to install CLANG or LLVM
You don't need root access, you can just do:
./configure --prefix=$HOME/clang
or whatever the equivalent is if clang uses a different build system.
If /home is mounted noexec, that's a real problem, but if you're doing software development, noexec would mean you couldn't run the programs you're writing either.
Of course, just because there's no technical reason you can't do something, doesn't mean there isn't a nontechnical / political reason you shouldn't. If you'd made a project change as large as using a different compiler without approval from (or at least notification to) higher up, you might have gotten in trouble.
It was in 1996-1997 so it was before. They would install the OS and then remove the CD-ROM drive for security reasons. One of which was to make sure no unauthorized software was installed, and yes they had CD Burners back then and saw that as a security risk for copying information.
You will find that not all federal systems are secure and run by experts. Some federal employees are not qualified for their jobs, and that is why federal contractors are hired to make up for it. For example some of our servers and systems were not behind a firewall and had public Internet IP addresses. I agree a shared account is a security risk, and when someone changed the password to the maintenance account we were locked out and had to file a form to learn the new password. In fact to do anything like install software or configure a system we had to file a form first.
Yes the federal government and chain of command requires that I file a request before I use a new software product. Even if it can be downloaded and run in the home directory, if I don't get permission for it, I am in deep trouble. So much as refreshing an IP address, if I do it myself I am in trouble, I have to call their help desk and have them refresh and renew it for me.
Actually VxWorks uses gcc (and make for the build system). At level A I think you are required that your tests have 100% coverage, at machine instruction level, but I don't remember there must be that kind of fixed relationship between source code and output from the compiler. I have developed only for level B and have never been involved in toolchain verification, so maybe there is something I am missing.
At the highest levels of safety, the DO-178B standard for aviations systems requires compilers to generate code in a way that has a fixed structural relationship between source and output patterns, where each pattern is independently verified, so that every output instruction is directly traceable to a source instruction. All other code generated by the compiler has to be manually verified. This eliminates the use of most of the optimization techniques found in modern compilers. Also, the emphasis on Worst-Case Execution Time (WCET) verification means that performance improvements only matter if your WCET estimates can incorporate them.