With you every step of the way: from code quality to comprehensive security

Publisher:EE小广播Latest update time:2023-11-29 Source: EEWORLDAuthor: Shawn Prestridge,IAR资深现场应用工程师 / 美国FAE团队负责人Keywords:code Reading articles on mobile phones Scan QR code
Read articles on your mobile phone anytime, anywhere

Security has always been a very hot topic, and it seems like every week we hear news about how such and such a company was hacked and the data of millions of users was leaked.


image.png


Part of the reason we see so many security issues is because of the way we treat security: Security is often considered an afterthought, something added to a device at the end of development. However, complex systems, especially embedded systems, have a large attack surface, which allows attackers to exploit and find holes in their armor. If you study the ways in which most hackers attempt to break into systems, you'll quickly discover that their favorite tool in their arsenal is to find and exploit software vulnerabilities in devices.


If software vulnerabilities are an entry point for hackers to exploit, then we need to improve our code quality to solve this problem. But how serious is the problem, and what can we do to solve it?

 

Code vulnerabilities are easy targets for hackers


Poor code quality is actually a widespread problem, and there's quite a bit of evidence to support the claim that poor coding directly leads to vulnerabilities. While many software engineering experts have been preaching this for years, the first real realization of it was perhaps 2001, when the Code Red worm imposed a buffer on Microsoft's Internet Information Services (IIS) Overflow attack. [1] Although the first documented buffer overflow attack occurred in 1988, targeting the Unix finger instruction, the impact on ordinary people was very limited and therefore did not make headlines.


As Code Red caused massive Internet slowdowns and was splashed across the news, suddenly we were seeing an increase in buffer overflow attacks everywhere, and it seemed that security researchers and hackers were attacking all kinds of systems, including These vulnerabilities are found everywhere in embedded systems. Using a buffer overflow attack, a hacker can run any code they want on the affected system, targeting anything that uses a fixed-length buffer to hold text or data. The hacker fills the buffer space to the maximum and then writes executable code at the end of the legitimate buffer space. The compromised system will then execute the code at the end of the buffer, which in many cases allows the attacker to do whatever they want. [2]


This type of attack became an emergency because coding to check and enforce buffer limits was not common at the time, but is now recommended by many coding standards, such as mitre.org's Common Weakness Enumeration (CWE) Check the buffer for this type of vulnerability. [3] Unfortunately, developers generally do not look for this problem when writing code. Code analysis tools are usually needed to find these problems, so that developers will realize the problem exists and fix it. A simple code quality improvement like this can greatly improve the security of your code by eliminating one of the most common tactics used by hackers. Therefore, it is good coding to check and enforce the buffer length in your code.


Not just a buffer overflow


However, the problem isn't just about buffer overflows, it's actually a systemic problem, with sloppy coding often leading to countless security holes that hackers can exploit to break into the system. A paper published by the Software Engineering Institute (SEI) makes this point very clear:


"...Quality performance metrics provide a basis for determining high-quality products and predicting safety and security outcomes. Many items in the Common Weakness List (CWE), such as improper use of programming language constructs, buffer overflows, validated input Value failures, etc., may be related to low-quality coding and development processes. Improving code quality is a necessary condition for solving some software security problems.”[4]


The paper also points out that because many security problems are caused by software vulnerabilities, security problems can be treated just like more common coding vulnerabilities, and you can apply traditional quality assurance techniques to help solve at least part of the security problem.

Since normal software quality assurance processes allow us to estimate the number of vulnerabilities remaining in a system, can the same be done for security vulnerabilities? While the SEI does not confirm a mathematical relationship between code quality and security, they do state that between 1% and 5% of software vulnerabilities are security vulnerabilities, and go on to note that their evidence shows that when security vulnerabilities are tracked, they can accurately Estimating the level of code quality in the system. [4] This ultimately shows that code quality is a necessary (but not sufficient) condition for security, truly defeating the idea that security can be viewed as something added to the device at the end of development. Instead, security must run through the DNA of the project, from design to coding, all the way to production.

 

Coding standards can help a lot


Many of the most common security vulnerabilities are addressed in coding standards such as mitre.org's Common Defect List, and point out other areas to focus on, such as divide-by-zero errors, data injection, loop irregularities, null pointer exploits, and character String parsing error. MISRA C and MISRA C++ also promote coding security and reliability to prevent security vulnerabilities from seeping into your code. While these coding standards can catch many common vulnerabilities, developers must think longer term when writing code: How could a hacker exploit the code I just wrote? Where is the vulnerability? Have I made assumptions about what the inputs will look like and how the outputs will be used? A good rule of thumb is that if you are making assumptions, those assumptions should be turned into code to ensure that what you are expecting is actually what you are getting. If you don't, hackers will take action.


But what about open source software? The typical argument for using open source components in design relies on a "proven in use" argument: so many people use it, it must be good. The same SEI paper also has some elaboration on this issue:


"Besides being free, one of the touted benefits of open source is the idea that 'having a lot of people paying attention to the source code means security issues can be discovered quickly and anyone can fix them without relying on the vendor.' However, the reality is What happens is that without a disciplined and consistent focus on eliminating vulnerabilities, security holes and other vulnerabilities will appear in the code.”[4]


In other words, SEI believes that the "proven in use" argument is meaningless and is reminiscent of the story of Anybody, Somebody, Nobody, and Everybody when applying quality assurance to open source code. Furthermore, your tests are not enough to prove that the code is satisfactory. The SEI says that code quality standards like CWE can uncover problems in your code that are often not discovered in standard tests and are usually only discovered when hackers exploit vulnerabilities. [4] To prove this, in May 2020, researchers at Purdue University demonstrated 26 vulnerabilities in the open source USB stack used in Linux, macOS, Windows, and FreeBSD. [5] So, when it comes to security, code quality is key, and all code matters.


Code analysis tools help with standards compliance


In addressing code quality issues, what can we do to improve the security of our applications? The simple answer is to use code analysis tools. There are two basic types of these tools: static analysis tools, which only look at the source code of the application, and runtime (or dynamic) analysis tools, which instrument the code. , looking for vulnerabilities such as null pointers and data injection methods. IAR can provide both tools, including the static analysis tool IAR C-STAT and the runtime analysis tool IAR C-RUN, both of which are fully integrated in the IAR Embedded Workbench development environment. High-quality code analysis tools include checks for CWE, MISRA, and CERT C. CERT C is another coding standard designed to promote coding security. Together, these three rule sets form a premium combination for coding that promotes security: some rulesets overlap with others, but also provide unique features that can help ensure that your code Has a high degree of security. Using these standards will also help ensure you have the highest quality code and may even uncover some potential vulnerabilities in your code.


High-quality code is safe code


Ensuring code quality ensures code security. Don't put the responsibility for code quality on others, because other people's vulnerabilities may cause you a security nightmare. But there is hope, because code analysis tools can help you quickly find bugs before they cause trouble. The road to security always passes through code quality.


Keywords:code Reference address:With you every step of the way: from code quality to comprehensive security

Previous article:The 30th anniversary of e-sports, you shine the brightest!
Next article:Dalian Shiping Group launches BCM development board solution based on Xinchi Technology products

Latest Embedded Articles
Change More Related Popular Components

EEWorld
subscription
account

EEWorld
service
account

Automotive
development
circle

About Us Customer Service Contact Information Datasheet Sitemap LatestNews


Room 1530, 15th Floor, Building B, No.18 Zhongguancun Street, Haidian District, Beijing, Postal Code: 100190 China Telephone: 008610 8235 0740

Copyright © 2005-2024 EEWORLD.com.cn, Inc. All rights reserved 京ICP证060456号 京ICP备10001474号-1 电信业务审批[2006]字第258号函 京公网安备 11010802033920号