Subscribe Login
 advertorial 

Compiler Qualification, Certification and ISO 26262

By Joe Drzewiecki, associate director of development tools, and Steve Vernier, manager of product engineering, automotive products at Microchip Technology

It may seem paradoxical that qualifying a tool for use in a functional safety application cannot fall to the tool provider. A closer look at the ISO 26262 qualification process will explain this apparent paradox. The process starts with determining the ASIL (Automotive Safety Integrity Level) of your application and the TCL (Tool Confidence Level) required of each tool that you plan to use. Determining the ASIL of your application is outside the scope of this article.

Determining TCL required of tools to be used

When developing an item with ISO 26262 safety requirements, the standard requires that all software tools used in that item’s development have documented evidence as to why each tool is unlikely to introduce an error that will cause unsafe operation. A perfectly good design intent can be made unsafe by mis-operating tools. The description “unlikely,” as opposed to “impossible,” is key. The intent is to increase confidence that the tool does not create safety relevant errors, not guarantee that it works perfectly. In the case of C compilers, there may be several to choose from and many considerations involved in the final selection. Confidence that the compiler chosen does not introduce errors into the design is essential.

To help determine the level of detail to which a tool should be analyzed, a risk assessment (Tool Classification) is performed. Tool classification considers two aspects for each use case (see Table 1): its impact on the safety of the design (Tool Impact or TI), and the probability that an error could be generated and go undetected (Tool error Detection or TD). TI and TD are not affected by how often the tool has been used, or what company made the tool. Risk is not limited to only what goes directly into the item, but also the information that affects design choices (ex: simulations) or test results. Error detection can of course be built into the tool, but TD determination should also include the evaluation and testing of the item later in the development process. Risk can be minimized by using only tools for their intended purpose and usage conditions, which is usually described in a tool’s user guide and/or safety manual. Note: TCL1 tools require no qualification.

Tool Qualification based on ASIL and TCL

ISO 26262 recognizes four qualification methods for each TCL, each ranked by the ASIL of the application (see Table 2).

Microchip MCA823 Fig. 2

Increased confidence from use is restrictive in that it applies only for exactly the same version of the same tool in the same application. However, if that applies to your situation, it’s there to use. The second method, evaluation of the development process requires deep insight into the development processes of your tool provider. That may be possible, but it is not likely to be used except for the mitigating influence of a third-party certification body, which could gain access to this usually proprietary information during certification audits. The third method is validation of the software tool. Validation is a tried-and-true method for ensuring proper operation of software. Depending on the tool, however, the degree of expertise necessary to validate the tool may be excessive. Certainly, compilers fall into that category. Finally, qualification gets easier if the tool has been developed in compliance with a safety standard. Processes that comply with safety standards are still fairly new and many tools are mature enough that this qualification method is too new to apply to them.

Validating the qualification

Of course, you must be comfortable with the thoroughness of the qualification that you performed when you present it to an auditor. As mentioned, one would have to be a qualified compiler validation expert to determine if the hundreds of thousands of tests performed on the compilers, such as MPLAB XC functional safety compilers, were necessary and sufficient.

One sure way to validate a tools qualification is to leverage third-party certification from an accredited body like TÜV SÜD. Such an organization can bring a vast array of experience and expertise in both functional safety and tool certification. Utilizing comprehensive information and incisive on-site audits, a third-party organization uses information by the tools provider, including process definition and documentation, validation methodologies and results, a safety plan, Failure Mode and Effects Analysis (FMEA) and functional safety manual, to ensure that any provisional classification and qualification documents provided by a tool vendor meet a rigorous, high standard of attainment.

Microchip simplifies meeting your Functional Safety requirements

Along with the certificate from TÜV SÜD and the reports that substantiate it, Microchip provides all of the above-mentioned documentation along with our just-released MPLAB XC Compilers for Functional Safety – in one convenient package. Also included in the package are classification and qualification documents for MPLAB X IDE and MPLAB debuggers and programmers to Tool Confidence Level 1 (TCL 1). Since the MPLAB XC compiler products support all of Microchip’s microcontrollers (MCUs), every MCU Microchip offers is included in this functional safety solution. Microchip helps customers simplify the development tool qualification process for their functional safety requirements.

SC2 logo white

Taking formal methods mainstream

In academia, we refer to computing science. In industry, we refer to software engineering. An engineer is a skilled technician who develops and applies scientific knowledge to solve technological problems. Too often in practice software people must resort to skillful tinkering as opposed to sound engineering. That’s why at Verum, we’ve dedicated ourselves to the development and application of scientific knowledge to solve the technological problems underlying this phenomenon. To meet these challenges head on, we’re developing a language that enables building reactive systems at industrial scale. The language offers built-in verification and allows for reasoning about both the problem and the solution. It’s complemented by tooling that automates every development aspect from specification, construction, and documentation to verification and validation. In this talk, we’ll present what we’ve achieved and what will come tomorrow, when we stop tinkering in software development.
Rutger van Beusekom holds an MSc in mechanical engineering from Eindhoven University of Technology. From 1999-2005, he worked as a software engineer at Philips CFT. From 2005-2007, he was a software engineer and team lead at Philips Research. Since 2007, he’s been at Verum, in the roles of consultant, software engineer, team shepherd, architect and CTO, working together with and at ASML, Ericsson, FEI, Philips and other customers.
SC2 logo white

Developing for safety and security

Software systems have exploded in complexity, leading to an enormous increase in the number of vulnerabilities available for exploitation by bad players. This effects safety as safety and security are inexorably linked. Cars today have one hundred million lines of code, but should we be proud or ashamed? Developing systems that need to be safe and secure will require a shift in thinking away from huge monolithic to minimalistic, component-based that enables components to be fully validated and tested, to eliminate vulnerabilities. This talk explains how we need to change software development to make security and safety the main criteria.
Chris Tubbs is an industry veteran with 46 years’ experience in the avionics, simulation, medical, automotive and software industries. After 15 years in the aerospace industry managing safety-critical systems, he co- founded companies in the simulation and medical-imaging markets in the roles of commercial and managing director. He then spent eight Years in the automotive industry in Germany and the Netherlands as a development and business development manager, after which he joined Green Hills Software in 2008. He was promoted to Director of Business Development EMEA in 2012, since when he has specialized in safety and security.
SC2 logo white

Remodeling legacy software

Have you ever considered remodeling your kitchen, while continuing to cook in it? It may not sound obvious, but that’s exactly what this talk is about. Within Kulicke & Soffa, high-tech pick & place machines are developed for the semiconductor industry. For the development of these machines, a software stack is used, the development of which started more than a decade ago. Over the course of years, different machine types were developed from this codebase, which led to a situation where alternative flows are implemented in various areas of the code base. Therefore, the decision was made to group product-type-specific code. Constrained by feature development, that should continue in the same code base. Remodeling while cooking! This talk will take you through the remodeling and the challenges that come with it.
Corné van de Pol is a software architect and trainer at Alten Nederland. This gave him the opportunity to work for a range of companies, including Philips, Vanderlande, ASML and Kulicke & Soffa. He enjoys learning and helping others and with over 10 years of experience as a professional software engineer, he got specialized in agile software development and object-oriented design and clean code.
Erik Onstenk is lead software architect at Kulicke & Soffa Netherlands. He joined Kulicke & Soffa (formerly Assembléon) in 2007. Over the years, he worked on the control software of the entire machine portfolio. His current focus is redefining the reference architecture to better suite recent developments and facilitate future expansions.
SC2 logo white

Why high process compliance is no guarantee for good software quality

In the automotive industry, Aspice is used for measuring an organization’s capability to develop high-quality software. Companies supplying software to automotive manufacturers  are required to have a minimum maturity level to ensure that they deliver that high quality. Still, having high-quality processes in place and complying with them is no guarantee. To see why that is and what else is needed to assure high quality software, we first need to understand the many different aspects of software quality and the influence they have. In this talk, Ger Cloudt will present a holistic view on software quality using the 1+3 SQM approach, addressing the consequences of high or low quality for each of the four defined quality types.

Ger Cloudt studied electronics at the University of Applied Sciences in Venlo (the Netherlands). At companies like Philips, NXP and Bosch, he has gained more than 35 years of experience in in-product software development across different industries, including industrial automation, healthcare, automotive, semiconductors, security and building technologies. After having developed software for over 15 years, he became a software development manager, leading numerous engineering teams. During all these years, he developed a vision on what really matters in software development, which he has encapsulates in his book “What is software quality?”.
BC-Event logo rgb

Opportunities and challenges of high-throughput 3D metrology equipment for semiconductor process control

With the shipment of its first system to a high-end chip manufacturer, Nearfield Instruments proves that the semiconductor market is very much open to innovative solutions for advanced process control metrology. This first product, Quadra, can measure in-line and in great detail (ångstroms) the on-surface high-aspect-ratio (10:1) features of integrated circuits. The company is now scaling up to deliver dozens of its scanning probe metrology systems per year.

Nearfield founder Hamed Sadeghian foresees the Quadra metrology platform to be the basis for several products and product lines. All of them will solve different problems the semiconductor industry is facing to follow Moore’s Law with its ever smaller and 3D features. Nearfield is expecting to deliver its second product line based on the Quadra platform next year. This system will be able to image, non-destructively, subsurface structures with nano-precision.

In this talk, Hamed Sadeghian will highlight the major requirements for developing non-destructive 3D high-volume manufacturing metrology equipment in the semiconductor industry, the architecture of Quadra (including software) and the challenges faced and overcome. He will also address the impact of the system architecture on the outsourcing strategy to the high-tech supply chain.

Hamed Sadeghian received his PhD (cum laude) in 2010 from Delft University of Technology. Four years later, he obtained an MBA degree from the Vlerick Business School in Belgium. He is the founder (2001) of Jahesh Poulad Co., a manufacturer of mechanical equipment.

Hamed was a principal scientist and Kruyt member of TNO and led a team of thirty researchers in nano-optomechatronic instrumentation at TNO in Delft from 2011 to 2018. In 2016, he co-founded Nearfield Instruments and is currently CEO/CTO at this scale-up that recently shipped its first in-line metrology system to a high-end chip manufacturer.

BC-Event logo rgb

Mastering the edge: critical factors to enabling edge computing

There’s no denying that cloud computing has been a top technology over the past two decades. So many of us working from home since the start of the pandemic would have been impossible not that long ago. Even though the cloud is key for today, it can’t handle the technologies of the future. Self-driving cars are a perfect example. They need to make ultra-fast, perfectly accurate decisions. There’s no time to wait for data to be processed in a data center. This is where edge computing comes in. Edge computing cuts across the IoT – from home and work to the most complex of all, the vehicle. Coupled with the rising digitalization that leads to everything connected, high-performance edge compute platforms are transforming ecosystems and the development landscape. In this talk, Maarten Dirkzwager will share why mastering edge computing with the right level of safety and security is critical to enabling next-generation technologies.

Maarten Dirkzwager NXP

Maarten Dirkzwager is responsible for corporate strategy and chief of staff to the NXP management team. He joined the company in 1996 at Philips. After several roles in central engineering, he moved to Philips Semiconductors in Hong Kong in 2005, where he was responsible for the innovation, efficiency and strategy of the discrete back-end factories. In 2009, he moved to the corporate strategy team in the Netherlands where he was involved in the transition of NXP to a profitable high-performance mixed-signal player. In 2015, he played a leading role in NXP’s acquisition and integration of Freescale, which resulted in creating one of the leading semiconductor companies and a leader in automotive semiconductors. In 2017 and 2018, he worked as head of strategy for ASML and AMS, after which he returned to NXP in early 2019.