Open Source Processors: Fact or Fiction?
JUNE 27TH, 2019 -By BRIAN BAILEY
Calling an open-source processor free isn’t quite accurate. Open source processors are rapidly gaining mindshare, fueled in part by early successes of RISC-V, but that interest frequently is accompanied by misinformation based on wishful thinking and a lack of understanding about what exactly open source entails.
Nearly every recent conference has some mention of RISC-V in particular, and open source processors in general, whether that includes keynote speeches, technical sessions, and panels. What’s less obvious is that open ISAs are not a new phenomenon, and neither are free, open processor implementations.
“OpenRISC was introduced in 2000 and OpenSPARC has been around since 2006,” says Chris Jones, vice president of marketing for Codasip. “These cores, and other freeware peripheral IPs, made their way to FPGAs, demo platforms, prototypes, but never saw any widescale commercial deployment. So is RISC-V any different, or is it OpenSPARC 2.0?”
The answer may depend, in part, on broad changes underway across the semiconductor industry. “With the diminishing of Moore’s Law, the only way to improve performance is with customization, which leads to the development of more chip variants,” says Raik Brinkmann, president and CEO for OneSpin Solutions. “The open-source nature of RISC-V feeds this paradigm shift.”
This also has contributed to the changed landscape. “The semiconductor industry is in an evolutionary rather than a revolutionary phase,” says Bobe Simovich, director and distinguished engineer for Broadcom. “Consolidation means that more chips are being done in fewer places, and I think this will continue. Many startups are exciting after a drought, but in my opinion, there will be a new wave of consolidation. Many companies are not in the business of solving problems. They’re in the business of avoiding problems.”
So why does this support the need for open-source processors? “There are many reasons why people are interested in an open ISA,” says Simon Davidmann, chief executive officer for Imperas. “Sometimes it is because they want it to be free, or they want it to be open source. Perhaps they don’t want license constraints that tell you what you can and cannot do. In a few years, you will be able to pick up an RTL core that has gone to silicon 20 times and does what you want as a small package. It will be completely free, and yes, it will be open source—just like I can get open-source tools for software. This is the vision.”
In the end it comes down to three drivers, according to Codasip’s Jones:
- A fair licensing model
- The ability to differentiate through processor customization
- Vendor independence
But that doesn’t mean that RISC-V processors are abundant and all free of charge. “The Verilog representation of a processor core can be put in the public domain, but not all of the ancillary deliverables can be,” Jones says. “Many tools, models, scripts, etc., are specific to a particular EDA tool and therefore proprietary. A user of open-source IP must provide those components themselves, or license a commercial RISC-V core, or have a third-party fabless ASIC company implement the core on their behalf.”
What is an open-source ISA?
An ISA defines a hardware/software interface. It defines what each software element, an instruction, is supposed to accomplish on the hardware and that enables software portability between different implementations of a particular ISA core.
An open-source ISA means the specification is freely available for anyone to build an implementation around that specification. “There are no patents that protect the ISA,” explains Jerry Ardizzone, vice president of worldwide sales for Codasip. “With most ISAs you cannot just build a processor core and apply it to that instruction set architecture without getting a license from the provider of that ISA. In the case of RISC-V, that ISA is not owned or patented by anyone. It is freely available for anyone to go and build a processor core that is compliant to that ISA.”
What you get is a free specification. “RISC-V is simply an open-source ISA, which commercial IP providers implement into an IP product and then sell at industry-standard licensing and royalty models,” says Tim Whitfield, vice president of strategy for Arm’s automotive and IoT business. “These offerings have limited differentiation, which forces IP providers to compete on price within their own ecosystem. In fact, when they do attempt to differentiate, software fragmentation could be a likely outcome, making it difficult for a coherent software ecosystem to grow around the ISA.”
But differentiation is one of the key elements driving RISC-V adoption. “Extension are important,” says Emerson Hsiao, senior vice president of sales and support for Andes. “RISC-V standardizes the notion of extensions and there are standard extensions, such as DSP. Not everyone is going to use extensions, but I bet our customers will do it for differentiation and differentiation is important in today’s industry. RISC-V gives you the option to do that.”
MIPS and Arm have a role in open source, as well. “When Wave acquired MIPS, it was with the express intention to open the architecture,” says Steve Brightfield, senior director of strategic marketing for Wave Computing. “Future architectures will be defined by the consortium and not by MIPS. We do have a lot of patents. We have over 400 patents surrounding MIPS, and part of the initiative is that you get a license to those patents. This helps to protect you in case there is a threat from another architecture vendor. We plan to take the verification suite that we have today and provide that through an independent third-party company so that they can do independent verification of other companies’ MIPS cores. Once they are compliant, they get access to the MIPS open branding, they get access to the MIPS patents.”
Arm is not fully closed, either. “While not open source, the Arm architecture is an open architecture,” says Arm’s Whitfield. “NVIDIA recently announced it was making its full stack of AI and HPC software available to the Arm ecosystem, citing Arm’s ability to offer an open architecture for supercomputing.”
From specification to processor
Several steps are necessary to go from an ISA specification to a working chip.
“To get from an ISA to a CPU implementation that can be used in an SoC is a complex process requiring both skilled engineering resources, along with EDA and compute resource to build a functioning CPU IP block,” says Whitfield, noting that the first step is to create the microarchitecture. “This requires serious investment in engineering and infrastructure. It requires time to develop, test and implement.”
This is a serious investment in time and resources, he says, including:
- Specialist engineering team of micro-architects, who define the processor characteristics, including pipeline architecture, memory system, caching, debug and bus interface. This takes tens to hundreds of man years, depending on complexity.
- Teams of verification engineers to ensure functional correctness and architectural compliance, together with the associated compute and specialist EDA tools and hardware. This takes hundreds of man years and peta cycles of computing.
- Physical design teams to ensure that the processor can be realized in silicon with the correct power, performance and area characteristics. This adds tens to hundreds of man years, depending on complexity together with the costs of EDA tools and compute infrastructure.
- Integration, which requires models of the CPU to enable software development before silicon is available. That adds more engineering resource and compute.
- Support and maintenance. To get any return on the engineering investment, the processor must be used across multiple projects and teams in the company. This requires support and ongoing maintenance because the processor design may be used for many years.
Some companies have decided to take this on themselves. Others will leave it to one of the companies that are offering this as a service. “Everyone I’ve talked to is doing custom extensions, where they build their own instructions,” says Codasip’s Ardizzone. “They don’t want to contribute this back to the foundation. It’s their differentiation and may make something a little better than anyone else. To survive you have to make money, and that means a reasonable business model. If our strategy is to deliver off-the-shelf RISC-V cores that aren’t customizable, we’ll go out of business very quickly. That’ll be a commodity business. You have to offer customization.”
This is not easy, though. “They want flexibility, performance, they want access to the process resources,” says Andes’ Hsiao. “Implementation is very tightly coupled to the process. They want to add more features, they want multi-cycle instructions. Without a structure in place, it is almost infeasible. What we propose as a processor IP company is that you need to set a boundary. This means a modular approach, more like a container. If the container that is attached to the processor is verified, then the designer will have to design their own extensions or custom instructions based on that boundary. Hopefully that boundary is well-defined and pre-verified and well-contained so that it does not cause issues when you have small extensions.”
Implementations are generally not open source. “It is possible to take an open-source CPU implementation as a starting point,” says Whitfield. “There are open-source RISC-V CPU implementations available, but few of them are maintained to the level required for commercial use. Considerations such as verification quality, ISA compliance, design provenance to meet functional safety standards, support and maintenance, license restrictions and patent protection all potentially add cost and increase risk versus using commercial IP.”
There also has to be trust in the implementation, according to OneSpin’s Brinkmann. “With open source, there is a danger of hardware Trojans or unintended behavior sneaking into the design. Furthermore, commercial vendors including internal IP groups must perform extensive verification and demonstrate the results to their customers while users must also perform verification especially when adding custom extensions. The use of open IP in critical application domains such as automotive, mil/aero and IoT will have huge challenges when it comes to functional correctness, safety, security and trust.”
But the openness can help, as well. “One of the biggest drivers is that it offers more transparency into the implementation of the cores,” Brinkmann says. “In some scenarios that is really critical. If you don’t trust your supply chain, an open implementation adds visibility. If you use some of the open-source implementations, possibly generated from a high-level language, you do not have visibility into the RTL, but you have the source code in your hand.”
Perhaps the biggest hurdle that has to be overcome with open-source processors is verification. This exists at several levels. The lowest bar is compliance, which ensures the hardware/software contract has been maintained, and that there is a reasonable level of certainty the software will be portable between implementations.
“A standard, robust golden test suite to make sure your RISC-V implementation is compliant does not exist yet,” says Ardizzone. “One reason is that the RISC-V specification isn’t complete yet. It is still evolving. Some of the extensions are still being worked on. Some are complete and ratified, while others are still under negotiation. Once we get closer to that, we will see a big push to create the compliance test suite.”
That’s a good first step, but more is required. “It may run the compliance suite, but that does not mean it is a fully verified core,” says Neil Hand, director of marketing at Mentor, a Siemens Business. “It just means that it works under the conditions of the compliance suite, and you have probably touched a fraction of the scenarios that are possible.”
Others agree. “Running the compliance test suite doesn’t tell you much,” says OneSpin’s Brinkmann. “It would be fairly simple to insert a Trojan into the implementation that would trigger switching form user mode to system mode. It would be hard to detect that either with simulation or random test suites, or bounded property checking. A Trojan may trigger well beyond the bounds of your checking. You need a much more rigorous verification approach that goes end to end so that you can trust the implementation.”
That leads to the next level of verification. Verification of the standard core is required to ensure that all software will execute correctly on the implementation. On top of that is verification of a core that has been extended, which adds a whole new level of complexity even though it doesn’t necessarily require the same level of completeness.
“If I am building a chip and buy a core from Arm or MIPS or Tensilica, they have done an unbelievable amount of verification,” says Imperas’ Davidmann. “If you read the papers from Arm, they talk about peta cycles. That is 1015. When you change something, you have to re-verify it. When you buy a core from the likes of an Arm, you know it is going to be an Arm and you know it is going to work. The challenge for RISC-V and other open ISAs is that the designer of the processor will have to do that themselves. They probably have 10 times more verification than if they were just plugging in an Arm core, where they have to integrate a piece of IP. The amount of verification that has to be done in the average SoC project is probably 10X if they use an open ISA compared to buying a standard off-the-shelf part.”
MIPS has been working to make a full verification suite available. “Independent companies will get access to the verification suite,” says Wave’s Brightfield. “The consortium will sign agreements with a company in Europe, one in the US, one in China that initially will have the rights and the capabilities to offer services to companies that have an RTL design based on MIPS to get it verified. They will be able to charge a small fee for that, and that will allow the third-party company that has developed the RTL core to validate that they have a compliant core. That gets them the seal of approval and the patent rights and the branding rights.”
Verification is all about risk management. “There are some applications where a standard off-the-shelf solution is going to be perfect,” says Mentor’s Hand. “There will be others where a huge amount of value is added by making some changes. Today, everyone wants to change everything and make it run 10X faster by adding custom instructions, but what is the overhead that is now put on the whole product.”And you have to think outside of the process core itself. “In a microprocessor, about 20% of the logic is associated with the ISA,” says Adnan Hamid, CEO at Breker Verification Systems. “This is things like the ALU and registers—80% of it is load and store. It is all about cache and paging and fabric and memories. So open ISA or not, all of that problem remains, especially when you get into bigger chips.”
The good news is that if the core is going to be deeply embedded, you may not have to perform complete verification on the core. “Some companies are going to strip down RISC-V for power optimization, but that chip will still have to talk to commercial IP blocks that will get integrated,” adds Hamid. “Verification traditionally has been to beat every piece to destruction hoping that you are ready to cover everything and do that all the way up the food chain. That means you waste a lot of time verifying things that you don’t care about. With Portable Stimulus (PSS) we can define the flows that our products have to support and push those down. So maybe we only need to test 20% of the capabilities of my RISC-V in my design.”
When a core is extended, by definition there is no verification suite for it. “Once you start taking an open-source processor and modifying it, suddenly a whole group of people that have never had to deal with the verification of a processor now have to deal with it,” says Hand. “All of that knowledge that has been built up within the processor companies over decades, they don’t have that. There is no magic bullet or tool. The positive of an open accessible standard, both commercial and open source, is that you can grab it, customize it and do really cool things. But they have to make sure it works. It is a very different set of challenges compared to verifying an SoC. Once you change anything in that processor, you have to re-verify the whole thing. You may not know the scope of what you have to verify again.”
And you may have to look into the negative design space, as well, to see if you have inserted additional vulnerabilities. “There are ways to make sure there isn’t anything in the design that you didn’t have in the ISA,” says Brinkmann. “You prove that nothing else can happen. You need to focus the verification on where it matters.”
One of the biggest reasons to have an open ISA is to enable the development of an ecosystem. “Today, the ecosystem for Arm is gigantic, but the ecosystem for RISC-V isn’t at the same level,” says Ardizzone. “However, it is developing quite quickly. We are seeing a lot of design starts, and so you have to follow the money. When customers come in, the ecosystem grows. Today, Arm has 83% of the processor IP business.”
That ecosystem has enormous value. “Arm was recently able to gather data from both our ecosystem and third-party research that quantifies the estimated SoC design costs with and without the support of the Arm ecosystem,” says Whitfield. “The result was more than $20 million in cost savings when designing a 28nm SoC supported by the Arm ecosystem.”
Fig 1: The value of an ecosystem. Source: Arm
As you move towards the advanced nodes, those figures may change significantly. “It is possible to build a mixed-signal SoC on a legacy process node using open source EDA tools but not on advanced nodes such as 7nm where an SoC may require 5 billion to 20 billion transistors,” adds Whitfield. “It comes down to risk. Are commercial designers willing to risk millions of dollars of mask costs, not to mention opportunity cost using tools that do not come with support, rapid bug fixing and foundry certification?”
There’s another side to this, as well. In the past, there were only a few processor companies and that did not make it an interesting market for the development of special-purpose EDA tools. But if open ISAs do take off, there could be hundreds of companies developing processors, which could lead to some new verification tools targeting this function.
Tools are lacking today. “Where can you buy the tools that are going to solve the problem that Arm, MIPS and everyone else has built themselves over the past 20 years,” asks Davidmann. “They don’t just write some RTL and expect it to work. They have gone down many blind alleys of verification that didn’t work and tried others until they find a solution that works.”
We do have some emerging tools that help. “Formal technologies are good at understanding the dark parts of the design,” says Hand. “PSS is good for looking at scenarios and bringing those up to a higher level and testing the SoC. Hardware-software co-design helps with some of the system-level aspects. There is no one magic bullet bringing these technologies together.”
We can also expect to see appropriate verification IP. “Maybe we will come to a point with newer tools where we can pre-package the tests for that kind of functionality,” says Hamid. “You will pay somehow. Either you buy or you build and pay for good verification tools or you pay for a lot of verification engineers.”
New tools will emerge. “There are new EDA opportunities to automate this verification and make it repeatable and predictable,” says Brinkmann. “As an open ISA there are by definition multiple implementations and it may be a viable option to come up with tools that can automate the process. It is not interesting for us to do that for a specific proprietary ISA with just one implementation, but if there are hundreds of different versions or configurations, it makes a lot more sense.”
All of this leads to commercial EDA vendors needing to step up and fill the gap with solutions to these challenges. And that may be the point when open-source processors become available to the masses.