Bluespec - ESL Synthesis
[ Back ]   [ More News ]   [ Home ]
Bluespec - ESL Synthesis


Bluespec, Inc. manufactures ESL Synthesis toolsets that significantly accelerate hardware design and modeling & reduce verification costs delivering. At the end of May the company announced Open ESL Synthesis Extensions to SystemC to create a unified environment for modeling, design, and verification. I had an opportunity to speak with Shiv Tasker, Bluespec CEO, and George Harper VP of Marketing about the product and their firm.

Would you give me a brief biography?
Shiv Tasker: 17 years in EDA. I started at Intergraph, Valid, and Cadence. I was at ViewLogic (Sr. Vice President for Worldwide Sales, Consulting Services and Corporate Marketing) for a while just before the merger with Synopsys. Then I stepped out of EDA for about 6 years. After that I became a cofounder of Bluespec. It is more than two years old, three in July.

George Harper: I am mostly a chip guy. This is my first EDA company. I have both designed chips (LSI Logic) and marketed chips (Director of Marketing at Trebia Networks, senior marketing positions at Conexant Systems, formerly Maker Communications, and Shiva Corporation).

There is a famous scene in Godfather III where the Al Pacino character says that every time I try to get out they pull me back in. You were out of EDA for six years. What brought you back?
Pain! I am a masochist. Actually, I had a very successful startup in life sciences. I was thinking about what I would do next. I was introduced to Professor Arvind, the Johnson Professor of Computer Science and Engineering at MIT. He teaches computer architecture and computer science. He said he had some interesting technology. He had founded a company called Sandburst which was a fabless semiconductor company offering a 10G-bit Ethernet router. Sandburst has since been acquired by Broadcom. They were developing this Bluespec technology. In 2003 as a result of telecommunication meltdown the VCs were wondering why they were spending money in developing an EDA tool when it was a chip company, I said “Let me take at look at it. If it is sufficiently interesting, maybe I will write a business plan and spin it out of Sandburst.” It was really a compelling technology. I went out and raised the money and started Bluespec.

The technology was TRS or Term Rewriting Systems (TRS)?
The Arvind concept was elevating how we express design by raising the abstraction level around concurrency. I felt it was really powerful in terms of addressing complex data path, control logic types of problems which had not been successfully solved. A lot of people tried over a number of years, including if you remember Synopsys. Arvind convinced me. When I first encountered the technology I said that there are a lot of bodies here and I have to wonder if the well is poisoned. He said he was teaching a seminar at MIT. “There are bunch of people from Intel, IBM and TI coming. Why don't you come?” I took the seminar and the labs. I talked to the guys from Intel and TI. I asked them what they thought. Is there something here? The unanimous feedback was yes, there is something compelling here. I also had a very long list of things that needed to be fixed to be a commercial product. That's pretty much where we spent the first year putting in the host language around SystemVerilog, which in 2003 was just coming on line. Accellera was going through its standardization process. We did that for a couple of years. It was obvious to us that there would always be a group of people that were more comfortable with C++. We asked ourselves “How can we take the concepts which are where the value is and apply them in a SystemC environment?” We spent last year doing that. Now we have both a SystemVerilog environment and a SystemC environment.

Bluespec has raised a couple of rounds of venture capital?
We have done a series A and a series B. I would like to think that we have taken less money than most other EDA companies of our vintage and have gone further in terms of both product delivery and customer satisfaction and having reference customers.

How large a company is Bluespec?
We are around 42 people. We have 18 people in the US and 2 in Europe. The rest of them are in Armenia and India. All of our engineering is done in the US. Armenia and India act as our Q/A as well customer support for our European and India customers.

Any particular reason why Armenia and India?
There is a lot of chip design going on in India right now. Large customers like ST and TI have very large organizations in India that do a lot of chip design. We have to find ways of supporting those customers. We choose Armenia because it is very close to Europe, the cost is pretty low and the quality of people is very high.

When did Bluespec introduce its first product?
Our first product was mid 2005. We actually started getting our first customers by about 2Q 05. With new technology and a different methodology, we had a fairly long sales cycle initially. Even though we started working with people, we didn't close out our first order until 2Q05. Since then we have had very good traction.

How many customers does Bluespec have? We don't list the customers. Most of our major customers are in wireless, mobile applications. Our big customers are ST, TI, Nokia and Analog Devices.

Are the products in production use at these firms?
Yes. The previous product, the one based on SystemVerilog. The one in the SystemC environment is being rolled out this week.

Who was the target of your first product?
Our first product was more or less an RTL replacement. It's targeted at people who are Verilog and VHDL literate and want to move up the level of abstraction in terms of how they do implementations and are comfortable with learning SystemVerilog. This particular product is targeted at architects and verification people and people who are C++ literate and want to find a way of bridging the gap between architecture and implementation.

Would you describe the product you are announcing?
George: We are adding the same concepts that we have proven out in our SystemVerilog product to people using SystemC. In many ways we are taking a technology that people are using for modeling and making it really viable for hardware design. We've got ESL Synthesis extensions similar in concept to Transaction Level Modeling (TLM) extensions in the SystemC world. They are open, free and out there for people to download. People have been downloading since the announcement. We are doing several things with this announcement in terms of technology and product. I think for the first time in a broad based fashion allowing SystemC to be a single environment for everything from modeling to verification to design, providing a very clear path that didn't exist before from higher level description into hardware for control logic and what we call complex data path, that is data paths that have tightly intertwined control logic that works with that data path providing a solution that we found consistently in the first projects is able to deliver designs, the writing of designs true to verification in less than half the time.

You mentioned projects. Were there any early releases or beta sites?
These are with people who have used Bluespec SystemVerilog product using the same concepts and the same capabilities in a different syntactical environment. For example, just yesterday we got the results from an evaluation whose conclusion was that it was 5 times faster than what the design would have been in a VHDL environment. If you are familiar with Deepchip there is a letter from a designer at ST who concluded that his first design was done in half the time, that's the design through the verification of the module he worked on. The technology is both proven as well as proven with customers with real results. We are expecting to see the same results in this SystemC environment.

What deficiencies and issues do you see with SystemC?
SystemC is used today to build high level functional models and do architectural exploration at that level where it is really detached from what the implementation will look like as well as what the hardware architecture and microarchitecture will look like. What this means is that when people move further into the design process when they get into implementation is that people do a manual rewrite because there is no automated way to go from those models into an efficient implementation. They typically get only one shot at doing that because they have to do it essentially by hand. There are obviously a couple of issues with it. You have to maintain two separate environments where there is not a lot of consistency in reuse across them. The other issue is that when you do this high level modeling you are really detached from understanding the implications of what your choices are in terms of the real hardware. Think about things like queuing performance you can understand at a high level model but understanding power, area and cost, all of those things, if you do that in an environment that is devoid of understanding what the hardware looks like then you really can not make accurate tradeoffs. Ideally, particularly today with the emphasis on mobility, creating optimal solutions for those types of markets really means solutions in terms of cost because you are dealing with the consumer market as well as low power and low energy usage because you are not running off the wall, you are running on batteries.

The other thing that we have seen is that the models for concurrency and communication inside of SystemC with threads and events is very much an RTL level kind of model for managing those things. Concurrency with threads is really very difficult. We have found with people whom we've talked to and who have build SystemC models is that they tend not to build a lot of concurrency because they are trying to avoid a lot of the complexity that threads and events forces upon them. This means that the models inherently are not going to reflect what the hardware is going to look like because you don't have a lot of concurrency in the model. It is difficult to scale complex concurrency based models. This means that ultimately it is avoided. Again it makes it harder to translate those high level models automatically or even manually into a real implementation.

Are you familiar with GartnerDataquest's ESL Landscape shown at DAC in 2005 that shows where vendors in high level modeling fit in terms of automating synthesis down into RTL? There are three different design types or methodologies: algorithmic, processor/memory and pure control logic based. The second I equate to complex data path design as in memory controller and DMA controllers. If you look at the firms that are automating high level ESL models, e.g. Forte, Accelchip, and Synplicity, they all squarely fit into the first category. These are people that are taking C and C++ kinds of descriptions of an algorithm, typically tightly nested for loops, and automating the generation of RTL. Those would be things like FIR filters and IFFTs. Everything else that people do in a typical hardware design doesn't fit into that category. So if you are building a cache controller, a DMA controller, a network engine, a bridge chip …, all of those things do not fit into that category. In terms of synthesis Bluespec with our SystemVerilog based product is the only solution that attacks and raises the level of abstraction as well as automating synthesis in those areas.

The question then is “What is not covered, previous to this announcement with existing SystemC, C and C++ synthesis?” Thierriy Baucheon, R&D Director at HEG Division of STMicroelectronics says “We've looked across the things that we are doing and Bluespec is the only solution that can address 90% of that.”

Based upon these issues with SystemC and where current products fit today we believe that one of the big things needed with SystemC is ESL synthesis as described by GartnerDataquest Landscape.

Where does Bluespec stand?
We had our initial customer in Q2 of last year and an uptake including three of the top IDMs in the mobile industry. Our results are proven and pretty compelling quote saying we are twice as fast for the first project out the gate with them.

Bluespec has been used for cache controllers, processors, memory controllers, DSPs, bus bridges, DMA controllers, serial controllers, audio, video, bus controllers all of those things which you would never see a traditional behavioral synthesis tool do but which is something we can do with ESL Synthesis much more effectively and yet still generate high quality RTL.

The product?
Shiv talked about the level of abstraction for the way concurrency is expressed in communication. Basically we have this concept that instead of threads and events we have something called rules which allow a much higher level way of describing things closer to the operation that helps simplify the complexity of expressing concurrency. Then there is something called interface methods which allow a much more powerful way of expressing how you communicate between blocks. Rules are the sort of how you express within a module, how you articulate complex currency. Then interface methods are how you compose larger systems very quickly. You can start thinking about building a harness of what your interfaces look across your system. The power of the interface method is the ability to in essence express not only the what the port list looks like or what the wire interface of the port looks like but as well what the correct protocol and behavior is for communicating between blocks. That becomes a very powerful way of taking building blocks and composing them together much quicker without having to resort to paper specification on what the interface looks like. In a sort of automated way get interfaces properly connected and working. Both of these things are built into our ESL extensions with SystemC. We have a language reference manual as well as examples and tools available on our website for download.

What is the tool flow for your product?
You can think about a system model where you may take arbitrary SystemC block as model say a codec model and mix them up with other components in you system that are designed with ESL synthesis extensions which we area calling ESE (easy). Then you can simulate those in standard SystemC and C tools (GCC, the OSCI simulator or a commercial SystemC simulator). The arbitrary sort of SystemC blocks you have today would run through the core SystemC with TLM extensions and the ESE based design would run through the ESL Synthesis extensions. The whole design will simulate. Later this year we will be rolling out our synthesis tool, ESEComp that will take the ESE SystemC blocks and synthesize very high quality RTL out of them.

What are the benefits?
What are the things that one would like to see out of an ESL solution for chip based design? Common environment, common language, unified environment for doing modeling and verification. Not only very high level macro architectural exploration but also marry that with effective micro architectural exploration, particularly as you start thinking about low power and low cost design fro consumer based solutions. Ideally in addition to algorithmic synthesis, solutions that are commonly available, you would like to be able to synthesize all aspects of the design including the complex data paths and control logic.

What does ESE provide for modeling and architecture?
For people doing SystemC today it certainly provides a single environment for model design, eliminates the need for separate environment, provides the opportunity for a complete different implementation, provides a common language for modeling and architecture people to discuss things with design people. It is more accurate with respect to the hardware. You can really assess the implications of your choices in terms of power, latency and timing. It provides much faster implementation of complex concurrency and design composition through the interface method than you would have with threads and events and a strict RTL model for communicating between blocks.

What does ESE provide for verifications?
Similar types of benefits. If you are trying to express concurrency, which you often are trying to do in testbenches, as well as if you are creating golden reference models that can be sued for verification comparison with an actual implementation being able to provide hardware accuracy, being able to get complexity done quickly is tremendously valuable. Finally, those who want to do designs or want to pass the designs, the benefits are obvious in terms of being able to synthesize no compromise RTL out of the design.

Synthesis extensions available today if you go on the website. You can download software documentation and examples that provide a language reference manual, tutorials, code examples, user guide. That is important in a free fashion for untimed simulation. A lot of the type of things people do today with SystemC based models. That works with standard tools that are available today.


ESE            for untimed simulation, no clocks

ESEPro        for clock scheduled simulation, available today

ESEComp     will synthesize Verilog RTL from SystemC designs, demos at DAC
                 Rolling out in the second half of the year

Customer quotes from Deepchip and your responses on your website raise issues related to learning curve associated with new methodologies and/or tools.
Typically we have a three day training session. We go over slides, tutorials and so forth. All the quotes you have seen and in every evaluation we have done through design and verification has exceeded that 2X improvement for the first design out of the gate after one week of training and evaluation preparation you are productive. It takes some time to really become an expert and it takes some time to feel comfortable but productivity really happens from day one.

There is an interesting discussion of experience in the letter from the ST engineer that said “Even after coding with this for a few weeks I was convinced that I was more efficient with RTL. But then I started getting into the more complex part. It sort of became easier. In the end it looked like I was 2X faster. For my next design I much prefer to use Bluespec.”

So the first project is highly productive but there is a period of becoming comfortable, an adjustment to loving the tool.
In general we found when people get through their first project, they never want to go back to RTL level design.

Is this based on a double blind study or a gut feel?
Actually, a little bit of both. In many cases someone has done a similar block where they had comparative data. They are providing that comparison with their experience in the Bluespec projects. This is not in all cases because sometimes they are building a new block. Not an apples ot apples comparison. In the majority of cases there is some sort of apples to apples comparison in the company so they can provide feedback.

So, you are providing new tools or extensions of tools to a targeted end users rather than trying to change who does what?
I think that is true. As Shiv talked about the Bluespec SystemVerilog tool targeted hardware designers and architects familiar with hardware languages and concepts. With that solution we are replacing what they are doing today. With the SystemC product we are targeting people that are doing modeling, architecture and verification today with SystemC. We are providing an opportunity to add design to that mix. The main thing we are targeting is along the lines of what people are doing today but allowing them to do that significantly better. Over time that will provide some opportunity to do more. We believe very, much in fitting with existing methodologies and tool sets and not requiring firms to do a lot of organizational learning to adopt a really compelling solution. If you try to change too many dimensions it becomes very difficult to adopt a new approach.

How does Bluespec sell its products?
It's primarily direct sales. We have a few reps. We have a west coast presence that covers the east coast and east Asia. We have somebody how cover the other two thirds of the US and Canada. We have somebody who covers Europe and India. We have a few select reps that work in a direct sales capacity and a distribution rep in Taiwan.

How many seats does the ever hard to define typical user have?
It depends on the product. The synthesis tool is a time based license. Typically there would be some sharing. There will not be one tool at a customer. We have simulation based license as well. These depending on the customer could have multiple seats per engineer to less than one seat per engineer. It depends upon how the tool is leveraged by a particular engineer.

I think you said that the major application for these tools has been mobile cell phones.
It is mostly a coincidence rather than anything else. We have had tremendous success with mobile semiconductor and systems companies. There are some good reasons why that area of the market has adopted Bluespec. It offers the ability to provide very rapid architectural exploration and feedback. Not only improving the design cycle but also being able to make the best decisions about the implementation for consumer marketplace. Lots of good fits there. The tool sets are generally applicable to chip design whether ASIC or FPGA.

Whom do you see as competition for your products?

Generally we do not see a direct competitor. There are people currently using SystemVerilog. The decision would be do I do Verilog level or do I do it at a higher level. We see the algorithmic synthesis tools as being really very complementary to Bluespec. If you want to have an implementation generated automatically from a SystemC or C based implementation, that's a terrific solution for an algorithmic synthesis tool. If you are looking to do that algorithm by hand in RTL, we think we are a viable alternative to that. They automate high level descriptions from C to RTL. We focus on complex control logic and complex control. We expect that we will be within the same design groups as a complete ESL story for people doing hardware design.

What is the market size for this type of product?
If you think about the market size for people doing modeling and chip design today, that's the TAM (Total Available Market) for what we are doing. Our focus is on people looking to drastically improve and drastically reduce the cost of doing chip development whether ASIC or FPGA and for people to improve the quality and wrestle with the complexity of what designs have become today. Back when I did my first chip design, I worked on a MIPS processor at LSI Logic. My first design was done in schematic capture. You can't afford to do that with designs today given their complexity. It is getting to the point where it is untenable to do very high quality, fast implementations of designs in RTL. We are offering something the industry didn't have prior to Bluespec, which is a way to take complex hardware, complex concurrent based design and do them at a much higher level and higher quality and yet integrate nicely with existing RTL level tools.

No competition?
When we go to prospects, they are making a decision to continue to do RTL level designs or not. In some sense our competition is existing approaches. But in terms of the technology and in terms of where we are positioned, the types of solutions we do well which is taking a high level description of complex data paths we are not aware of any competing technology.

What do you see as obstacles to your commercial success?
Shiv: Inertia is a big one. Methodology change is never easy. In addition to retraining people in a new skill set, there are always things people need to work through in terms of implications on process and methodology, organization and on who does what. These things take time. We are 15 years into RTL design. A lot of people have tried things and have a sour taste, a very jaded view about the possibility that there is something that can do a very high quality hardware implementation for the types of things they are doing. Part of the inertia is really overcoming preset viewpoints as to what is possible.

What is your strategy for doing that?
Shiv: My concise description is to focus on a few customers, make them successful and widely publicize those successes.

That takes time on both the sales and implementation side?
Yes. That is reflected in how much money you take in, how you calibrate your burn rate, how you set the size of the organization, how careful you are in terms of matching revenue and expenses and all those things. It is a philosophy. We consider this to be a marathon not a sprint. We have geared ourselves to running a marathon,

The top articles over the last two weeks as determined by the number of readers were:

Mentor Graphics Announces User2User 2006 Best Paper Award Winners User2User 2006 was held May 2 through May 5, 2006, in San Jose, California, and was attended by approximately 500 Mentor customers from 12 countries. U2U 2006 presentations and papers are available at

Cadence Debuts Industry's First Transaction-Based System Verification and Management Solution
Cadence announced the industry's first automated end-to-end transaction-based flow from architectural modeling to full system validation. The newly enhanced Incisive Enterprise solution combines verification management technology, SystemC/mixed-language simulation, and hardware acceleration/emulation for customers verifying and validating complex SoCs and systems. The Incisive Plan-to-Closure methodology has been updated to include transaction-based acceleration and transaction-level modeling methodologies to guide design and verification teams through the verification process.

Joe Costello to Keynote Monday Opening of Design Automation Conference; Former Cadence CEO Challenges Semiconductor Industry to See Technology Through the Eyes of Consumers
Joe Costello, chairman of Orb Networks and former CEO of Cadence Design Systems, will open this year's Design Automation Conference (DAC) with a keynote titled, "iPod or Iridium: Which One Are You Going To Be?" During this session, Joe will challenge participants with these fundamental questions: Are you going in the right direction? Are you bending your minds with the complexity of implementing modern-day systems and chips? Are you racing toward the right finish line? What are consumers really looking for? What will convergence really lead to and are you positioned to take advantage of all it will bring our industry and our world?

Mentor Graphics Introduces Catapult SL, the First High-Level Synthesis Tool to Create High-Performance Subsystems from Pure ANSI C++ Mentor today expanded the Catapult product line with Catapult SL (System Level). The Catapult SL tool supports complex hierarchical design, includes new technology that improves block-level performance and offers links to power analysis tools to help reduce power consumption by up to 30 percent. The Catapult SL tool is priced at $350,000, and is currently available in either term or perpetual licenses. Other members of the Catapult product family are priced starting at $140,000

National Semiconductor Equips All Employees With 30-Gigabyte Video Apple iPods to Cap Off Most Successful Year in Company's History While designed for personal entertainment, the popular Apple MP3 player will be used as a new training and communications tool at National, providing a convenient real-time method for its 8,500 employees to download National podcasts and other employee communications. National reported sales of $2.16 billion for fiscal 2006, which ended May 28, 2006

43rd Design Automation Conference Features Management Day Focused on Intersection of Business and Technology DAC will offer a Management Day track again this year, following the success of last year's "Management Day at DAC." The Management Day track, to be held on Tuesday, July 25, helps managers make decisions where the technology and business of IC and system design meet. The 43rd DAC will be held July 24-28, 2006, at the Moscone Center in San Francisco.

Other EDA News

Gidel is proud to introduce PROCSpark ll™

Design Automation Conference to Host 3rd UML for SoC Design Workshop July 23

Freescale Tapeout Marks 1,000th Design for Cadence CeltIC NDC

Making the Case: ENOVIA MatrixOne to Outline Criteria for Driving Corporate-Wide Buy-In for PLM

Comit Systems Expands Adoption of Cadence Encounter Digital IC Design Technology

Renesas Deploys Novas' Debug System

ATI Implements Mentor Graphics Modular TestKompress for Production Test of Advanced 90nm Graphics Processor

Verari Systems Announces New CEO

Making the Case: ENOVIA MatrixOne to Outline Criteria for Driving Corporate-Wide Buy-In for PLM

Celoxica, GiDEL Bring FPGA+DSP Acceleration to Embedded Imaging Applications

Altera's Stratix II FPGAs Provide Complete Design Security Solution for Protection of Intellectual Property

Hong Kong's ASTRI Reconfirms Cadence as Key EDA Solutions Provider

Other IP & SoC News

MIPS Technologies Joins The SPIRIT Consortium

Acacia Technologies Licenses Resource Scheduling Technology to Madrigal Soft Tools

Samsung to Benefit From ARM and National Semiconductor Power Management Collaboration

AnalogicTech Announces High Efficiency, Integrated Step-Down Converter, LDO for GSM Applications

Therma-Wave Wins Fab Expansion Repeat Metrology Orders

Darfon Selects Cypress's WirelessUSB(TM) LP Radio System-on-Chip and enCoRe(TM) II MCUs for Next-Generation Wireless Mice

Wave Systems Announces New Licensing Agreement with STMicroelectronics for Additional PC Security Software

Avago Technologies Introduces Higher-Resolution Reflective Optical Encoders with Increased Performance, Temperature Tolerance

LSI Extends 4 Gb/s Fibre Channel Reach With New Teradata Deployments

Imagination Technologies Formally Launches U.S. Operations

Primarion Introduces Industry's First Dual-Phase, Programmable Digital Power Conversion and Power Management IC

National Semiconductor's New Boomer Audio Subsystems with RF Suppression Help Shield High-Frequency Interference

DALSA Semiconductor Delivers World's First 100+ Million Pixel CCD Image Sensor Chip to Semiconductor Technologies Associates 'STA'

IDT Enhances Efficiency of Next-Generation Wireless Infrastructures with Industry's Only Pre-Processing Switch