"May you live in interesting times," goes the familiar Chinese curse. Considered a curse, because interesting times herald change, and change brings with it both good times as well as challenging ones.
This year marks the 25th anniversary of my own involvement with design technology. Way back in 1970, I had the good fortune to bump into Professor Donald Pederson in the computer room of the University of Melbourne, Australia, and before I knew it, I was a SPICE-1 developer, or I suppose a SPICE-1hacker in todays lingo. And the times, for me at least, have remained "interesting" ever since.
Twenty five years.
From FORTRAN, punched cards, line printer output, even computer rooms as we old fogies used to call them, to the interactive, multimedia world we live in today. Along the way, I have had the distinct privilege to observe, and participate in, a process of profound change and to work with many very talented people, the people who have driven that change, over the past quarter century.
In a keynote address, one is encouraged to deal only with the prime underlying elements or themes, to try to set a tone for the days ahead. That is not easy in an industry that has proven time and time again, at least to me, that the devil is almost always in the details! Predicting the future of an industry like ours is always risky. Ill begin by setting a context using a few well-known predictions from the past.
As many of you may know, it was back in 1943 when Thomas Watson, then Chairman of IBM, was quoted as saying that he believed the total world market for computers was maybe five.
And Ken Olson of DEC is quoted as seeing no reason why anyone could possibly want a computer in their home... The votes arent in on that one yet!
When Popular Mechanics predicted back in 1949 that computers may, someday, weigh less than 1.5 tons they were both optimistic and, eventually, correct.
On the other hand, the business editor of Prentice Hall was off the mark somewhat in 1957 when he predicted that the data processing fad wouldnt last a year.
The problem is further exacerbated by the fact that many of our users are often more conservative than we are!
...as this IBM engineer demonstrated back in 1968 when he questioned the value of the integrated circuit in design. Fortunately, our users are very quick to appreciate the value of a new tool or methodology, especially when there is even the slightest hint that one of their key competitors might adopt it.
The point I make is that long-range predictions of the future in technology--in our field--tend almost always to be overly pessimistic. The long-term future almost always outperforms the pundits, often in unpredicted directions. We are almost always too conservative, even in an industry that has changed the world.
When we think of the semiconductor industry today, we see plots on semi-log paper continuing upward and to the right. We imagine lots more gates, mixed-signal design, more chips on a board, and lots more hard work for us all. We imagine tough problems just getting tougher, and they are important problems and they are getting tougher. But are big, complex, deep-submicron chips the only path which leads to the future? Fortunately, there is a small cadre of marginally crazy people out there trying to change the problem rather than simply trying to solve it.
Kris Pister Micromachine Chip Photo
Try to imagine, for example, self-powered chips powered only by incident radiation or some novel battery technology, communicating amongst each other using tiny semaphores, by waving tiny micromechanical mirrors, or by making noises like crickets in the night. These unpackaged, tiny silicon die might float around us, perhaps sampling atmospheric conditions and signaling results back to some central site. They might be dumped by the bucket-load from transcontinental jet aircraft,, working together as a team as they drift slowly to earth. Or perhaps they might be glued onto the leading edge of a jetliner, steering the plane using thousands of 100micron long micro fingers. Or perhaps a slurry of such chips might be painted onto the surface of a bridge, or the walls of a house, sampling temperatures and stresses. Fortunately, there are people considering such things, even developing them. And Im sure, in the future, such developments will come to pass.
Imagine design tools and services distributed throughout the world, sitting on servers or personal computers just waiting to jump into action. Tools and services provided by industry, by academia, or even by individual contributors working from their homes. Imagine design systems comprised of tens or perhaps hundreds of such tools and services, linked together to evaluate tradeoffs for their users among a wide range of component and architectural options. User interfaces that are automatically downloaded across the network when a capability is invoked, agents that wander the network building repositories of interesting information, guided by general input from the user. Services spanning chip design, packaging, rapid prototyping, assembly, test, and volume manufacturing, of both electrical as well as mechanical subsystems, all on line. Services which provide technical as well as business-related information. Environments in which it is easy to try-before-you-buy, where on-line training comes with the tool or service. Such systems are really not that far off!
In this light, if there is a single point I wish to make here today, it is that as a discipline, both in industry and in academia, we are just not taking enough risks today, and most certainly not from a technical perspective.
We still have just as many bright, creative people as we have always had, most of whom stand ready to change the world in very dramatic ways. Many of whom continue to be frustrated as they share their vision in a hiring process which often seems to consider Windows 95 programming experience as its most important strategic hiring criterion. It is perhaps no wonder then that the business community believes that "Buying companies is a legitimate way of doing research and development in this industry," as Robert Stern, an analyst with Smith-Barney put it.
Taking technical or business risk is not new to us. Virtually every successful start-up in our industry owes its early advantage to such risks. And the larger companies in our industry are often forced to take on substantial business risk, as they struggle to feed the ever-demanding Wall Street inferno.
However, if we are to continue the revolution, if we are to exceed our own expectations, we must be prepared to take even bigger risks--both from a technical as well as from a business perspective, and both in industry as well as in academia. This risk is a very personal one, it involves putting our reputations and our careers on the line at all levels of our organizations, from student to CEO.
Those who fund research must return to the longer-term emphases that actually led us to where we are today. Dont ask for deliverables in six months or a year ask what might we learn from this work. Dont ask for software, but rather expect a deeper understanding. Even in the question itself, one prejudices the process. Trust that the deliverables, the understanding, and even perhaps the software will come, Its more likely to be what you really need if you dont insist upon it.
As you look around the Conference this week, and reflect back on the state of our industry a quarter century ago, Im sure you will agree we have a lot to be very proud of indeed. The $100B semiconductor industry simply would not exist in its present form if it wasnt for our meager $1.2B contribution. ...Thats right, we just dont get the respect we deserve!
We sit here, in June 1995 with predictions of a rosy business future. With 1994 growth at almost 11%, its highest level in over five years according to Dataquest, and with some analysts predicting 14% CAGR through 1988, things seem quite healthy.
On the other hand, we have certainly had our ups and downs in the past and, at times, our customers have had good cause for concern. We have seen major design technology companies spring from the vision of a core team to valuations of hundreds of millions of dollars in just a few years. And we have seen many of them vanish into the oblivion of merger and acquisition, or relegation to the pink sheets, even more rapidly.
Over the years, many of us have searched for an understanding of this boom-to-bust phenomenon, and there have been many theories put forward. In 1992, in a tutorial at this Conference, I proposed a stellar analogy for consideration and I will give you an update here today.
Hertzsprung Russell Diagram
For those who missed the tutorial back then, the analogy is based on this diagram which attempts to capture the evolution of stars. Most of the stars start out life in the lower right hand corner and, as they increase in both temperature and luminosity, they move along the main sequence to the upper left. Depending on prevailing conditions, the may leave the main sequence, expanding in size and cooling down a bit to become red giants. In this process, they often suck in surrounding matter, even perhaps small stars as the expand their horizons and grow. But in that process, they often become less efficient and ultimately the nuclear reaction that got them to where they are can no longer support their mass and so they implode, moving to the lower left of the diagram as white dwarfs, or perhaps even black holes!
In 1992 I proposed an EDA analogy to this evolution, and this is where the companies I used to illustrate the analogy seemed to be at the time. Synopsys was relatively young and growing rapidly, It seemed to me that Viewlogic was about to peel off into red-giant status, and Cadence and Mentor were poised for impending implosion.
Since then, most of the smaller companies shown here were either absorbed by the giants, moved out of the EDA business almost entirely, or changed their names, perhaps hoping to create a new image.
In terms of four of the major players at the time, Cadence, Mentor, Viewlogic and Synopsys, Synopsys appears to have continued along the main sequence, almost doubling its shareholder value in absolute terms--and seem now well-positioned for red-giant status. Cadence and Mentor both seemed to start out in the direction of implosion, dropping to about one third of their June 1992 value, only to recover to new highs, standing almost 30% above their 1992 levels in absolute terms, while Viewlogic grew to almost double its June 1992 value before collapsing to about one third of that value today. Now you might say that the performance of Cadence and Mentor has bucked the trend, that the model no longer applies. However, if one normalizes the performance of these companies to the overall performance of the market using the S&P 500, the performance of both companies has been almost flat, actually even negative when compared to indicies which emphasize high technology stocks.
Both companies have made dramatic changes in both organization and business emphasis over the past three years, re-orienting themselves in new and exciting directions. As a fellow revolutionary, I sincerely hope these changes lead to success and that these companies can establish a new plateau for design technology tools and services. ...But Ill hold on to my model for a little while yet!
There are still many small companies being created every year, often founded by employees of the major players or by former design technology customers. These smaller companies actually seem to be increasing, as a percentage of the industry.
DAC Booth Space Data
Using Design Automation Conference booth and suite space as a very rough measure of discretionary corporate resources, I have plotted the number of companies as a function of the space they have purchased over the past nine years. As you can see, the segment showing the most rapid growth over the past decade are those companies purchasing 200 square feet or less. Another observation is that this year we have the largest number companies ever exhibiting at the DAC.
So the EDA carousel may characterize a phenomenon many of us have observed over the years. But what is its root cause? And is it possible to de-fuse it?
I believe the cause is most strongly related to a lack of consistent and sustained investment in the technical infrastructure of the industry and that it most certainly could be avoided.
However, once again, it does require considerable risk, all the moreso for an established player. The past efforts of SDA systems, who introduced the concept of a user-extensible EDA system and common data model; Viewlogic, who adopted the EDIF interchange format at a very early stage, even before it was a published standard; and Mentor Graphics in its Falcon development, represented major investments in such infrastructure and are to be applauded.
Standard ways of representing, storing, and retrieving information, and standard protocols and interfaces which can empower the creators of new forms of intellectual property are a must. I am pleased to see that Sematech, EDAC, and the CFI have embarked on such a project. My hope is that these standards are developed and promoted from a wide-area-network-based perspective--from an Internet perspective--a point to which I shall return later.
There are many possible way to approach the next revolution in our industry, and I could certainly spend an hour on each one, but in the interest of fueling discussion and debate at the Conference I will at least mention a few of them--the topics that seem to me to be the most obvious, the shortest approaches to revolutionary change in aspects of our industry. However, there are no paths to revolution per se--its eventual outcome cannot be planned. To paraphrase a modern revolutionary in the field of philosophy, revolution, like truth, is actually a "pathless land."
The topic we refer to as Deep Sub-Micron is perhaps the most obvious approach to revolutionary change, certainly it was a hot topic at last years Conference and is the approach that has been most discussed over the past year. The one caveat I will add here is that this is as much about methodology as it is about tools--as much about the way the tools are organized and the way they interact as it is about the specific capabilities or particular emphasis of the tools themselves. The real winners here will be those who re-think the entire back-end IC design flow, not those who simply improve or augment existing offerings, existing methodology. With one hundred million devices and eight layers of relatively poor interconnect, we are wiring-driven-the problem becomes routing and placement, rather than placement and routing. Even logic synthesis must change, with a return to a switch-level emphasis and library generation on-the-fly.
Representation, Animation, and Modeling
At the other end of the electronic design spectrum, we have heard a lot about HLDA and ESDA. System-on-a-chip and hardware-software co-design. We are told of the advantages of cycle-based simulation and the promise of formal verification. There is a revolution coming here, and all of the aforementioned will doubtless be a part of it. However, once more, this is not a tool issue per se but it is rather largely a methodology issue. It has to do with how best to represent the design at higher levels of abstraction first, what are the right primitives to use in our models. Right from both the users point of view as well as the design technologist. Acceptable ways to represent the passage of time, to represent the protocols between elements of the design and, of at least equal importance, the best ways for the user to interact with the design. In my view, a key insight here is the realization that in todays world, with its complex and ever-changing interactions among the components of an electronic system, modeling and animating the environment in which a design is expected to operate is as important as modeling the design itself. To me, this means a lot more than simply a pins-out perspective. If one starts by building a system that can represent and animate the rest of the world, then animating and evaluating the design itself becomes straight forward. To that end, combinations of hardware as well as software that support the use of real-time external interfaces, like PCI or nuBus for example, can save the user a great deal of modeling effort as well as running time.
Chip, Package and Assembly Integration.
According to Technology Forecasters, the world-wide growth in contract board assembly and related services is running at about 15% annually and this industry will generate about $39B worth of revenues in 1997, $16.5B of which will be generated in North America alone. There are many reasons which contribute to this growth but to me the most important factor is the ongoing trend towards outsourcing of both assembly and manufacturing, driven by the increasing sophistication, and hence cost, of the equipment needed to manufacture competitive products. The factors driving this trend are very similar to those that drove the development of the ASIC industry at the beginning of the last decade. As was the case back then, a key factor which differentiated winners from losers in the early days was the quality of their design technology, the value they could add to their customers design process. As was the case with the move from silicon foundry to ASIC, the value-added through design technology--tools, design flows, and standard libraries--was key to driving higher margins and building a successful new industry. Packaging and assembly is bound to move in the same direction as the players in this industry attempt to move beyond the label of board-stuffer, and high-quality design technology, from tools to services, is bound to play a central role here.
But, as with the ASIC phenomenon, those design technology suppliers who simply test the water with a retrofitted IC/PCB design system are bound to fail. Once again, this revolution demands revolutionary thinking, it demands revolutionaries.
A key to all of the above, and to just about everything we do, is the value we add, the value we can deliver to our customers, and we deliver that value in the form of intellectual property. This is a very complex topic and one that is sure to grow in importance in the years ahead. In the ongoing debate about the role of intellectual property, it is important to apply this term in its broadest possible sense--libraries, tools, methodology, interfaces, standards, services, in the form of hardware as well as software, in fact any organization of information produced by the intellect is a candidate here. However, the word property is as important in this context as the word intellectual. Property implies ownership, and systems which can facilitate the creation and dissemination of a wide range of products of the intellect without compromising the ownership aspect will play a central role in this revolution as well. And this leads me to my final opportunity for revolution.
The Internet Phenomenon
Without doubt, the opportunity which is likely to cause the most dramatic changes in our world-view over the next few years is the Internet--the application of wide-area networks to EDA. Every presentation I have seen and every article I have read which tries to predict the impact of this phenomenon on our industry falls woefully short of the mark. The future EDA scenario I painted earlier , with tools and services linked throughout the world via networks and standard protocols, is almost upon us. If you havent yet surfed the net with a tool like Netscape, do it! If you dont know what Lycos is, find out. If Tcl/TK and Java mean nothing to you, its time to find out what they are. You will be a user, the only question is who will be the provider of your technology.
You cant buy software.
A key insight here is the realization the one cannot really buy software. It cannot be re-sold like a used car or a house. Many now view the running of application software as a performance of it--you perform the software, much like you might perform an opera or a play a videotape, perform a movie.
...but you can pay for it!
However, you certainly can pay for it! We know that!. But what does one pay for? The right to use the software and the right to updates. Once this insight is clear, then the way one sells access to software and services can be separated from the physical medium on which it resides. What you pay is related to what level of service you receive, not how many megabytes are packed onto your hard drive. Whether the tools and data reside locally, are copied over the network at run-time, or are invoked on a remote server, becomes a caching issue, performance and security related but not a direct business issue.
The implication of such a change is profound, especially to todays EDA direct-sales organizations. A user will be able to surf the net, find a tool or service, evaluate it, learn it, use it, and link it into an overall design flow without leaving his or her own desk. User-customized design systems will become the norm rather than the exception, and the small companies with good ideas and useful technologies--useful intellectual property--will be able to provide them with low overhead, via standard and safe EDA interfaces and protocols. Those under-200 square foot companies I mentioned earlier.
In my view, the biggest hurdle we, as an industry, have to overcome to make such a future a reality is the security issue. But, make no mistake, it is a problem that will be solved.
This raises another important--I would even say central--challenge that we face as an industry and as a discipline today--the issue of standards. If we are to empower the many service providers out there as well as the many who would like provide service, whether the service be in the form of fabrication, assembly, or design technology support, we must establish some collection of standard interfaces and protocols. Standards, correctly adopted, can empower an entire industry. If developed or promoted incorrectly, they can at best stifle progress, waste millions of dollars, and, more importantly, waste the time of many invaluable minds. Gerry Langelers comment on a panel at this Conference in 1990 summarized what seemed to be the prevailing view of many of our customers at that time.
With five more years of history behind one such standard, Joe Costello was reported to have made the following comment recently regarding VHDL, a comment with which, as many of you already know, I identify quite closely!. (By Jeff Dorsch, Electronic News)
As an industry, we still do a very poor job of providing an effective and balanced forum for the development and critical evaluation of standards.
Groups of well-intentioned and enthusiastic volunteers are certainly not the answer--and I say that as one who has spent a considerable fraction of his adult life playing such a role.
For the most part, standards are viewed by the design technology industry as either a tactical business weapon that can be used to get a leg up on the existing competition or simply as overheard--a capability one is forced to provide to satisfy a customer check list.
By the same token, our customers have at least as critical a role to play in this debate as well. How often do our users really participate with a critical eye in the early stages of the development of a standard? How often have we seen them raise their voices early to actually oppose the development of a proposed standard, even when it is clear to them that it will not improve, but rather is likely to complicate their lives? How often have we heard it said, "The nice thing about standards is that there are so many to choose from."
In proportion to the potential long-term benefit as well as potential costs associated with standards, neither the design technology companies nor their customers spend enough high-quality time on the management of this issue. Lets hope we do a better job with our wide-area network interfaces and protocols.
In summary, we have come a long way over the past quarter century and it should be clear to us all that we have an exciting future ahead of us.
I would like to take this opportunity to thank all of those who I have had the distinct privilege to work with over the years, in all aspects of our field, and most especially I thank my many students. It is through their eyes that I continue to see the world in a new light almost every day.
We are, once again, on the verge of a revolution, a revolution which has a number of important dimensions, each of which will change our world and will benefit our users in profound ways. Success in any one of these areas involves taking risks, big risks, in both research and in development. As a discipline, I hope we find a way to undertake these challenges and opportunities but, no matter what happens, the times are certainly bound to remain interesting.