What is EDA, anyway?

Wikipedia defines Electronic Design Automation as 

"a category of software tools for designing electronic systems such 
as integrated circuits and printed circuit boards."

OK, that's not a terrible start, but it's still not terribly detailed, either. 

The rest of the article is OK, but it's not really up to Wikipedia standards,
so I'm going to take my own swing at things. A few quick disclaimers to get
out of the way:

 * I work in EDA for one of the Big Three dominant players
 * I specialize in IC layout (more on that later), so my knowledge drops off
   quickly the further we move from that portion of the design flow, and goes
   asymptotically to 0 as we move into Printed Circuit Board (PCB) design

				Glossary of Terms

 * IC:   Integrated Circuit
 * PCB:  Printed Circuit Board
 * ASIC: Application Specific Integrated Circuit, a chip or portion of a chip
         tailored to a specific function and immutable once manufactured
 * FPGA: Field-programmable Gate Array, a very large array of standard logic
         gates that can be re-configured as needed for multiple functions
 * Gate: Overloaded term with 2 possible meanings:
         1. An element of Boolean logic (AND, OR, NOT, XOR, etc) made up of
	    multiple transistors
	 2. The controlling input of an individual Field-Effect Transistor,
	    used to enable or disable the channel that conducts current from
	    the source to the drain of the FET. 
 * Register: A logic circuit used to store the value of a signal until it can
             sent to the next stage (generally by a clock signal pulse)
 * RTL:  Register-Transfer Level, a style of coding in a hardware design
         language like VHDL or Verilog that describes the movement of signals
         through a circuit from one set of logic registers to the next.

				Overview

Broadly, EDA software can be divided into several major areas:

 * Analysis/Verification: Does the design work? Can it be manufactured without
   errors? Analysis tools can include simulators, static design checkers, formal
   verification tools (which use mathematical proofs to check design
   equivalence), and even hardware emulators that can mimic the design in something
   closer to real-time operation (usually by compiling the design onto a FPGA).
 * Design: Defining the architecture/function, creating constraints that must be
   satisfied, describing and implementing the design itself (logically and
   physically). Tools in this space include schematic editors, layout
   generators, logic synthesis, and automatic place-and-route solutions.

Both of the areas can be applied to either PCBs or ICs. The design challenges
for both are similar in broad strokes, but the physical scale varies vastly from
a PCB with visible components numbering into the low thousands to an IC that may
contain 1x10^9 transistors (and a similar scale of wires). 

				So what is EDA doing in the picture?

At its most minimal, it provides basic CAD functionality. The simplest flow can
be conceptualized as defining a design, laying out the components, analyzing the
layout to make sure it matches the logical definition, and sending the design
off for fabrication. Let's take a look at a very basic workflow.

				Basic EDA, at human scales

A user starts a schematic with the parts they need and connect everything manually, 
then exports the schematic data to a layout tool to physically place the components 
and connect them with real wires (this layout can also be manual in the most
straightforward case, once the initial data is populated). With the layout done,
an analysis is run to check the layout vs. the schematic (LVS) to make sure the
physical function matches the logical function. Basic design rule checks (DRC)
are also run to look for opens, shorts, and other situations that could produce
defects during the manufacturing process. If everything passes, the design data
is exported as a format for manufacturing and sent off. 

				Large-scale designs bring large-scale problems

So, we have seen a very basic flow that is mostly manual, and uses EDA software
mostly for data transfer and some analysis (LVS and DRC). For small designs,
manual input is sufficient, and allows for a high degree of designer control,
but it suffers from an inability to scale to very large-scale designs, because
there is only so much a human can do. 

Note: from this point forward most discussion will focus on digital logic
design. The world of analog is still tightly controlled and resistant to
automation beyond what I have described above. 

To scale up, we need to expand the amount of automation in the design process,
starting with the definition of the design itself. We have to move from a
manually-drawn schematic to something more algorithmic, but still with a
reasonable amount of designer control. Into the breach come Hardware Description
Languages (HDLs), which look like computer code, but include constructs like
clocks and events that are necessary to describe the function of a typical
digital circuit. The dominant language today is Verilog (with vaguely-C-like
syntax), but some design flows still use VHDL (which has a syntax derived from
Ada). With an HDL, the design can now be described at a higher level, with
signals propagating from stage to stage through the design, being processed with
various logical operations at each stage.

Now, at this point we are presented with a design that has specific functions
defined, but no actual circuits to implement these functions. What shall we do?

Enter Logic Synthesis. The synthesis process takes the design as one input, and
a library of discrete Boolean logic gates as the other. The library contains
specific versions of the various functions needed (3-input AND, 2-input OR, Full
Adder, etc.) at various sizes and strengths (a larger, stronger circuit will
drive current further, with a sharper waveform). 

Coming out of logic synthesis, we want to make sure the design matches the
input. For this, we typically use Formal Verification tools to run a static
analysis of various test points in the design. Full dynamic simulation can also
be run, but takes significantly longer, so this is often done in parallel with
the rest of the design process. 

For many designs, this is now the point where we add test logic. This can be
used to probe internal design points, but is also used to load test patterns
into the design and then step them through the design on a tester to see that
the test patterns produce the correct output (this loading process is referred to
as "scanning" and the logic it creates is often called "scan" logic). 

With the design mapped into real circuits, we can now move from definition to
physical layout. This is done with an automated place-and-route (PNR) tool,
which places the logic gates onto a predefined grid within the design, routes
the wiring between the gates, and does optimization of various design
constraints that are provided by both the designer and the library provider.
These include timing-based checks, electrical correctness rules, and many other
design rules that probably deserve articles of their own. 

With the design now implemented, we move back into the analysis stage. Formal
checks are run again, to ensure PNR didn't change any logical function.
Simulation may also be run, as we now have real circuits and wires and can do a
full dynamic analysis (which will take a long time). Static Timing Analysis is
done to ensure that each cycle of the design will meet timing checks, and
LVS/DRC checks are done just like the simpler flows, just on much larger
data sets with far more complex rules. 

Finally, the design is exported to be sent off to manufacturing. I will not even
begin to touch on this process, as semiconductor manufacturing is incredibly
complex and fills books. 

				So what's the point of all that?

In short, scale. As mentioned, a modern IC can have in excess of 1x10^9 devices,
each of which has multiple wires connected to it, with thousands of
inputs/outputs, which must satisfy complex manufacturing rules, stay within
power budgets, and operate over a wide range of temperature and voltage
conditions. Doing this by hand is simply unfeasible, so we leverage the vast
computational resources at our disposal to automate the process. 

Obviously (?) I have vastly simplified the process here. EDA is a
multi-billion-dollar* industry with thousands of engineers working to improve
the state of the art and push automation into new areas (AI, ML, higher-level
abstraction, etc.). 

All so you can run Doom.


* US version of both billions and dollars