You are on page 1of 14


Introduction to VLSI Systems

4.1 VLSI

The abbreviation VLSI stands for Very Large Scale Integration. Several
forms of integration of IC’s on the chip were developed earlier.



It denotes Small Scale was the first ever form of integration
and these included less than 100 transistors per chip. This is the the most basic
forms of integration and is very primitive. Thay are not in use now.


This stands for Medium Scale Integration . This is an advanced version of

SSI where the chip might include 1000 transistors per chip. They are more
advanced and have higher speeds than SSI. As a result they lead to greater
processing efficiency.


LSI stands for Large Scale Integration. This includes 1000-10,000 transistors
per chip. They are better than MSI in terms of processing and operating speeds,
output efficiencies and so on. Also more number of functions and coding can be
incorporated. As a result they paved the way for advancement in the technologies
of electronics and efficient coding.

This is one of the most modern developments in IC integration. They include

more than 10000 transistors per chip. They cannot be operated by normal
programming and have specialized programming and coding techniques. Also they
paved the way for ASIC designs, VLSI design flows and complex integrated
circuits. They helped in revolutionizing the IC technology.


Structured VLSI design is a modular methodology originated by Carver

Mead and Lynn Conway for saving microchip area by minimizing the interconnect
fabrics area. This is obtained by repetitive arrangement of rectangular macro
blocks which can be interconnected using wiring by abutment. An example is
partitioning the layout of an adder into a row of equal bit slices cells. In complex
designs this structuring may be achieved by hierarchical nesting. Structured VLSI
design had been popular in the early 1980s, but lost its popularity later because of
the advent of placement and routing tools wasting a lot of area by routing, which is
tolerated because of the progress of Moore's Law. When introducing the hardware
description language KARL in the mid' 1970s, Reiner Hartenstein coined the term
"structured VLSI design" (originally as "structured LSI design").


As microprocessors become more complex due to technology scaling,

microprocessor designers have encountered several challenges which force them to
think beyond the design plane, and look ahead to post-silicon. These challenges
have been enumerated below and various technological efforts and modifications
are being made in order to make up for the following.
1. Power usage/Heat dissipation

As threshold voltages have ceased to scale with advancing process technology,

dynamic power dissipation has not scaled proportionally. Maintaining logic
complexity when scaling the design down only means that the power dissipation
per area will go up. This has given rise to techniques such as dynamic voltage and
frequency scaling (DVFS) to minimize overall power.

2. Process variation

As photolithography techniques tend closer to the fundamental laws of

optics, achieving high accuracy in doping concentrations and etched wires is
becoming more difficult and prone to errors due to variation. Designers now must
simulate across multiple fabrication process corners before a chip is certified ready
for production.

3. Stricter design rules

Due to lithography and etch issues with scaling, design rules for layout have
become increasingly stringent. Designers must keep ever more of these rules in
mind while laying out custom circuits. The overhead for custom design is now
reaching a tipping point, with many design houses opting to switch to electronic
design automation (EDA) tools to automate their design process.


Though VLSI designs are effective further improvements and advancements

are being made in the field of VLSI. This is done by further increasing the number
of transistors per chip which leads to the development of ultra VLSI technologies.
Also problems of VLSI like power consumption, process variations are also being
taken care of.

4.2 VHDL

VHDL stands for Very high speed IC Hardware Description

Language. It is used in electronic design automation to describe digital and
mixed-signal systems such as field-programmable gate arrays and integrated
circuits. VHDL is designed to fill a number of needs in the design process. First, it
allows description of the structure of the system, that is ,how a system is
decomposed into sub-systems and how these sub-systems are interconnected.
Second, it allows specification of the function of the system so that the user might
know what to do. Thirdly, it defines the detailed structure of the system to be
designed from a rather abstract system.


VHDL was originally developed at the behest of the U.S Department of Defense in
order to document the behavior of the that supplier companies were including in
equipment. That is to say, VHDL was developed as an alternative to huge, complex
manuals which were subject to implementation-specific details. The idea of being
able to simulate this documentation was so obviously attractive that logic
simulators were developed that could read the VHDL files. The next step was the
development of logic synthesis tools that read the VHDL, and output a definition
of the physical implementation of the circuit.

Due to the Department of Defense requiring as much of the syntax as possible to

be based on Ada, in order to avoid re-inventing concepts that had already been
thoroughly tested in the development of Ada, VHDL borrows heavily from the Ada
programming language in both concepts and syntax.In February 2008, Accellera
approved VHDL 4.0 also informally known as VHDL 2008, which addressed more
than 90 issues discovered during the trial period for version 3.0 and includes
enhanced generic types. In 2008, Accellera released VHDL 4.0 to the IEEE for
balloting for inclusion in IEEE 1076-2008. The VHDL standard IEEE 1076-2008
was published in January 2009.The initial version of VHDL, designed to IEEE
standard 1076-1987, included a wide range of data types, including numerical
(integer and real), logical (bit and boolean), character and time, plus arrays of bit
called bit_vector and of character called string.A problem not solved by
this edition, however, was "multi-valued logic", where a signal's drive strength
(none, weak or strong) and unknown values are also considered. This required
IEEE standard 1164, which defined the 9-value logic types: scalar std_ulogic
and its vector version std_ulogic_vector.


1. Strongly typed language.

1. Case insensitive
2. Supports flexible design methods
3. Top-down and bottom-down methods.


VHDL is commonly used to write text models that describe a logic circuit.
Such a model is processed by a synthesis program, only if it is part of the logic
design. A simulation program is used to test the logic design using simulation
models to represent the logic circuits that interface to the design. This collection of
simulation models is commonly called a testbench .VHDL has constructs to handle
the parallelism inherent in hardware designs, but these constructs (processes) differ
in syntax from the parallel constructs in Ada (tasks). Like Ada , VHDL is strongly
typed and is not case sensitive. In order to directly represent operations which are
common in hardware, there are many features of VHDL which are not found in
Ada, such as an extended set of Boolean operators including nand and nor. VHDL
also allows arrays to be indexed in either ascending or descending direction; both
conventions are used in hardware, whereas in Ada and most programming
languages only ascending indexing is available.

VHDL file has input and output capabilities, and can be used as a general-
purpose language for text processing, but files are more commonly used by a
simulation testbench for stimulus or verification data. There are some VHDL
compilers which build executable binaries. In this case, it might be possible to use
VHDL to write a testbench to verify the functionality of the design using files on
the host computer to define stimuli, to interact with the user, and to compare results
with those expected. However, most designers leave this job to the simulator .It is
relatively easy for an inexperienced developer to produce code that simulates
successfully but that cannot be synthesized into a real device, or is too large to be
practical. One particular pitfall is the accidental production of transparent latches
rather than D-type flip-flops as storage elements.


The key advantage of VHDL, when used for systems design, is that it allows
the behavior of the required system to be described (modeled) and verified
(simulated) before synthesis tools translate the design into real hardware (gates and
wires).Another benefit is that VHDL allows the description of a concurrent system.
VHDL is a dataflow language, unlike procedural computing languages such as
BASIC, C, and assembly code, which all run sequentially, one instruction at a time.

VHDL project is multipurpose. Being created once, a calculation block can

be used in many other projects. However, many formational and functional block
parameters can be tuned (capacity parameters, memory size, element base, block
composition and interconnection structure).VHDL project is portable. Being
created for one element base, a computing device project can be ported on another
element base, for example VLSI with various technologies.


VERILOG is used as a hardware description language as the VERILOG

HDL. This was designed to be a simple, intuitive and effective at multiple levels of
abstraction in the standard textual format for a variety of design tools like
verification simulation, timing analysis, test analysis and synthesis. The VERILOG
language essentially contains a set of built in primitives, like logic gates, user
defined primitives, switches and wired logic. It also has device pin-to-pin delays
and timing checks. The mixing of abstract levels is provided by semantics like nets
and registers.



Verilog was invented by Phil Moorby and Prabhu Goel in 1983 at

Automated Integrated Design Systems (renamed to Gateway Design Automation in
1985) as a hardware modeling language. Gateway Design Automation was
purchased by Cadence Design Systems in 1990. Cadence now has full proprietary
rights to Gateway's Verilog and the Verilog-XL simulator logic simulators.
Originally, Verilog was intended to describe and allow simulation; only afterwards
was support for synthesis added.


With the increasing success of VHDL at the time, Cadence decided to make
the language available for open standardization. Cadence transferred Verilog into
the public domain under the Open Verilog International (OVI) (now known as
Accellera) organization. Verilog was later submitted to IEEE and became IEEE
Standard 1364-1995, commonly referred to as Verilog-95.


Extensions to Verilog-95 were submitted back to IEEE to cover the

deficiencies that users had found in the original Verilog standard. These extensions
became IEEE Standard 1364-2001 known as Verilog-2001.Verilog-2001 is a signifi
upgrade from Verilog-95. First, it adds explicit support for (2's complement) signed
nets and variables. Previously, code authors had to perform signed-operations
using awkward bit-level manipulations (for example, the carry-out bit of a simple
8-bit addition required an explicit description of the boolean-algebra to determine
its correct value). The same function under Verilog-2001 can be more succinctly
described by one of the built-in operators: +, -, /, *, >>>.


Not to be confused with System Verilog, Verilog 2005 (IEEE Standard 1364-
2005) consists of minor corrections, spec clarifications, and a few new language
features (such as the uwire keyword).A separate part of the Verilog standard,
Verilog-AMS, attempts to integrate analog and mixed signal modeling with
traditional Verilog.


SystemVerilog is a superset of Verilog-2005, with many new features and

capabilities to aid design-verification and design-modeling. As of 2009, the
SystemVerilog and Verilog language standards were merged into SystemVerilog
2009 (IEEE Standard 1800-2009).The advent of hardware verification languages
such as OpenVera, and Verisity's e language encouraged the development of
Superlog by Co-Design Automation Inc. Co-Design Automation Inc was later
purchased by Synopsys. The foundations of Superlog and Vera were donated to
Accellera, which later became the IEEE standard P1800-2005: SystemVerilog.


A Verilog design consists of a hierarchy of modules. Modules encapsulate

design hierarchy, and communicate with other modules through a set of declared
input, output, and bidirectional ports. Internally, a module can contain any
combination of the following: net/variable declarations (wire, reg, integer, etc.),
concurrent and sequential statement blocks, and instances of other modules (sub-
hierarchies). Sequential statements are placed inside a begin/end block and
executed in sequential order within the block. But the blocks themselves are
executed concurrently, qualifying Verilog as a dataflow language.Verilog's concept
of 'wire' consists of both signal values (4-state: "1, 0, floating, undefined"), and
strengths (strong, weak, etc.) This system allows abstract modeling of shared
signal-lines, where multiple sources drive a common net. When a wire has multiple
drivers, the wire's (readable) value is resolved by a function of the source drivers
and their strengths.

The PLI provides a programmer with a mechanism to transfer control from

Verilog to a program function written in C language. It is officially deprecated by
IEEE Std 1364-2005 in favor of the newer Verilog Procedural Interface, which
completely replaces the PLI.The PLI enables Verilog to cooperate with other
programs written in the C language such as test harnesses, instruction set
simulators of a microcontroller, debuggers, and so on.


1. Shorten the design verification loop

1. Verification through simulation.
2. Allow architectural tradeoffs with short turnaround
3. Enable automatic synthesis
4. Reduce time for design capture
5. Easy to change

2.4 FPGA


programmable Gate Array (FPGA) is an integrated circuit designed to be
configured by the customer or designer after manufacturing—hence "field-
programmable". The FPGA configuration is generally specified using a hardware
description language (HDL), similar to that used for an application-specific
integrated circuit (ASIC) (circuit diagrams were previously used to specify the
configuration, as they were for ASICs, but this is increasingly rare). FPGAs can be
used to implement any logical function that an ASIC could perform. The ability to
update the functionality after shipping, partial re-configuration of the portion of the
design and the low non-recurring engineering costs relative to an ASIC design
(notwithstanding the generally higher unit cost), offer advantages for many
applications. FPGAs contain programmable logic components called "logic
blocks", and a hierarchy of reconfigurable interconnects that allow the blocks to be
"wired together" somewhat like many (changeable) logic gates that can be inter-
wired in (many) different configurations . Logic blocks can be configured to
perform complex combinational functions, or merely simple logic gates like AND
and XOR. In most FPGAs, the logic blocks also include memory elements, which
may be simple flip-flops or more complete blocks of memory.

In addition to digital functions, some FPGAs have analog features. The most
common analog feature is programmable slew rate and drive strength on each
output pin, allowing the engineer to set slow rates on lightly loaded pins that would
otherwise ring unacceptably, and to set stronger, faster rates on heavily loaded pins
on high-speed channels that would otherwise run too slow. Another relatively
common analog feature is differential comparators on input pins designed to be
connected to differential signaling channels. A few "mixed signal FPGAs" have
integrated peripheral Analog-to-Digital Converters (ADCs) and Digital-to-Analog
Converters (DACs) with analog signal conditioning blocks allowing them to
operate as a system-on-a-chip.[5] Such devices blur the line between an FPGA,
which carries digital ones and zeros on its internal programmable interconnect
fabric, and field-programmable analog array (FPAA), which carries analog values
on its internal programmable interconnect fabric.


The FPGA industry sprouted from programmable read-only memory

(PROM) and programmable logic devices (PLDs). PROMs and PLDs both had the
option of being programmed in batches in a factory or in the field (field
programmable), however programmable logic was hard-wired between logic gates.
In the late 1980s the Naval Surface Warfare Department funded an experiment
proposed by Steve Casselman to develop a computer that would implement
600,000 reprogrammable gates. Casselman was successful and a patent related to
the system was issued in 1992. The 1990s were an explosive period of time for
FPGAs, both in sophistication and the volume of production. In the early 1990s,
FPGAs were primarily used in telecommunications and networking. By the end of
the decade, FPGAs found their way into consumer, automotive, and industrial
applications. FPGAs got a glimpse of fame in 1997, when Adrian Thompson, a
researcher working at the University of Sussex, merged genetic algorithm
technology and FPGAs to create a sound recognition device. Thomson’s algorithm
configured an array of 10 x 10 cells in a Xilinx FPGA chip to discriminate between
two tones, utilising analogue features of the digital chip. The application of genetic
algorithms to the configuration of devices like FPGA's is now referred to as
Evolvable hardware


A recent trend has been to take the coarse-grained architectural approach a step
further by combining the logic blocks and interconnects of traditional FPGAs with
embedded microprocessors and related peripherals to form a complete "system on
a programmable chip". This work mirrors the architecture by Ron Perlof and Hana
Potash of Burroughs Advanced Systems Group which combined a reconfigurable
CPU architecture on a single chip called the SB24. That work was done in 1982.
Examples of such hybrid technologies can be found in the Xilinx Virtex-II PRO
and Virtex-4 devices, which include one or more PowerPC processors embedded
within the FPGA's logic fabric. The Atmel FPSLIC is another such device, which
uses an AVR processor in combination with Atmel's programmable logic
architecture. The Actel SmartFusion devices incorporate an ARM architecture
Cortex-M3 hard processor core (with up to 512kB of flash and 64kB of RAM) and
analog peripherals such as a multi-channel ADC and DACs to their flash-based
FPGA fabric.


Applications of FPGAs include digital signal processing, software-defined

radio, aerospace and defense systems, ASIC prototyping, medical imaging,
computer vision, speech recognition, cryptography, bioinformatics, computer
hardware emulation, radio astronomy, metal detection and a growing range of
other areas.FPGAs originally began as competitors to CPLDs and competed in a
similar space, that of glue logic for PCBs. As their size, capabilities, and speed
increased, they began to take over larger and larger functions to the state where
some are now marketed as full systems on chips (SoC). Particularly with the
introduction of dedicated multipliers into FPGA architectures in the late 1990s,
applications which had traditionally been the sole reserve of DSPs began to
incorporate FPGAs instead.

FPGAs especially find applications in any area or algorithm that can make
use of the massive parallelism offered by their architecture. One such area is code
breaking, in particular brute-force attack, of cryptographic algorithms.

FPGAs are increasingly used in conventional high performance computing

applications where computational kernels such as FFT or Convolution are
performed on the FPGA instead of a microprocessor.