You are on page 1of 3

11.

Explain lex and yacc tools:Lex: - scanner that can identify those tokens
Yacc: - parser.yacc takes a concise description of a grammar and produces a C
routine that can parse that grammar.
12. Give the structure of the lex program:Definition section- any intitial c program code
%%
Rules section- pattern and action separated by white space
%%
User subroutines section-concsit of any legal code.
13. The lexer produced by lex in a c routine is called yylex()
14. Explain yytext:- contains the text that matched the pattern.
15. The yacc produced by parser is called yyparse().
16. Why we have to include y.tab.h in lex?
Y.tab.h contains token definitions eg:- #define letter 258.
17. Explain the structure of a yacc program?
Defn section- declarations of the tokens used in the grammar
%%
The rules section-pattern action
%%
Users subroutines section
18. Explain yyleng?
Yyleng-contains the length of the string our lexer recognizes.
19. Define action?
The C code associated with a lex pattern or a yacc rule. When the pattern
or rule matches an input sequence, the action code is executed.
20. Define Pattern?
In a lex lexer, a regular expression that the lexer matches against the input.
21. What is an Alphabet?
A set of distinct symbols .For example, the ASCII character set is a
collection of 128 different symbols. In a lex specification, the alphabet is the
native character set of the computer, unless you use %T to define a custom
alphabet.
22. Define ASCII?
American Standard Code for Information Interchange, a collection of 128
symbols representing the common symbols found in the American alphabet:

lower and upper case letters, digits and punctuation, plus additional characters
for formatting and control of data communication links.

23. What is an Ambiguity?


An Ambiguous grammar is one with more than one rule or set of rules that
match the same input.
24. Define Finite Automaton?
An abstract machine which consists of a finite number if instructions.
Finite automata are useful in modeling many commonly occurring computer
processes and have useful mathematical properties.
25. Define Input.
A stream of data read by a program. For instance, the input to a lex scanner
is a sequence of bytes, while the input to a yacc parser is a sequence of tokens.
26. Define Language.
A well-defined set of strings over some alphabet; informally, some set of
instructions for describing tasks which can be executed by a computer.
27. Define Precedence.
The order in which some particular operation is performed.
28. What is a Parser?
A Parser for a Grammar is a program which takes in the Language string as it's input
and produces either a corresponding Parse tree or an Error.
29. What is the Syntax of a Language?
The Rules which tells whether a string is a valid Program or not are called the Syntax.
30. What is the Semantics of a Language?
The Rules which gives meaning to programs are called the Semantics of a language.
31. What are tokens?
When a string representing a program is broken into sequence of substrings, such that
each substring represents a constant, identifier, operator, keyword etc of the language, these
substrings are called the tokens of the Language.
32. What is the Lexical Analysis?
The Function of a lexical Analyzer is to read the input stream representing the Source
program, one character at a time and to translate it into valid tokens.
33. How can we represent a token in a language?
The Tokens in a Language are represented by a set of Regular Expressions. A regular
expression specifies a set of strings to be matched. It contains text characters and operator
characters. The Advantage of using regular expression is that a recognizer can be automatically
generated.
34. How are the tokens recognized?
The tokens which are represented by an Regular Expressions are recognized in an input
string by means of a state transition Diagram and Finite Automata.

35. Are Lexical Analysis and Parsing two different Passes?


These two can form two different passes of a Parser. The Lexical analysis can store all the
recognized tokens in an intermediate file and give it to the Parser as an input. However it is more
convenient to have the lexical Analyzer as a co-routine or a subroutine which the Parser calls
whenever it requires a token.
36. What are the Advantages of using Context-Free grammars ?
It is precise and easy to understand.
It is easier to determine syntactic ambiguities and conflicts in the grammar.
37. What are Terminals and non-Terminals in a grammar?
Terminals:- All the basic symbols or tokens of which the language is composed of are
called Terminals. In a Parse Tree the Leafs represents the Terminal Symbol.
Non-Terminals:- These are syntactic variables in the grammar which represents a set of
strings the grammar is composed of. In a Parse tree all the inner nodes represents the NonTerminal symbols.

You might also like