Professional Documents
Culture Documents
Umair Nazir
REG.NO:
SP13-BCS-O11
ASSIGNMENT:
A.I#2
more than 1 millions line of code... And this may not work properly as assumed.
Human neurons work parallely.Artifical neutron cannot work parallelly
Artificial neural networks (ANNs) are mathematical constructs, originally designed
to approximate biological neurons. Each "neuron" is a relatively simple element --for example, summing its inputs and applying a threshold to the result, to
determine the output of that "neuron".
Several decades of research went into discovering how to build network
architectures using these mathematical constructs, and how to automatically set the
weighting on each of the connections between the neurons to perform a wide range
of tasks. For example, ANNs can do things like recognition of hand-written digits.
A "biological neural network" would refer to any group of connected biological
nerve cells. Your brain is a biological neural network, so is a number of neurons
grown together in a dish so that they form synaptic connections. The term
"biological neural network" is not very precise; it doesn't define a particular
biological structure.
In the same way, an ANN can mean any of a large number of mathematical
"neuron"-like constructs, connected in any of a large number of ways, to perform
any of a large number of tasks.
An artificial neural network is basically a mathematical function. It is built from
simple functions which have parameters (numbers) which get adjusted.
Step 4. Steps 2-3 are repeated for all the remaining layers (it is possible for the
model to have more than 2 layers), until we are left with the output node.
Q NO #2
Data structure and searching algorithm
Computer systems are often used to store large amounts of data from which individual
records must be retrieved according to some search criterion. Thus the efficient
storage of data to facilitate fast searching is an important issue. In this section, we
shall investigate the performance of some searching algorithms and the data structures
which they use.
Sequential Searches
Let's examine how long it will take to find an item matching a key in the collections
we have discussed so far. We're interested in:
a. the average time
b. the worst-case time and
c. the best possible time.
However, we will generally be most concerned with the worst-case time as
calculations based on worst-case times can lead to guaranteed performance
predictions. Conveniently, the worst-case times are generally easier to calculate than
average times.
If there are n items in our collection - whether it is stored as an array or as a linked list
- then it is obvious that in the worst case, when there is no item in the collection with
the desired key, then n comparisons of the key with keys of the items in the collection
will have to be made.
To simplify analysis and comparison of algorithms, we look for a dominant operation
and count the number of times that dominant operation has to be performed. In the
case of searching, the dominant operation is the comparison, since the search requires
n comparisons in the worst case, we say this is aO(n) (pronounce this "big-Oh-n" or
"Oh-n") algorithm. The best case - in which the first comparison returns a match -
requires a single comparison and is O(1). The average time depends on the probability
that the key will be found in the collection - this is something that we would not
expect to know in the majority of cases.
Binary Search
However, if we place our items in an array and sort them in either ascending or
descending order on the key first, then we can obtain much better performance with
an algorithm called binary search.
In binary search, we first compare the key with the item in the middle position of the
array. If there's a match, we can return immediately. If the key is less than the middle
key, then the item sought must lie in the lower half of the array; if it's greater then the
item sought must lie in the upper half of the array. So we repeat the procedure on the
lower (or upper) half of the array.