Download - 1.1 Basic Data Structure (2)
-
8/10/2019 1.1 Basic Data Structure (2)
1/33
DATA STRUCTURE
-
8/10/2019 1.1 Basic Data Structure (2)
2/33
Definitions
Data Structure is a way of organizing all data
items that considers not only the elements
stored but also their relationship to each other.
Logical and mathematical model of a
particular organization of data items is called
Data Structure.
-
8/10/2019 1.1 Basic Data Structure (2)
3/33
Types of Data Structure
1) Primitive data structure: -
These are basic structures and are directlyoperated upon by the machine instructions.
Integer
Float
Character
Pointer
-
8/10/2019 1.1 Basic Data Structure (2)
4/33
Types of Data Structure
2) Non-Primitive data structure: -
These are derived from the primitive datastructures and these emphasize on structuring of a
group of homogeneous or heterogeneous dataitems.
Array
Structure/Union List
-
8/10/2019 1.1 Basic Data Structure (2)
5/33
Types of Data Structure
List can be further divided into two parts:-
1) Linear list:- Every item is related to its previous and next
time. Data is arranged in linear sequence. Data items can
be traversed in a single run e.g. array, stack, linked list,queue. Its implementation is easy.
2) Non-Linear list:- Every item is attached with many other
items. Data is not arranged in sequence and cant be
traversed in a single run e.g. tree, graph. Its
implementation is difficult.
-
8/10/2019 1.1 Basic Data Structure (2)
6/33
Array An Array can be defined as a set of finite number of
homogeneous elements or data items. It means an
array can contain one type of data only. The array are stored at consecutive memory locations
and are static by nature. When the user declarers array,
it occupy that memory location, either the user make
use of all that memory or not.
-
8/10/2019 1.1 Basic Data Structure (2)
7/33
Linked-List
It can be defined as a collection of variablenumber of data items. An element of a list mustcontain at least two fields, one for storing data or
information and other for storing address of nextelement. Each such element is referred to as anode, therefore a list can be defined as a collectionof nodes.
-
8/10/2019 1.1 Basic Data Structure (2)
8/33
A linked list a1, a2,..., anis shown in Figure
As shown in diagram the header node contains the address offirst node, which has two parts in which data and address of nextnode is stored separately. In this way by traversing through usercan find the other nodes. The last node contains null in its
address part.
-
8/10/2019 1.1 Basic Data Structure (2)
9/33
Stack A Stack is an ordered collection of objects that are
inserted and removed according to the last-in first-out
(LIFO) principle.
Insertion of element into stack is called Pushand
Deletion of element into stack is called Pop. Insertion
and deletion of elements can be done only from one
end, called TOSor top of stack.
-
8/10/2019 1.1 Basic Data Structure (2)
10/33
Stack
2
2
10
2
1. Push(2)
2. Push(10)
3. Pop(10)
4. Pop(2)
-
8/10/2019 1.1 Basic Data Structure (2)
11/33
Queue
A Queue is an ordered collection of elements whichworks on FIFOprinciple.
Elements can be inserted into a queue from one end
called REARand elements can be deleted from the
other end called FRONT.
-
8/10/2019 1.1 Basic Data Structure (2)
12/33
Tree
A tree can be defined as finite set of data items. Treeis non-linear types of data structures in which dataitems are arranged or stored in hierarchicalrelationship.
-
8/10/2019 1.1 Basic Data Structure (2)
13/33
Graph
A graph G(V,E) is a set of vertices V and a set ofedges E. An edge connects a pair of vertices andmany have weight such as length, cost or anothermeasuring instrument for recording the graph.
-
8/10/2019 1.1 Basic Data Structure (2)
14/33
Memory Allocation
There are two types of memory allocations: -
1) Compile-time or static allocation
int x,y;
2) Run-time or Dynamic allocation
1) Malloc()- malloc(no. of elements*size of elements)
2) Calloc()- malloc(no. of elements, size of elements)
3) Ralloc()- realloc(ptr_var, new_size)4) Free()- free(ptr_var)
-
8/10/2019 1.1 Basic Data Structure (2)
15/33
Data Structure Operations
Traversing-Accessing each record to process thatitem.
Searching-finding the location of record with a givenkey value or that satisfy a condition.
Inserting-Adding a new record to structure.
Deleting-Removing a record from the structure.
Sorting-Arranging records in logical order.
Merging-combining records from two sorted files intoone file.
-
8/10/2019 1.1 Basic Data Structure (2)
16/33
Algorithm
An algorithm is a well defined computational procedure
of finite steps which takes some values as input and
produces some value as output. In other words an algorithm is a finite sequence of
computational steps that transform the input into the
output.
-
8/10/2019 1.1 Basic Data Structure (2)
17/33
Complexity of AlgorithmsAlgorithmic complexity is concerned about how fast or slow
particular algorithm performs. We define complexity as a
numerical function T(n)- time versus the input size n. We want
to define time taken by an algorithm without depending on the
implementation details. The way around is to estimate efficiency
of each algorithm asymptotically. We will measure time T(n)asthe number of elementary "steps" (defined in any way), provided
each such step takes constant time.
- Time Complexity: Running time of the program
as a function of the size of input.- Space Complexity: Amount of computer memoryrequired during the program execution, as a functionof the input size.
-
8/10/2019 1.1 Basic Data Structure (2)
18/33
Asymptotic Notation
When we look at input sizes, large enough to
make only the order of growth of the running
time relevant, we are studying the asymptoticefficiency of algorithms.
Three notations
Big-Oh (O) notation
Big-Omega() notation
Big-Theta() notation
-
8/10/2019 1.1 Basic Data Structure (2)
19/33
Asymptotic notations
-
8/10/2019 1.1 Basic Data Structure (2)
20/33
O notation: - (Upper bound) This notation gives
the upper bound for a function to within a constantfactor. It is represented as
f(x)= O(g(n))
means f (the running time of the algorithm) grows
exactly like g when n (input size) gets larger. In
other words, the growth rate of f(x) is
asymptotically proportional to g(n).Here the growth
rate is no faster than g(n). big-oh is the most usefulbecause represents the worst-case behavior. This
is the worst case for any algorithm.
-
8/10/2019 1.1 Basic Data Structure (2)
21/33
notation: - (Tightly bound) This notation boundsa function to within constant factors.It is
represented as
f(x)=
(g(n))
If there exists positive constant n0 ,C1and C2such
that to the right of n0 ,the value of f(x) always lie
between C1 g(n) and C2 g(n) inclusive.
-
8/10/2019 1.1 Basic Data Structure (2)
22/33
notation: - (Lower Bound) This notation gives alower bounds for a function to within constant
factor.It is represented as
f(x)=
(g(n))
If there exists positive constant n0 ,C such that to
the right of n0 ,the value of f(x) always lie on or
above Cg(n).
-
8/10/2019 1.1 Basic Data Structure (2)
23/33
Categories of algorithm
Seven functions that often appear inalgorithm analysis:
Constant 1
Logarithmic log n
Linear n
Log Linear n log n
Quadratic n2
Cubic n3
Exponential 2n
-
8/10/2019 1.1 Basic Data Structure (2)
24/33
The Constant Function
Constant Time: O(1)
An algorithm is said to run in constant time if it requiresthe same amount of time regardless of the input size.
Examples:
array: accessing any element
fixed-size stack: push and pop methodsfixed-size queue: enqueue and dequeue methods
-
8/10/2019 1.1 Basic Data Structure (2)
25/33
The Linear Functions
Linear Time: O(n)An algorithm is said to run in linear time if
its time execution is directly proportional
to the input size, i.e. time grows linearlyas input size increases.
Examples:
array: linear search, traversing, find minArray List: contains method
queue: contains method
-
8/10/2019 1.1 Basic Data Structure (2)
26/33
Logarithmic Time: O(log n)
An algorithm is said to run in logarithmic
time if its time execution is proportionalto the logarithm of the input size.
Example:
binary search
Quadratic Time: O(n2)
An algorithm is said to run in logarithmic
time if its time execution is proportionalto the square of the input size.
Examples:
bubble sort, selection sort, insertion sort
-
8/10/2019 1.1 Basic Data Structure (2)
27/33
The Cubic functions and other polynomials
Cubic functions
f(n) = n3
Polynomialsf(n) = a0 + a1n+ a2n
2+ .+adnd
d is the degree of the polynomial
a0,a1.... ad are called coefficients.
-
8/10/2019 1.1 Basic Data Structure (2)
28/33
The N-Log-N Function
f(n) = nlogn
Function grows little faster than linear
function and a lot slower than the quadraticfunction.
-
8/10/2019 1.1 Basic Data Structure (2)
29/33
Example
AlgorithmMystery(n) # operations
sum0 1
fori0ton1do n + 1
forj0ton1do n (n + 1)
sumsum + 1 n .n
Total number of steps: 2n2+ 2n + 2
-
8/10/2019 1.1 Basic Data Structure (2)
30/33
Input size: n (number of array elements) Total number of steps: 2n + 3
AlgorithmarraySum (A, n)
Inputarray Aof nintegers
OutputSum of elements of A # operations
sum0 1
fori0ton1do n+1
sumsum + A [i ] n
return sum 1
Example: Find sum of array elements
-
8/10/2019 1.1 Basic Data Structure (2)
31/33
Input size: n (number of array elements) Total number of steps: 3n
AlgorithmarrayMax(A, n)
Inputarray Aof nintegers
Outputmaximum element of A # operations
currentMaxA[0] 1
fori1ton1do n
ifA [i] currentMaxthen n -1
currentMaxA [i] n -1returncurrentMax 1
Example: Find max element of an array
-
8/10/2019 1.1 Basic Data Structure (2)
32/33
Space complexity
Space complexity is a function describing the amount of
memory (space) an algorithm takes in terms of the amount of
input to the algorithm. We often speak of "extra" memory
needed, not counting the memory needed to store the input
itself. Again, we use natural (but fixed-length) units tomeasure this. We can use bytes, but it's easier to use, say,
number of integers used, number of fixed-sized structures,
etc. In the end, the function we come up with will be
independent of the actual number of bytes needed torepresent the unit. Space complexity is sometimes ignored
because the space used is minimal and/or obvious, but
sometimes it becomes as important an issue as time.
-
8/10/2019 1.1 Basic Data Structure (2)
33/33
Space complexity
For example, we might say "this algorithm takes n2 time,"
where nis the number of items in the input. Or we might say
"this algorithm takes constant extra space," because the
amount of extra memory needed doesn't vary with the
number of items processed. For both time and space, we areinterested in the asymptotic complexity of the algorithm:
When n(the number of items of input) goes to infinity, what
happens to the performance of the algorithm?