You are on page 1of 38

Report on Departmental Activities

Audio-Watermarking Based Ownership Verification


System Using Enhanced DWT-SVD Technique
1. INTRODUCTION
Digital watermarking scheme has been an effective method of hiding information into a
digital media. It can applied to different types of media such as text, image, audio and video with the
aim of protecting copyright or verifying the media ownership. Of these, the audio ownership
verification is of interest in this paper. However, the digital revolution in audio industry has brought
new ownership verification challenges, as a recording audio can be copied and distributed easily. A
number of digital audio watermarking techniques/ methods have therefore proposed to solve or
mitigate the impact of these challenges [1] [10].
In digital watermarking, we can hide data in another object (cover data) then send the
watermarked object over the internet. Other approaches used for audio ownership verification is
known as audio fingerprinting [1]. In these methods, the audio features of a certain audio signal are
extracted and saved in a database as original audio fingerprint. Unlike fingerprinting, the
watermarking techniques can be implemented in spatial domain where the watermark can be
embedded directly into a digital media and modified its values. Alternatively, the digital media can
be transferred to a frequency domain where the embedding process takes place [2], [3].
Ramakrishnan et. al. [4] suggested an image watermarking which uses Discrete Wavelet Transform
(DWT) and Singular Value Decomposition (SVD) techniques to improve robustness of still images.
Lalitha et. al. [5] compared performance of the Discrete Cosine Transform (DCT) and SVD based
audio watermarking with the DWT-SVD based audio watermarking and concluded that the DWTSVD had demonstrated better experimental results than the DCTSVD techniques. Unlike human
visual systems, the human auditory systems are much more sensitive and therefore, it is more
difficult to achieve the necessary imperceptibility [6], [7]. Moreover, audio signals are typically
represented by much less samples per time period, which makes it more difficult to embed the same
amount of information robustly and imperceptibly. The number of research in audio watermarking is
therefore relatively limited due to the sensitive properties of both the human auditory system and
audio signals. zer et. al. [8] proposed SVD based technique to watermark audio signals with
minimal effect on the original audio signal. Al-Haj and Mohammad [9] reported a digital audio

Dept. of ECE

VVIT

Report on Departmental Activities

watermarking technique based on a combination of DWT-SVD techniques with the aim of


improving watermark robustness against different signal attacks.
In audio copyright-ownership verification systems, however, the watermarking technique
should not only satisfy robustness requirement but other important requirements such as
imperceptibility to ensure minimal distortion in the audio signal, and uniqueness and immutability
features to differentiate between the owner and other parties who claiming the copyright. Al-Yaman
et. al. [10] extended the work reported in [9] by using an effective code assignment method to
improve quality of the watermarked audio signal through minimizing the effect of the ones in the
digital pattern of the embedded digital watermark. Size of the required minimum audio signal cover
was also improved by using a hash function encoding method.
In this paper, we propose an enhanced audio ownership verification system which builds on
the findings reported in [10]. The suggested enhancements which include a new audio signal
framing, DWT matrix formation, and embedding methods aim at improving the minimum audio
cover period, quality of the watermarked audio and its robustness against various attacks. The
remainder of this paper is organized as follows: Section 2 overviews the process of audio watermark
embedding extraction into/from the original audio signal. The proposed modifications/enhancements
are detailed in Section 3. In Section 4, the obtained experimental results are presented and compared
with those obtained by the old system. Finally, Section 5 concludes the work and findings presented
in this paper.

Dept. of ECE

VVIT

Report on Departmental Activities

2. SYSTEM OVERVIEW
Fig. 1 shows an overview conceptual diagram for the proposed audio ownership
verification system. As illustrated, the watermark (image or a logo) that refers to the owner is
embedded into the audio file in a way that it should not affect quality of the original audio. This is
achieved by considering an enhanced framing and DWT matrix formation. The ownership of the
audio under verification can then be verified by comparing the provided watermark image with that
embedded in the audio signal. The proposed watermark embedding/extraction processes can be
outlined briefly as follows.

Figure 1. System overview.


A. Watermark Embedding Process
The watermark embedding process into an audio signal proceeds through the following steps:
sampling the original audio signal, partitioning the obtained samples into frames (framing), a fourlevel DWT is then performed. Next, the obtained DWT is decomposed using SVD technique. The
SVD is a numerical technique used to transform matrices into series of linear approximations that

Dept. of ECE

VVIT

Report on Departmental Activities

expose the underlying structure of the matrix. Mathematically, for example, the SVD of an nn
matrix A is defined by the operation [10], [11]:
A = U S VT ----------------------------(1)
In matrix form, Eqn. (1) can be presented as [9]:
----------------------------------(2)
The watermark of a digital-image is then encrypted using SHA-1 hash algorithm [10] and embedded
into the DWTSVD- transformed audio signal according to the following formula [9]:
-----------------------------------(3)
where W(n) is the hash bit (0 or 1) of the watermark image, _ is the watermark intensity, S1,1 is the
top left value in the S-matrix, and S1,1w is the watermarked S1,1. For example, if _ is set to 0.1, as a
selected watermark intensity value, then S1,1 w will equal (1.1 S1,1) when W(n) is 1, and to (S1,1)
when W(n) is 0. Now, to produce the desired watermarked audio, the inverse of SVD and DWT is
performed. The implemented stages of watermark embedding are shown in Fig. 2 [10]. In the present
work, the framing, matrix formations and embedding stages represent the core enhancement
suggested in this work,
B. Watermark Extraction Process
The extraction procedure of the encrypted watermark image from the watermarked audio signal can
be achieved by following a process similar to that of the watermark embedding described above. The
implemented stages of watermark extraction are shown in Fig. 3.

Figure 2. The Watermark Embedding Process.

Dept. of ECE

VVIT

Report on Departmental Activities

Figure 3. The watermark extraction process.


C. Ownership Verification Process
The extracted bits are compared with those obtained from applying the Hash (SHA-1) on the
watermark. The system confirms verification when the compared bits completely match. This
procedure is shown in Fig. 4.

Figure 4. Ownership verification process.

Dept. of ECE

VVIT

Report on Departmental Activities

3. THE PROPOSED ENHANCEMENTS


This section is devoted to explain the suggested enhancements of the audio ownership verification
system under consideration. As mentioned earlier in Section 1, these enhancements which focuses on
the audio signal framing, matrix formation and embedding methods aim at improving the minimum
audio-cover period, quality of the watermarked audio and its robustness against various attacks. The
suggested enhancements and their impacts on improving the system are described as follows.
A. Framing of Audio Signal
When applying the watermark to all frames, the signal-to noise ratio of the audio signal will be
degraded. Applying the watermark on selected frames will therefore improve quality of the
watermarked audio signal. In addition, since noise is random and might affect only part of the audio
signal, the probability of losing the watermark will be decreased. For example, using a sampling
frequency of 44100 samples per second, the audio signal is portioned into frames, each of length 800
samples. Selection of this frame length ____ improved the minimum audio-cover period when
compared with the minimum periods reported in [9] and [10]. A frame group is then be selected by
taking one frame and skipping 29 others. According to the Eqn. (4), the skipped frames allow for
adding 1.84 watermark bits in each second.
Bit per second =-----------------------------(4)
where __ is the sampling frequency, __ is the frame length and __ is the number of skipped frames.
The suggested frame selection method is illustrated in Fig. 5.

Figure 5. Audio segmentation scheme.


Such a frame selection method minimizes the required audio-cover period by 69% when compared
to that reported in [10] and the suggested bit- assignment stage is therefore no longer needed. This
Dept. of ECE

VVIT

Report on Departmental Activities

improvement is achieved irrespective of the watermark size. Table 1 shows different watermark size
examples and the required minimum audio-cover period for both the old and suggested methods.

TABLE 1. Minimum audio-cover period versus watermark size.


B. Matrix formation
The discrete wavelets transform (DWT) is a discipline capable of giving a time-frequency
representation of any given signal starting from the original audio signal S, DWT produces two sets
of coefficients .The approximated coefficients A (low frequencies) are produced by passing the
signal S through a low pass filter. The details coefficients D (high frequencies) are produced by
passing the signal S through a high pass filter. Depending on the application and the length of the
signal, the low frequencies part might be further decomposed into two parts of high and low
frequencies. Fig. 6 shows 4-level DWT decomposition for the signal S. The original signal S can be
reconstructed using the inverse DWT process [9]. Now, having the same D component repeated
several times in the matrix formation reported in [9], [10] may cause a mathematical problem when
performing the inverse SVD since it changes the watermarked signal matrix and the correct value of
appropriate D cannot be identified due to the repeated values for the same D [11]. A four-level DWT
transformation is therefore performed on each frame to produce five multi-resolution sub-bands:
D1, D2, D3, D4 and A4, as illustrated on Fig. 6.

Dept. of ECE

VVIT

Report on Departmental Activities

Figure 6. Four-level DWT decomposition.


These sub-bands are re-arranged in a new matrix form as shown in Fig. 7. The matrix size is
obtained from:
Matrix --------------------------------(5)
where L is the frame length. In this matrix form, unlike the previous matrix form, there will be no
repeated values for the same D.

Figure 7. The suggested matrix formation.


C. Embedding Process
Robustness of a watermark increases with the increasing values of watermark intensity (_) in Eqn. 3.
Using large values of _ will make the watermarked audio more immune to noise and attacks and
easier to extract. In contrast, using smaller values of _ will increase the probability of losing the
Dept. of ECE

VVIT

Report on Departmental Activities

embedded watermark bits. However, increasing _ will cause SNR to be decreased, as shown in Fig.
8. Based on this, selection of optimum value of _ will be a compromise between quality and
robustness. Fig. 9 shows how robustness increases with increased values of _.

Figure 8. SNR versus watermark intensity (_).

Figure 9. Robustness versus watermark intensity (_).

Dept. of ECE

VVIT

Report on Departmental Activities

4. RESULTS AND DISCUSSION


In this section, the impact of the suggested enhancements on the quality of watermarked audio signal
is presented and compared with that of the old system. As mentioned earlier, the encoding stage
resulted in a clear enhancement on the inaudibility of the watermark. This enhancement is noticeable
by listening to the watermarked audio and confirmed by the Signal to Noise Ratio (SNR)
measurement. Fig. 10 shows SNR test for embedding watermarks of random different sizes in an
audio file of 4:22 minute period. In this example, the signal quality improvement by the proposed
system is clearer when compared to the old system.

Figure 10. SNR for watermarks of different sizes.


Another performance comparison between the old and suggested systems is shown in Fig. 11. This
figure illustrates the significant improvement of the suggested system in minimizing the required
audio-cover signal. Finally, a number of tests are also performed to measure system robustness
against various types of attack. The obtained results demonstrated system robustness against several
bench mark attacks, as shown in Table 2. The used attacks are defined by Stirmark watermarking
benchmark [12].

Dept. of ECE

10

VVIT

Report on Departmental Activities

Figure 11. SNR versus minimum audio-cover period.


The robustness against attacks is measured using the BER (Bit Error Rate) which is defined as the
ratio of incorrect extracted bits to the total amount of embedded bits, as expressed as [13]:
-----------------------------------------(6)
where L is the watermark length, Wn corresponds to the nth bit of the original embedded watermark
and W'n corresponds to the nth bit of the extracted watermark.

Dept. of ECE

11

VVIT

Report on Departmental Activities

5. CONCLUSIONS
This paper presented novel extensions to a previously reported digital audio watermarking
algorithm based on DWT and SVD techniques. The suggested extensions expanded the possibility of
inserting more frames in the original audio signal and improved quality of the watermarked audio by
changing the value of watermark intensity. Performance of the proposed enhancements was tested
under different watermark sizes and audio periods. The obtained test results showed improved
performance over the original algorithms in terms of audio signal quality (25% higher SNR), 69%
reduction in the required minimum audio-cover period, and improved robustness against various
watermark benchmark attacks.

Dept. of ECE

12

VVIT

Report on Departmental Activities

REFERENCES
1. Y. Liu, H. S Yun, J. S Sung, N. S. Kim, "A novel audio fingerprinting scheme based on sub
band envelop hashing", Proc. Asia-Pacific Signal and Information Processing Association,
pp.813-816, Oct. 2009.
2. A. Bouridane ,L. Ghouti ,M. Ibrahim and S. Boussakta Digital Image Watermarking Using
Balanced Multiwavelets, IEEE Transactions on Signal Processing, Vol. 54, No. 4, pp. 15191536, April 2006.
3. S. Ramakrishnan, T. Gopalakrishnan, K. Balasamy, SVD Based Robust Digital
Watermarking For Still Images Using Wavelet Transform, CCSEA 2011, CS & IT 02, pp.
155167, 2011.
4. N. Lalitha, G. Suresh, V. Sailaja, Improved Audio Watermarking Using DWT-SVD,
International Journal of Scientific & Engineering Research, Vol. 2, Issue 6, June, 2011,ISSN
2229- 5518.
5. W-N Lie and L-C. Chang, Robust and High-Quality Time- Domain Audio Watermarking
Based on Low-Frequency Amplitude Modification, IEEE Transactions on Multimedia, Vol.
8, No.1, February 2006.
6. H. zer, B. Sankur, and N. Memon, An SVD Based Audio Watermarking Technique, ACM
Workshop on Multimedia and Security, New York, August 1-2, 2005.
7. A. Al-Haj and A. Mohammad, Digital Audio Watermarking Based on the Discrete Wavelets
Transform and Singular Value Decomposition, European Journal of Scientific Research,
ISSN 1450-216X Vol.39 No.1, pp.6-21, 2010.
8. H. Andrews and C. Patterson, "Singular Value Decomposition (SVD) Image Coding", IEEE
9.

Trans. on Communications; 42(4): 425-432, 1976.


A. Lang, Stirmark Benchmark for Audio (SMBA) [Online]:http://amsl-smb.cs.uni-

magdeburg.de/smfa/ main. php, Accessed on December 13, 2011.


10. J. Grody and L. Brutun,Performance Evaluation of Digital Audio Watermarking
algorithms, Proc. of the 43rd IEEE Midwest Symposium on Circuits and Systems, 456-9,
2000.

Source code
function varargout = gui_audioWatermark(varargin)
%GUI_AUDIOWATERMARK M-file for gui_audioWatermark.fig

Dept. of ECE

13

VVIT

Report on Departmental Activities


gui_Singleton = 1;
gui_State = struct('gui_Name',
mfilename, ...
'gui_Singleton', gui_Singleton, ...
'gui_OpeningFcn', @gui_audioWatermark_OpeningFcn, ...
'gui_OutputFcn', @gui_audioWatermark_OutputFcn, ...
'gui_LayoutFcn', [], ...
'gui_Callback',
[]);
if nargin && ischar(varargin{1})
gui_State.gui_Callback = str2func(varargin{1});
end
if nargout
[varargout{1:nargout}] = gui_mainfcn(gui_State, varargin{:});
else
gui_mainfcn(gui_State, varargin{:});
end
% End initialization code - DO NOT EDIT

% --- Executes just before gui_audioWatermark is made visible.


function gui_audioWatermark_OpeningFcn(hObject, eventdata, handles, varargin)
% This function has no output args, see OutputFcn.
% hObject
handle to figure
% eventdata reserved - to be defined in a future version of MATLAB
% handles
structure with handles and user data (see GUIDATA)
% varargin
unrecognized PropertyName/PropertyValue pairs from the
%
command line (see VARARGIN)
% Choose default command line output for gui_audioWatermark
handles.output = hObject;
handles.alpha = 0.1;
handles.AudioInput = [];
handles.Fs = 8000;
handles.LogoHash = '';
handles.key = [];
handles.vm_idx = [];
set(handles.hashIn,'string',' ')
set(handles.pushbutton7,'enable','off','visible','off')
set([handles.PlayAudio,handles.LoadLogo,handles.InsertWatermark],...
'Enable','inactive')
cla(handles.InputAxis)
cla(handles.OutputAxis)
cla(handles.LogoAxis)
set(handles.InputAxis,'xtick',[],'ytick',[])
set(handles.OutputAxis,'xtick',[],'ytick',[])
set(handles.LogoAxis,'xtick',[],'ytick',[])
set(handles.hashIn,'backgroundcolor',[1 1 1])

Dept. of ECE

14

VVIT

Report on Departmental Activities


set(handles.InsertWatermark,'string','Insert Watermark')
set(handles.LoadInput,'string','Load Input Audio')
set(handles.PlayAudio,'string','Play Input Audio')
set(handles.OutputAxis,'visible','on','xtick',[],'ytick',[])
set(handles.LoadLogo,'visible','on')
set(handles.PlayOutput,'visible','on')
set(handles.LogoAxis,'visible','on','xtick',[],'ytick',[])
set(handles.InsertWatermark,'Enable','inactive')
set(handles.popupmenu1,'value',1)
% Update handles structure
guidata(hObject, handles);
% UIWAIT makes gui_audioWatermark wait for user response (see UIRESUME)
% uiwait(handles.figure1);

% --- Outputs from this function are returned to the command line.
function varargout = gui_audioWatermark_OutputFcn(hObject, eventdata, handles)
% varargout cell array for returning output args (see VARARGOUT);
% hObject
handle to figure
% eventdata reserved - to be defined in a future version of MATLAB
% handles
structure with handles and user data (see GUIDATA)
% Get default command line output from handles structure
varargout{1} = handles.output;

function hashIn_Callback(hObject, eventdata, handles)


% hObject
handle to hashIn (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles
structure with handles and user data (see GUIDATA)
% Hints: get(hObject,'String') returns contents of hashIn as text
%
str2double(get(hObject,'String')) returns contents of hashIn as a double

% --- Executes during object creation, after setting all properties.


function hashIn_CreateFcn(hObject, eventdata, handles)
% hObject
handle to hashIn (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles
empty - handles not created until after all CreateFcns called
% Hint: edit controls usually have a white background on Windows.
%
See ISPC and COMPUTER.
if ispc && isequal(get(hObject,'BackgroundColor'),
get(0,'defaultUicontrolBackgroundColor'))
set(hObject,'BackgroundColor','white');
end

% --- Executes on button press in LoadInput.


function LoadInput_Callback(hObject, eventdata, handles)
% hObject
handle to LoadInput (see GCBO)

Dept. of ECE

15

VVIT

Report on Departmental Activities


% eventdata reserved - to be defined in a future version of MATLAB
% handles
structure with handles and user data (see GUIDATA)
[handles.fn,pn] = uigetfile('*.wav');
if handles.fn==0
return;
end
set(handles.hashIn,'string','')
[handles.AudioInput, handles.Fs]= wavread([pn,handles.fn]);
plot(zeros(size(handles.AudioInput(:,1))),'parent',handles.OutputAxis)
plot(handles.AudioInput(:,1),'parent',handles.InputAxis)
set(handles.InputAxis,'XLim',[0,size(handles.AudioInput,1)])
set(handles.OutputAxis,'XLim',[0,size(handles.AudioInput,1)])
set(handles.InputAxis,'xtick',[],'ytick',[])
set(handles.OutputAxis,'xtick',[],'ytick',[])
set(handles.PlayAudio,'Enable','on')
set(handles.LoadLogo,'Enable','on')
set(handles.hashIn,'backgroundcolor',[1 1 1])
% Update handles structure
guidata(hObject, handles);
% --- Executes on button press in PlayAudio.
function PlayAudio_Callback(hObject, eventdata, handles)
% hObject
handle to PlayAudio (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles
structure with handles and user data (see GUIDATA)
if ~isempty(daqfind)
wavplay(handles.AudioInput(:,1),handles.Fs)
end
% --- Executes on button press in LoadLogo.
function LoadLogo_Callback(hObject, eventdata, handles)
% hObject
handle to LoadLogo (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles
structure with handles and user data (see GUIDATA)
cla(handles.LogoAxis)
set(handles.hashIn,'backgroundcolor',[1 1 1],'string','')
[fn,pn] = uigetfile('*.png;*.bmp;*.jpg;*.tif','Select the logo file');
if fn==0
return;
end
set(handles.hashIn,'string','')
Logo = imread([pn,fn]);
handles.Logo = Logo;
% Update handles structure
guidata(hObject, handles);
Hash = hash(Logo);

Dept. of ECE

16

VVIT

Report on Departmental Activities


HashDB = struct('OwnerName','','Hash','');
if exist('HashDB.mat','file')
load('HashDB.mat')
end
imshow(Logo,'parent',handles.LogoAxis)
pause(1)
OwnerName = '';
for i = 1:numel(HashDB)
if strcmpi(Hash,HashDB(i).Hash)
if strcmpi(Hash,HashDB(i).Hash)
OwnerName = HashDB(i).OwnerName;
end
end
end
if isempty(OwnerName)
msgbox('Please Enter Owner Name to update the Hash Database', 'New
Logo','warn')
set(handles.hashIn,'string',OwnerName,...
'backgroundcolor',[1,1,0],...
'enable','on')
set(handles.pushbutton7,'enable','on','visible','on')
return;
end
set(handles.hashIn,'string',OwnerName,...
'backgroundcolor',[0.5,1,0],...
'enable','inactive')
set(handles.pushbutton7,'enable','off','visible','off')
handles.LogoHash = Hash;
set(handles.InsertWatermark,'Enable','on')
% Update handles structure
guidata(hObject, handles);
% --- Executes on selection change in popupmenu1.
function popupmenu1_Callback(hObject, eventdata, handles)
% hObject
handle to popupmenu1 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles
structure with handles and user data (see GUIDATA)
if get(handles.popupmenu1,'value')==2
cla(handles.OutputAxis)
cla(handles.LogoAxis)
set(handles.hashIn,'string','','backgroundcolor','w')
set(handles.InsertWatermark,'string','Extract Watermark')
set(handles.LoadInput,'string','Load Audio')
set(handles.PlayAudio,'string','Play Audio')
set(handles.OutputAxis,'visible','off')
set(handles.LoadLogo,'visible','off')
set(handles.PlayOutput,'visible','off')
set(handles.LogoAxis,'visible','off')
set(handles.InsertWatermark,'Enable','on')
set(handles.hashIn,'string','')

Dept. of ECE

17

VVIT

Report on Departmental Activities


else

set(handles.InsertWatermark,'string','Insert Watermark')
set(handles.LoadInput,'string','Load Input Audio')
set(handles.PlayAudio,'string','Play Input Audio')
set(handles.OutputAxis,'visible','on')
set(handles.LoadLogo,'visible','on')
set(handles.PlayOutput,'visible','on')
set(handles.LogoAxis,'visible','on','xtick',[],'ytick',[])
set(handles.InsertWatermark,'Enable','inactive')
set(handles.hashIn,'string','')

end
% Hints: contents = cellstr(get(hObject,'String')) returns popupmenu1 contents as
cell array
%
contents{get(hObject,'Value')} returns selected item from popupmenu1

% --- Executes during object creation, after setting all properties.


function popupmenu1_CreateFcn(hObject, eventdata, handles)
% hObject
handle to popupmenu1 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles
empty - handles not created until after all CreateFcns called
% Hint: popupmenu controls usually have a white background on Windows.
%
See ISPC and COMPUTER.
if ispc && isequal(get(hObject,'BackgroundColor'),
get(0,'defaultUicontrolBackgroundColor'))
set(hObject,'BackgroundColor','white');
end

% --- Executes on button press in InsertWatermark.


function InsertWatermark_Callback(hObject, eventdata, handles)
% hObject
handle to InsertWatermark (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles
structure with handles and user data (see GUIDATA)
% Frame length in time domain
len_Frame_t = 800;
% Count of frames per group
group_members = 30;
if strcmpi(get(hObject,'string'),'Insert Watermark')
% Prepare the watermark bits
w = dec2bin(handles.LogoHash)-48;

% FINDIN HOW MUCH OF THE SIGNAL IS NEEDED FOR THIS DATA


len_w = numel(w)
len_group = group_members*len_Frame_t;
len_s = (len_w+1) * len_group;
signal = handles.AudioInput(:,1);
if len_s > numel(signal)

Dept. of ECE

18

VVIT

Report on Departmental Activities


msgbox('The selected audio signal is very short','!!!','warn')
return
end
sig = signal(1:len_s);
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
% DATA EMBEDDING
% A frame among the 30 frames in a group is selected
% This frame is subjected to 4 level dwt
% The coefficients such as A4,D4,D3,D2 and D1 are used to make a matrix.
% SVD of this matrix is taken and the first element of the s-matrix is
% subjected to the watermarking process:% s11 = s11 + alpha*w(n)
% where,
% s11 = first element of s-matrix
% w(n) = the watermark bit
% alpha = watermark strength
alpha = 0.1;
% Process
vm_idx = 31:30:len_s;

[sig_w, key] = Audio_DataEmbedding(sig,w,alpha,len_Frame_t,vm_idx);


handles.key = key;
signal(1:len_s) = sig_w;

fn = sprintf('%s_e.wav',handles.fn);
wavwrite(signal,handles.Fs,32,fn);
fn = sprintf('%s_k.mat',handles.fn(1:4));
save(fn,'key')
handles.OutputAudio = signal;
handles.AudioInput = signal;
plot(signal,'parent',handles.OutputAxis)
set(handles.OutputAxis,'XLim',[0,size(signal,1)])
set(handles.OutputAxis,'xtick',[],'ytick',[])
set(handles.PlayOutput,'Enable','on')
handles.alpha = alpha;
handles.vm_idx = vm_idx;
guidata(hObject,handles)

else

signal = handles.AudioInput(:,1);
if isempty(handles.key)
fn = sprintf('%s_k.mat',handles.fn(1:4));
if exist(fn,'file')
load(fn)

Dept. of ECE

19

VVIT

Report on Departmental Activities


end

else

key = handles.key;
end
alpha = handles.alpha;
len_s = numel(key)*group_members*len_Frame_t;
vm_idx = 31:30:len_s;
W = Audio_DataExtracting(signal,key,alpha,len_Frame_t,vm_idx);
bitsPerChar = 7;
len_data_rx = numel(W);
bits_rx = reshape(W,[len_data_rx/bitsPerChar,bitsPerChar]);
ascii_rx = char(bits_rx+48);
char_rx = bin2dec(ascii_rx);
data_rx = char(char_rx)';

if ~exist('HashDB.mat','file')
return
end
load('HashDB.mat')
Owner = '';
for i= 1:numel(HashDB)
if strcmpi(data_rx, HashDB(i).Hash)
Owner = HashDB(i).OwnerName;
end
end
set(handles.hashIn,'string',Owner,'enable','inactive')
guidata(hObject,handles)
end
% Update handles structure
guidata(hObject, handles);
function hashOut_Callback(hObject, eventdata, handles)
% hObject
handle to hashOut (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles
structure with handles and user data (see GUIDATA)
% Hints: get(hObject,'String') returns contents of hashOut as text
%
str2double(get(hObject,'String')) returns contents of hashOut as a
double

% --- Executes during object creation, after setting all properties.


function hashOut_CreateFcn(hObject, eventdata, handles)
% hObject
handle to hashOut (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles
empty - handles not created until after all CreateFcns called

Dept. of ECE

20

VVIT

Report on Departmental Activities


% Hint: edit controls usually have a white background on Windows.
%
See ISPC and COMPUTER.
if ispc && isequal(get(hObject,'BackgroundColor'),
get(0,'defaultUicontrolBackgroundColor'))
set(hObject,'BackgroundColor','white');
end

% --- Executes on button press in PlayOutput.


function PlayOutput_Callback(hObject, eventdata, handles)
% hObject
handle to PlayOutput (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles
structure with handles and user data (see GUIDATA)
wavplay(handles.OutputAudio,handles.Fs)

% --- Executes on button press in pushbutton7.


function pushbutton7_Callback(hObject, eventdata, handles)
% hObject
handle to pushbutton7 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles
structure with handles and user data (see GUIDATA)
OwnerName = get(handles.hashIn,'string');
if isempty(OwnerName)
msgbox('Please Enter Owner Name', 'Empty field !!!', 'warn')
return;
end
Logo = handles.Logo;
if exist('HashDB.mat','file')
load('HashDB.mat')
else
HashDB = struct('OwnerName','','Hash','');
end

n = numel(HashDB);
if isempty(HashDB(n).Hash)
n = n-1;
end
n = n+1;
HashDB(n).OwnerName = OwnerName;
if isempty(HashDB(n).OwnerName)
return;
end
Hash = hash(Logo);
for i = 1:numel(HashDB)
if strcmpi(Hash,HashDB(i).Hash)
OwnerName0 = HashDB(i).OwnerName;
if ~strcmpi(OwnerName0,HashDB(n).OwnerName)

Dept. of ECE

21

VVIT

Report on Departmental Activities


msgbox('Invalid Owner','!!!','error')
return;
end

end
end

HashDB(n).Hash = Hash;
save HashDB HashDB
handles.LogoHash = Hash;
set(handles.pushbutton7,'enable','off','visible','off')
set(handles.LoadLogo,'enable','inactive')
set(handles.InsertWatermark,'enable','on')
guidata(hObject,handles)

hash.m
function
% HASH %
%
% USAGE:
%
% inp =
%
%
% h
=

h = hash(inp)
Convert an input variable into a message digest using any of
several common hash algorithms
h = hash(inp)
input variable, of any of the following classes:
char, uint8, logical, double, single, int8, uint8,
int16, uint16, int32, uint32, int64, uint64
hash digest output, in hexadecimal notation

inp=inp(:);
% convert strings and logicals into uint8 format
if ischar(inp) || islogical(inp)
inp=uint8(inp);
else % convert everything else into uint8 format without loss of data
inp=typecast(inp,'uint8');
end
% create hash
x=java.security.MessageDigest.getInstance('SHA-1');
x.update(inp);
h=typecast(x.digest,'uint8');
h=dec2hex(h)';
if(size(h,1))==1 % remote possibility: all hash bytes < 128, so pad:
h=[repmat('0',[1 size(h,2)]);h];
end
h=lower(h(:)');
h=h(1:4);
clear x
return

Audio_DataEmbedding.m
function [sig_w, key] = Audio_DataEmbedding(sig,w,alpha,len_Frame_t,vm_idx)
% AUDIO WATERMARKING
% Initialization of the output signal

Dept. of ECE

22

VVIT

Report on Departmental Activities


sig_w = sig;
% length of the input data bitstream
len_w = numel(w);
% Initialization of the key vector
key = zeros(1,len_w);
% Process
for n = 1:len_w
%% DWT
% [CA,CD] = DWT(X,'wname') computes the approximation
%
coefficients vector CA and detail coefficients vector CD,
%
obtained by a wavelet decomposition of the vector X.
%
'wname' is a string containing the wavelet name.
wname = 'db1';
start_idx = vm_idx(n)*len_Frame_t+1;
finish_idx = start_idx+len_Frame_t-1;

frame = sig(start_idx :finish_idx);


% LEVEL 1 DWT
[A1, D1]= dwt(frame',wname);
% Length of the coefficient
len_d1 = length(D1);

% LEVEL 2 DWT
[A2, D2]= dwt(A1,wname);
% Length of the coefficient
len_d2 = length(D2);

% LEVEL 3 DWT
[A3, D3]= dwt(A2,wname);
% Length of the coefficient
len_d3 = length(D3);

% LEVEL 4 DWT
[A4, D4]= dwt(A3,wname);
% Length of the coefficient
len_d4 = length(D4);
len_a4 = length(A4);

%% MATRIX FORMATION
Matrix = [A4 D4 D3 D2; D1];

Dept. of ECE

23

VVIT

Report on Departmental Activities

%% SINGULAR VALUE DECOMPOSITION


% SVD
Singular value decomposition.
%
[U,S,V] = SVD(X) produces a diagonal matrix S, of the same
%
dimension as X and with nonnegative diagonal elements in
%
decreasing order, and unitary matrices U and V so that
%
X = U*S*V'.
[u s v] = svd(Matrix);
% extract the first element of the s-matrix
s11 = s(1,1);
key(n) = s11;
% Embedding process
s11w = s11 * (1 + alpha*w(n));
% Replace the first element of s-matrix with the embedded element
s(1,1) = s11w;
% INVERSE SINDULAR VALUE DECOMPOSITION
Matrix_w = u*s*v';

%% DECOMPOSITIION OF THE MATRIX INTO THE COEFFICIENTS


start_A4 = 1;
finish_A4 = len_a4;
A4_w = Matrix_w(1,start_A4:finish_A4);
start_D4 = len_a4+1;
finish_D4 = len_a4+len_d4;
D4_w = Matrix_w(1,start_D4:finish_D4);
start_D3 = len_a4+len_d4+1;
finish_D3 = len_a4+len_d4+len_d3;
D3_w = Matrix_w(1,start_D3:finish_D3);
start_D2 = len_a4+len_d4+len_d3+1;
finish_D2 = len_a4+len_d4+len_d3+len_d2;
D2_w = Matrix_w(1,start_D2:finish_D2);
start_D1 = 2;
finish_D1 = len_d1;
D1_w = Matrix_w(2,1:len_d1);

%% INVERSE DWT 4 LEVEL


A3_w = idwt(A4_w,D4_w,wname);
A2_w = idwt(A3_w,D3_w,wname);
A1_w = idwt(A2_w,D2_w,wname);
frame_w = idwt(A1_w,D1_w,wname);
sig_w(start_idx:finish_idx) = frame_w;
end

Dept. of ECE

24

VVIT

Report on Departmental Activities

Audio_DataExtraction.m
function W = Audio_DataExtracting(sig_w,key,alpha,len_Frame_t,vm_idx)
% AUDIO WATERMARK EXTRACTING
% Length of the key
len_k = numel(key);
% Initialization of the output bitstream vector
W = zeros(1,len_k);

% Process
for n = 1:len_k
%% DWT
% [CA,CD] = DWT(X,'wname') computes the approximation
%
coefficients vector CA and detail coefficients vector CD,
%
obtained by a wavelet decomposition of the vector X.
%
'wname' is a string containing the wavelet name.
wname = 'db1';
start_idx = vm_idx(n)*len_Frame_t+1;
finish_idx = start_idx+len_Frame_t-1;

frame = sig_w(start_idx :finish_idx,1);


% LEVEL 1 DWT
[A1, D1]= dwt(frame',wname);
% Length of the coefficient
len_d1 = length(D1);

% LEVEL 2 DWT
[A2, D2]= dwt(A1,wname);
% Length of the coefficient
len_d2 = length(D2);

% LEVEL 3 DWT
[A3, D3]= dwt(A2,wname);
% Length of the coefficient
len_d3 = length(D3);

% LEVEL 4 DWT
[A4, D4]= dwt(A3,wname);
% Length of the coefficient
len_d4 = length(D4);

Dept. of ECE

25

VVIT

Report on Departmental Activities


len_a4 = length(A4);
%% MATRIX FORMATION
Matrix = [A4 D4 D3 D2; D1];
%% SINGULAR VALUE DECOMPOSITION
% SVD
Singular value decomposition.
%
[U,S,V] = SVD(X) produces a diagonal matrix S, of the same
%
dimension as X and with nonnegative diagonal elements in
%
decreasing order, and unitary matrices U and V so that
%
X = U*S*V'.
[u s v] = svd(Matrix);
% extract the first element of the s-matrix
s11w = s(1,1);
s11 = key(n);
% Extraction process
s11 = round((s11w/s11 - 1)/alpha);
W(n) = s11;
end

HashDB_Updater.m
close all;
clear all;
clc
[fn,pn] = uigetfile('*.jpg;*.png;*.bmp;*.tif','Select the logo file');
Logo = imread([pn,fn]);
if exist('HashDB.mat','file')
load('HashDB.mat')
else
HashDB = struct('OwnerName','','Hash','');
end

n = numel(HashDB);
if isempty(HashDB(n).Hash)
n = n-1;
end
n = n+1;
HashDB(n).OwnerName = input('Input the name of the Owner','s');
if isempty(HashDB(n).OwnerName)
return;
end
Hash = hash(Logo);

Dept. of ECE

26

VVIT

Report on Departmental Activities

for i = 1:numel(HashDB)
if strcmpi(Hash,HashDB(i).Hash)
OwnerName0 = HashDB(i).OwnerName;
if ~strcmpi(OwnerName0,HashDB(n).OwnerName)
disp('Invalid Owner')
return;
end
end
end

HashDB(n).Hash = Hash;
save HashDB HashDB

HARDWARE CONFIGURATION
The printed circuit board (PCB) provides the electrical interconnections between various
components and as well as provides mechanical support to the components. The components are
soldered to the PCB. The quality of soldering directly affects the reliability of the circuit. The
procedure for fabricating the PCB for any general project is described below.
The printed circuit board consist the following steps.
1.
2.
3.
4.
5.
6.

Lay out preparation


Artwork Preparation
Film Master Production
Pattern transfer
Etching
Drilling

LAYOUT PREPARATION
The layout is commonly prepared in the scale of 2:1. It offered a reasonable compromise
below accuracy gained and handling convenience 2:1 artwork as the actual PCB area. Grid systems
are commonly used for preparing the layout. The use of the grid sheet gives more convenience in
Dept. of ECE

27

VVIT

Report on Departmental Activities

placement of components and conductors. The grid system based on 0.1 is found to be too coursing,
a grid equidistant of 0.025 or even 0.1 mm is recommended.
Procedure
1. Each and every PCB layout is viewed from component side.
2. The designing of the layout is started with an absolutely clear component list and circuit diagram
is available.
3. The larger component are placed first and the space in between is filled with area
4. In the designing of the PCB layout, It is very importance to divide the circuit in to functional sub
units. Each of these sub units are realized in the defined the portion of the board.
5. The components are placed in the grid sheet tanning the standard length and width.
6. The punched lay out is circled to taking the standard size of the land pads.
7. These pads are entering connected as the circuit diagram.
8. The mirror image of these gives the solder side of the PCB.
PATTERN TRANSFER
After the film is processed the film master are obtained. The transfer of the conductor
which on film master on to the copper clad base material is done by two methods mainly photo
printing and screen printing. Photo printing is extremely accurate process which is also applied to the
fabrication of semi-conductors. Screen printing is comparatively cheap and simple method for
transfer although less precise then photo printing. But this is less costly, this method is commonly
used.
SCREEN PRINTING
In screen printing, the process is very simply. A screen fabric with uniform meshes and
opening is stretched and fixed on a solid frame of metal or wood. The circuit pattern is
photographically transferred on to screen, leaving the meshes in the rest of area as closed. In the
actual printing step ink is forced by moving queue through the open master on to the surface of the
material to be printed. The light sensitive material is coated on to the screen and using film master
Dept. of ECE

28

VVIT

Report on Departmental Activities

the pattern is transferred to the screen. The using ink and the pattern is transferred to the copper clad
sheet.

Two methods are used for screen printing into screen


1. Direct method
2. Indirect method.
In direct method than photographically sensitive emulsions are used for transferring patterns. The
Wet material is uniformly coated to the screen and then exposed. In indirect method, the
photographically sensitive film is transferred to screen. The film is exposed and ten it sticker in to
the screen. The pattern is then transferred to the screen using the links and squeegee.
ETCHING
The removal of unwanted copper from the copper clad sheet is known as etching. For this 4
types of tanks are used.

1.
2.
3.
4.

Ferric chloride
Cupric chloride
Chromic acid
Alkaline ammonia

Among these ferric chloride is cheap and popular etchant is ferric chloride and also suited for
home and industrial applications. The high corrosive powers of ferric chloride lead to shot etching
time and little under etching. Ferric chloride matches well photo and screen printed resists

DRILLING
Drilling of component mounting holes in to PCBs is by the most important mechanical
machining operation PCB production process. The importance of the whole drilling on PCB has
further group with electronic component miniaturization and its need for smaller whole diameters
and higher package density where whole purchasing is practically routed out. Four types of drilling
are commonly used
1. Drilling by direct sight.
2. Drilling by optical slight.
Dept. of ECE

29

VVIT

Report on Departmental Activities

3. Jig drilling.
4. NC drilling.
COMPONENT MOUNTING
Components are basically mounted on one side of the board. On polarized two lead
components are mounted to give the marking or orientation throughout the board. The component
orientation can be both Horizontal as well as vertical but uniformly, directions are placed. The
Uniformity in orientation of polarizes components is determining during design of PCB.
Some recommended mounting techniques are given below
Horizontally mounted resister must touch the board resister to avoid lifting of solder along
with the copper pattern under pressure on the resister body Vertical mounted resister should not be
flash to board surface to avoid the Strain on the solder joint as well as the component need to
junction due to Different thermal expansion coefficient of lead board material also where necessary
spaces should be provided. Coated or sealed components have to be mounting such a way as to
provide a certain distance from the board. When jumper wires cross over the conductors, jumper
wire must be insulated. Transistors mounting should be never done flash to be board. This could give
considerable stress on the solder joints and to the lead connection beside the possible over heating
during the soldering operations.

SOLDERING
Soldering is an important process in assembling electronic products. Solder joints are
formed by nature of the welding process. Solder does not stick on the insulating surface. on most
metal wetting will take place and a joint will form if the work piece and solder are not enough and
the surface are clean and free from oxides . The surface to be soldered must be sufficiently not to
permit wetting by the solder. The component to be soldered should not be damaged by heating. The
metal surface to be wetted must have good wet ability. They must be clean so that metallic contact
can be established between older and metal. For good soldering select proper rating of soldering iron
and select proper tip. Pick up the [proper solder wire commonly 60:40 tin and led composition are
Dept. of ECE

30

VVIT

Report on Departmental Activities

used. Make a good contact between the soldering tip and the part to be soldered. Apply small amount
of flux and make a contact between the soldering iron and joint till good spreading of soldering of
soldering has been realized.
TESTING AND ASSEMBLY
Assembly consists of soldering of components and wires on the PCB and mechanical of
wired PCB and other assemblies. Testing is carried out even at design phase itself in breadboard
level to verify the design, so that little circuit changes are required after designing the PCB.
TESTING

After soldering the components on to the PCB, the board is thoroughly cleaned for any
residual flux and wire leads. All the components are checked for their value and for the proper
orientation if applicable. Before ICs are inserted into the sockets, power applied to the board and
voltages are measured at the IC power point. Power is switched off before the ICs are inserted. Press
the required switch and check whether the corresponding code is available at various stages
(Receiver, display and the motor driver input.). If all these requirements are satisfied connect the
required appliance in the circuit.
ASSEMBLY
The tested PCB is mounted on the mechanical structure. The mechanical structure is
constructed using PVC foam sheet. The ultrasonic transmitter receiver module is mounted on the
side of the structure such that the sound waves transmitted can easily be received. The indicators are
also made easily visible.

PASSWORD BASED DOOR LOCKING SYSTEM


AIM
Nowadays every devices operation is based on digital technology. For instance,
token-based digital identity devices, Fort-token mobile and digital-based door lock systems for auto
door opening or closing are all based on digital technology. These locking systems are used to
Dept. of ECE

31

VVIT

Report on Departmental Activities

control the movement of door and are functional without requiring a key to lock or unlock the door.
These locking systems are controlled by a keypad and are installed at the side hedge of the door.
In this project, a keypad is attached to the door for opening and closing operations of the
door. After entering the 4bit code, if that code matches with the predefined one, then the door will
unlock for a limited period of time. After prolonging the unlocking for a fixed period of time, the
relay reenergizes and the door gets locked again. If anyone enters a wrong code in an attempt to open
the door and attempts thrice, then this system immediately switches an alarm or a buzzer.

BLOCK DIAGRAM
Power
Supply

LCD

Motor
Driver
8051 Micro
Controller

Keypad

Motor

Buzzer

The operation of this system can be described by the above block diagram, which consists of
blocks as keypad, a buzzer, an LCD, a Motor Driver and a motor. The Keypad is an input device that
helps to enter a code to open the door. This block gives the entered code signals to the
microcontroller. The buzzer and LCD are the final indicating devices for displaying the information
and alarming. The motor moves the door: opens and closes the door, and the motor driver drives this
motor by receiving control signals from the microcontroller.
The microcontroller used in this project is from 8051 family, and it is programmed with the
Keil software. When a person enters a password by using various keys in the keypad ranging from 0
to 9 including Enter and Escape keys, then the microcontroller immediately reads the data and
compares it with the stored data.

Dept. of ECE

32

VVIT

Report on Departmental Activities

If this data matches, the microcontroller sends display information to the LCD display as
Code is authenticated. Furthermore, the microcontroller sends the command signals to the motordriver IC to rotate the motor in a particular direction such that the door opens. After sometime, the
spring system with a specified time delay closes its relay, and then the door starts closing and slowly
comes to its normal closed position.
If the person attempting to open the door enters a wrong password, then the microcontroller
switches the alarm for further course of action. In this way, a simple door-lock system can be
implemented with the use of a microcontroller.

CIRCUIT DIAGRAM

PCB LAYOUT

Dept. of ECE

33

VVIT

Report on Departmental Activities

RESULT
The project Password Based Door Locking System was successfully designed and tested. It
was found to function as per the expectations.

SPY ROBOT
Dept. of ECE

34

VVIT

Report on Departmental Activities

AIM
To develop a robot used to explore a remote location by mapping the path with the help of a
server and sensing the obstacles with IR sensor and transmits the collected data to the control station
using wireless means.

BLOCK DIAGRAM
This project develops a running robot used to explore a remote location by mapping the path
with the help of a server and sensing the obstacles with IR sensor and transmits the collected data to
the control station using wireless means. This project consists of a microcontroller section and
mechanical section. The microcontroller section stores a program that controls the movements of the
robot, and object/obstacle detection. The mechanical section consists of DC motors for the
movements of robot. The server helps in tracing the path of the robot. A robot is a mechanical
device, a manipulator designed to perform many different tasks and capable of repeated, variable
programming. To perform its assigned tasks, the robot moves parts, objects, tools, and special
devices by means of programmed motions and points. The science and technology that deals with
study of robots, their design, manufacture, and application is known as Robotics. Although the
appearance and capabilities of robots vary vastly, all robots share the features of a mechanical,
movable structure under some form of autonomous control. The control of a robot involves three
distinct phases - perception, processing and action.
The robot holds a mechanical structure that consisting servo motors for the movements of
robot, and an electronic circuitry which acts as the CPU/ head (Intelligence) of the robot. A
microcontroller acts as a Central Processing Unit, which stores an algorithm that controls the
movements of the robot; ambient parameter sensing and object/obstacle detection. The robot is fixed
with various sensors that monitor atmospheric temperature, light intensity, relative humidity and
pressure. The object is detected using IR Sensor module which comprise of an IR transmitter and
receiver section. If the distance to the object from robot is found less than the predefined value, the
Sensor produces control signals to the CPU that changes the motion algorithm that matches to the
situation and changes the path of the robot. The raw analog outputs from sensors after filtering and
signal processing are directed to the ADC (Analog to Digital Converter) module. The digital output
of the ADC further processing is given to the USART (Universal Synchronous Asynchronous
Receiver Transmitter) module for communication purpose. The results are automatically archived on
Dept. of ECE

35

VVIT

Report on Departmental Activities

each time interval. A DC motor is like a light bulb; it has no electronics of its own and it requires a
large amount of drive current to be supplied to it. This is the function of the L293D chips on the
Handy Board, to act as large current switches for operating DC motors
The control station interfaced with a wireless communication module interprets the incoming
data from the microcontroller transmitted via ZIGBEE, and displays the needed data in
corresponding manner. We can develop application software using MATLAB to achieve Control
Station and Robot interfacing.
Transmitter Section

PC with MATLAB
Zigbee Module

Receiver Section

Power Supply

AVR

Motor Driver 1

Motor Driver 2

Zigbee Module

Motor 1

Motor 2

IR Sensor

Dept. of ECE

36

VVIT

Report on Departmental Activities

CIRCUIT DIAGRAM

PCB LAYOUT

Dept. of ECE

37

VVIT

Report on Departmental Activities

RESULT
The project Spy Robot was successfully designed and tested. It was found to function as per
the expectations.

Dept. of ECE

38

VVIT

You might also like