Professional Documents
Culture Documents
1
Katherine Louise A. Rogacion
July 6, 2016
Change Control
2
Overview
The purpose of this document is to outline the steps for the daily GR replica process.
3
Daily GR Replica Process
1. The export runs at 6pm CEST (server time) on TRSEKA180 (10.42.8.180) server.
Note: The job usually takes approximately 1.5 hours to run. To check the export process, run the command below on the
Linux command prompt.
If it’s finished, you will get an email notification with the subject “Export of TRCOTXKA GR tables is now finished.”
When you do a ps -ef | grep expdp, you will not see a process corresponding to expdp. Example is shown below.
4
$ ps -ef | grep expdp
There are 22 total dump files that are produced on /u01/export directory on TRSEKA180 (10.42.8.180) server. These
are:
5
-rw-r----- 1 oracle oinstall 163840 Aug 3 19:25 GR_Tables_2016_GSDBA_wrkdatacallbacks_02.dmp.gz ok
oracle@10.240.70.19:/oracle/backup/daily_replica
c#d5zV
2. Compress the export data pump files that are located on /u01/export directory on TRSEKA180 (10.42.8.180) server. The
command to use is gzip.
6
Example:
$ gzip trcotxka_gsdba_partitioned_tables_03.dmp
3. Delete the export dump files on /oracle/backup/daily_replica in TCSVDB101 (10.240.70.19) server from the previous
day’s import. The command to delete is rm.
ls -lrt *.dmp
Example:
rm *.dmp --> This deletes all export dump files. The * wildcard is used to signify that all export dump files will
be deleted.
4. Transfer the export dump files from TRSEKA180 (10.42.8.180) to TCSVDB101 (10.240.70.19). Use the scp command to transfer
it.
Example:
7
scp trcotxka_gsdba_partitioned_tables_03.dmp.gz oracle@10.240.70.19:/oracleg/backup/daily_replica
oracle@10.240.70.19's password:
5. When you have transferred the export dump file to TCSVDB101 (10.240.70.19), de-compress the files that are compressed.
Example:
6. Before importing the tables to BOBJTEST database, you will have to drop the tables and the sequences. The script is
located in /home/oracle/scripts directory.
a. Run the script drop_proddba_gsdba_before_import.sh to drop the selected tables that will be imported.
8
Example: nohup ./drop_proddba_gsdba_before_import.sh & --> This will run the script as a background process.
Example: ./drop_sequences_proddba_gsdba_bobjtest.sh
7. Import the schemas by running the scripts manually. The import scripts are located in /home/oracle/scripts directory
in TCSVDB101 (10.240.70.19) server.
The reason why we are running the imports separately is due to the fact that Oracle uses the undo space in the undo
tablespace when running imports. The scripts have been broken down to different schemas to import one schema at a time
with the bigger tables and to keep the undo tablespace manageable as well.
9
Example: nohup ./import_gsdba_partitioned_tables.sh & --> Run it as a background process.
Notes:
tail -f nohup.out --> This will automatically print the import progress on your screen on your putty session.
===================================
Importing GR tables
10
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - 64bit Production
d. Another way to see the progress of the import is to enter the command below.
11
ps -ef | grep impdp
8. While import is running, it is important to monitor the flash recovery area on the server because the import process
generates archived redo log files. The flash recovery area for BOBJTEST database is located on
/backup/flash_recovery_area/BOBJTEST/BOBJTEST/archivelog. Just select the appropriate date so you can see where the
archived redo logs are being generated.
/backup/flash_recovery_area/BOBJTEST/BOBJTEST/archivelog
total 8
12
drwxrwx--- 2 oracle oinstall 4096 Jul 6 16:19 2016_07_05
/backup/flash_recovery_area/BOBJTEST/BOBJTEST/archivelog/2016_07_06
[oracle@tcsvdb101 2016_07_06]$
total 2806140
To free up space, simply delete the .arc files under the appropriate directory. Example is
/backup/flash_recovery_area/BOBJTEST/BOBJTEST/archivelog/2016_07_06.
13
Steps to free up space on the flash recovery area:
b. rm *
c. export ORACLE_SID=BOBJTEST
d. rman target /
14
validation failed for archived log
Crosschecked 10 objects
f. RMAN> list expired archivelog all; --> This will show all expired archivelog files.
g. RMAN> delete expired archivelog all; --> This will list all expired archivelog files and prompt you with this question.
Do you really want to delete the above objects (enter YES or NO)? Simply type YES.
h. To verify that you have freed up space and reduced the space usage on the directory, login to SQL *Plus as the sysdba
user and run the query below.
select
name,
from v$recovery_file_dest;
15
9. When all the import scripts have been and run and import is completed, you will need to re-create the sequences on
BOBJTEST database.
b. Ensure the ORACLE_SID is set to TRCOTXKA database by running the command echo $ORACLE_SID.
c. Connect to TRCOTXKA database on SQL *Plus as the sysdba user - sqlplus / as sysdba. You can also do this on SQL Developer.
Connect to TRCOTXKA using system user. The password is easyjet.
d. Run the query below. The results will be used to create the sequences.
from dba_sequences
e. Copy the results to a temporary file on your laptop. (I use notepad for this.)
./drop_sequences_proddba_gsdba_bobjtest.sh
10. Connect to BOBJTEST database as the sysdba user. Run the query below.
If the query returns a value of 660, then we’ll need to recompile the invalid objects by running the script utlrp.sql
Please see below.
17
After recompiling, run the query below. It should return 146 invalid objects which should be fine.
11. Verify the number of tables imported to BOBJTEST database. The queries below should be run after the whole process.
and object_name in
('APPROVED_TL','BOARTICLE','BOARTICLE_HISTORY','BO_HISTORY','CT_HISTORY','DATA_BO','DATA_CALLBACKS','DATA_CMS1','DAT
A_CMS1_ADJ','DATA_MANUAL','DATA_TCMS','DC_HISTORY','IMPORTLOG','PC_HISTORY','STAT','USC_HISTORY','USC_HISTORY_TCMS',
'BO_HISTORY','CA_HISTORY','CONSULTANT_TIME','DATA_ICC','DATA_TCA','DATA_TCMS_ADJ','DATA_TCS','DATA_TCS_ABSENCE','DAT
A_TCS_OUTBOUND','DATA_UPSELL','DC_HISTORY','EMPLOYEE_ABSENCE','FAILED_DATA_CALLBACKS','FTA_AGENT','FTA_DS_PRICE_HIST
ORY','FTA_FG_DATA','FTA_FORECAST_GROUP_HISTORY','FTA_SAVED','FTA_SG_DATA','FTA_SHRINKAGE','FTA_STAFF_GROUP_HISTORY',
'GRETL_DAILY_EXPECTATION','MTB_HISTORY','RW_CAE_HISTORY','SOCIAL_STAT','WRK_DATA_CALLBACKS','ABSENCE_CODES','ABSENCE
_CODES_TCS','ABSENCE_TYPES','BACKOFFICE','BILLING_TYPE','BO_ACTIONOUTCOME','BO_CATEGORY','BO_CHANNEL','BO_CLIENTPROD
UCT','BO_CLOSURE','BO_COMPLEXITY','BOBJ_MNG_USERS','BOBJ_RS_BUILT','BOBJ_RS_COUNTRY','BOBJ_RS_NAME','BOBJ_RS_REGION'
,'BOBJ_RS_SITE','BOBJ_TARG_ADMIN','BOBJ_TARG_LEVEL_NAMES','BOBJ_TARG_LEVELS','BOBJ_TARG_STATUS','BOBJ_TARG_TYPES',
18
'BOBJ_TARG_UNITS','BOBJ_TARG_VALUES','CALLBACK_ACCOUNTS','CONSULTANT_TASK','DURATION_CONTRACTS','FTA_DS_PRICE','FTA_
FORECAST_GROUP','FTA_STAFF_GROUP','GRETL_IMPORT_CONFIG','GRETL_LOG4NET','MAILBOX_TASK_DURATION','MAILBOXES',
'SOCIAL_EMPLOYEEUSERS','SOCIAL_FILEADMINISTRATORS','SOCIAL_SMADMINISTRATORS','TASKS','UPSELL_CATEGORY','UPSELL_CODES
','UPSELL_CODES_TCMS','USE_TCS')
order by object_name;
and object_name in
('COSTPRO_PROFILE','COSTPRO_RANGE','CURRENCY','EXT_COMPANY','TC_CONTRACT_TYPE','TC_COUNTRIES','TC_DEPARTMENTS',
'TC_EMPLOYEES','TC_GROUPS','TC_HRMS_EMPLOYEES','TC_HRMS_RELATIONSHIP','TC_REGIONS','TC_ROLES','TC_SITES','TC_VCF','V
_TC_EMPLOYEES');
19
12. Delete all the export dump files and log files that are dated for the current day in /u01/export directory in TRSEKA180
(10.42.8.180) server. This is in preparation for the next daily run of the export job.
cd /u01/export
ls -lrt
20
-rw-r--r-- 1 oracle oinstall 1532 Jul 6 16:29 trcotxka_gr_tcadba_tables.log
21
22