You are on page 1of 11

KISHORE VAJJA

E-Mail: vajjakishore@gmail.com
Phones: 703-200-8877 (M)

Profile: Over eight and half years of industrial experience in software testing, quality
assurance and off-shore project coordination in web and windows applications
development.

Education: M.S. in Mathematics with computer science.


M.Phil. in Mathematics

Computer Skills:

Testing Tools: Manual Testing, HP TestDirector/Quality Center, Quick Test


Professional (QTP), WinRunner, LoadRunner, Team Track, Rational
Clear Quest and Test Tracker, Rational Robot, Rational Functional
Tester and Rational Quality Manager.

Database: ORACLE, SQL Server and MS Access.

Web Design: HTML, DHTML and Front Page.

Operating Systems: Windows 95/98/2K/NT/2003/XP, UNIX Workstation

Hardware: IBM Compatible PCs, HP-UNIX workstation, Sun Solaris UNIX


workstation and HCL-UNIX (Mini).

Protocols: ATM, SNMP, FTP and TCP/IP.

Application Servers: Weblogic, Apache, Web Server and JBoss

Web Servers: MSIIS, Websphere and TOM Cat

Tools/Utilities: MS-Word, MS-Excel and Power point.

Version Control Tools: PVCS (Polytron Version Control System), RCS (Revision
Control System), CVS (Concurrent Versions System), Clear Case and
Visual Source Safe.

Test Methods: Waterfall, Agile/Scrum and Rapid Application Development.


PROJECT EXPERIENCE:

Client: UPS Supply Chain Solutions (Aug. ‘08 – Till Date)


Role: Sr. QA Analyst
Project: Flex Global View (FGV)
Environment: JAVA, J2EE, Oracle, JDBC, PL/SQL, HP Quality Center, PVCS, JAVA
BEANS, JAXB, EDI, XML, UNIX, TOAD, Cognos 8.4, I*Net, Report Net,
IP Switch FTP tool and MS-Excel

FGV is a visibility system where it is accommodated with five major modules named as
Manage Shipments, Manage Customs, Manage Purchase Orders, Manage Inventory and
Reports.

Manage Shipments serves to Add/ Edit shipment, track Shipment and small packages and
Add/Edit Carrier/Container/Destination Milestones.

Manage Customs serves as to create/edit Customs Entries and track Customs Entries.

Manage Purchase Order serves as to Track Purchase Orders, creating Vendor


Compliances/Booking Requests and view PO status.

Manage Inventory is to Track Inventory, maintain the warehouses and create Shipping
Orders.

Finally, the Reports module provides all the reports related to above modules including
Customer Customs Reports.

So far I have extensively involved in Manage Shipments, Manage Purchase Orders,


Manage Customs and Reports modules.

Roles and Responsibilities:

 Attended the Business and Functional Requirement Document (BRD) review


meetings conducted by BA team to better understand the application functionality
and to gather Functional Requirement Document (FRD) to create the Test Plan
and Test Scenarios.
 Involved in formal and informal Peer Review meetings to discuss the Project
Test Plan documentation.
 Involved in Internal and External Test Plan review meetings where Peer members,
QA Lead, BA, Client User and Development team are the attendees.
 Conducted Base line, Sanity, Functional, Integration, Ad-hoc, Regression and
UAT testing against the existing and new product.
 Extensively involved in weekly/daily Defect Review and Test Execution Status
Meetings to discuss/explain the defects with development team.
 Verified the De-Normalization functionality of data into different tables by
loading an XML file.
 Involved in the preparation of Release Notes based on the fixed/non fixed defects.
 Involved both Front end and Back end (Database) testing during the tenure.
 Extensively used Team Track Defect Management Tool to log all the defects and
reporting purpose.
 All the Test Cases were created and executed in Test Plan and Test Lab modules
of HP Quality Center respectively.
 Recorded and played back QTP scripts for manual test cases against FGV front
end screens by inserting different types of check points.
 Gathered and Prepared test data to execute the test cases.
 Executed SQL scripts in UNIX to insert Shipments, Carrier/Container Status
Messages in FGV Database.
 Executed join SQL statements to validate the inserted/updated data in FGV
Database.
 Verified the update functionality when system received an update of a
shipment/Custom Entry/Purchase Order.
 Created Requirement Traceability Matrix to verify the test coverage.
 Worked on JMS Queues to drop shipments, Milestones into FGV Database to
validate the new functionalities of the project.
 Attended weekly QA staff meeting to get explored with Management updates,
discusses improving test methodologies and Brain Teasers.
 Tested Cognos reports functionality extensively.
 Logged in to UNIX servers to access the log files to attach the error information
in the defect for the developers to better understand the defect.

Client: AT&T Mobility (Feb. ‘08 – Jul. ‘08)


Role: QA Lead Analyst
Project: PCA RepUI
Environment: JAVA, Oracle, JDBC, Different APIs, Mercury Quality Center, Rational
Clear Quest, CARE/TLG Clients, ETQC API Web tool, Global Web
Login/MY PRISM (version controller) and MS-Excel

PCA (Personalized Course of Action) RepUI application focuses on providing Vitals,


Sales as well as Operational Recommendations for an existing AT&T Wireless
customers. Once Customer is added to the OPUS Admin tool, the aggregation of the
account will be performed by PCA and the same will be captured into pCA database
which is good for the next 2 days. PCA RepUI takes care of the GUI part of the
application.
This application works with different sub-systems (CAM, TLG, BID, LTVO, ED,
Clarify, OPUS, CPC, DLC and MRE) through its individual APIs which support PCA at
account details aggregation level.

Roles and Responsibilities:

 Analyzed the Business Requirement Document (BRD)and Functional


Specification documents (FSD) to prepare Test Cases based on the Test Plan and
Use Cases and attended Reviews meetings to better understand the applications
 Participated in Walkthroughs with Team lead, System Analyst, HCD Designer,
Project Coordinator and the Development team to discusses the outstanding
defects and scope change requests
 Analyzed test data and Conducted Database or Data driven testing under certain
business rules for data population in DB
 Attended triage meetings to understand and consider the scope changes for the
release during the test execution phase
 Written and executed test cases in HP Quality Center for the release
 Performed Smoke, Functional, GUI/design, Block Box, End to End,
Navigational and Regression tests during the execution phase
 Extensive test data analysis between API Web tool Vs pCA DB to determine the
pCA database aggregation for Make, Model, SKU and device first use date are
correct.
 Sole performance of API testing with different sub-systems as a part of Back-
end test
 Tested the application using Agile methodology.
 Involvement in executing CARE and TLG test data in OPUS RepUI to verify the
account created date, features on the account, upgrade eligibility check, account
status, Contract information, Rate Plan details and etc.
 Uploaded and executed the Functional Test cases into HP Quality Center.
 Extensive involvement in performing Data Driven test using parameterization
technology.
 Involved in writing the Project test plans for this test effort by using the AT&T
template and attempted in collecting the team meeting minutes.
 As a senior team member, Participated in creating the LOEs’ for this test effort
 Used Rational Clear Quest Bug Reporting tool to log the defects and reporting
purposes for developers to fix the defects
 Dealt with System Analyst and the HCD (Human Centered Department) designer
to discuss the system and jumppage/wire frames changes
 Written and executed SQL queries to access the pCA database as a part of data
driven test.
 Performed JUnit framework testing and created Traceability matrices
 Suggested/Created application scope Changes in this release
 Application level Regression, End to End and System tests were conducted
after some windows patches were implementation
 Worked very closely with the Development team to analyze the application at the
backend level and to describe the defects.
 Trained and assisted new team members on data analysis using the API tools.

Client: Cingular Wireless, Inc., (Mar. ‘07 – Jan. ‘08)


Role: QA Lead Analyst
Project: PSC KIOSK/RepUI
Environment: JAVA, Oracle, Mercury Quality Center, Rational Clear Quest, Franks
Tool, ETQC API Web tool and MS-Excel

PSC (Phone Support Center) KIOSK is an application which supports the customer’s
phone devise’s problem. Customer can troubleshoot the problem phone in PSC Kiosk.
Based on customer’s account relation/Vitals and troubleshooting options, customer will
be provided different options (BRE, INS, SUS, WSC, WXE, OOW and REP) by PSC.
Customer has to pick one option.

PSC RepUI is an application operated by store Representative to help/assist the customer


with his/her problem phone device in the store. It is a replication of KIOSK. If a customer
is not satisfied with what he/she did with KIOSK level or customer goes to the store
representative with problem phone, store Rep has to take care of the customer.

These applications work based on different backend servers (CAM, TLG, DLC, MRE,
RLM, Clarify, OPUS, CPC, Oracle 11i and Compass) which support PSC at different
stages of application flow.

Roles and Responsibilities:

 Involved in gathering the system requirement documentation (SRD) and attended


walkthroughs to better understand the applications
 Worked very closely with the Dev team to analyze the application at the backend
level and to describe the defects
 Extensive involvement in executing CARE and TLG test data in OPUS RepUI to
collect the associated account number with Zip, SSN, warranty, insurance and etc.
 Written and executed all the Kiosk and RepUI test cases in HP Quality Center
for all the releases.
 System Test, End to End Test, Integration test and the Functionality test were
the motto of the test execution for this application
 Logged all the defects (applications and enhancement) in Rational Clear Quest
for development team to fix the defects.
 Performed UAT test in Innovation lab for KIOSK and RepUI at the time of
application Pre-Staging
 Extensive test analysis by using the ETQC API tool to make sure the device
qualifies for the warranty exchange under certain business Functional rules.
 Involved in writing the Project test plans for all the releases and taken the team
meeting notes.
 Worked in Rally environment.
 Participated in creating the LOEs’ for all the releases
 Tested and Supported 4 major billing releases of this application in two different
and continues Phases
 Conducted extensive Regression testing as a part of Release test
 Dealt with the System Analysts with system changes
 Written and executed SQL queries to pull the data from AS400 server
 Written and executed Kiosk script change and AT&T Re-branding test cases in
Quality Center.

Client: Cingular Wireless, Inc., (Oct. ‘06 – Feb. ‘07)


Role: Test Engineer
Project: BAN Expansion Testing Project
Environment: JAVA, Oracle, Shell scripting, VISTA plus, Maestro, Testdirector, MS-
Excel, FTP, X-Tern and UNIX

BAN (Business Account Number) numbers are assigned to different markets in


ranges. When a BAN range is exhausted within a market, a new BAN range must be
assigned. The scope of the project is when the BAN is expanded (from 9 digits to 12
digits in range) in SQL scripts pertaining to the extract files or reports effected by BAN,
it need to make sure that all the extract files and the reports generated after running the
SQL scripts (daily, weekly, Bi-monthly and monthly), need to be expanded based on the
individual markets.

Roles and Responsibilities:


 Attended triage meetings to analyze the scope of the project
 Extensive study and understanding of business requirements to make test plan for
the project
 Analyzed user Requirements, prepared test strategies and Test Scenarios
 Responsible for writing test plan and test cases in testdirector
 Creation of different test scenarios to test the BAN expansion
 Writing test cases in Testdirector
 Conducting Regression tests after each defect got fixed
 During testing life cycle, performed End to End test.
 Extract files and Reports are transferred to the application recipients through FTP
and direct connect
 Written Shell scripts to extract the files from the different market servers by
connecting through X-tern tool
 Applied shell scripts on extract files and reports to test the BAN for expansion.
 Contacting the recipients for UAT after transferring the files/reports.
 Viewing the extract files and reports manually to test the expanded BAN for
accuracy
 Created output files for each extract file or report effected by expected BAN
cases
 Generated all the reports in MS-Excel to report the defects to the developers
 Escalation of the wrong turns during the testing effort to the team lead to make
them correct

Client: Verizon (Oct. ‘04 – Aug. ‘06)


Role: Test Engineer

Project 1: http://inflow.ebiz.verizon.com
Environment: ASP.NET, C#, HTML, CSS, SQL Server 2000, Rational Robot, Rational
Quality Manager, Rational ClearQuest

Involved in Requirements gathering, Analysis and Testing of the intranet site


http://inflow.ebiz.verizon.com. The purpose of the system is to maintain the tasks within
an organization. The system allows the management to assign tasks to their team
members and keep track of them. Team members will update the tasks once they finish
them. E-mail notifications will be sent in both ways.

Project 2: www.verizon.com
Environment: ASP.NET, C#, MCMS 2002, SQL Server 2000, MS Application Test
Center, Rational Robot, Rational Quality Manager, Rational ClearQuest

Involved in all phases of test system life cycle of www.verizon.com migration project;
It’s purely content driven site developed in Microsoft Content Management Server
(MCMS), with home & business as THEIR CORE BUSINESS UNITS. The system
allows content proofing and publishing dynamically by the authorized content owners.
The site explains different services provided and services offered in home and business
sectors by version.

Project 3: www.itcare.ebiz.verizon.com
Environment: ASP.NET, C#, MCMS 2002, SQL Server 2000, MS Application Test
Center, Rational Robot, Rational Quality Manager, Rational ClearQuest

Involved in analysis and testing of intranet site http://itcare.ebiz.verizon.com developed


using Microsoft Content Management Server (MCMS). The site is used to maintain the
content of the executive conferences takes place at the end of every year. All the
executives will be able to upload photos, documents, presentations, demos etc. It also
provides message board to discuss about several topics to conferences.

Common Activities for all Verizon projects:

 Extensively worked with Rational Administrator to create test plans that


includes test cases, expected results and prioritized tests by going through the
existing site.
 Attended Business Requirements meetings to better understand the system.
 Created automated test scripts in Rational Robot for each test case.
 Created sample test data pools to execute the automated scripts in Rational
Robot
 Created verification points in Rational Robot after running the scripts.
 Executed each test case individually to perform the functional testing.
 Executed the test plan to perform the system testing and generated the result
report.
 Entered defects in Rational Clear Quest and assigned them to appropriate
developer through the Development team manager.
 Attends weekly meetings with BA’s and leads and discuss the on-going work and
update the changes in scripts while execution of test cases.
 Retesting the application after the defects was fixed.
 Tested content publishing workflow by logging in as different roles manually.
 Commonly used functions are stored in Shared repository to access from each
script.
 Verified the Verification Points values to make sure that the certain values were
generated as expected.
 Verified user interface actions with database values to make sure that the database
is reflecting the actions the user is performing.
 Altered the test cases as per the new requirements as some additional features
were added in the migration project.
 Performed browser compatibility testing.
 Performed Smoke, System, Integration, GUI, functional, Navigational,
Regression, verification and validation testing.

Client: Washington Mutual Bank (Apr. ’03 - Jul. ’04)


Project: www.wamu.com Phase I & II
Environment: MCMS 2002, ASP, VBScript, JavaScript, HTML, SQL Server 2000,
Test Director, WinRunner, LoadRunner and windows XP.

Involved in whole testing life cycle of www.wamu.com public site. The site has general
information about personal, small business and commercial banking along with FAQs.
It’s a content driven site where authorized users will publish the content online. The site
is migrated from Sun flavor of technologies to Microsoft technologies.

Roles and Responsibilities:

 Went through the existing Java site to understand the system properly.
 Extensively worked with Test Director to create test plans that includes test
cases, expected results and prioritized tests by going through the existing site.
 Altered the test cases as per the new requirements as some additional features
were added.
 Performed UNIT testing by going through the test cases manually.
 Verified data integrity as accounts are moved from old system to new system
 Tested content publishing workflow by logging in as different roles manually.
 Created defects in Test Director.
 Attended Triage meetings to provide input on defects and assigned them to
appropriate developer.
 Worked closely with development team to explain the defects clearly and
revalidating once they fix them.
 Created GUI scripts and executed them in WinRunner between builds.
 Involved in Black box, functional testing, integration testing, load testing,
regression and System testing.
 Performed sanity testing after each code push to test server.
 Worked with business users during UAT.
 Created final testing report after the testing process is completed.

Client: Intellisoft Technologies, Inc. (Nov. ’01 – Jan. ’03)


Role: Off-Shore Project coordinator and tester

Project: Worked on multiple projects


http://www.bigbend.edu
http://esevaonline.com/
http://gist.ap.nic.in/apgos/

Environment: ASP, JSP, SQL Server, ORACLE, Windows XP, Windows 2000
Advanced Server, Test Director, WinRunner, LoadRunner.

Involved in analysis and testing of various web and windows applications. The projects
were developed at both on-site and (Dallas, USA) and off-shore (Hyderabad, India) sites
for varieties of clients, including State Government and transportation web sites.

Roles and Responsibilities:

 Analyzed the user requirements by interacting with system architects, developers


and business users.
 Creation of Test Cases and Test Director and executing the test cases.
 Assigned test cases to test engineers at off-shore development center.
 Coordinated with offshore testing team on daily basis to get the status of testing
and updated bug reports.
 Provided demos to various clients periodically to update the system status.
 During testing life cycle, performed different types of testing like system testing,
Integration testing and Regression Testing.
 Interacted with business and developers in resolving issues
 Performed Browser functionality testing for IE & Netscape.
 Smoke Testing was performed after each code drop.
 Involved in UAT (User Acceptance Testing) and Implementation.
 Created the Automated test scripts using WinRunner tool for testing the
compatibility of the application with different platforms.
 Created the test data for interpreting positive /Negative results during functional
testing.
 Used the Test Director as bug-tracking tool to centralize the bugs and also to
follow up the bug status.
 Performed regression testing after each build release of the application.

Client: BPCL (Jun. ’01 – Sep. ’01)


Role: Tester
Project: IMS
Environment: ORACLE, Developer/2000, UNIX, Manual Testing and Test Tracker Pro.

Designed and implemented IMS (Inventory Management System), which co-ordinates all
times information and data relating to the re-ordering the items of a company. The IMS
meets the needs like capture of perpetual inventory transactions, maintaining of stores
ledger for various item, Auto indicating the items re-ordering level and generating the
reports like purchase order, request for proposal, Stores Ledger. The receiving module
consists of ordering the materials to the vendors depending on the re-order level in the
form of purchase order and procuring the items from them. The issuing module includes
issuing the materials according to the production requirement.

Roles and Responsibilities:

 Created test cases by going through the use cases and test plan.
 Performed unit testing by executing the test cases manually.
 Logged defects in Test Tracker Pro.
 Tested the defects after development team fixed them.
 Provided the test result report to lead.

You might also like