You are on page 1of 29

IBM Tivoli Netcool Performance Manager for Wireless Development Toolkit

Version 5.0.5

IBM Tivoli Netcool Performance Management for Wireless Development Toolkit,


v5.0.5: Best Practice Guide

This edition applies to version 5 release 0 modification 5 of the Tivoli Performance Manager for Wireless Technology Pack
Development Kit and to all subsequent releases and modifications until otherwise indicated in new editions.
Copyright IBM Corp. 2007, 2013
US Government Users Restricted Rights - Use, duplication or disclosure restricted by GSA ADP Schedule Contract with IBM Corp.

Introduction................................................................................................3
1.1
1.2

Audience............................................................................................................3
Required skills and knowledge...........................................................................3

Top best practice tips................................................................................4


2.1
2.2
2.3
2.4
2.5
2.6
2.7
2.8
2.9
2.10

Data entry options..............................................................................................4


Requirements specification process...................................................................4
Vendor documentation.......................................................................................4
Available document formats...............................................................................4
Data formats......................................................................................................5
Standard spreadsheet........................................................................................6
Flexibility............................................................................................................ 6
Review process..................................................................................................6
Consistency.......................................................................................................6
Utilize all the tools..........................................................................................6

Planning and project overview.................................................................7


3.1
3.2

Overview of technology pack release lifecycle...................................................7


LOE and TTM....................................................................................................8

3.2.1
3.2.2
3.2.3
3.2.4
3.2.5
3.2.6

3.3

Division of work..................................................................................................9

3.3.1
3.3.2
3.3.3

The size of the requested solution........................................................................8


Availability of data samples...................................................................................8
Vendor documentation.......................................................................................... 8
Poor quality data samples and non-standard format.............................................8
Graphs and reporting requirements......................................................................8
Quality of existing release.....................................................................................8
Single solutions engineer......................................................................................9
One system designer and one system developer.................................................9
Full specialist team................................................................................................ 9

Building a technology pack....................................................................10


4.1

Data entry........................................................................................................ 10

4.1.1
4.1.2

4.2
4.3

Abstract mappings...........................................................................................11
TNPM-WDT naming.........................................................................................11

4.3.1

4.4
4.5

Data entered manually........................................................................................ 10


Counter template spreadsheet............................................................................10

Technology pack.................................................................................................. 11

Technology [pack counter source.....................................................................12


KPI design requirements..................................................................................12

4.5.1
4.5.2
4.5.3
4.5.4
4.5.5
4.5.6
4.5.7
4.5.8
4.5.9
4.5.10

NC objects.......................................................................................................... 12
Counter Block...................................................................................................... 14
Counter Name..................................................................................................... 14
Description.......................................................................................................... 14
Counter Type....................................................................................................... 14
Data Type............................................................................................................ 14
Units.................................................................................................................... 14
KPI Group Name.................................................................................................15
KPI Group Abbreviations.....................................................................................15
KPI Name............................................................................................................ 16

Copyright IBM Corp. 2007, 2010, 2013

4.5.11
4.5.12

4.6

General definition guidelines............................................................................16

4.6.1
4.6.2

4.7

KPI Definition...................................................................................................... 16
Busy Hours......................................................................................................... 18

Technology Pack upgrades..............................................................................20

4.7.1
4.7.2
4.7.3

4.8

KPI Abbreviation.................................................................................................. 16
KPI Expression.................................................................................................... 16

Obsolete counters...............................................................................................20
Moved/renamed counters/groups........................................................................21
Creating a TNPM-WDT project for an upgraded technology pack.......................21

Reporting......................................................................................................... 21

Information sources................................................................................23
5.1
5.2
5.3

Vendor documentation.....................................................................................23
Data samples...................................................................................................23
Other Sources..................................................................................................23

5.3.1
5.3.2
5.3.3

Standards............................................................................................................ 23
Subject Matter Experts........................................................................................ 23
Existing technology packs...................................................................................24

Review process........................................................................................25
6.1
6.2
6.3
6.4
6.5

Data Validation tool..........................................................................................25


Gateway Validator tool.....................................................................................25
Objects............................................................................................................. 25
KPIs................................................................................................................. 25
Busy Hours......................................................................................................25

Typical Issues Found...............................................................................26


7.1
7.2
7.3

Issues identified at a data audit........................................................................26


Issues identified at verification of KPI repository..............................................26
Issues identified at Technology Pack testing....................................................26

Copyright IBM Corp. 2007, 2010, 2013

ii

1 Introduction
This document provides best practice guidelines for building a technology pack by using
IBM Tivoli Netcool Performance Manager for Wireless Development Toolkit (TNPMWDT).
It is a supplement and does not replace the documentation that is available on the
infocenter from here:
http://publib.boulder.ibm.com/infocenter/tivihelp/v8r1/topic/com.ibm.tnpm.doc/welcome
_tnpm_WDT.html
The IBM Tivoli Performance Netcool Manager for Wireless Development Toolkit
Installation and User Guide is the reference point for all issues relating to the use of the
IBM Tivoli Netcool Performance Manager for Wireless Development Toolkit while
building a technology pack.
The aim of this document is to:

Ensure consistency of design


Minimize development level of effort and accelerate solution availability
Improve the quality of technology packs

1.1 Audience
This guide is for personnel who are responsible for creating technology packs for TNPM.
Such personnel must have:

Experience in using the TNPM product


Training in all the standard IBM Tivoli Netcool Performance Manager for
Wireless Development Toolkit components

1.2 Required skills and knowledge


Required skills and knowledge include:

Microsoft Windows operating systems


UNIX operating systems, especially Solaris
Database principals
Graphical User Interfaces
General IT and telecommunications principles
Network Operator's OSS and BSS systems architecture
Tivoli Netcool Performance Manager - Wireless component, loader mapping
syntax

2 Top best practice tips


When building technology packs, there is a strong focus on ensuring the quality of the
delivered solution. In addition, minimizing level of effort (LOE) and accelerating time to
market (TTM) is critical. Here are the top best practice tips that are essential in driving
these requirements.

2.1 Data entry options


Make full use of data entry options. If a configured gateway is already available, consider
using the LIF parser.
The LIF Parser is a tool that reads all the blocks and variables in any gateway LIF file,
and inserts the details into the KPI Repository, against a specified LIF type name. The
LIF type is associated with a given technology and vendor. The purpose of the tool is to
save a product definer from manually entering information for LIF variables using the
KPI Repository UI. After these LIF variables are stored in the database, they can be used
to automate the production of one-to-one KPIs. The LIF Parser is used after network
objects (and their associated attributes) are defined in the KPI Repository by using the
UI.
For complex and large solutions, fully utilize the TNPM-WDT CSV spreadsheet. For
small simple solutions, direct data entry into the IBM Tivoli Netcool Performance
Manager for Wireless Development Toolkit user interface may be the most appropriate
method. Remember these methods are not mutually exclusive and can be used in
combination.

2.2 Requirements specification process


Use a requirements specification process to gather inputs and maintain consistency. It is
essential that all technical requirements for the technology pack are clearly documented
and managed under change control.

2.3 Vendor documentation


Use vendor documentation as your guide. Object modeling, KPI naming, KPI grouping,
data types etc., are documented in the standard vendor documentation. Unless there is
technical justification and customer (operator) requirement to do so, utilize what the
vendor has given you.

2.4 Available document formats


Make full use of all available document formats to improve data entry. Most vendors
provide their standard documentation set covering their list of performance management
counters in PDF format. Some can cover hundreds of pages and several thousands of
counters/KPIs. Limited PDF converter tools can be used. But it is worthwhile
investigating if these documents are available in a more editable format so that you can
cut and paste into the CSV spreadsheet. A word processing format is useful, a spreadsheet
much better. XML is an excellent format to work with.

2.5 Data formats


Use standard data formats where possible, for example, XML and CSV. This ensures
consistency, makes it easier to find skilled engineers to perform the gateway work, and
utilizes already available code.

2.6 Standard spreadsheet


Have requirements entered into the standard spreadsheet. Where possible, get the subject
matter experts (SMEs) to enter counters/KPI requirements outside of standard vendor
documentation in a spreadsheet and preferably the format used in the TNPMW-DT CSV
spreadsheet.

2.7 Flexibility
Be flexible in how you assign engineers to build a technology pack and how you
allocate work. Small, non-repeatable requirements may only require one or two
experienced engineers to build a solution. A requirement for large numerous solutions
may necessitate a more process driven build and test environment, where individual
activities may be allocated across a large number of more specialist staff.

2.8 Review process


Assign time for a review process after the definition phase is complete. Allow time for
another experienced engineer or customer to review the design prior to production of the
final code. This is a worthwhile step to ensure requirements are correct.

2.9 Consistency
Be consistent in design. Most technology packs will not be one offs and will be upgraded
in the future. They will also need to interact with other technology packs developed
elsewhere. Utilize the supplied global object model (GOM), use the standard Tivoli
Netcool Performance Manager for Wireless Development Toolkit documentation and the
details in this best practice guide to assist you.

2.10 Utilize all the tools


Utilize all the tools that are available in the TNPM-WDT. The TNPM-WDT contains a
variety of useful tools to assist in building a technology pack. They all have a purpose
and are there to ensure quality and minimize LOE. Spend time reviewing what the full
sets of tools are capable of and use them where relevant.

3 Planning and project overview


3.1 Overview of technology pack release lifecycle
The following diagram illustrates a traditional high-level overview of the process to build
end-to-end technology pack. The process sets up to support a sizeable number of parallel
builds. The IBM Tivoli Performance Netcool Manager for Wireless Development Toolkit
Installation and User Guide provides a more detailed diagram on the technical aspects of
a project.
Plan
Business
decision to build
Business
requirements
captured
Project plan put
in place
Resources
assigned

Design

Build

Create and
define KPI
repository

Verify KPI
repository

Validation
QA and
integration
testing

Release
Final release to
customer

Schema Design
Run data
validation tests
Produce PRS
documenting
design

Final packaging
Implement
required GW
changes
Upgrade analysis
if relevant

Define GW
requirements
Run data
validation tests
Create

A more agile system than the model above


must be encouraged, focusing on reducing
documentation
TTM while maintaining quality. You should:
and Packaging

Place strong focus on the quality and content of a clear and concise initial
design.
Trust the TNPM-WDT tools and minimise unnecessary testing.
Reduce excessive QA validation by bringing more into the build process
Use light documentation and process where possible.

Important aspects of planning and building technology packs are the LOE required, and
the division of work and TTM.

3.2 LOE and TTM


There are no definitive limits set on how quickly a technology pack can be produced.
There are a large number of variables that will influence both the start time of a project,
total effort required and elapsed time before the final solution is available.
The following sections list the typical areas that affect LOE and TTM.
3.2.1

The size of the requested solution

A new technology pack covering 2000 KPIs will have a higher LOE than a technology
pack that only requires support for 50 KPIs.
3.2.2

Availability of data samples

If no data samples are available this can delay the technology pack build.
3.2.3

Vendor documentation

In some situations, the vendor documentation does not contain all the required inputs
essential for the completion of the technology pack definition. Some vendor
documentation may also increase the time to extract the key requirements and delay entry
into the KPI repository.
3.2.4

Poor quality data samples and non-standard format

Standard gateways for parsing XML, CSV, and so on are included in the TNPM-WDT
solution. Additional gateway coding is required but not as much as when an unstructured
or poor quality data format is used.
3.2.5

Graphs and reporting requirements

These will add value to the end solution but require definition, build and test time.
3.2.6

Quality of existing release

If a technology pack is an upgrade, additional effort may be required to resolve


outstanding defects if the existing baseline is of poor quality. This may also cause more
effort in the design to support the resolution of these defects.

3.3 Division of work


Depending on the level of resources and expertise available, you can follow three main
models. You must choose the correct model to meet your requirements against the
available resources and expertise.
3.3.1

Single solutions engineer

If engineers have the requisite skills in telecommunications, database design, gateway


coding, and other skills, an engineer can build and test a full technology pack. This may
be due to limited resources being available or that the volumes of solution requirements
do not merit a larger team.
Advantages

No handover points between different parts of the process.

Disadvantages

3.3.2

No independent review of the definition or final code. Potential issues may be


raised at customer deployment.
Resources cannot be run in parallel or doubled up to accelerate TTM.
One system designer and one system developer

The work can be divided between one engineer focusing on designing the technology
pack and another implementing the build, coding and test.
Advantages

Allows specialist allocation of skills.


Allow some work in parallel.

Disadvantages

3.3.3

Handover material required between the two engineers has to highlight specific
requirements.
Full specialist team

A complete specialist team can be used to focus on specific areas:

System designer focusing on the definition.


Specialist gateway engineer.
System developer building the solution.
QA test engineer focusing on verification.

Advantages

Allows a specialist allocation of skills.


Allows work in parallel.

Disadvantages

Handover material is much more complex.

4 Building a technology pack


4.1 Data entry
When building a technology pack, it is essential to optimize data entry, especially for
large and complex solutions. There are three methods for data entry. These are not
mutually exclusive and a combination of them can be utilized to load data in the most
efficient manner. The three methods are as below.
4.1.1

Data entered manually

Data is entered manually by using the TNPM-WDT user Interface. This is useful for
small simple solutions where data is manually entered easily.
4.1.2

Counter template spreadsheet

Counter template spreadsheet populated with definition data and loaded. For large and
complex solutions, this is the main form of data entry. It allows the engineer to define the
technology pack and visualize it in one spreadsheet. Data can be moved easily and the
powerful functions of a spreadsheet are invaluable in managing the data.
The counter template can be found in the following directory once installed:
\IBM\Tivoli Netcool\TNPM-WDT\data\counter_parser

An example spreadsheet is shown below.

The counters listed in the spreadsheet must correspond to the vendor documentation in
order and naming convention to maintain consistency and usability.
Columns A K need to be populated as a minimum. See the IBM Tivoli Performance
Netcool Manager for Wireless Development Toolkit Installation and User Guide for a full
explanation of each column. To assist in the review process it can be useful if a
convention is followed to highlight issues:

Each KPI must be written in blue and counter in black.


Each KPI/counter row used as a busy hour must be highlighted in yellow.
Where there is a doubt about the definition, that cell must be highlighted in red.

4.2 Abstract mappings


Elements in technology packs are case sensitive. Therefore, you must use consistent
capitalization for Blocknames that are used within a technology pack in the following
scenarios:

The same blockname is used across multiple KPI Groups


The same blockname is used by attributes and KPIs

Ensure the blocknames are consistent across the Technology Pack.

4.3 TNPM-WDT naming


4.3.1

Technology pack

The technology pack names must be built as follow:


<>_<Technology>_<Subsystem>_<Release>
Examples for each component are listed in the following table.
Vendor
ALU
Cisco
Ericsson
Huawei
Motorola
NSN
ZTE
Comverse
Tekelec
Intervoice
Starent
Openmind

Technology
ATM
CDMA
EVDO
GPRS
GSM
HSPA
IP
LTE
SDH
UMTS
VoIP
WiMAX

Subsystem
BSS
NSS
GGSN
SGSN
MSC-S
UTRAN
MGW
HLR
SAE
PDN
SGW
HSS

Release
W10B
R13
V100R008
B10
U4.1
RU20
P7.1
M14.3
V18
UA6
BR10
S13

It is important to ensure that each technology pack has a unique name and uses the
correct naming conventions and version as defined by the vendor.
There is some flexibility to the naming process and this will be impacted by the scope of
the project. For example, a wireless packet data network may be delivered as one
technology pack, or split into components such as SGSN and GGSN.

4.4 Technology [pack counter source


The name of sources must be the same as the technology pack. For most technology
packs, one counter source must be used and must contain all counters and KPIs required.
For a technology pack upgrade, the counter source of the previous release has to be
retained. This is mainly for obsolete counter purposes and backward compatibility. To do
this, follow the sequence of instructions below:

Create a project and make sure Omit GOMs is selected. Import the previous
version's repository.
Create a counter source for the upgraded version.
For the following .xml files, find-replace old_source_name (N-1) with
new_source_name (N) (this can be done via any viewer):
abstract_attribute_mapping.xml
abstract_kpi_mapping.xml
metalayer_kpi_mapping.xml
vendor_Counter.xml
Import these four XML files into the KPI repository.
Check that all the previous counter source and new counter source are retained.

4.5 KPI design requirements


4.5.1

NC objects

The Global Object Model (GOM) must be used where applicable. For example, for cell
counters - use the cell objects. New objects must be minimized where possible.
If a new object (except secondary reference objects) has to be created, the mandatory
fields to be implemented are:

ID
Name
Parent (e.g. MSC, BSC, SGSN, etc.)
Vendor
Network
Region
Vendor Version
Technology

The descriptions must be consistent with those in the GOM. See below:
ID
A unique identifier for the XXX.
Name
A user-friendly name preferably unique for the XXX.
Vendor
Manufacturer of the XXX.
Network
Network associated with the XXX.
Region
Region associated with the XXX.
Version
Hardware/Software version of the XXX.
Technology
Technology of the network/element (for example, GSM, GPRS, UMTS).
Where XXX can be Cell, MSC, SGSN, GGSN, BSC, Processor, Interface, etc.
Use the COPY toolbar the Copy Object Wizard to clone objects, by selecting an object
and clicking on button.
In many cases, the objects are already defined and listed in the vendor documentation.
These must be followed unless there is a strong justification to change. The object `not, it
can be deduced from the parent/daughter relationships and/or from the raw data samples.
The raw data samples may also provide useful additional information on attributes.
Use the Object Hierarchy viewer and right-click on an icon, then select Open new
Window, expand nodes, resize window, and finally press <Ctrl><Alt><Prt sc> to
capture for documentation purposes.
NC mapping (object attributes)

In some cases, object attributes may not be available at design and may only be
information available in the field, post deployment of the technology pack. For example,
Region.
In these cases enter as populated by customer for data we expect the customer to be
able to insert manually. This must only be used for static values.
For some attributes, the value can often change, for example TRX. If this information
cannot be loaded from the raw performance data or from a configuration file, then no
mapping must be used to identify this. This leaves the field defined for future updates if
a reliable method of loading and updating the information is found.
It is important to define the most appropriate unique primary ID. Sometimes raw data
from only one network element may function, but in a multi-element environment, this
may not be so.

The Gateway Validator tool (-keyval mode) in the TNPM-WDT can be used to detect
non-unique primary IDs that will result in oscillations.

4.5.2

Counter Block

The name of the block that the counter appears in, if the row does not represent a
complex calculation. This may sometimes be referred to as a LIF Block, or as a Vendor
Table, or a Counter Group. Raw data and vendor documentation can give good guidelines
on naming.
4.5.3

Counter Name

The name of the counter is as it appears in the vendor documentation and as exported
across the Northbound Interface from the vendor equipment. This may be a counter that
is self-defining such as dropped calls or counter IDs. Ensure that the correct counter
name is mapped.
4.5.4

Description

A description of the measurement. This must not contain any carriage returns, or any nonASCII characters. Although not essential to build a definition, completing clear and
concise descriptions are important to the end usability of the final technology pack
vendor documentation contains these descriptions.
4.5.5

Counter Type

The type of counter, for KPI measurement: Accumulation, Intensity, Erlangs, PercentOK,
or PercentFail. The counter parser will disallow any other types, as they would break a
check constraint in the KPI repository database.
It is important to maintain consistency for similar counters and between raw counters and
those rolled up to a higher level.
The vendor documentation may contain this information.
4.5.6

Data Type

The data type of the measurement: Integer, Int8, Float, String, or Date. The Counter
Parser will disallow any other types, as they would break a check constraint in the KPI
Repository database.
Int or Int8? All the non-float counters must be Integer. With the exception of bytes, octets
and frames type counters (Int8) as their values may be more than 2^31.
Use FLOAT when needed.
The vendor documentation may contain this information. If this information is not
available, you can get this information from the counter descriptions.
4.5.7

Units

The unit must always be in plural and is of free text format. In 90% of cases, the unit
must be # for number.
The following units are acceptable: Seconds, Milliseconds, %, Bytes, Octets, Kbits,
Packets, Frames, Kbits/s, Minutes, Erlangs, etc.

What is critical is that there is consistency when defining units. The vendor
documentation may contain this information.
4.5.8

KPI Group Name

The KPI group name is the name of the KPI group into which the KPI produced must be
placed.
Avoid abbreviations in the measurement group names (industry accepted and standards
body acronyms are acceptable). No spaces are permitted. Spaces must be replaced with
underscores. For example, Handovers_UMTS.
If you need to use the element name in the measurement name you must add overview
in the KPI group name. For example, Bearer.Ericsson.GSM.Bearer_overview.
Only the first letter must be capitalized. For example, Handover_failures_per_causes.
When the different causes are detailed, the measurement group must be named
xxx_per_causes. For example, Handover_failures_per_causes.
Try to keep KPI group names to no more than 30 characters.
For upgrades involving KPI groups with more than 30 counters it is at the designers
discretion as to whether these remain as they are (easier for upgrades) or are split. Large
groups tend to be unmanageable and in most cases, there are logical ways to split the data
into more manageable chunks. The vendor documentation often gives good pointers on
how this is done.
4.5.9

KPI Group Abbreviations

The KPI group abbreviation must be no more than 19 characters and msut be prefixed
with the object. For example, CELL_TRAFFIC for Cell.Ericsson.GSM.Traffic.
Use information from the vendor documentation

Measurements are grouped within the documentation so it makes sense to retain this for
the abbreviation e.g. Cell group M123456 would have the abbreviation CellM123456.
Special characters and separation

Only the character _ and . are allowed in the KPI names. No blanks, single/double
quote, etc.
The . can be used for splitting the KPI groups into different hierarchy level when
needed. For example:

Cell.Ericsson.GSM.CPUE.Overload
Cell.Ericsson.GSM.CPUM.Overload
Cell.Ericsson.GSM.LP.Overload
Cell.Ericsson.GSM.RP.Overload
TRX.Alcatel.GSM.QoS.RX_level_UL
TRX.Alcatel.GSM.QoS.RX_level_DL
TRX.Alcatel.GSM.QoS.RX_quality

4.5.10

KPI Name

The meaningful name of the KPI. Only valid if the vendor counters have to be renamed.
Otherwise, use the counters as KPI names.
KPI names should have less than 30 characters for most cases. The abbreviation for the
KPI should be no more than 30 characters and should be the same as the counter/KPI
name where possible.
The KPI name should avoid duplicating the same information given in the KPI group
name. For example, if the KPI group name is Visitor subscribers then it is unnecessary
to use Attempts_authentification_for_visitors as the KPI name.
4.5.11

KPI Abbreviation

An abbreviation for a KPI that is created from a counter by the Counter Parser tool must
be no more than 30 characters long, and would eventually be used for raw column
naming. This can be the same as the counter name if no more than 30 characters.
4.5.12

KPI Expression

The (optional) mapping expression for a KPI that is created from a counter by the counter
parser tool. This is generally used where the KPI being mapped is calculated from more
than one counter, one that is not a true one-to-one mapping.
It is also used when a counter name is not a meaningful description so that a more
appropriate KPI name is required.
All complex KPIs calculations must be mapped to the metalayer, so that the calculation
will be done in the metalayer rather than the loadmap. This means that KPI Expression
column in the spreadsheet will need to be filled in. Ensure you use curly brackets { } to
indicate the calculation is done in the metalayer.
The only exception to this is for complex KPIs used in busy hour calculations. These
should have their calculations performed in the loadmap. In this case either fill in column
C in the spreadsheet or fill in column K ensuring you use ordinary brackets ( ) where
appropriate. Check that the KPI has store raw and store summary selected once the
spreadsheet has been loaded into the repository.

4.6 General definition guidelines


4.6.1

KPI Definition

Foreign objects

Sometimes it is possible to get the same performance statistic from 2 distinct elements
(elements are linked together as the BSC/MSC or SGSN/GGSN, PCU/SGSN).
For example, the MSC can provide Handover measurements for the BSC element. In
this case, the measurements should be named UMTS_Handover_from_MSC to ensure
that there is no conflict with the BSC KPI measurement.

BSC.Ericsson.GSM.UMTS_Handover (KPI group name in the BSC Technology Pack)


BSC.Ericsson.GSM.UMTS_Handover_from_MSC (KPI group name in the MSC
Technology Pack)
Aggregated group

Sometimes we can aggregate the performance statistic from a lower element in order to
calculate a BH. For example, TRX to CELL, CELL to BSC, and Interface to SGSN.
BSC.Ericsson.GSM.Traffic_aggregated_from_cell
This aggregation can be carried out in the gateway or metalayer.
KPI groups with secondary keys

The use of secondary keys is not recommended.


KPI Name Order
Attempts, successful and failure

Attempts, request, successful, unsuccessful, failure, etc. should always be at the


beginning of the counter/KPI names if possible. For example:

Request_MS_initiated_PDP_activations
Attempts_MS_initiated_PDP_activations
Successful_MS_initiated_PDP_activations
Failed_MS_initiated_PDP_activations

Incoming and outgoing

The word incoming or outgoing should suffix or prefix the KPI name. For example:

Unblock_messages_incoming
Unblock_messages_outgoing
Incoming_unblock_messages
Outgoing_unblock_messages

KPI names per causes (due to)

You should use due to for a KPI name expressing a cause. For example,
HO_due_to_distance, HO_due_to_power, HO_due_to_better_cell, etc.
Lowercase and uppercase

All the counter/KPI must start with an uppercase character and the rest with lower case
(except for the acronyms and abbreviations). All the acronyms ( such as APN, BSC,
GPRS, GSM, PDP, TCH, PDCH and SDCCH) and abbreviations (such as HO, MS, UL,
DL) must be in uppercase.
Total/Min/Max/Mean/Avg

The KPI should start with:

Tot_ (or Total_), Min_, Max_, Mean_, Peak_, Avg_

Total or Number must not be used when the KPI names are explicit. For example,
Successful_TCH_requested (it is not necessary to use Total_successful_TCH_requested).
Percentage KPIs

All percentiles must start with %_.


The name of the percentage KPI must be the same as the KPI itself with the prefix %_.
For example, Successful_TCH_assigned should have:

%_Successful_TCH_assigned as a KPI name for the percentage.

hoSuccessOutgoingMbandMbandMsDualb must have:

%_hoSuccessOutgoingMbandMbandMsDualb as a KPI name.

Special characters and separation

Only the special character _ is permitted in a KPI name. No spaces, single/double


quote, bracket, etc. are supported by the platform. A KPI name should not start with a
number.
Singular or plural

The plural should be used as much as possible. For example:

Incoming_unblock_messages
Successful_MS_initiated_PDP_activations

If that is not possible, the singular should be used.


4.6.2

Busy Hours

Within a performance management solution, it is essential to monitor the value of


counters/KPIs during periods of peak load. It is important to define suitable busy hours
that can be applied across the Technology Pack.
Attempts or Successful

The "attempts counters should be preferred for BH calculations.


Naming

The BH acronyms must be suffixed with bh. For example, ctchbh.


The acronym must be as short as possible without underscores. The max length is 10
characters.
The acronym must be built of follow:

<vendor abbreviation><object abbreviation><description of the BH>bh

The busy hour names should be built as follow:

<Vendor abbreviation>_<Object abbreviation>_<description of the BH>bh

For example:

erctchbh - for Ericsson Cell TCH traffic busy hour

Example components

Examples for each component are listed in the following table.


Vendor
al Alcatel
cs Cisco
co Comverse
er Ericsson
hu Huawei
lu Lucent
moMotorola
nk Nokia
nt Nortel
se Siemens
tk Tekelec

Object
b BSC
c Cell
h HLR
g GGSN
ip IP Interface
m MSC
pc PCU
s SGSN
sp Signalling
rs Routeset
l Link
ls Linkset

Description
tch TCH Traffic
cch CCH Traffic
ul UL RLC
dl DL RLC
ip Incoming packets
op Outgoing packets
ib incoming Bytes
ob Outgoing Bytes
ic Incoming calls
oc Outgoing calls

Foreign Busy Hours

Avoid using busy hours/counters defined in other technology packs when defining a
technology pack.
Applying BHs to KPI groups

Some objects do not have their own BHs. Some examples are listed below.
Object

BHs to apply

Antenna_Branch
AUC
Bearer
BSC
BS_Carrier
CDMA_Channel
Cell
Common channel control
D_Link
DLCI
GGSN
GPRS_Tunnel
HLR
IP_Interface
LAC
LAPD
Neighbour
Neighbour_BSC
MSC

Cell
HLR
Bearer or its parent
BSC
Cell
Cell
Cell
Cell
D-Link or its parent
DLC or its parent
GGSN
Parent
HLR
Interface or its parent
LA or MSC
BSC
Cell
BSC
MSC

Object

BHs to apply

MSC_Neighbour
OSI_Channel
PVC
Processor
SMSC
Radio_Link
Routing area
Route_if
RNC
SGSN
Signalling_Link
Signalling_LinkSet
Signalling_RouteSet
Signalling_Point
Supplementary_Service
PCU
TDMA_Channel
TRX
VLR

MSC
Cell
Parent
Processor or its parent
SMSC
Cell
RA or SGSN
Route or its parent
RNC
SGSN
Signalling link or its parent
Signalling linkset or its parent
Signalling routeset or its parent
Signalling point or its parent
SS or its parent
PCU or BSC
Cell
Cell
MSC

4.7 Technology Pack upgrades


With upgrades of an existing technology pack, it is important to maintain the underlying
structure of the previous version. A focus must be placed on implementing new/changed
and deleted counters/KPIs.
This includes maintaining:

Existing KPI group names


Counter/KPI allocation to an object
Busy hour allocation
Summary definitions
Data types
Units
Counter/KPI names and descriptions

If there is an underlying technical issue such as a defect, they have to be resolved and this
may influence structure and content. Defects must be documented and assessed during
the definition and build of the upgrade.
For moved, renamed or obsolete counters the guidelines in the following sections must be
used.
4.7.1

Obsolete counters

Obsolete counters can be identified by using the Obsolete From column in the
KPI_Group window. A drop-down list of all the available counter sources will be

displayed. Choose the release that the KPI is obsolete from. This will automatically add
into the description -Obsolete in xx.xx-
Retain abstract mappings so that the loader can pick them up (to support a customer
network in transition, leave this to services to disable it upon customer approval).
Flag this in any handover document created.
Obsolete counters must be retained for two upgrades and then physically removed. The
main reason for this is supporting customers with multiple live releases and during a
network upgrade. This is not mandatory and only a recommendation, provided there is
agreement with the customer, this can be removed. For example, Ericsson UTRAN P6 >
P7 > P7.1 > W10B. W10B will need to remove obsolete P6 counters.
Where an entire group is obsolete, add a note to the KPI group description and mark the
descriptions for each individual KPI within the group.
4.7.2

Moved/renamed counters/groups

Ensure that you use the KPI Administration tool to move or copy counters and the KPI
Group Administration tool to move or copy KPI groups. Both tools can be found in the
TNPM-WDT under Definitions -> Utilities. This ensures that a record of what has
happened is retained within the TNPM-WDT repository.
Ensure that the abstract mappings for both Object Attributes/KPIs are correct with
reference to the updated gateway. Especially for LIF blocks and counter names defined in
previous releases that may not have been fully verified due to lack of data.
4.7.3

Creating a TNPM-WDT project for an upgraded technology pack

When creating a project in the TNPM-WDT for an upgraded technology pack, make sure
Omit GOMs is selected. This is to ensure any changes done in the previous release are
correctly imported, for example data length.

4.8 Reporting
As a general guideline you must target at least 25% of the KPI groups to define
reports/graphs for. For example, if a technology pack has 100 KPI groups define
approximately 25 reports/graphs as a minimum.
Some KPI groups have multiple reports/graphs defined, for example important
measurements or too many KPIs for one report.
Customer and SME input is important in what graphs and reports to define. Some
customers may want only a data repository, where others may want an extensive list of
predefined graphs and reports.
Do not try to give too much information in one graph/report. Create an overview of the
information and use the drill down feature to run a second report with more detailed
information.
The drill down must be defined when needed. For example, Handover overview report
must drill down to the Handover failures per causes report). The drill down can also be
defined from a parent to a sub-object (e.g. BSC to Cell or Cell to TRX).

A report must not contain more than 20 counters/KPIs to maintain usability.


Do not mix graph and report components in the same technology pack. If in doubt then
split the report into several individual reports.

5 Information sources
It is important to gather as much information as possible to add value to a technology
pack. Use all possible sources where necessary.

5.1 Vendor documentation


This is the main source of information to gather requirements for a technology pack. In
most cases, each vendor has a suite of documentation detailing:

All the counters/KPIs exported over the Northbound Interface per release
This usually contains the majority of the raw information that needs to be
processed for any technology pack

It some cases the following are available and all can input into the definition:

Documented format of the data structure


Detailed Managed Object hierarchy document
100% of performance management counters in a spreadsheet format
New/changed/deleted counters in a specific release
Detailed KPI formula as determined by the vendor

5.2 Data samples


Data samples are then required to allow full validation of the information sources. The
target is usually 24 hours worth of 100% real data coverage across all objects. In many
cases, this is not possible and a focus has to be on getting data samples covering as much
of this as possible.

5.3 Other Sources


5.3.1

Standards

Industry standards are a useful tool in gathering additional information. For example,
3GPP standards within the wireless arena.
Most vendor solutions are based upon industry standards in some way, from both content
and a structural perspectives. They must be referenced where possible.
5.3.2

Subject Matter Experts

If working in an operations group or in a professional services role, there is usually a


strong pool of technical knowledge to provide detailed guidance on technical content. It
is important to leverage this experience as much as possible in terms of counter/KPIs
definitions, structure etc. This information must be documented as requirements and used
as a baseline for the whole technology pack definition.

5.3.3

Existing technology packs

Pre-existing technology packs, counter spreadsheets and vendor documentation can all be
used to assist in a new solution. For example, while building a technology pack for
wireless messaging for a vendor, you can use an existing technology pack for the vendor
to highlight key areas of functionality and content that would most likely be required.

6 Review process
A review process must be carried out once the definition phase of the project is complete.
The following sections detail the main areas that must be covered.

6.1 Data Validation tool

The Data Validation tool must be used prior to review, with all relevant errors
fixed. Many of the common errors in technology packs can be highlighted and
resolved at this phase.
Any review must also consider best practice guidelines as defined in this
document.

6.2 Gateway Validator tool

If LIF data previously exists, use the Gateway Validator.

6.3 Objects

Ensure that the correct objects have been set.


Check correct NC mappings are defined for primary, relationship and
configuration attributes and validated with data.
Object must have a relevant description

6.4 KPIs

Counters must be defined as per the vendor documentation. These must only be
changed if there is justification to do so.
Ensure that a complete counter set has been entered into the repository.
Review derived KPI syntax.
Review counter type and data type.
Check KPI group sizes.

6.5 Busy Hours

Check the most relevant busy hour (usually traffic related has been defined).
Busy hour names should comply to best practice convention.

7 Typical Issues Found


7.1 Issues identified at a data audit

Counters/KPIs may be found in the raw data but not in the KPI repository.
The vendor documentation is the baseline for determining content. In some
cases, counters/KPIs may be missed due to a documentation error or data entry.
KPI abbreviation names may have to be changed due to Oracle reserved words.
These must be flagged up at data validation.
It is also possible that 100% of the raw data is not intended to be supported
within the release.
Missing counters/KPIs intended for the release will need to be added to the KPI
repository.
KPI group reallocation may be required.
Some counters/KPIs may need to be reallocated to the correct KPI group.
ID mappings change required.

7.2 Issues identified at verification of KPI repository

Incorrect KPI or busy hour mappings.


Changes that are meant to be in the KPI repository not have been included.
Inconsistent summary type defined for KPIs under the same KPI group.

7.3 Issues identified at Technology Pack testing

Errors produced by tools or platform.


Missing KPIs from older versions, which are only visible during the
development phase.
When older versions of the technology pack are available, there might be
differences between tables/KPIs that require manual verification and in some
cases require technology pack rework.

You might also like