You are on page 1of 32

Issue 27 December 2008

it

Auditing systems development


1

In this issue
ARAFURA SEA

I N D I A N O C E A N

Assuring SAP
AUSTRALIA
TASMANIA
TASMANIA

Australia
Assuring SAP
In the later part of 2004 the ANAO became aware of a consulting firm who had developed a software tool, called SAP Assure2. This tool, in part, automated the audit of SAP. The capabilities of this software went significantly beyond the audit coverage and the capability of the ANAOs manual programmes, and extended audit analysis into areas such as SAP configuration controls, audit client manual and business process controls, and control risk assessments. Importantly, the software was able to undertake this analysis via read-only access to the audit clients SAP environment, and was capable of customisation and configuration to cater for the different environments we were required to audit. A decision was made to pilot SAP Assure after a desktop review of several comparable products. The review panel consisted of both IT and Assurance Audit specialists, and considered factors such as usability, access requirements, cost, functionality, and ongoing training needs. In late 2004 the ANAO undertook a live pilot assessment of the tool at the then Department of Transport and Regional Services.3 The pilot was a combined effort with the product vendor and included involvement from a range of ANAO staff including IT and Assurance Audit personnel. The outcomes of the pilot were successful and positive. The pilot team realised immediate audit benefits in terms of improved audit coverage, efficiency, automation of reporting, and focussed recommendations for configuration and security improvements. But they also identified considerable ongoing benefits that could be gained by the ANAO in using the tool as part of our audit coverage. Consequently, a business case was developed and subsequently approved to implement the tool progressively across our SAP audit clients. The rollout of SAP Assure was internally phased across a number of key audit clients. Our implementation used an integrated approach between the IT and Assurance auditors, supported by both the IT and Assurance Audit executives as well as audit client management. A programme of technical training was provided to IT Audit staff, with additional process training being provided to Assurance Audit staff. Staff champions were designated for the tool and provided a direct feedback loop from audit staff to the vendor regarding implementation or operational issues, and recommendations for possible improvements. These activities contributed greatly to the overall success of the implementation.

Hobart

I N D I A N

O C E A N

AUSTRALIA

Enterprise resource planning and related applications software from SAP has been implemented by many Australian Government bodies as the basis of their financial and human resources management systems. Amy Fox and Lesa Craswell from the Australian National Audit Office (ANAO), chart the development of ANAOs audit of SAP and in particular its use of specialised audit software, SAP Assure. There have been a variety of significant benefits from the use of SAP Assure and the ANAO has achieved a much greater level of assurance about the controls in client systems.

Organisation Background
In Australia the Office of the AuditorGeneral was established in 1901 under the authority of the Audit Act 1901, the fourth act to be passed by the first assembly of the Commonwealth of Australia Parliament. In 1997 the Auditor-General Act 1997 replaced the Audit Act, and was enacted on 1 January 1998. Under this Act the independence and mandate of the Australian National Audit Office (ANAO) was further strengthened, as the Auditor-General had become an Officer of the Parliament. Information Technology Audit (IT Audit) exists to provide an integrated audit support service to all business units within the ANAO. While administratively part of the Assurance Audit group, they have primary responsibility for the management and delivery of IT audit activities to both the Assurance (Financial) and Performance Audit service groups. The primary objective for IT Audit is provision of independent assurance through the detection and evaluation of risks faced by Australian Government entities in their adoption and use of existing and emerging technologies. There are currently 23 permanent members of IT Audit supplemented by contract staff, each with a range of educational and technical skills and experiences. The variety of technology, accounting, auditing, and graduate backgrounds provides a diverse team capable of successfully undertaking most of the audit activities with which it is charged. One of the services offered by IT Audit is the provision of IT application control assessment of the various SAP environments within audit clients. SAP is a key area of review for the ANAO as it has

been implemented by multiple Australian Government entities for use as their financial management and/or human resource management systems. The purpose of our review is to undertake risk identification, risk assessment, control identification and control assessment of entities SAP implementations. This contributes a significant level of assurance for the financial statement audits undertaken by the ANAO.

Analysis Undertaken
There are three modules to the SAP Assure tool: Security; Configuration; and Data integrity and analysis. Using SAP Assure as part of our audit approach presents considerable opportunity for audit clients to be active participants in the audit process. The clients provide background information for module configuration as well as information on their business processes supported by SAP. In addition to system controls and processes, SAP Assure allows manual processes to be considered as part of the overall audit risk and control assessment. This ensures a holistic view of all processes and configurations can be taken into account by the audit team.

My Olympic torch relay Brazil


Audit of information technology in the Siape consignment module Audit of information technology in the justice and public safety information integration system Infoseg

Development of SAP Audit Approach


The ANAO developed and published a SAP Better Practice Guide1 in 1998. This guide was prepared to provide assistance to Australian Government entities to ensure that security and internal control considerations (in the form of better practice procedures) within their SAP systems were configured and implemented correctly. The guide was updated and expanded on in 2004. Until 2005 the ANAOs approach to auditing SAP was through the use of manual work programmes. These programmes were developed internally over a long period of time but were resource intensive to complete. Our approach also imposed significantly on the staff at the audit client, and did not appropriately consider differences in SAP versions, audit client business processes, and/or SAP customisations. In addition, our audit scope and coverage was primarily limited to the BASIS module in SAP, specifically related to security role allocation and administration. To support our audit process, the ANAO relied heavily on consultants and contractors to provide specialist SAP knowledge and training.

Module
Controls

Key Functions
Provides a comprehensive assessment of controls within an SAP R/3 environment. Inbuilt knowledgebase of controls which your environment is assessed against. Automatically identifies and reports internal control weaknesses. Facilitates an integrated audit. Assists with the ongoing monitoring of key SAP controls. Inbuilt knowledgebase of SAP transactions and segregation of duties points. Provides proactive review of segregation of duties. Assesses security within all SAP processes. Assists with the ongoing monitoring of SAP access. Identifies integrity risks. Enables assessment against preset tolerances. Automatically identifies duplicate and potential fraudulent transactions, and financial statement disclosure concerns. Assist with assessing the integrity of master data.

Security

8 10

Integrity

Better Practice Guides can be downloaded from the Publications section of the ANAOs website: http://www.anao.gov.au

The supplier of the software is now Protiviti Independent Risk Consulting. More details at: www.protiviti.com.au/portal/site/pro-au/menuitem.422c12144c9aeef4ca19f110f5ffbfa0/
3

Now Department of Infrastructure, Transport, Regional Development and Local Government

N O RTH ATL A N TI C ATL A N TI C O CEA N O CEA N

Eq u a t o r Eq u a t o r

BRAZIL

SO U TH SO U TH ATL A N TI C ATL A N TI C O CEA N

BRAZIL

Audit of information technology in the Siape consignment module


What is the Siape consignment module?
The Integrated System of Human Resources Management (Siape) is a human resources system that processes and controls a payroll in the order of R$ 52 billion a year, covering about 1,300,000 civil servants, retirees and pension holders of the Executive Branch. Siapes consignment module consists of a set of computer-based transactions that support systematic payroll consignment procedures. These systematic procedures consist of the provision of services to civil servants, retirees and pension holders of the Executive Branch, who are referred to as consignees, by entities duly registered and authorised to make deductions from the payroll, which are referred to as consigners. Payroll consignment procedures are applied to approximately 1,300 consignees, for whom transactions amounting to over R$ 300 million a month are carried out.

India
Why E Governance Projects Fail
Other shortcomings which were detected in the Siape system. Access control and the environment where it is executed jeopardise the correct operation of the consignment module. Among them, special mention should be made of the following: the staff in charge of managing the system are dissatisfied with the workload involved; lack of controls over those in charge of registering individuals in the Siape system and operating it; existence of general staff in charge of registering individuals in the system who do not belong to the regular staff of the managing unit; members of the Siape system development and maintenance team with non-controlled access to the production environment; and lack of a unified channel to process complaints. the normative guidelines that regulate the matter, determinations were proposed to the Human Resources Secretariat of the Ministry of Planning, Budget and Management SRH/MP, among which the following stand out: immediate suspension of optional consignments if there are any doubts as to whether or not they were authorised by the consignee, preventing their re-inclusion until it is actually confirmed that that the consignee authorised the consignment; the authorisation of the consignee should be registered in the Siape system before the consignment is actually made; applicable internal administrative measures should be taken to recover damages to the public treasury as a result of the non-collection of fees for covering the costs for processing data related to optional consignments; prior formalisation of a contract or agreement between consignees and the Central Agency of the Sipec system to operate in the Siape and Siapenet systems; associations and clubs which operate in the Siape system should be exclusively made up of federal civil servants; minimum documentation for each consignee should be required and maintained. Determinations were also proposed for the purpose of improving the control and transparency of the systematic procedures applied to Siape consignments, as well as recommendations to improve them, with the aim of improving procedures and internal controls.

11

Tribunal de Contas da Unio (SAI Brazil) Information Technology Audit Secretariat

Main TCU findings


TCU detected various shortcomings directly related to the systematic consignment procedures, among which the following stand out: inclusion of consignments without authorisation from the consignee; undue re-inclusion of already excluded or finalised consignments; undue exclusion of consignments; changes in the amounts to be transferred to consignees; non-charging of a fee for using the system for the optional consignment rubric; inclusion of optional consignments under compulsory consignment rubrics and of expenses not legally provided for in monthly payment rubrics; existence of a consignment rubric not provided for in the law; lack of controls in the beginning of the consignment flow; absence of criteria to punish a consignee who acts in an irregular or illegal way; and absence of contractual tools between the consignees and the Central Agency of the Civil Staff System of the Federal Administration Sipec.

Netherlands
Why government ICT projects run into problems

14

TCU DELIBERATION
Sentence n. 1,505/2007 TCU-Plenary Rapporteur: Justice Valmir Campelo

Why did TCU carry out this audit?


Because of problems related to deductions in the payroll of civil servants, retirees and pension holders at amounts exceeding the legal limits and of other irregularities which led the National Treasury General Attorneys Office in the State of Amap and the Federal Attorneys Office in the State of Amap to file a representation, the Court carried out this audit for the purpose of investigating controls and procedures related to the consignment of amounts in the payroll.

TCU determinations and recommendations


The main conclusion of this audit is that there are no controls allowing for one to tell for sure that the Sole Paragraph of Article 45 of Law 8,112/90 is being complied with, i.e., that deductions from the payroll are being made upon authorisation from the civil servants involved. Therefore, for the purpose of adjusting the systematic consignment procedures to the provisions of

China, Kuwait, Pakistan and Bhutan


Auditing Systems Development
9

18

Hebrides

Orkney Islands

North Sea

Baltic Sea

NETHERLANDS NETHERLANDS

Elbe Elbe

English Channel English Channel


Sein Sein e e

Why government ICT projects run into problems


Eefje Leydesdorff and Thomas Wijsman from the Netherlands Court of Audit

Loire

Bay of

Rhine

Danub

Danube

NETHERLANDS

Government ICT projects are carried out in a complex environment. The complexity is threefold: political, organisational and technical factors complicate these projects. Therefore government ICT projects are difficult to govern. Actors involved in government ICT projects are inclined to turn a blind eye to these difficulties because each actor, for its own legitimate reasons, has an interest in large and ambitious projects. While both politicians and ministers should take our observations to heart, we believe that the latter hold the key to break out of the spiral in which projects become too complex. Ministers should be more realistic in their ambitions and should make sure to keep a firm grip on their ICT projects.

Introduction
The Dutch government experiences severe difficulties managing ICT projects. Projects run into problems, they get far more expensive than budgeted, need more time than planned or do not deliver the intended results. The problems can have adverse impact on government processes and often a great deal of money is involved. For instance, because of substantial problems with the project to implement a new procedure for the payment of housing and medical care benefits 52,000 people did not receive the benefits they where entitled to at the end of 2005. Another case in point is the failure of a project aimed to develop an ICT system to support a future human resources shared services centre. Because of an unconstructive relationship between government and the supplier, the supplier threw in the towel which ended the project. A recent example is a project for the renewal of a social benefits system which was abandoned after having spent 87 million. Such problems are not unique to the Dutch government. In the United States serious problems with ICT projects led to the introduction of the Clinger-Cohen Act in 1996. Also the private sector struggles to manage ICT projects. Projects in the public sector, however, are much more in the spotlight.

In this article we will discuss the underlying causes of the problems with government ICT projects. It is based on an audit that we performed at the request of our parliament.

Request from parliament


On 5 June 2007, members of the House of Representatives addressed questions to the Minister of the Interior about newspaper reports on large amounts of public money wasted on failing ICT. According to the media, ICT experts estimated that the Dutch government spends between 4 and 5 billion a year on failing ICT projects. As a result of the debate with the minister, the House asked the government to give an overview of all large ICT projects of central government and to examine how coordination by the Minister of the Interior could be strengthened. The House also requested the Netherlands Court of Audit to investigate the persistent problems with ICT projects. We were requested to identify the main underlying causes of problems with ICT projects of the central government and to make recommendations for improvement. The request also comprised four other questions but in this article we focus on the first question1. This article is based on part A of the audit that we published at the end of 2007 (http://www.rekenkamer.nl/9282000/ d/p425_report.pdf). We published part B in July 2008.

Four other questions not covered here were: What is the quality of the information provided to the House of Representatives and how useful are the project administrations to provide this information? How are the efficiency and effectiveness of expenditure on ICT projects accounted for? What indication can the Court of Audit give of avoidable costs and avoidable delays? What view does this investigation give on the possibilities and limitations of a government-wide investigation into avoidable costs and delays in central government ICT projects since 2000?

14

Cover photo supplied by the Chinese National Audit Office

Editorial
Welcome to Edition 27. An awful lot seems to have happened (at least for me!) since the last edition went to the printers. Not least of which were two key events the Beijing Olympics, and the INTOSAI Working Group on IT Audit annual meeting held in Tokyo. And in this edition we cover both! Firstly, but in reverse chronological order, the Olympics. Ms Yang Li (aka Amy Young) Director of IT Audit at the National Audit Oce of the Peoples Republic of China, and member of our very own the INTOSAI Working Group on IT Audit was selected to participate in the Olympic Torch Relay. She ran her leg in Qinghai Province, in the northwest of China. And yes, we have the photos! It is good to see IT Audit represented in such a prestigious way. The 17th Meeting of the Working Group was held in Tokyo from 21-23 May. The Japan Board of Audit were excellent hosts, and everything ran very smoothly. We are very grateful to the team who made it happen, including all those who worked so hard behind the scenes. In addition to updates on continuous work to share information of IT Audit (including this journal), the Working Group considered updates on projects IT Governance, Auditing Systems Development, e-governance risks, IT tools for electronic audit papers, countering fraud in an IT environment, and use of SAP in public administration. The Group also discussed a wide range of suggestions for future work, and heard presentations from SAIs on their work but of wider interest. As is our usual practice, we shall publish the outputs of this work in intoIT as they are nalised and agreed. Also, where the Working Group feels that wide exposure and comments will help we will also publish the draft and seek views. This edition contains such a draft on Auditing Systems Development. A team lead by SAI China, with support from SAIs Kuwait, Pakistan and Bhutan have produced an exposure draft and are seeking wider views before nalisation. So please let them have your views (details are in the article). We are also very grateful to SAIs Australia, Brazil, India, and Netherlands for their contributions to this edition. As I have said many times before, intoIT only survives because of your contributions, so please keep them coming. We are particularly keen to include a range of short news items about what is happening in the world of IT Audit in your country. I am hoping that future editions will cover the topics of Enterprise Resource Planning systems, and IT tools for documenting audits, so if you have any experience of these I would be very pleased to hear from you. Contributions are welcome at any time please email them to intoIT@nao.gsi.gov.uk. We are very much aware of the diculty of writing in English if that is not your rst language and we are very happy to help with your article to ensure that it gets published to the best advantage. We will also let you see exactly what we propose to publish before we do so. Heres hoping to hear from you soon! Steve Doughty Editor

Steve Doughty

IntoIT is the journal of the INTOSAI Working Group on IT Audit. The journal is normally published twice a year, and aims to provide an interesting mix of news, views and comments on the audit of ICT and its use in Supreme Audit Institutions (SAIs). Material in the journal is not copyrighted for members of INTOSAI. Articles from intoIT can be copied freely for distribution within SAIs, reproduced in internal magazines and used on training courses. The Editor welcomes unsolicited articles on relevant topics, preferably accompanied by a photograph and short biography of the author, and short news items for inclusion in future issues. The views expressed by contributors to this journal are not necessarily those of the editor or publisher. Contributions should be sent to: The Editor of intoIT National Audit Office 151 Buckingham Palace Road London SW1W 9SS United Kingdom, E-mail: intoit@nao.gsi.gov.uk, Web site: www.intosaiitaudit.org

ARAFURA SEA

I N D I A N O C E A N

Assuring SAP
AUSTRALIA
TASMANIA
TASMANIA

Hobart

I N D I A N

O C E A N

AUSTRALIA

Enterprise resource planning and related applications software from SAP has been implemented by many Australian Government bodies as the basis of their financial and human resources management systems. Amy Fox and Lesa Craswell from the Australian National Audit Office (ANAO), chart the development of ANAOs audit of SAP and in particular its use of specialised audit software, SAP Assure. There have been a variety of significant benefits from the use of SAP Assure and the ANAO has achieved a much greater level of assurance about the controls in client systems.

Organisation Background
In Australia the Office of the AuditorGeneral was established in 1901 under the authority of the Audit Act 1901, the fourth act to be passed by the first assembly of the Commonwealth of Australia Parliament. In 1997 the Auditor-General Act 1997 replaced the Audit Act, and was enacted on 1 January 1998. Under this Act the independence and mandate of the Australian National Audit Office (ANAO) was further strengthened, as the Auditor-General had become an Officer of the Parliament. Information Technology Audit (IT Audit) exists to provide an integrated audit support service to all business units within the ANAO. While administratively part of the Assurance Audit group, they have primary responsibility for the management and delivery of IT audit activities to both the Assurance (Financial) and Performance Audit service groups. The primary objective for IT Audit is provision of independent assurance through the detection and evaluation of risks faced by Australian Government entities in their adoption and use of existing and emerging technologies. There are currently 23 permanent members of IT Audit supplemented by contract staff, each with a range of educational and technical skills and experiences. The variety of technology, accounting, auditing, and graduate backgrounds provides a diverse team capable of successfully undertaking most of the audit activities with which it is charged. One of the services offered by IT Audit is the provision of IT application control assessment of the various SAP environments within audit clients. SAP is a key area of review for the ANAO as it has

been implemented by multiple Australian Government entities for use as their financial management and/or human resource management systems. The purpose of our review is to undertake risk identification, risk assessment, control identification and control assessment of entities SAP implementations. This contributes a significant level of assurance for the financial statement audits undertaken by the ANAO.

Development of SAP Audit Approach


The ANAO developed and published a SAP Better Practice Guide1 in 1998. This guide was prepared to provide assistance to Australian Government entities to ensure that security and internal control considerations (in the form of better practice procedures) within their SAP systems were configured and implemented correctly. The guide was updated and expanded on in 2004. Until 2005 the ANAOs approach to auditing SAP was through the use of manual work programmes. These programmes were developed internally over a long period of time but were resource intensive to complete. Our approach also imposed significantly on the staff at the audit client, and did not appropriately consider differences in SAP versions, audit client business processes, and/or SAP customisations. In addition, our audit scope and coverage was primarily limited to the BASIS module in SAP, specifically related to security role allocation and administration. To support our audit process, the ANAO relied heavily on consultants and contractors to provide specialist SAP knowledge and training.

Better Practice Guides can be downloaded from the Publications section of the ANAOs website: http://www.anao.gov.au

In the later part of 2004 the ANAO became aware of a consulting firm who had developed a software tool, called SAP Assure2. This tool, in part, automated the audit of SAP. The capabilities of this software went significantly beyond the audit coverage and the capability of the ANAOs manual programmes, and extended audit analysis into areas such as SAP configuration controls, audit client manual and business process controls, and control risk assessments. Importantly, the software was able to undertake this analysis via read-only access to the audit clients SAP environment, and was capable of customisation and configuration to cater for the different environments we were required to audit. A decision was made to pilot SAP Assure after a desktop review of several comparable products. The review panel consisted of both IT and Assurance Audit specialists, and considered factors such as usability, access requirements, cost, functionality, and ongoing training needs. In late 2004 the ANAO undertook a live pilot assessment of the tool at the then Department of Transport and Regional Services.3 The pilot was a combined effort with the product vendor and included involvement from a range of ANAO staff including IT and Assurance Audit personnel. The outcomes of the pilot were successful and positive. The pilot team realised immediate audit benefits in terms of improved audit coverage, efficiency, automation of reporting, and focussed recommendations for configuration and security improvements. But they also identified considerable ongoing benefits that could be gained by the ANAO in using the tool as part of our audit coverage. Consequently, a business case was developed and subsequently approved to implement the tool progressively across our SAP audit clients.

The rollout of SAP Assure was internally phased across a number of key audit clients. Our implementation used an integrated approach between the IT and Assurance auditors, supported by both the IT and Assurance Audit executives as well as audit client management. A programme of technical training was provided to IT Audit staff, with additional process training being provided to Assurance Audit staff. Staff champions were designated for the tool and provided a direct feedback loop from audit staff to the vendor regarding implementation or operational issues, and recommendations for possible improvements. These activities contributed greatly to the overall success of the implementation.

Analysis Undertaken
There are three modules to the SAP Assure tool:
l l l

Security; Configuration; and Data integrity and analysis.

Using SAP Assure as part of our audit approach presents considerable opportunity for audit clients to be active participants in the audit process. The clients provide background information for module configuration as well as information on their business processes supported by SAP. In addition to system controls and processes, SAP Assure allows manual processes to be considered as part of the overall audit risk and control assessment. This ensures a holistic view of all processes and configurations can be taken into account by the audit team.

Module
Controls

Key Functions
l l

l l l

Provides a comprehensive assessment of controls within an SAP R/3 environment. Inbuilt knowledgebase of controls which your environment is assessed against. Automatically identifies and reports internal control weaknesses. Facilitates an integrated audit. Assists with the ongoing monitoring of key SAP controls. Inbuilt knowledgebase of SAP transactions and segregation of duties points. Provides proactive review of segregation of duties. Assesses security within all SAP processes. Assists with the ongoing monitoring of SAP access. Identifies integrity risks. Enables assessment against preset tolerances. Automatically identifies duplicate and potential fraudulent transactions, and financial statement disclosure concerns. Assist with assessing the integrity of master data.

Security

l l l

Integrity

l l l

The supplier of the software is now Protiviti Independent Risk Consulting. More details at: www.protiviti.com.au/portal/site/pro-au/menuitem.422c12144c9aeef4ca19f110f5ffbfa0/
3

Now Department of Infrastructure, Transport, Regional Development and Local Government

Benchmarking and continuous monitoring ensure that subsequent reviews are more efficient. Benchmarking allows for the comparison of prior year results against the current settings and allowing any changes and issue remediation to be identified very quickly. This feature enables prior year results to be fed into the risk assessment process for the current years audit.

Maintaining and assuring security of audit client data; and Turnover of key ANAO staff members initially involved in the pilot and introductory training (this has subsequently been addressed and now the majority of ANAO IT Audit staff are trained in the use the tool).

and security management of SAP across a large number of Australian Government entities. These recommendations are centred on the general themes of:
l

The Challenges
As with any software tool, in addition to licensing fees, there are ongoing costs associated with training and maintaining the knowledge of staff. There were also initial costs and challenges involved in the ANAOs implementation of SAP Assure. These included:
l l l

The Benefits
A number of the ANAOs audit clients viewed the introduction of SAP Assure software as a business improvement opportunity. They were keen to obtain access to the audit results in order to assess internally how they could better configure SAP to meet their entitys needs. Some have subsequently purchased the tool for their own use. In contrast, a small number of audit clients were not enthusiastic with the additional level of scrutiny and identification of risks provided by the tool. There have been a variety of signicant initial and ongoing benets achieved by the ANAO as a result of the rollout of SAP Assure These include:
l l

Incompatible duties and ensuring an appropriate level of system enforcement of segregation of duties between key financial transactions (for example, transaction creation and transaction approval); Rationalisation of the number of users with access to functions and transactions considered by the ANAO as sensitive; Rationalisation of the number of users with high level administration access (for example, SAP*); A lack of setting and enforcement within configuration items of organisational and/or accounting policy (for example, asset useful life settings in SAP differing to that required by accounting policy); and Key system messages not being set, or being set as warnings instead of fatal error messages.

Purchase of the product; Initial training; Greater understanding required of audit areas not previously reviewed; Risks identified impacting the overall audit approach; and Ensuring support from audit managers and audit clients.

The complexity of the SAP environment and the ability of the tool to reflect this also presented some additional challenges, including:
l

Greater coverage for audit risk and control assurance across the SAP environments (we are now capable of reviewing configuration and security across all SAP modules, not just BASIS); Increased efficiency in the second year of implementation (review of SAP was more efficient in the second year across all audit clients); Identification of business improvement opportunities for audit clients; A standardised approach to reviewing SAP across a diverse audit client base (individual configurations differ widely, however the overall audit approach is now consistent); and A much greater level of assurance achieved where audit clients now have strong controls (and those with weaker controls are able to identify and remediate to improve their controls).

The Future of SAP Assurance within the ANAO


Many Australian Government entities have moved, or are preparing to move, to SAP ECC6 during the 2007-09 financial years. Within the ANAO there are internal processes underway to ensure that SAP Assure is able to continue to meet our ongoing needs in this changing environment. As part of this the ANAO has successfully undertaken a second pilot implementation of SAP Assure on a new audit client implementation of SAP ECC6. The aim of this pilot was to observe the SAP ECC6 implementation process, identify issues the tool needed to address to continue performing, to provide important staff development and learning opportunities, and provide the audit client with an opportunity for business process improvement. The ANAO is currently in the process of updating its SAP Better Practice Guide to reect changes to the SAP System environment and new functionality since the original guides were released. The new guide is expected to be completed by March 2009.

Interpretation of results (determining whether a particular setting was appropriate or not based on audit risk and control assessments); Variances of configuration between audit clients (as each client has different business requirements and processes); Audit clients unaware of particular settings (they had been set during the original implementation of SAP using the standard Australian Government template and never subsequently reviewed/updated); Differences between the initial configuration of the tool (which had a standard setup for industry and private sector use) and the configuration of the SAP environments under review by the ANAO (requiring additional software setup and customisation for government use);

Key Improvement Opportunities


With the assistance of SAP Assure the ANAO has been able to identify and recommend a significant number of improvements to both the configuration

Lesa Craswell Director at the Australian National Audit Office (ANAO) in Canberra Australia. Lesa has previously been a member of the local board of the Canberra Information Systems Audit & Control Association Chapter and has over ten years IT experience in Australian Government. lesa.craswell@anao.gov.au

Amy Fox Senior Director at the Australian National Audit Office (ANAO) in Canberra Australia. Amy has previously been a member of the local board of the Canberra Information Systems Audit & Control Association Chapter, is a Certified Information Systems Auditor and a Chartered Accountant. amy.fox@anao.gov.au

My Olympic torch relay


International Olympic Day is celebrated on 23 June. On that day in 2008, 162 torch bearers took part in the torch relay of the Beijing Olympics beside the beautiful Qinghai Lake in Qinghai Province, the northwest of China. Ms. Yang Li, who is Deputy Director of IT Audit Center with the National Audit Office of the Peoples Republic of China, was the 121th leg among them. After the torch relay, Yang was interviewed by the interpreter from The Official Website of the Beijing 2008 Olympic Games by telephone. I had never thought that I could connect with the Olympic Games so near. Before last National Day, I thought the Olympic Games were only related with the athletes and the champions. After I was selected as the Torch Bearer organized by Lenovo last October, I was extremely pleased. My family and my colleagues felt proud for me. It is not only my glory but also the 80,000 Chinese auditors. Today I really hope they could also share the happiness. Yang Li said.

My family and my colleagues felt proud for me. It is not only my glory but also the 80,000 Chinese auditors. Today I really hope they could also share the happiness

N O RTH ATL A N TI C O CEA N

Eq u a t o r

BRAZIL

SO U TH ATL A N TI C O CEA N

BRAZIL

Audit of information technology in the Siape consignment module


What is the Siape consignment module?
The Integrated System of Human Resources Management (Siape) is a human resources system that processes and controls a payroll in the order of R$ 52 billion a year, covering about 1,300,000 civil servants, retirees and pension holders of the Executive Branch. Siapes consignment module consists of a set of computer-based transactions that support systematic payroll consignment procedures. These systematic procedures consist of the provision of services to civil servants, retirees and pension holders of the Executive Branch, who are referred to as consignees, by entities duly registered and authorised to make deductions from the payroll, which are referred to as consigners. Payroll consignment procedures are applied to approximately 1,300 consignees, for whom transactions amounting to over R$ 300 million a month are carried out.

Tribunal de Contas da Unio (SAI Brazil) Information Technology Audit Secretariat

Main TCU findings


TCU detected various shortcomings directly related to the systematic consignment procedures, among which the following stand out: l inclusion of consignments without authorisation from the consignee;
l

undue re-inclusion of already excluded or finalised consignments; undue exclusion of consignments; changes in the amounts to be transferred to consignees; non-charging of a fee for using the system for the optional consignment rubric; inclusion of optional consignments under compulsory consignment rubrics and of expenses not legally provided for in monthly payment rubrics; existence of a consignment rubric not provided for in the law; lack of controls in the beginning of the consignment flow; absence of criteria to punish a consignee who acts in an irregular or illegal way; and absence of contractual tools between the consignees and the Central Agency of the Civil Staff System of the Federal Administration Sipec.

Why did TCU carry out this audit?


Because of problems related to deductions in the payroll of civil servants, retirees and pension holders at amounts exceeding the legal limits and of other irregularities which led the National Treasury General Attorneys Office in the State of Amap and the Federal Attorneys Office in the State of Amap to file a representation, the Court carried out this audit for the purpose of investigating controls and procedures related to the consignment of amounts in the payroll.

Other shortcomings which were detected in the Siape system. Access control and the environment where it is executed jeopardise the correct operation of the consignment module. Among them, special mention should be made of the following:
l

the normative guidelines that regulate the matter, determinations were proposed to the Human Resources Secretariat of the Ministry of Planning, Budget and Management SRH/MP, among which the following stand out:
l

the staff in charge of managing the system are dissatisfied with the workload involved; lack of controls over those in charge of registering individuals in the Siape system and operating it; existence of general staff in charge of registering individuals in the system who do not belong to the regular staff of the managing unit; members of the Siape system development and maintenance team with non-controlled access to the production environment; and lack of a unified channel to process complaints.

immediate suspension of optional consignments if there are any doubts as to whether or not they were authorised by the consignee, preventing their re-inclusion until it is actually confirmed that that the consignee authorised the consignment; the authorisation of the consignee should be registered in the Siape system before the consignment is actually made; applicable internal administrative measures should be taken to recover damages to the public treasury as a result of the non-collection of fees for covering the costs for processing data related to optional consignments; prior formalisation of a contract or agreement between consignees and the Central Agency of the Sipec system to operate in the Siape and Siapenet systems; associations and clubs which operate in the Siape system should be exclusively made up of federal civil servants; minimum documentation for each consignee should be required and maintained.

Determinations were also proposed for the purpose of improving the control and transparency of the systematic procedures applied to Siape consignments, as well as recommendations to improve them, with the aim of improving procedures and internal controls.

TCU DELIBERATION
Sentence n. 1,505/2007 TCU-Plenary Rapporteur: Justice Valmir Campelo

TCU determinations and recommendations


The main conclusion of this audit is that there are no controls allowing for one to tell for sure that the Sole Paragraph of Article 45 of Law 8,112/90 is being complied with, i.e., that deductions from the payroll are being made upon authorisation from the civil servants involved. Therefore, for the purpose of adjusting the systematic consignment procedures to the provisions of
l

N O RTH ATL A N TI C O CEA N

Eq u a t o r

BRAZIL

SO U TH ATL A N TI C O CEA N

BRAZIL

Audit of information technology in the justice and public safety information integration system Infoseg
Main TCU findings
TCU detected serious improprieties in the system, particularly with regards to its management:
l l

What is the INFOSEG system?


The National Public Safety and Justice Information Integration System (Infoseg) was set up to integrate and provide information from public safety, justice and inspection agencies of the Federal Government, the States and the Federal District. Public agents registered in the system can access information on the web on investigations, proceedings, arrest warrants, firearms, vehicles and drivers organised in four browsing modules. The Individuals Module is the main and most complex Infoseg module and its database, called National Index (IN), is the responsibility of the National Public Safety Secretariat of the Ministry of Justice (Senasp/MJ). The National Index is an index of basic information on individuals from all over the country, such as the existence of an arrest warrant, an investigation and a judicial proceeding related to them. After an initial search of the IN, detailed information can be obtained through a link to the originating State databases.

insufficient regulation; inconsistencies between the criminal databases in the units of the Federation and the IN; lack of a clear definition of the meaning of the information making up the IN; lack of a formally defined information safety policy; inadequate human resources framework and system usability; lack of a definition of the owners of certain assets; non-existence of a business continuity plan; poor management of back-up copies; inappropriate procedure for controlling changes in the system; shortcomings in the safety of the physical facilities; inappropriate operation of the user assistance service; and insufficient audit trails and shortcomings in labour contracts.

urgent measures should be taken to correct them by the agency in charge. Apart from jeopardising the reliability of the system, these inconsistencies can have serious consequences, such as the unwarranted arrest of an innocent citizen or the non-arrest of a criminal.

TCU Determinations and recommendations


With the aim of improving the Infoseg system, which is a powerful tool that can contribute to ensuring the timeliness, efficiency and efficacy of inspection, public safety and justice actions, TCU recommended that the system should be institutionalised through a federal law. The Court determined that Senasp should, among others measures, correct the errors that generate inconsistencies between the criminal databases in the units of the Federation and the IN, define the meaning of the information making up the IN, and draw up information safety and access control policies and a business continuity plan. It also determined that the secretariat should evaluate the outsourcing of staff in the Infoseg managing unit, so that its management is taken care of by sufficient permanent civil servants appropriately trained to perform strategic and sensitive activities.

l l

Why did TCU carry out this audit?


Considering that the public safety topic is present on the agenda of the Brazilian society and that a previous TCU audit had detected difficulties for implementing such an important system, the Court carried out this audit for the purpose of evaluating safety-related aspects and the consistency of the information managed by the Infoseg system.

Among the problems which were detected, special mention should be made of inconsistencies between the data contained in the National Index and those contained in the databases of the agencies that feed the system. Considering that Brazils public safety agents use the information available in this system to make decisions such as arresting an individual or not, the inconsistencies that were detected are extremely serious and

TCU DELIBERATION
Sentence n. 71/2007 TCU-Plenary Rapporteur: Deputy Justice Augusto Sherman Cavalcanti

10

Huang Ha
In du s

sh Jin . aR
ek M on g

In d

us
Gang es

Ganges

Salween R.

A r a b i a n

S e a

INDIA

An d am an Se a

I n d i a n O c e a n

Nicobar Islands Strait of Malacca

Why E Governance Projects Fail

e e v e d i c a c c c La a S e

INDIA

E Governance, especially in developing countries, is looked upon as a means to change the very concept of governance resulting in empowerment of citizens and increased transparency in public dealings by governments; increased efficiencies in delivery of public goods is an inherent underlying assumption. This paper, by Dr Ashutosh Sharma, shares some of the problems that may derail the process and need to be guarded against by vigilant auditors who should bring these to public attention in a timely fashion.

In the award winning legendary Hindi novel Raag Darbari1 (meaning The song of the court) there is a character who intersperses the story of the main protagonists in a tragicomic manner. His quest for the Holy Grail is translated into futile attempts at getting a copy of the land records from the village bureaucracy. E governance would have been good news to him and his millions of fellow beings. However according to an oft quoted 2003 survey on e-government initiatives in developing/transitional countries, only 15 per cent of e government projects can be termed as successful with 35 percent as total failures and 55 percent as partial failures where the outcome is classified as follows:
l

more than a technological initiative but is made of a complex set of relationships between the stakeholders commitment, structured developmental processes and adequate infrastructural resources. There were a number of reasons for e governance projects not doing well or falling short of expectations. Many should be applicable across national boundaries and could serve as guiding points for the auditors. Some of the more important ones are shared below:

1 Lack of business process modification


in many well meaning projects, and duplication of the manual processes in the IT environment were seen as major reasons for the end users/citizens not associating any value addition with the projects and looked upon e governance as an unwelcome addition to the hurdles to be crossed before getting the work done. For example in departments which maintain land records especially in rural areas the details regarding land ownership, cropping patterns etc were computerised but no legal sanctity was given to the output generated by such systems in absence of a commensurate change in the statutes. Similarly lack of horizontal integration also means that e governance projects would continue to deliver services in a fragmented and unsatisfactory fashion resulting in the end users having to approach a multitude of government agencies thus defeating the promise of less government in your life. Moreover an ambiguity about the very concept of e governance results in many government entities categorising e government projects such as office automation and inventory management

Total failure: the initiative was never implemented or was implemented but immediately abandoned. Partial failure: major goals for the initiative were not attained and/or there were significant undesirable outcomes. Success: most stakeholder groups attained their major goals and did not experience significant undesirable outcomes.

Part of the inhumanity of the computer is that, once it is competently programmed and working smoothly, it is completely honest Isaac Asimov

Though this survey was on e government and not e governance nevertheless a very large number of e governance projects have, over the years, belied the promise that they once showed. SAI India, over the last four years, has conducted numerous audits of e governance projects with the scope ranging from evaluating the system development methodology to the overall performance in terms of the achievement of objectives. The results brought into focus the fact that the issue of e governance is much
1 Author Shrilal Shukla. English version in Penguin.

11

as e governance projects. Thus vast sums of money are spent on computerisation activities without giving the e governance related benefits to the end users.

2 Vendor driven initiatives


Currently e governance is the buzzword in the corridors of power in governments and the international donor agencies. Vast sums of monies are being promised and given to implement such schemes. However a close scrutiny reveals, startlingly, that the preference for IT components such as the hardware and software such as operating systems and RDBMS change dramatically for similar projects within the same country in the same period of time. This is sometimes reflected in a kind of a secular trend resulting from an unstated agenda or a conscious shift. While there may be only limited objections to choosing one technology over the other, auditors need to monitor and examine the trends. It is also seen that often the Acquisition and implementation processes are not monitored in an effective fashion and deliverables are often less than the specifications. However due to a hurry to get things going the projects may be operationalised even when they are not fully ready. Moreover it is not only in the Acquisition and Implementation but also in the Delivery and Support areas that excessive dependence on the developer(s)/vendors is seen resulting in large revenue expenditure while the untrained work force of the government entities sit idle. Additionally there is often poor control over outsourcing. The benchmarks for evaluating performance of the service

provider are not set out in a transparent fashion and are often biased towards it. For example a penalty clause for deficient services and extended liability is often absent or too poorly drafted to be legally enforceable. This completes the chain which started from lack of transparency in selection of technology/vendor then goes through to less than adequate receipt of deliverables and continues to large payments for services which are not monitored for performance; the citizen or the governed being the only loser.

Sometimes e governance projects, paradoxically, become victims of their own success. The demand for the services rendered by them may end up outstripping the capacity both of the infrastructure and of the organisational preparedness. This is especially true in cases of start small, rollout fast and scale big later model which is increasingly gaining in popularity.

4 Vested interests
It was often seen that there was clearly stated commitment from the Political establishment but continuous resistance by a section of the executive and other stakeholders adversely affected by transparency brought in by e governance. E governance is a catchy slogan which translates into power to the people and paints a picture where the omnipotent computer(s) would take over all those functions of the state which entail an unnecessary interaction of the common man with a government official. This immediately attracts the fancy of the citizens who are also potential voters, and look forward to a corruption and discretion free system where each individual is treated according to transparent rules. This enthusiasm for IT enabled e governance allows the governments to announce and launch mega e governance schemes which often translate into large scale expenditure on hardware and software. These are often associated with lack of transparency in acquisition and creation of technological and physical infrastructure, an irony since the projects themselves seek to increase transparency in the governance mechanisms.

3 Individual led initiatives


In many projects at the system development stages, especially when the user requirements were being made, there was no effective communication between the users to share the domain knowledge with the system developer(s). This was particularly true of projects which were being implemented as a result of individual initiatives emanating from the top of the management hierarchy. In such cases the developers also felt answerable to none except the management at the very top. This soon caused even the enthusiasts at the operational level to lose interest and the projects were implemented by going though the motions. This led to the development of systems which were inherently deficient and soon ran into the ground after the change of guard at the top management level. Even where the systems become operational and were hailed as success stories poor change management controls meant that over a period of time they completely stopped doing what they had set out to achieve.

12

During audit the government functionaries were often found painting a worse picture of the e governance projects than the actual situation. The expectation was that a very critical audit report would help in derailing the process of e governance.

However there are also strong vested lobbies which feel threatened by this transparent governance and often they were seen to do anything to either discredit a new project or not allow it to take off at all. Though the bogey of unemployment resulting from computerisation is long dead, the resistance continues as it has been realised that automation of backend procedures would eventually result in e governance.

6 The digital divide


There is always a risk that the implementation of e governance projects is prioritised so as to benefit only a certain section(s) of society. Additionally e governance delivery mechanisms may not account for the existing digital divide. This would cause even the most well intentioned initiatives to not achieve their objectives. Though innovative methods were seen, especially such as e governance kiosks manned by paid non government facilitators to help citizens, the fact remains that without bridging the digital divide e governance projects may not gain critical mass to be effective. Successful e governance implementation is about four main components. End users need:
l l l

5 Confidentiality issues
A major concern is the lack of attention to issues relating to the confidentiality of the data such as in e tendering systems or regarding personal details of citizens etc. For example if an e tendering system does not store the data regarding the bids before the opening date in unencrypted fashion, PKI is not mandatory for submission of bids, logical time locks to disable access to the bid details before the bid opening date are absent and there is inadequate provision of activity logs for system and data administrator activities then the system can be labelled as extremely prone to manipulation and does more harm to the cause of IT in improving governance. One may be surprised to find such cases where large contracts have been decided on the basis of such a system. Information Technology indeed cuts both ways! Similarly if personal details such as social security numbers or taxation details in an e tax return filing system are not kept in a secure environment, it would ultimately undermine the confidence of the users in the use of such systems. Ironically IT-enabled e governance can also facilitate frauds. It was observed that in cases of computerised lucky draws for houses/residential plots the algorithm was tampered with to favour a few. This was completely contrary to the spirit of a lucky draw where the results should be random. As a result, some sections of citizens started blaming IT for the problem. Clearly the issue was not one of IT enabled fraud but of the organisation not addressing the risk arising from the very nature of technology.

Identification, Business Process Modification, Use of Information Technology, and most importantly, Committed Government Intent.

Deficiencies in any of these would result in e governance projects failing to achieve their objectives. Note Identifiable and measurable parameters to assess the success of e governance projects are not easy to formulate. This is especially true regarding the intangible/soft benefits which are in the forms of increased transparency, sense of economic and social empowerment by access to information and better efficiencies in delivery of public services. In the absence of benchmarking, due to the uniqueness of some of the projects, making a quick judgment about their success or failure is a risk that must be guarded against by all auditors

Dr Ashutosh Sharma joined SAI India in 1997. He is a medical doctor from Delhi University and a CIA & CISA. He has worked as the Chief Information Security officer of SAI India and its Director of IT Audit. Currently he is working as a faculty of IT and IT Audit at SAI Indias Training Academy for its officers. He has participated in various International events of the UN and INTOSAI. He is also a recipient of the Prime Minister of India Award for Excellence in Public Administration for 2006-2007.

13

Shetland Islands Orkney Orkney Islands Islands Gulf of Finland

L. Ladoga

Rybinsk Res. Volga


lg Vo a

Hebrides brides

North Sea North Sea

Baltic Baltic Sea Sea

NETHERLANDS NETHERLANDS

ElbElb e e

English Channel English Channel


Se Se in in e e

Why government ICT projects run into problems


Volga Kuybyshev Res.
n Do

50N 50E

Dniepe r

Vo l g

Dn es

Loire Loire

Rh in e Rh in e

e e n u bu b DaDan

tr

n Do

ay of of scay
Rh o n e

Eefje Leydesdorff and Thomas Wijsman from the Netherlands Court of Audit
er

p Dn ie

Sea of Azov

Dan ube Dan ube

NETHERLANDS
ne

M e d i t e r r a n e a n

Government ICT projects are carried out in a complex environment. The complexity is threefold: political, organisational and technical factors complicate these projects. Therefore government ICT projects are difficult to govern. 10E Actors involved in government ICT projects are inclined to turn a blind eye to these difficulties because each actor, for its own legitimate reasons, has an interest in large and ambitious projects. While both politicians and ministers should take our observations to heart, we believe that the latter hold the key to break out of the spiral in which projects become too complex. Ministers should be more realistic in their ambitions and should make sure to keep a firm grip on their ICT projects.
Sicily

14

n ro Ga

Po

Ad
Corsica

Danube

Blac k Se a
40N

ria t ic

Se a

Sardinia Majorca

S e a

Tyrrhenian Sea Ionian Sea


Peloponnesus

Lesbos
Eu p h ra tes

Introduction
Aegean Sea

M e d

20E

Rhodes The Dutch government experiences severe difficulties managing ICT projects. Projects Crete run into problems, they get far more i expensiventhan budgeted, need more time t e r r a e a n S e a 30N Dead Sea than planned or do not deliver the intended Request from parliament results. The problems can have adverse 40E On 5 June 2007, members of the House of impact on government processes and Representatives addressed questions to the often a great deal of money is involved. For Minister of the Interior about newspaper instance, because of substantial problems reports on large amounts of public money Red with the project to implement a new Seawasted on failing ICT. According to the procedure for the payment of housing and media, ICT experts estimated that the 30E medical care benefits 52,000 people did not Dutch government spends between 4 receive the benefits they where entitled to and 5 billion a year on failing ICT projects. at the end of 2005. Another case in point 0 500 Miles As a result of the debate with the minister, is the failure of a project aimed to develop the House asked the government to give an ICT system to support a future human an overview of all large ICT projects of resources shared services centre. Because 0 500KM central government and to examine how Parallel between of an unconstructive relationshipscale at 45N 15E coordination by the Minister of the Interior government and the supplier, the supplier could be strengthened. The House also threw in the towel which ended the project. requested the Netherlands Court of Audit A recent example is a project for the renewal to investigate the persistent problems with of a social benefits system which was ICT projects. abandoned after having spent 87 million. We were requested to identify the main Such problems are not unique to the underlying causes of problems with ICT Dutch government. In the United States projects of the central government and to serious problems with ICT projects led to make recommendations for improvement. the introduction of the Clinger-Cohen Act in The request also comprised four other 1996. Also the private sector struggles questions but in this article we focus on the to manage ICT projects. Projects in the first question1. This article is based on part A public sector, however, are much more in of the audit that we published at the end of the spotlight. 2007 (http://www.rekenkamer.nl/9282000/ d/p425_report.pdf). We published part B in July 2008.
le Ni

. In this Rarticle we will discuss the underlying causes of the problems with government ICT projects. It is based on an audit that we performed at the request of our parliament.

Four other questions not covered here were: What is the quality of the information provided to the House of Representatives and how useful are the project administrations to provide this information? l How are the efficiency and effectiveness of expenditure on ICT projects accounted for? l What indication can the Court of Audit give of avoidable costs and avoidable delays? l What view does this investigation give on the possibilities and limitations of a government-wide investigation into avoidable costs and delays in central government ICT projects since 2000?
l

Another aspect of political complexity is that the political environment is highly dynamic. Political changes with considerable consequences for the project should lead to a reconsideration of the project conditions in terms of ambition, time, money and human resources. Also, when problems or new risks arise the project conditions should be reconsidered. However, such reconsiderations are not always politically opportune. As a result the responsible persons are tempted to keep muddling on even when continuation of the project is no longer justified by a valid business case.

Organisational complexity Method


To start with, we defined an ICT project as follows: An ICT project is a project whose aim is to develop and/or introduce an ICT system. We understand development to mean the specification, procurement and internal and external construction or modification of the system. Introduction means both the technical and the organisational implementation. We based our report on three sources of information. First we used previous audits of ICT projects by the Netherlands Court of Audit. Second we conducted a study of the causes given in national and international literature. As our third source we consulted experts from diverse backgrounds: government, ICT suppliers, academia and IT audit. A great deal of literature has been published on how to manage ICT projects and how to control the risks. Despite the many manuals and methods, large projects keep running into problems. Our aim was not to write the next how to project management handbook or to reproduce a list of common well known failure factors. Instead, we strived to identify the main underlying causes of the persistent problems with ICT projects.

Characteristics of Government ICT projects


We identified three factors of complexity that characterise government ICT projects, namely political, organisational and technical complexities. Government ICT projects often fail to a certain degree because of a combination of these three factors. In this section we will describe these factors that complicate ICT projects, often to the extent of becoming unmanageable.

Political complexity
Political decision makers tend to believe that ICT is the ideal solution to any policy problem. Senior civil servants, ministers and the members of parliament often do not fully understand what ICT can do and, more importantly, what it cannot do. A minister who takes decisions without seeking adequate advice runs the risk that the project becomes unrealistic from the start. Also, it is not uncommon for a project deadline to be the outcome of a political debate or the statement of an ambition instead of an underpinned and realistic planning result. Unrealistic timing can get projects into serious problems.

Often government ICT projects are also complex because several organisations are involved that are more or less autonomous. One can think of a number of organisations joining forces in an ICT project because their business processes are related and require exchange of information. Central steering of the project is difficult or sometimes even impossible in these cases. Since organisations tend to act primarily from their own main goals, their contribution to and acceptance of the ICT project depend largely on how the ICT project can serve these specific goals. Another aspect of organisational complexity is the strong interconnection between ICT and organisation. An ICT project generally implies an organisational change and, conversely, organisational change can have significant impact on the ICT landscape.

Technical Complexity
There is an inherent mismatch in flexibility between ICT systems and political and organisational processes. While political and organisational processes are dynamic and flexible by nature, once a decision has been taken to develop a particular ICT system and the project is under way, it is difficult to change the project. Such changes are not impossible, but have their price in terms of time and budget overruns. Another complicating technical factor is that ICT systems often have to be connected to other systems already in operation. Compatibility between ICT systems already a major issue within a single organisation becomes especially challenging where a number of organisations is involved. The problems that arise when collating data from two or more organisations if the data entries do not agree with each other (conversion problems) are often underestimated. Also, advances in ICT succeed each other at a daunting pace. Expertise and know how quickly become obsolete and new techniques that become available during the project place the chosen strategy in a new light.

Tension between organisational, political and technical complexity


Organisational complexity Project: funds personnel ambition time Technical complexity

Political complexity Source: Court of Audit, 2007

16

Why government projects become unwieldy


Our main finding is that government ICT projects are often too ambitious and too complex because of the combination of the political, organisational and technical factors we mentioned in the previous section. A project that is too complex lacks balance between the ambitions and the available human, financial and time resources. In theory, the solutions to reduce complexity are relatively simple, if not obvious. The motto is: start small and proceed in small steps. Minimise the organisational and technical complexity. Organisational complexity, for example, can be reduced by limiting the number of organisations involved. Depending on the type of project, pilot schemes can be carried out or developments can be piggy backed with one organisation developing an application and others adopting the functionalities they need. Technical complexity can be reduced by opting for standard software. The 80/20 rule is also often applicable. About 80% of the work required to develop an ICT application is concerned with the last 20% of the applications functionality. Do all exception rules really have to be programmed? Or can some exceptions be replaced with manual procedures? Complexity is sometimes a given, for example because legislation must apply to all citizens at the same moment. If it is a given, other conditions, such as completion time, must be adapted for the project to remain realistic. A project can also be made more manageable by dividing it into smaller, more controllable subsidiary projects. All the recipes given above are known. But they are often not applied even though those involved know that projects are doomed to failure if they are too ambitious or too complex. Why is this so? The Court of Audit understands the cause to be in the area of legitimate interests of the actors concerned.

The actors involved in the initial stages of an ICT project are ministers, the House of Representatives and ICT providers. Each of these actors, for its own legitimate reasons, has an interest in large and ambitious projects. The House of Representatives not only exercises parliamentary control over the government but also takes its own initiatives to steer the governments actions. In this latter role, the House often expects the government to solve complex problems, preferably as quickly as possible. These demands usually culminate in complex projects with tight deadlines. Ministers like to show they are decisive. Decisiveness is best displayed by an ambitious project subject to a definite and tight deadline. Announcing a feasibility study or a small scale pilot scheme is not usually seen as decisive action. To survive, ICT providers need contracts, preferably big ones. And they are unlikely to refuse the additional work brought about by additional requirements. Since all these actors have a natural tendency to think in terms of big solutions to big problems and therefore cannot keep each other in check during this critical phase, an ICT project can quickly be sucked into a spiral of growing complexity during the process of discussion and negotiation. The parties entrap each other in the spiral and inevitably agree upon a project that is too complex but has the status of political fact from which there is no elegant way back.

put their house in order they can take a more considered view of the dynamic environment in which ICT projects become increasingly ambitious. We can summarise our recommendations as follows: be realistic about the ambitions and make sure you keep a firm grip on your ICT projects. Realism means being aware that:
l l

ICT is not a quick fix to a problem; Political deadlines can be fatal to a project; ICT ambitions also display a gap between policy and practice; Changes during the project are often inevitable; An exit strategy prevents muddling on.

To keep a grip on ICT projects:


l

The minister should be a serious counterpart for both the House and the ICT provider; Decisions should be taken in phases; Decisions should be based on well considered plans, and projects should be evaluated as part of an overall project portfolio; Reconsideration along the way should be made possible.

l l

Closing remarks
1 The factors that we identified above are not to be used as an excuse for failing ICT projects: Government ICT projects are inherently complex, therefore my project failed to deliver. 2 While we explicitly address the ministers in the report, they can not do it all by themselves. Also the House of Representatives should be willing to be more realistic in its demands. 3 Although we address the ministers in our recommendations, this does not imply that we see the minister as some sort of super project manager. All who are involved in the decision making should take the lessons to heart.

Recommendations
We are convinced that ministers hold the key to break out of the spiral in which projects become too complex. A minister not only has a voice in the political decision making but is also responsible for the management and execution of a project. In their capacity as client, moreover, the minister is in direct contact with the provider. Our recommendations are therefore designed to strengthen a ministers position. The underlying thought is that if a minister has

Eefje Leydesdorff MSc MPIM Eefje Leydesdorff is a senior auditor at the Netherlands Court of Audit and works on her PhD. She is interested in questions on policy, public organisations and information management. Before joining the Netherlands Court of Audit in 2006 she worked on several ICT and information management projects in the public sector. She obtained a masters degree in Public Information Management at TiasNimbas Business School (2006) and a masters in Artificial Intelligence at the University of Amsterdam (2002).

Thomas Wijsman MA Thomas Wijsman is a project manager at the Netherlands Court of Audit, coordinating IT audits. An auditor since 1986, his focus since the mid-nineties is mainly on IT deployed by Dutch ministries and public bodies. He has a masters degree in psychology / educational science, a higher educational degree in IT, and a post-graduate degree in IT auditing. He received his certification as a qualified IT auditor from NOREA, the professional association for IT-auditors in the Netherlands. He is a member of NOREAs editorial staff and Technical Committee, as well as the coordinator of a working group of government IT auditors.

17

Auditing Systems Development


This paper is an exposure draft of a new major output of the INTOSAI Working Group on IT Audit, produced by a research team from the SAIs of China, Kuwait, Pakistan and Bhutan. Readers comments would be very much welcomed by the team. They should be emailed to cnao@audit.gov.cn or amycnao@yahoo.com, marked for the attention of Ms Yang Li of the Chinese National Audit Office.
As Information Technology has advanced, Government organisations have become increasingly dependent on the use of IT to carry out their business operations and service delivery and to process, maintain and report essential information. Organisations often spend significant resources in developing, acquiring and maintaining application systems that are important to their effective functioning. These systems in turn manage critical information and should be considered an asset that needs to be effectively managed and controlled. But heavy reliance on IT can also result in unacceptable levels of disruption if the systems development is delayed or does not work as intended. Many risks can, to some extent, aect the successful development or acquisition of a new information system. These include the risk that the system will:
l l l l

The investigation of these IT project failures revealed a number of common problems, which can be summarised as follows:
l

Failure to assess (and manage) project risks, in particular stemming from:


l l l

an unrealistic business case; technology problems; lack of senior management involvement; lack of user commitment to the project. inexperienced IT project managers; failure to apply, or an inadequate, IT project management methodology; lack of quality standards; vague or incomplete specification of requirements; failure to streamline the requirement specification, resulting in slipping deadlines and rising costs. failure to seek competitive tenders; high staff turnover no continuity; vague terms of reference for consultants and open-ended contracts; failure to monitor and control consultancy costs; lack of independent quality assurance on consultants work.

Ineffective project management:


l l

l l

never be delivered; be delivered late (time overrun); exceed budget (cost overrun); divert user resources to an unacceptable degree; not deliver the required functionality; contain errors; be unfriendly; fail frequently during operation; not perform to the required standard; be difficult and costly to operate, maintain and expand; not interconnect with other systems.
l

Mismanaging consultants and suppliers:


l l l

l l l l l l

18

System implementation failures:


l l l

unrealistic delivery deadline; inadequate acceptance testing; taking shortcuts due to lack of time (particularly cutting back on training, testing and quality reviews); unworkable or non-existent contingency plans.

Auditors should also have an understanding of the above information, establish levels of materiality, make assessment of risk, and take into consideration of internal controls of SDLC. While planning the review of the SDLC of an application system, auditors should consider:
l

Auditors should communicate with the project sponsor about:


l l

The objectives of the review; Scope of the review in terms of SDLC stages to be covered by the review; Type of review whether it is a preimplementation review of the proposed SDLC, a parallel/concurrent review as SDLC stages are being executed, or a post-implementation review after the SDLC stages in question are completed; The timeframe of the review the start dates and the end dates; Processes for reporting observations and recommendations; Process for following up on the agreed actions.

The risk of project failure can be dramatically reduced by breaking down the project into a number of manageable stages, and where the aim of each stage is to produce one or more pre-defined products. The method typically used is the System Development Life Cycle (SDLC), which is a process involving multiple stages (from establishing the feasibility to carrying out post implementation reviews), used to convert a management need into an application system. It is custom-developed, purchased or a combination of both. The advantage of a structured approach is that it helps to reduce the complexity of planning, monitoring and control. It also offers a number of points during the project where progress against pre-defined deliverables can be reviewed and corrective action taken as necessary (including abandoning the project!). To ensure a structured, cost-effective and efficient audit of a systems development project, a new approach should be undertaken. This new approach involves the auditing of each completed phase of a system development and giving input that allows corrections to be made before next phase of the development. This approach differs from the traditional audit approach of auditing systems development projects only after the project has been completed, the focus of traditional audit is on auditing the completed systems development process to assess if the development took place on a structured basis. The IT auditors overall objective, like everyone else involved, is to contribute to the success of the system project. IT auditors are best qualied to do this by helping control the exposures resulting from the project, and giving management reasonable assurance that this has been done.

The acquisition/development mode, technology, size, objectives and intended usage of the application system; Project structure for acquisition and implementation; Skill and experience of the project team; The SDLC model chosen; The formal SDLC methodology and customised process design adopted, if any; Risks that are likely to effect the SDLC; Any concerns or issues perceived by appropriate management; The current SDLC stage; Any prior review of the earlier SDLC stages of the application system; Any prior SDLC reviews of similar application systems; Any other risk assessments/reviews by the IT auditors or others (such as IT staff) that have a bearing on the proposed review; The skill and experience level of the IT auditors available and the possibility of getting competent external assistance where necessary.
l

l l l

l l

Collect the background information As well as information gathered at the planning stage, auditors should obtain detailed information of the organisation and the environment in which it operates to perform the control review. A vital part of gaining this information is to gather information about the organisations IT systems. Without such information, the auditors would be unable to say that a full understanding of the organisation has been achieved. Typically, gaining information about the organisations IT systems will include gathering information on several aspects of the organisations systems. This information will allow the auditors to make an assessment of the complexity of the systems to be reviewed. This will in turn have an impact on the skills and resources required to carry out the review. The information gathered by the auditors should include:
l

l l

Identify audit objectives An SAI gets its authority from the government or the law to review and assess a government departments operations. The audit objectives are to ensure that systems under development successfully meet the organisations aims and objectives. From its audit work, the SAI comes to conclusions and issues reports about:
l l l

Key documents: the IT strategy, the business plan and expenditure profiles. The hardware used to run its financial systems; System software (operating system, utilities, security software and networking software); Financial and other applications: auditors should determine which applications are to be reviewed as part of the audit process; Key organisation staff in both the finance and IT departments. When carrying out a detailed control review, IT auditors need to contact and interview these staff. Whether there have been any problems with the organisations IT systems. For example, in previous years the organisations system may have been unable to produce a complete trial balance;

Effectiveness and efficiency of operations; Reliability of financial reporting, and Compliance with applicable laws and regulations.
l l

1. Audit plan
When auditors begin auditing the work of SDLC, they should first prepare an audit plan. A good plan will be a good beginning. In the audit plan, the auditor should identify the audit scope, determine audit objectives, gather basic information about the organisation, determine materiality, assess risk, and evaluate internal controls. During the planning, auditors should communicate with the organisation for audit objectives, and to get relevant information about its information system, technology infrastructure, structured approach used for developing application system, organisational structure, IT strategic plan etc.

More detailed objectives vary from audit scope, such as:


l

Whether information systems sufficiently meet the business needs of the organisation Whether adequate controls and audit facilities have been built into the system

IT auditors will also need to know:


l

19

What changes, if any, are planned for the IT system; Are there any written policy or detailed management practices to govern systems development projects? Is there a management steering committee? Who is assigned specific responsibilities for systems development projects? Is there an appropriate systems development methodology which provides for periodic milestone events? Is there a project management and control system which requires preparation of time and cost budgets and then measurement of actual vs. planned results? Is there an independent quality assurance function which monitors the details of systems development projects? Is a project manager assigned with overall responsibility for direction and coordination of systems development? Are adequate standards for complex systems development used? Do documentation standards provide detailed guidance for each step and each product during systems development? Is there a comprehensive data security function which monitors systems development, maintenance and operation? Is there a comprehensive data administration function and have detailed responsibilities and authority been established? Is there a comprehensive data dictionary and is it required during systems development and modification? Are feasibility, impact, cost/benefit, and risk analyses prepared, approved and maintained during systems development projects? Are internal controls and security features included with systems design? Are the internal auditors required to monitor systems development projects, sign-off at milestones and review and approve acceptance test results?

Assess materiality, identify the software to be audited In a single IT Audit project, auditors clearly get audit objectives and scope from the project sponsor. In this case, it is not a question to identify which systems development cycle should be audited. When review work is one part of a financial audit project, auditors need to identify the application software to be audited and ascertain the audit scope, because organisations always have many applications for different departments or businesses. In this case, the value of the assets controlled by the system(s) or the value of transactions processed per day/week/month/year should be considered when assessing materiality. Identify the software life cycle Adopting a suitable SDLC is important for developing project management. Each organisation should establish a SDLC methodology and assign responsibility for each phase of the cycle so that system design, development, and maintenance may progress smoothly and accurately. This cycle starts with a perceived need and extends through feasibility study, design and development, testing, implementation, system acceptance and approval, postimplementation review, and maintenance of the application and systems software. Following each phase of this cycle ensures that the new or revised software meets the organisations needs, that adequate internal controls are consistent with managements objectives, and that the application is properly implemented. The method can be adjusted to comply with project requirements. Auditors should interview the project managers and clarify the SDLC and detailed process. Identify the controls adopted in SDLC Auditors should ask the management what risks had been identified and what control actions had been adopted in each stage of the SDLC. At the same time the auditor can perform some non-sampling control tests. Auditors should prepare a control list of each stage for reviewing work. The detailed control actions list will be set out as follows. SDLC Methodology 1. Determine the extent of the responsibilities of management, internal audit, users, quality assurance, and data processing during the system design, development, and maintenance. 2. Review SDLC work papers to determine if the appropriate levels of authorisation were obtained for each phase.

Requirement Analysis 3. Review and evaluate the procedures for performing a requirement analysis. 4. Review requirement analysis for a recent project and determine if it conforms to standards. Systems Design and Development 5. Review and evaluate the procedures for systems design and development. 6. Review design specifications schedules, look for written evidence of approval, and determine if the design specifications comply with the standards. 7. Determine if an audit trail and programmed controls are incorporated in the design specifications of a recent project. 8. Review samples of source documents used for data entry, which are included in SDLC working papers of a recently developed application. Determine if they are designed to facilitate accurate gathering and entry of information. 9. Obtain and review programs to determine if they comply with the organisations programming standards. Testing Procedures 10. Review and evaluate the procedures for system and program testing. 11. Review documented testing procedures, test data, and resulting output to determine if they appear to be comprehensive and if they follow the organisations standards. 12. Review the adequacy of tests. Implementation Procedures 13. Review and evaluate procedures for program promotion and implementation. 14. Review documentation of the program promotion procedure. Determine if the standards are followed. Trace selected program and system software changes to the appropriate supporting records to determine if the changes have been properly approved. 15. Review documentation of the conversion/implementation of a newly developed application. Determine if the organisations implementation procedures were followed. Post-implementation Review 16. Review and evaluate the procedures for performing post-implementation reviews. 17. Review program modifications, testing procedures, and the preparation of supporting documentation to determine if the organisations standards are being followed. Maintenance of Applications 18. Review and evaluate the procedures for the maintenance of existing applications.

All this information could describe the existing information systems and technology, identify available resources, and define known problems. It could be collected in such ways as:
l

Previous working papers: auditors should confirm that the information remains up to date and accurate; Observation: touring the organisations IT facilities; Interviewing IT personnel; Reviewing internal audit reports.

l l

20

Preliminarily assess whether controls are in effect Based on the evaluation of information system developing controls and results of nonsampling control tests, the auditors should preliminarily assess the effectiveness of the controls by doing the following checks:
l

2. Audit Implementation
Project initiation A project is a management environment set up to deliver a business product to a specified business case. The aim of Project Initiation is to undertake the groundwork for the future management of a project and to obtain authority for it to proceed. Depending on the scale of the project, Initiation might relate to a study (e.g. a Feasibility Study), to the entire project, or to a single stage of a project (e.g. the Technical Design stage).An initiation stage might therefore occur at a number of points in the SDLC. Definite financial authority is normally needed before the Initiation Stage is reached. The level of authority necessary to approve the expenditure will generally depend on the value of the project, the organisations rules and delegated financial approval. The results of this work should be presented to the approving authority (e.g. the IT Steering Committee) in a formal report, i.e. the Project Initiation Document (PID), for them to consider and authorise. The PID should include the details of: l Legislative and/or business needs;
l l

Audit considerations
The purpose of the project, and its scale, should be taken into account when deciding exactly what needs to be reviewed at this stage of the SDLC. Generally speaking, the following factors should be considered:
l

Review and evaluate the procedures for modifying systems software. Review systems software modifications, testing procedures, and the preparation of supporting documentation to determine if the organisations standards are being followed. Review and evaluate documentation of in-house developed systems software and the features/options of proprietary systems software in use.For each significant assertion in each significant account, the auditors should assess control risks at one of the following three levels: Low control risks: The auditors believe that controls will prevent or detect any aggregate misstatements that could occur in the assertion in excess of design materiality. Moderate control risks: The auditors believe that controls will be in effect, but not enough to prevent or detect all aggregate misstatements that could occur in the assertion in excess of design materiality. High control risks: The auditors believe that controls have essential defects and are unlikely to prevent or detect any aggregate misstatements that could occur in the assertion in excess of design materiality.

Has an appropriate authorising authority given formal approval for the project to proceed? When approving the project, were adequate alternative options considered during the Feasibility Study and presented to the approving authority? Was each option evaluated in terms of its business benefits, costs and strategic fit? Are the estimates of business benefits achievable and measurable, and have workable methods for measuring achievement been defined? Does the Business Case include the costs of staff training and of developing a Business Continuity Plan? Is the estimated pay-back period longer than the likely economic working life of the system? Does the viability of the Business Case rely heavily on long term estimates? (the risks associated with long term measurement periods need to be included in the project risk assessment) Does the cost/benefit analysis include appropriate tolerance (e.g. 10%) to take account of under-estimates of costs and over-estimates of benefits? Have project risks been identified, measured and considered by the approving authority? Does the project clearly link with the existing or future business needs? Does the project clearly define:
l

Project objectives and time-scales; Management, organisation (Project Board, Project Manager, Stage Manager and the Quality Assurance Team); The scope of the project; Major control points for all stages of the project; All project deliverables; Technical and resource plans in sufficient details to calculate and allocate staff, costs and resources for the project; Quality criteria; Risks to successful completion, and proposed ways of managing identified risks; Any training needs which must be satisfied before the project commences; Any options raised in earlier reports (e.g. The Feasibility Study Report); Continuing validity of the existing user requirements, and of any assumptions and recommendations; The identification and management of both business and security risks; Any need to sustain or improve the level of service with limited or reduced staff numbers; Obsolescence of existing hardware, software or communications.
l l l l

l l

l l

During the check, the auditors should obtain sufficient:


l

l l

Information to develop comments in the auditors report or management letter and; Evidence to support the preliminary assessment of the effectiveness of internal controls.

The end product(s) to be produced at each stage of the project? Time-scales and deadlines project stages? Budgets and resource allocations for each project stage? Project organisation and responsibilities? Arrangements for monitoring and reporting progress? The quality assurance criteria to be applied at each stage of the project? Arrangements for assessing risks to the successful completion of the project as it progresses?

Confirm the audit method Auditors can collect evidence by using methods such as interviews, questionnaires, tour of business, and review of documents. Auditors can also use computer aided audit technique and tools (CAATTs) to examine the data flows such as snapshot, tracing, mapping, or verifying data and file integrity with Parallel Simulation, Test Data and Integrated Test Facilities.
l

Has a full time and experienced Project Manager been appointed to manage the project? Do project management standards specify the stages at which products are to be produced and progress reviews to take place?

21

Feasibility study The Feasibility Study Report is the end product of the Feasibility Study. The main objective of a Feasibility Study Report is to determine whether a proposal is viable and to recommend suitable action where necessary. The study might be undertaken by the organisations own staff (from both the IT Department and the end users), by external consultants, or a mixture of both. The study recommends the best way forward and it will:
l

Has the feasibility analysis report been submitted to the management steering committee for action? Have responsible departments signed off the feasibility phase? Have the internal auditors prepared a milestone report with opinions and recommendations for the feasibility analysis phase?

Determine if the project plan included all the required phases of project development, including test phase, training for users, conversion, and implementation. Does it cover all applications and areas concerned? Does it cover all interfaces to/from the application?

Project planning Project Planning outline plans for the reminder of the project including time-scale for implementation, and the proposed management structure for project development and implementation. In the Project Initiation phase, systems are planned using a strategic approach. Executives and others evaluate the effectiveness of systems in terms of meeting the entitys mission and objectives. This process includes general guidelines for system selection and systems budgeting. Management develops a written long-term plan for systems that is strategic in nature. The plan will change in a few months, but much evidence exists that such planning is conducive to achieving effective IT solutions over the long term. During this phase, several documents will be generated. They include the long-term plan, policies for selection of IT projects, and a long-term and short-term IT budget, as well as preliminary feasibility studies and project authorisations. Project proposals should have been documented when submitted to management, and a project schedule should exist that contains the approved projects. The presence of these documents illustrates a structured, formal approach to systems development and, as such, illustrates an effective planning system for IT projects and systems. It also demonstrates a formal manner of approving IT projects. IT auditors will verify the presence of the systems planning phase and take samples of the documents to verify the effectiveness of that system. The same audit procedures will be true for all of the other seven phases and, therefore, will not be repeated in the narratives of other phases.

User Requirement Analysis The purpose of this step is to understand the existing system and determine the users information and performance requirements. In this phase, IT professionals gather information requirements for the IT project. Facts and samples to be used in the IT project are gathered primarily from end users. A system analyst or developer then processes the requirements, producing a document, the User Requirement Specification (URS), which summarises the analysis of the IT project. The URS is about getting what you want, written in non-technical terms and consolidating all the materials produced to date relating to the business functions of the required system. It is a detailed statement of users requirements and provides a basis for design work, suppliers to submit proposals and acceptance testing criteria. A good specication should be ACCURATE: accurate, clear, concise, unambiguous, relevant, adequate, thorough and eective.User Requirement Specifications describe the:
l l

Define the problems or needs that require solution; Define broad or major requirements of the required solution; Determine if a computerised solution is required or desired; Determine if an existing system can be enhanced to correct the situation; Determine if a commercial product offers a solution to the problem; For each alternative, provide the estimate of costs, benefits, technical and business risk, time-scales, and an assessment of the options fit or compliance with the organisations IT Strategy. Identify a suitable solution to the problem, and seek authority to proceed with its development; Recommend developing or acquiring a demonstration system.

Audit considerations
This is an analysis of the possibility and worthiness of undertaking the project and determining whether a proposal is viable and to recommend a suitable action. Auditors should check:
l

Organisations business; Formal declaration of the users requirements of the proposed system; Existing system (incl. deficiencies); Objectives of the proposed system; Required functions (mandatory and optional); Expected performance; Constraints (e.g. environment, accommodation, locations of staff); Project timetable; Facilities required; IT security; Acceptance testing criteria; Documentation; Training; Maintenance.

l l l

Is the feasibility analysis well documented and clear? Have departments involved in systems development and operation been consulted during the feasibility analysis and have their recommendations been included? Does the feasibility analysis reflect any significant differences from original objectives, boundaries, and interfaces? Is the preliminary design in sufficient detail to support time and cost estimates, cost/benefit analysis, and impact study adequately? Does the preliminary design meet user requirements? Does the preliminary design reflect corporate standards? Has the project plan been prepared? Are the conclusions and recommendations supported by the feasibility analysis? Do the recommendations conform with corporate policies and practices?

l l

l l l l

Audit considerations
The purpose of this step is to determine if the project team has established a project plan and if the project plan was followed and any deviations documented, including extensions of the schedule. Auditors should check:
l l l l

l l l

Is the plan documented? Do the time frames appear realistic? Are the critical phases determined? Does the plan require management/user approval at specified points? Can the project be cancelled at the earliest points?

l l

22

If a decision has been made to buy a system (or indeed a service such as facilities management), the URS should be sufficiently comprehensive to form the basis for advising potential suppliers in full of the organisations needs, and enable them to respond with detailed proposals of how they propose to satisfy those needs. As a rule, the URS should therefore be written in such a way that it does not constrain the options open to either designers or prospective suppliers to provide innovative solutions by specifying exactly what technical solutions are to be employed in meeting the users requirements. It is important to ensure that the systems final owner signs off the User Requirement Specification to signify understanding and agreement before the project proceeds further.

Do benefits claimed appear to be reasonable? Do user requirements appear to reflect actual needs? Are effective change and version control procedures in place? Does the procurement procedure help to ensure the organisation obtain good VFM?

Does the contract provide for a system of controls( Security Controls, Audit trail features, Passwords Controls, etc) sufficient to detect reliability concerns? Does the software provide for sufficient data validation routines to detect input errors? Does the contract provide for adequate controls to detect the loss of file integrity? Does the contract provide for adequate backup and recovery controls? Does the vendor provide manuals for systems analysts and programmers to understand the application? Are the operator manuals included in the contract? Does the contract provide for the specified user manuals? Does the contract provide for documentation to assist organisation personnel in tracking down and correcting problems? Does the contract specify the costs associated with performing maintenance? Is the length of maintenance warranty periods specified in the contract? Does the organisation have the right to have maintenance performed by other than the vendor? Have provisions been made for vendor personnel to access to restricted areas to perform maintenance? Does the vendor require communication access to a vendor computer to perform maintenance? Does the contract provide for needed hardware upgrades? Does the contract provide for upgrading the application software in accordance with operating system upgrades? Does the contract provide how the user will request changes to software? Does the contract provide for the costs of enhancing the software at later dates? Has the vendor selection process been fair? Is the contract such that it would encourage the vendor to complete the contract? Does the selected vendor have a high probability of being in business during the duration of the contract? Have penalties been established in case the vendor fails to meet the contractual requirements? Do the terms of the contract conform to the organisations contractual term requirements? If the contract is terminated, have the termination provisions been specified?

Purchased software or systems development When an organisation plans to use some kinds of software, it should make a decision to purchase on the open market or have it developed by its own programmers. Software products purchased on the open market are often credible and tested precisely, but not specialised for the organisation. Although applications developed by its own programmers are mostly suitable for organisation, the applications are likely to have security vulnerabilities and hidden failures. So both situations should be audited. Purchased software This topic may include the procurement process. Purchased software packages should be compatible with existing IT function operations, meet the requirements of the users and should be reliable enough to work satisfactorily under operational workloads and conditions. Software product acquisition procedures should follow the organisation policies, and these products should be tested and reviewed before they are used and paid for.

Audit considerations
l l l

Efficiency Effectiveness Are user requirements well documented and clear? Is the responsible user executive specified? Have the user executives approved the requirements? Is a priority for implementation requested? Is the project included in the long- or short-range systems plan? Are the business objectives expressed clearly? Is the scope of the project defined well? Are claimed benefits supported? Is a solution or solutions to business objectives proposed? Are there necessary audit functions included in the new system? Does the requirements study include potential needs for the future? Does the requirements study consider potential use in meeting common needs? Are existing systems to be replaced or interfaced identified clearly? Are existing systems to be replaced or interfaced documented adequately and accurately? Is the new system compatible with other applications / systems? Could the new system recover after failure? Have other departments involved in systems development and operation been consulted during preparation of the requirements and have recommendations been included? Do user requirements include security, controls and privacy measures?

l l l

Audit considerations
l l

Are there vendor evaluation criteria? Are there invitation procedures for bidding? Are there selection procedures for vendor? Does the contract provide for product requirements as stated or modified by the organisation? Does the vender warrant that the product will perform as specified in the contract? Does the contract indicate how performance of those product specifications will be measured? Does the vender warrant that the product will meet the requirements in the organisations operating environment? Does the contract specify on what date the product will be operational? Does the contract indicate the level of performance for the product? Does the contract provide for remedy to the organisation when the product fails to achieve the performance level?
l l

l l

23

If the contract is a lease contract, does it provide for part of that lease being applicable to a purchase? Can the organisation terminate the contract at any time? Does the contract specify the state or country whose laws govern the contract? Does the vendor provide for operator training? Is the location of training specified in the contract? Does the contract provide for training of data processing personnel in the use of the application? Does the contract provide for training of user personnel in preparing input and using system outputs? Can user personnel be reasonably expected to prepare the application input accurately and completely? Are the reports and manuals designed for the skill levels present in the organisation? Can the software be moved from the current hardware to the next most logical piece of hardware? Will the vendor continue support for a reasonable period of time?

Audit considerations
l

Are systems specifications documented well and clearly? Have significant changes to systems design been controlled and approved by cognisant authority? Has a detailed work plan been prepared for the systems specifications phase? Has the systems development methodology been used effectively during development of systems specifications? Has the project management and control system been used effectively? Has actual accomplishment during development of systems specifications been reasonably close to estimates? Are systems development team resources adequate to accomplish objectives? Have time and cost estimates, cost/ benefit analysis, and impact study been updated? Have significant changes to project scope been approved by the management steering committee? Do systems specifications reflect accurately approved functional design features and user requirements? Is it reasonable to expect the systems specifications to be implemented satisfactorily within user and data processing environments? Do the systems specifications provide adequately for internal controls and data security? Do the systems specifications provide adequately for requested audit features? Has an appropriate configuration for hardware and software been selected for implementation of the systems design and specifications? Have the hardware and software selected been reviewed for adequacy of internal controls, data security, integrity, and dependability? Do systems specifications provide adequately for corporate standards and practices? Have systems acceptance criteria been updated? Has the systems test plan been updated? Has data administration reviewed systems specifications? Has data security reviewed systems specifications? Has quality assurance reviewed systems specifications? Has data processing operations reviewed systems specifications? Have user departments reviewed systems specifications? Has the risk analysis been updated?
l

Have systems specifications been submitted to the management steering committee for action? Have responsible departments signed off the systems specifications? Have the internal auditors prepared a milestone report with opinions and recommendations for the systems specifications phase?

General design Before coding and developing, an organisation should have specific software design, which encompasses general design and detailed design. Designers should produce one or more models of what they see a system eventually looking like, with ideas from the analysis section either used or discarded. The general design translates requirement specifications to future software architecture.

Audit considerations
l l l

Were users adequately consulted? Were alternative designs considered? Did the selected design meet the user requirement? Was an adequate financial audit trail provided? Were adequate controls provided? Was the design flexible enough to cope with change? Were hardware and software configurations specified? Did system security designs meet user needs? Did users sign off the system design?

l l

Systems development Systems development process is the translation of users needs or goals into software products. The developed software should meet organisation users expectation and run steady. The systems development process comprises several stages, including specifying user requirements, general design, detailed design, systems development, development testing, acceptance and so on. Systems Requirement Specifications Once the user requirement specifications have been approved, the project team starts designing the new system. The system design is meant to be a blueprint of the new IT system. The project team considers and evaluates alternative designs and selects the one that is expected to meet the user requirements most satisfactorily within the given constraints. Specifying user requirements encompasses those tasks that go into determining the needs or conditions to be met for a new or altered product, taking account of the possibly conflicting requirements of the various stakeholders. The output of this stage is the system design document (SDD). The SDD is submitted to top management for approval. The SDD includes the following: l Data flow in the information system;
l l l

Detailed design
Detailed design is the step where the software documentation is prepared for coding. In this stage, the organisation should prepare detailed design and technical software application requirements and define the criteria for acceptance of the requirements. The organisation should have the requirements approved to ensure that they correspond to the high-level design and perform reassessment when significant technical or logical discrepancies occur during development or maintenance. They should consider the confidentiality, integrity and availability of the system.

l l

Audit considerations
l

Is the systems design well documented and clear? Have significant changes to the preliminary design been controlled and approved by cognisant authority? Has a detailed work plan been prepared for the design phase? Has the systems development methodology(structured design techniques, prototyping, etc) been used effectively?

Database structure; Hardware and software configurations; User interface: That is, how the users are expected to interact with the system; Physical facilities required.

24

Has the project management and control system been used effectively? Has actual accomplishment been reasonably close to estimates? Are systems development team resources adequate to accomplish objectives? Have time and cost estimates, cost/ benefit analysis, and impact study been updated? Have significant changes to project scope been approved by the management steering committee? Do detailed functional design features accurately reflect approved detailed user requirements? Is it reasonable to expect the designed system to be implemented satisfactorily within the user and data processing environments? Does the design provide adequately for internal controls and data security? Does the design provide adequately for requested audit features? Have the requirements for hardware and systems software been developed and can they be met satisfactorily with resources available or approved for installation? Does the design provide adequately for corporate standards and practices? Have systems design acceptance criteria been prepared? Has the systems test plan been prepared? Does the design provide adequately for incident management(offsite backup and recovery measures, etc)? Does the design provide adequately for capacity management? Has data administration reviewed the systems design? Has data security reviewed the systems design? Has quality assurance reviewed the systems design? Has data processing operations reviewed the systems design? Have cognisant user departments reviewed the systems design? Has a risk analysis been conducted? Is the input defined in detail? Is the output defined in detail? Is the functional logic defined in detail? Is the logical file structure defined in detail? Has the systems design been submitted to the management steering committee for action? Have responsible departments signed off the systems design? Have the internal auditors prepared a milestone report with opinions and recommendations for the design phase?

Systems development
Systems development transfers the design onto the physical system by building the technical architecture and purchasing the material needed to build the system and building the database and programs. IT specialists write programs which will be used on the system. There are several kinds of development methodology used in systems development, such as: Data-Oriented Development, Object-Oriented Development, Component-Based Development, WebBased Development, Prototyping, Rapid Development and Agile Development. The audit may consider the usage of program coding standards. These standards enhance the quality of programming activities and future maintenance capabilities.

Are data elements, including interfacing data sets, entered in the data dictionary? Have procedures and/or programs been developed and documented for loading data files, initialising data files, systems conversion, year end processing, onsite backup and recovery, offsite backup and recovery? Is there a detailed, written training plan? Is there a detailed, written test plan, including Unit test, Integrated test, Systems test, Pilot test, Acceptance test, Parallel test? Has a test coordinator been assigned? Are tests documented well? Have all tests been reviewed in detail by at least one level? Have the test results been reviewed by the internal auditors and are they satisfied? Do products of the systems development phase conform with corporate standards and practices? Have products of the systems development phase been submitted to the management steering committee for action? Have responsible departments signed off products of the systems development phase? Have the internal auditors prepared a milestone report with opinions and recommendations for the systems development phase?

l l

l l l

Audit considerations
l

Has a detailed work plan been prepared for the systems development phase? Has the systems development methodology been used effectively during the systems development phase? Is the methodology used for systems development appropriate? Has the project management and control system(version control, incident/problem management capability, etc) been used effectively during the systems development phase? Has actual accomplishment during systems development been reasonably close to estimates? Have significant changes to systems specifications been controlled and approved by cognisant authority? Are systems development team resources adequate to accomplish objectives of systems development phase? Have time and cost estimates, cost/ benefit analysis, impact study, and risk analysis been updated? Have significant changes to project scope been approved by the management steering committee? Are there version controls during systems phase? Is there incident/problem management capability? Do program specifications and user procedures accurately reflect approved systems specifications? Do program specifications and user procedures provide adequately for internal controls and data security? Do program specifications and user procedures provide adequately for requested audit features?

l l

Development testing Development testing generally comprises unit testing and integration testing. Unit testing is the testing of an individual program module in an isolated environment before combining it with other modules to form a program. The objective is to determine whether the module is capable of accepting specific input and producing the correct outputs. The programming team leader normally carries out unit testing. Program testing follows similar objectives, but with all the modules in place to form a complete program. Integration testing is the process of adding new programs to an evolving system. Testing needs to find errors in the interfaces between programs, the discrepancies between the program functions performed and those specified and those unspecified functions are performed. Meanwhile, development testing may be elaborated in more detail, specifically with regard to: l Recovery Testing
l l l l

l l l l l

Security Testing Stress Testing Volume Testing Performance Testing

25

Audit considerations
l

Determine if the system is adequately tested prior to implementation, the test plan includes all aspects of the new system, and all unexpected results are thoroughly resolved. Has the test plan been documented, including:
l l l l l

Parallel running, postimplementation review and maintenance Post Implementation Review (PIR) is the final stage of a system development project. Its aim is to establish the degree of success achieved by the development project, and whether any lessons can be applied to improving the organisations development process. Meanwhile, parallel running and maintenance all should be taken into account. Auditors should pay attention to the adequacy of the system in meeting user requirements and evaluation of cost benefits or return on investment measurements.

Has security within the system been effective? If the system has been a joint effort between two or more vendors, have they worked effectively together?

Unit test; Integrated test; System test including interfaces; Pilot test; Parallel test.

Configuration management and change management Configuration and change management help ensure an orderly process for the control of changes to project baseline products as they evolve through each project phase.

Audit considerations
l

l l

Are the users included in the testing? Has testing been done at a proper testing facility? Testing of system functionalities requested by the audit function at user requirements and design stages. Has software scanning been done to see if any unnecessary code resides? Do the users have to sign-off on the success of the test programme? Are all aspects of the system tested, as outlined in the detail requirements? Have the system results been reviewed in detail? Is there a problem resolution procedure for those tests not meeting the expected results?

Audit considerations
l

Are there any procedures and policies related to change management? Have all changes from the original specification been properly identified, assessed/evaluated, reviewed, and implemented, tested, and logged and authorised? Have all changes to the application since go-live been logged and authorised? Have all changes before and after the implementation been tested? Have all changes been documented? Are changes which require more than a specified level of resource, or which are likely to cause significant slippage in the project timetable, referred to the Project Board for approval? Have all changes been reviewed for compliance with change and configuration management procedures, and authorised for release?

Has all relevant data been transferred to the new system in a controlled manner? Which changeover approach has been used? Parallel changeover, phased changeover or abrupt changeover? Are backup and recovery procedures documented, and have they been tested? Has the training programme been completed? Has any attempt been made to measure its effectiveness? Are user manuals clear, unambiguous and easy to understand? Do they incorporate all late changes to the system? Have responsibilities been assigned for carrying out clerical procedures and controls, and have they been tested? Has a System Administrator been appointed and trained? Are system administration activities documented? Has a documented plan been produced for reverting to the existing system should the need arise? Is it workable? Is there a system security policy? Has it been approved by the System Owner? Is it commensurate with the corporate IT Security Policy? Does it address all relevant risks? Has it been implemented? Has a documented business continuity plan been produced? Has it been tested? Are documented change and configuration management procedures in place? Has a monitoring process been established to determine the efficacy of the system? Has the system proved stable since golive? Is a service level agreement in place for the system? Have all parties been satisfied with the level of service to date? Has the system integrated effectively with other systems? Has vendor support been adequate, effective and timely?

l l

Acceptance Acceptance is based on an analysis of the User Requirement Specification and any other acceptance criteria defined during design and development. The aim is to identify that requirements, facilities and functions should be tested, their relative importance, and the method of testing to be adopted for each. During acceptance, user acceptance testing and quality acceptance testing are good methods.
l

Segregation of duty In a manual system, separate persons should be responsible for initiating transactions, recording transactions, and maintaining custody of assets. As a basic control, segregation of duty prevents or detects errors and irregularities. In an IT system, however, the traditional notion of segregation of duties does not always apply, because the program is performing functions that in a manual system would be considered to be incompatible. So segregation of duties must exist in a different form.

Audit considerations
l l

Are the results of the test plan satisfactory? Has data processing operations conducted a systems turnover evaluation and is the result satisfactory? Is the system documented adequately? Has an internal controls review been made? Is the level of internal controls satisfactory? Are the results of the parallel test satisfactory? Is the result of the test of backup and recovery tests satisfactory? Have responsible departments approved the system for implementation? Has the management steering committee approved the system for implementation? Have the internal auditors prepared a milestone report with opinions and recommendations for systems implementation?

l l

l l

Audit considerations
l

l l

Is there clear segregation of duties among those who build, test and operate the system? Is there an implemented practice in the IT function to ensure that roles and responsibilities are properly exercised? Do all personnel have sufficient authority and resources to execute their roles and responsibilities? Does the management make sure that personnel are performing only authorised duties relevant to their respective jobs and positions?

26

Operation management Input controls Input controls are to ensure the authenticity, accuracy, completeness, and timeliness of data entered in to the system. A manual or operating procedure should exist for system users.

Ascertain how rejected inputs are treated and reported. From samples of rejected records, ensure that they are amended and successfully re-input. Clear timetables should exist for input and should be adhered to. Ascertain who is responsible for authorising the processing of jobs and what procedures are in place. Are they reviewed on a regular basis? Checks should be made to detect possible duplicate input records. Determine what checks for duplicate input are carried out by the application itself, and how they are reported / followed up. Determine the action taken and the reason for the duplicates arising.

Output controls If output data has been classified according to the Security Policy/Plan, information can be classified as restricted, confidential, public, etc. Output controls should ensure that the processing of stored information is correct and appropriate to circumstances.

Audit considerations
l

Audit considerations
l

Transactions are from recognised sources. Determine the audit trail for documents prior to input to an application. Follow through a document to check that controls ensure input is only accepted from recognised sources. E.g. a valid timesheet. Transactions are explicitly authorised by either manual or electronic means. Establish how input is authorised. Request a list of all users of the system from the Systems Administrator. Ensure that all system users are valid employees and users. Password controls should be effective in restricting access. Ensure that access to the system requires a unique ID and password. Ideally the password should be alphanumeric and changed periodically. Input and authorisation functions are restricted and separated.
l

l l

Is there detail documentation for output requirements? (Output includes reports as well as files.) Are all departments concerns considered? Does the documentation include as follows:
l l l

Who is to receive the reports? Retention of reports and files, and Is the audit trail sufficient to identify who, when, how and why a user accessed a resource or amended an item?

l l

Processing controls Processing controls should ensure the project meets the objectives defined in the original proposal.

Audit considerations
l

Were the expected benefits of the new system realised? Does the system perform as expected? If there were differences found between expected and actual results, were they investigated? If there were inefficiencies noted, were they documented? Are transactions and account balances properly recorded on the Accounting systems, if applicable? (What accounts will the transactions affect?). Have written procedures been prepared that explain all error codes and messages, and corrective action for each? Does the application have provisions that prevent concurrent file/record updates? Is the file/record locked when one user is accessing in update, and are appropriate error messages provided? Does the application have controls to check for data integrity? Can the system-generated transactions be traced back to the source for reconciliation? Are there adequate audit trails for tracing purposes?
l

Does the output provide the users with the ability to control and ensure the completeness, accuracy, and authorisation of the data? Do the reports include the ability to trace the originator of each transaction? Do the reports include control totals, if applicable? Is there a means to verify the information included on the reports? Have the routing and distribution procedures been established?

l l

Is there an effective segregation of duties to prevent authorising transactions and vice versa? Can the system produce a system security report, which includes user access permissions?

Maintenance management The help desk management The help desk should make a quick response to a users problem, transfer or deal with it quickly, so the problem will have least effect on the system running. Furthermore, it should analyse the problem and find out the reason, and then classify the problem and provide the support for other work.

Input of parameters for processing and other standing data is strictly controlled
l

What controls exist to prevent accidental / malicious changes to fixed data parameters i.e. tax calculations, pay rise etc. Check the correctness of key values and data within the system.

Audit considerations
l

Does the system record a history of standing data changes? Data should be subject to validation for completeness and accuracy at input stage Establish if key fields are validated, what the criteria is and who ensures this is carried out. There should be clear procedures for data items rejected on input.

Whether the help desk can response to users problems, transfer or deal with them quickly, so the problems will have least effect on the system running; Whether the help desk can analyse the problems and find out the reasons; Whether the help desk can classify the problems and provide the support for other work.

27

Are logs periodically checked? Auditors should check:


l

Logical security Logical Security consists of software safeguards for an organisations systems, including user ID and password access, authentication, access rights and authority levels. These measures are to ensure that only authorised users are able to perform actions or access information in a network or a workstation.

Data security Data security is the means of ensuring that data is kept safe from corruption and that access to it is suitably controlled.

Does the application have the capability to successfully perform logging? Have all failed logon attempts been logged? Are all sensitive transactions and changes logged and an audit trail created? Does the audit trail contain who made the change, when it was made, and what was changed? Is the system administrator the only one who has access to change or delete these logs or audit trails?

Audit considerations
l

Are different access levels, such as read only, update, delete and add, set to defend different data? Are different access levels set for different personnel?

Audit considerations
l

Are there varying levels of security access for different types of transactions:
l l l l

Inquiry only, update non-monetary transactions, update financial transactions, and add/delete records.

Business continuity and disaster recovery With the formidable challenges and the growing complexity of IT systems that support business operations, an organisation must make comprehensive managed efforts to prioritise key business processes, identify significant threats to normal operation, and plan mitigation strategies to ensure effective and efficient organisational response to the challenges that surface during and after a crisis, which is called Business continuity planning. Disaster recovery involves an immediate intervention to minimise further losses brought on by a disaster and to begin the process of recovery, including activities and programmes designed to restore critical business functions and return the organisation to an acceptable condition. Auditors should determine if there are adequate backup and recovery procedures developed for the system:
l

Is there is periodic check on the application? Auditors should check:


l l

Have the processes and tools used to report, track, approve, fix, and monitor changes on the application been determined? Does the code reside in a code library or a different tool when being changed? Has the access to the code library been restricted? Have all requests for change been reviewed and authorised? Have all completed changes been reviewed for compliance with change and configuration management procedures, and authorised for release?

Are the levels appropriately assigned to the user department staff? Who has the ability to change passwords? Does the user department or data security control the password assignments? If controlled by the user department, does the staff member also have authority to input transactions? Are passwords masked, encrypted, stored in a visible file? Are there controls to log and monitor all sign-on attempts, both valid and invalid? Is all access to the system monitored? Does the application have controls in place to prevent unauthorised access to the system? Does the system lock out after a certain number of invalid sign-on attempts? Are both a password and logon-id required for access to the system? Are there controls against modern threats such as Viruses, Trojan Horses, Worms, Logic Bombs, Denial of Service attacks etc?

l l

l l

Security management Physical security Physical security is the protection of personnel, hardware, programs, networks, and data from physical circumstances and events that could cause serious losses or damage to an enterprise, agency, or institution. This includes protection from fire, natural disasters, burglary, theft, vandalism, and terrorism. Controls should be adopted to minimise the risk from potential threats such as water, electrical supply, fire, etc.
l

Have procedures been developed for disaster recovery and restart for the system. Have the recovery/restart procedures been documented? Do the procedures include all foreseeable circumstances? Do the plans include recovery of hardware and software? Are there procedures for the periodic backup of the system? How often will backups be done? How long will the backups be kept? What media will the backups be done on? (Tape, disk, diskette) Have the backup procedures been documented? How will the backups be labelled? Is the labelling consistent?

Audit Trail Reports Auditors should determine if there are adequate and effective audit trails and reports designed in the system:
l

l l l

Audit considerations
l l

Are items secured in some way? Are terminals in a locked, inaccessible area, kept away from the public and unauthorised users? Are there controls over the modems? Are diskettes stored in fireproof cabinet? Are backups stored off site? Are backup materials stored in a secure tape library?

Are detailed audit trail reports produced by the system automatically? Are audit reports listed on the report distribution schedule? Are the user departments satisfied with the information produced on the audit reports? Will the reports meet user and management needs? Will the reports satisfy audit needs? Can users input information, which will alter the audit trail reports? Are the reports distributed and reviewed by the appropriate people?

l l

l l l l

l l

28

Staff training Insufficient training will increase the risk of the application being misused or the system interrupted. The organisation should make sure that its staff are well trained, and training materials are available and up to date. Auditors should interview the development and user department leaders, talk about the training processes and get the latest training materials, user reference and other support materials. To determine whether the IT sta and all users received a proper training prior implementation, auditors should review as follows:
l

3. Audit report
As a result of the auditing work, auditors should make a full report. Generally speaking, the report should include: l General descriptions: In this section, auditors should state the audit objectives and scope, the methods used and the risk assessment.
l

Report of the audit findings and its impacts: Auditors should state the detailed findings of control weakness and the substantial impacts. Audit recommendations: It is preferable that auditors give some recommendations for control weakness.

whether there is a detailed Training Manual, User Manual and Technical Manual. In case of outsourcing or in-house development, whether above manuals are produced by the end of the development activity and have been delivered by the application provider, whether manuals have been checked and signed off. In case of ready-made application, whether all different users received their manuals prior implementation. whether all manuals were reviewed and signed off. whether all common users had been trained before the deployment. whether Security Awareness Training has been included. whether there is special training for system maintenance staff and management.

A standardised format for writing audit reports should at least include the following sections:
l

Executive Summary: Restates conclusion(s) for each audit objective and summarises significant findings and recommendations. Background: Provides background information about the purpose/mission of the audited area. It should also indicate whether a follow-up on the previous audit is included or not. Audit Objectives: List all audit objectives. Scope & Methodology: Identifies audited activities, time period, and the nature and the extent of audit tests performed. Audit Results: This section should be restricted to the documented factual statements, which can be substantiated. Statements of opinion, assumption, and conclusion should be avoided. Conclusions: The auditors opinion or conclusion based on the objectives of the audit should be stated. Recommendations: The auditors recommendation based on the results of the audit should be stated. Each recommendation should be preceded by a discussion of the finding and followed by the managements response to the recommendation. If the managements response is too lengthy to be included in the body of the report, a summary of the response should be included in the report with the complete response attached to the report (i.e., Appendices).

l l

29

Bibliography 1. The General Audit Guideline of the State Audit Bureau of Kuwait. 2. The High Tech Acquisitions Audit Manual of the State Audit Bureau of Kuwait. 3. COBIT 4.0 4. System Audit, M Revathy Sriram, Tata McGraw-Hill, 2001. 5. Managing The Audit Function, Michael P. Cangemi and Tommie Singleton, John Wiley & Sons, 2003. 6. Auditing Hardware and Software Contracts, William E. Perry, EDP Auditors Foundation. 7. www.adm.uwaterloo.ca 8. Post Implementation Reviews, David M. Burbage, 2001. 9. System Development Project, Judy Condon, 1999. 10. www.da.ks.gov 11. www.asosai.org 12. COBIT 4.1, IT governance institute, www.ISACA.org, 2007 13. Auditing Systems Development, INTOSAI IT AUDIT COMMITTEE, 2007 14. IT Audit Guidelines, 6th ASOSAI Research Project, 2003 15. Why IT projects fail, Steve Doughty, INTO IT 14, 2001

16. A new approach to the auditing of system development projects in South Africa, Eddie Pelcher, INTO IT 16, 2002 17. System Development Life Cycle (SDLC) Review, Document G23, www.isaca.org 18. System Development Life Cycle and IT Audits, Tommie W. Singleton, www.isaca.org, 2007 19. Chinese ITIL White paper, 2004 Glossary Risk: The potential that a given threat will exploit vulnerabilities of an asset or group of assets to cause loss and/or damage to the assets. It usually is measured by a combination of impact and probability of occurrence. SDLC: System development life cycle. The phases deployed in the development or acquisition of a software system. Typical phases include the feasibility study, requirements study, requirements definition, general design, detailed design, programming, testing, installation and post-implementation review. SDD: System Design Document CAATTs: computer aided audit techniques and tools URS : User Requirement Specification, which summarises the analysis of the IT project.

30

The INTOSAI information technology journal

it
www.intosaiitaudit.org

National Audit Office 2008 | Design and production by NAO Marketing & Communications Centre | DG Ref: 008757 | Printed by Dudfields Printed on Greencoat paper. Greencoat is produced using 80% recycled fibre and 20% virgin TCF pulp from sustainable forests.

You might also like