ICASA and USSASA Predetermined Objectives – 2013/14 March 2013 Portfolio committee.

23
ICASA and USSASA Predetermined Objectives – 2013/14 March 2013 Portfolio committee

Transcript of ICASA and USSASA Predetermined Objectives – 2013/14 March 2013 Portfolio committee.

Page 1: ICASA and USSASA Predetermined Objectives – 2013/14 March 2013 Portfolio committee.

ICASA and USSASAPredetermined Objectives – 2013/14March 2013Portfolio committee

Page 2: ICASA and USSASA Predetermined Objectives – 2013/14 March 2013 Portfolio committee.

2

Reputation promise/mission

The Auditor-General of South Africa has a constitutional mandate and, as the Supreme Audit Institution (SAI) of South Africa, it exists to strengthen our country’s democracy by enabling oversight, accountability and governance in the public sector through auditing, thereby building public confidence.

Page 3: ICASA and USSASA Predetermined Objectives – 2013/14 March 2013 Portfolio committee.

13

Independent Communication Authority of South Africa (ICASA)

Page 4: ICASA and USSASA Predetermined Objectives – 2013/14 March 2013 Portfolio committee.

1414

OVERVIEW

• A review of the draft 2013/14 Annual Performance Plan and the related draft Strategic Plan was performed. (Audits only performed on Departments)

• Our focus was to assess the usefulness of the information contained in the plans in terms of the:– measurability and relevance of indicators (well-defined, verifiable,

relevant)

– measurability of targets (specific, measurable, time-bound, relevant)

• Findings and discussions

• Conclusion in management report

Page 5: ICASA and USSASA Predetermined Objectives – 2013/14 March 2013 Portfolio committee.

1515

Findings – Measurability of Indicators

Definition:

The indicator needs to have a clear, unambiguous definition so

that data will be collected consistently and be easy to understand

and use (supported by Appendix E).

Error rate:

30% of indicators were not well-defined.

WELL- DEFINED

Page 6: ICASA and USSASA Predetermined Objectives – 2013/14 March 2013 Portfolio committee.

1616

Findings – Measurability of Indicators (continued)

VERIFIABLE

Definition

It must be possible to validate the processes and systems that

produce the indicator (supported by Appendix E).

Error rate:

30% of indicators were not verifiable.

Page 7: ICASA and USSASA Predetermined Objectives – 2013/14 March 2013 Portfolio committee.

1717

Findings – Measurability of Targets

Definition:

The nature and the required level of performance can be clearly

identified.

Error rate:

12% of targets were not specific.

SPECIFIC

Page 8: ICASA and USSASA Predetermined Objectives – 2013/14 March 2013 Portfolio committee.

1818

Findings – Measurability of targets (continued)

MEASURABLE

Definition:

The required performance can be measured.

Error rate:

12% of targets were not measurable

Page 9: ICASA and USSASA Predetermined Objectives – 2013/14 March 2013 Portfolio committee.

1919

Findings – Measurability of targets (continued)

TIME BOUND

Definition:

The time period or deadline for delivery is specified.

Error rate:

0% of targets were not time bound

Page 10: ICASA and USSASA Predetermined Objectives – 2013/14 March 2013 Portfolio committee.

2020

Findings – Relevance

RELEVANCE

Definition:

Indicators : The indicator must relate logically and directly to an

aspect of the institution’s mandate and the realization of strategic

goals and objectives.

Targets: The required performance is linked to the achievement

of a goal.

Error rate:

0% of indicators and related targets were not relevant

Page 11: ICASA and USSASA Predetermined Objectives – 2013/14 March 2013 Portfolio committee.

2121

TECHNICAL INDICATOR DESCRIPTIONS

• The Framework for Strategic Plans and Annual Performance Plans as issued by NT and enforced by Instruction Note 33 requires all departments, constitutional institutions and schedule 3A & 3C public entities to compile technical indicator descriptions (see Annexure E extract in slides) for all performance indicators included in their plans effective from the 2012/2013 reporting period.

• These technical indicator descriptions must be published on the website of the department/constitutional institution /public entity.

• ICASA has not compiled any technical indicator descriptions for the 2012/13 year nor the 2013/14 year (at the time of our review).

Page 12: ICASA and USSASA Predetermined Objectives – 2013/14 March 2013 Portfolio committee.

2222

CONCLUSION

• The errors identified relating to the 2013/14 plan:

– Targets not specific: below the threshold for qualification;

– Targets not measurable: below the threshold for qualification;

– Indicators not well-defined: above the threshold for qualification;

– Indicators not verifiable: above the threshold for qualification.

• Thresholds on errors identified:– 0% to 19% No opinion

– 20% to 50% Qualified

– Above 50% Adverse or disclaimer

Page 13: ICASA and USSASA Predetermined Objectives – 2013/14 March 2013 Portfolio committee.

2323

RECOMMENDATION

• Develop and implement standard operating procedures.• Clearly defined roles and responsibilities linked to

individual performance contracts.• A forum should be established for the portfolio to share

insights and have a consistent approach.• Continue to involve the AGSA in the planning process

allowing sufficient time for the AGSA to review the draft plans.

Page 14: ICASA and USSASA Predetermined Objectives – 2013/14 March 2013 Portfolio committee.

24

Universal Service Access Agency of South Africa (USSASA)

Page 15: ICASA and USSASA Predetermined Objectives – 2013/14 March 2013 Portfolio committee.

2525

Findings – Measurability of Indicators

WELL- DEFINED

Definition:

The indicator needs to have a clear, unambiguous definition so

that data will be collected consistently and be easy to understand

and use (supported by Appendix E).

Error rate:

32% of indicators were not well-defined.

Page 16: ICASA and USSASA Predetermined Objectives – 2013/14 March 2013 Portfolio committee.

2626

Findings – Measurability of Indicators (continued)

VERIFIABLE

Definition

It must be possible to validate the processes and systems that

produce the indicator (supported by Appendix E).

Error rate:

0% of targets were not verifiable

Page 17: ICASA and USSASA Predetermined Objectives – 2013/14 March 2013 Portfolio committee.

2727

Findings – Measurability of Targets

SPECIFIC

Definition:

The nature and the required level of performance can be clearly

identified.

Error rate:

24% of targets were not specific.

Page 18: ICASA and USSASA Predetermined Objectives – 2013/14 March 2013 Portfolio committee.

2828

Findings – Measurability of targets (continued)

MEASURABLE

Definition:

The required performance can be measured.

Error rate:

24% of targets were not measurable

Page 19: ICASA and USSASA Predetermined Objectives – 2013/14 March 2013 Portfolio committee.

2929

Findings – Measurability of targets (continued)

TIME BOUND

Definition:

The time period or deadline for delivery is specified.

Error rate:

0% of targets were not time bound

Page 20: ICASA and USSASA Predetermined Objectives – 2013/14 March 2013 Portfolio committee.

3030

Findings – Relevance

RELEVANCE

Definition:

Indicators : The indicator must relate logically and directly to an

aspect of the institution’s mandate and the realization of strategic

goals and objectives.

Targets: The required performance is linked to the achievement

of a goal.

Error rate:

0% of indicators and related targets were not relevant

Page 21: ICASA and USSASA Predetermined Objectives – 2013/14 March 2013 Portfolio committee.

3131

CONCLUSION

• The errors identified are still under discussion with Management at USAASA.

• Adjustment on the quality of the data included in the planning documents is likely to be made by the department.

Page 22: ICASA and USSASA Predetermined Objectives – 2013/14 March 2013 Portfolio committee.

3232

RECOMMENDATION

• Planning documents should be independently reviewed within the entity and the reviews should achieve adherence to the Frameworks issued by National treasury

• Oversight committees (Audit Committee) in the department to continuously monitor compliance and quality of planning documents

• The department/entity should develop the technical indicator descriptions to ensure consistent understanding of the indicators and process to reach the objectives

Page 23: ICASA and USSASA Predetermined Objectives – 2013/14 March 2013 Portfolio committee.

33

THANK YOU