Anomalies on projects
Transcript of Anomalies on projects
1
Training
HOW TO DETECT ANOMALIES
IN PROJECTS
Brussels, 16 November 2010
BOU2 – Conference Room: Copernic
2
Agenda
09h30 Introduction & Aims
09h45 Projects & Work Plans
11h00 Coffee
11h15 Applicants & Partners
11h45 People
12h15 Conclusions
12h30 End
3
Purpose of the training
• Raising awareness – to make you more aware of the nature of possible fraud
• To create an appropriate reflex (we are not inspectors!!)
• To give tips on spotting problems in projects as early as possible
• To make you aware of several indicators of potential fraud and information sources you could use and ways for follow-up
4
Be aware that:
• This course is about increasing the effectiveness of controls currently carried out, not imposing an additional layer of control
• The everyday actions of operational and financial actors and of the AO are the minimum necessary base for gaining assurance against errors and irregularities
• On this occasion, aware staff can have a real added-value in spotting potential fraud indicators:
5
What aptitude to adopt in everyday work?
Professional scepticism: a point of view which is neither positive nor
negative with respect to the persons or entities applying for project funding
• Professional scepticism means recognizing throughout the process that there is a possibility of fraud and remaining alert for potential indicators
• It is not cost effective to control 100% of the projects (you cannot control everything): strike the right balance between fraud risk and control
6
Discovering anomalies
Indicators– Signs which can show irregularity but are innocent most of
the time– Not “Red Flags” – nothing so concrete
Next Steps– Further analysis by PO/PM– In all cases PO/PM must immediately contact the HoS
and/or HoU!!– Other tasks to be done: note for the file, info plus, contact
R2 (LO/Financial sector), audit request, note for OLAF
7
Before we start / Disclaimer
• Examples used in the training are real (anonymised) cases! They were detected during ex-post audits in the Commission
• Most of the examples are research-family-based but the anomalies they show are of universal nature
• Risk level is different among the various policy areas: we propose some tools to identify possible anomalies that may be the results of fraud some tools may be more appropriate than others for your daily work it’s up to you to use them
8
Anomalies in projects
OLAF has identified 3 groups of irregularities:
1) Overcharging of staff costs (poor or non-existent recording of time spent, false/falsified accounting record and/or statements) linked with projects or workpackages
2) Plagiarism of scientific documents (falsification of activity reports)
3) Fraudulent use of names of companies in order to obtain a grant (“Fake” partners, “Fake” people)
9
SubcontractorSwap
SubcontractorSwap
Fake PartnersFake Partners
Multiple claims
multiple projects
Multiple claims
multiple projects
Extensive PlagiarismExtensive Plagiarism
Subcontract to own
company
Subcontract to own
company
Overcharge on single project
Overcharge on single project
Costs claimed in wrong period
Costs claimed in wrong period
Miscalculated hourly rates
Miscalculated hourly rates
Fake PeopleFake People
By mistake On purpose
Hard to spot
Easy to see
“Fake” Projects & WPs
10
“Fake” Projects & WPs
Project proposalProject
proposalTime
sheetsTime
sheetsCost
statementsCost
statementsAudit
certificatesAudit
certificates
11
Examples
The coordinator does not send the grant agreement to the other partners involved and does not transfer their part of the EU grant.
It over-executes tasks (which were assigned to other partners) and under-executes other tasks for which it is not interested.
P5 case
12
Examples
Charging for overtime. Hours charged for the project and number of employees working on the project. Inconsistency between staff salary slips and time sheets presented to the Commission.
The time sheets were related to a period outside the project’s scope. There were no time sheets for one person who was indicated as working on the project, but who denied any involvement.
OLAF case
13
Examples
The report showed clear similarities with results funded under another grant agreement. The research work actually conducted did not correspond to the work plan. The results described in the report were identical to those reported for a previous project and related to results already funded by the Commission under another grant agreement (same coordinator, similar partnership).
According to the work plan of the grant agreements, the work on the two projects should have been very different (laboratory analysis / developing and testing equipment).
OLAF case
14
“Fake” Projects & WPs
• Overpayment for new work–Vague WP descriptions–Too many partners / task–Subcontracts to related companies
• Work that has already been done–Other EC or national programme–Extension of existing project without (much) new work–Plagiarism
15
Subcontracting
• Partner - Communications expert– Peach has been selected by the consortium for its
experience in communication campaigns, with special target on modern ICT based communication methods ...
• Subcontracting– The other subcontracting costs are related to
communication activities and ICT technologies that cannot be carried out directly by the partners for the specific technical expertise required.
16
• Peach Key Staff:
Subcontracting
Organisation PeachName Mr Kernel
Organisation ConfitureName Ms Jam
• Facebook + Google: Married
17
Fake projects/WPs - Indicators
• Too many partners / WP• Partner’s core business unrelated to work• Multiple partners in dissemination, business planning,
marketing• Many staff declared – implausible WP complexity• Deliverables of a very general nature• Deliverables with indications of plagiarism• High cost of generic output (project website)• Subcontracting:
– large parts of workpackages– to related companies – to obscure companies
18
Information Sources
• Project information
–SAYKISS (Business Object)–National agencies, national contact points–Cordis (also via google: “site:cordis.europa.eu”)–Google
• WP–Proposal, description of tasks, reports, deliverables
19
Possible follow up
• Too many partners / task• Organise a monitoring visit• Try to clarify or reallocate responsibilities, sufficient detail, etc.
• Ask for reporting & review on each partners work• Check partners are real (see later slides)
• Non-original work• Ask to position project with respect to existing work• Check for plagiarism • Technical review, brief experts
20
Plagiarism - Definition
One of definitions of plagiarism is the fraudulent appropriation of the work of another person without citing source.
plagiarize
1.take and use (the thoughts, writings, inventions, etc. of another person) as one’s own.
(Concise Oxford Dictionary)
21
Plagiarism - Indicators
• Generic content, not linked to project
• Marked style changes
• Strange / nonsensical text
• Figures / text mismatch
• Over-familiar content
22
Strange text
Original: Usability guide for mobile games
The most comprehensive definition of playability states: The degree to which a game is fun to use, with an emphasis on the interaction style and plot-quality of the game.
Deliverable: application platform req.
The most comprehensive definition of functionality states: The degree to which an application is fun to use, with an emphasis on the interaction style and plot-quality of the application
23
Inconsistent figures
24
Inconsistent figures
60
Deliverable: a total of 70 widgets were surveyed and the results are presented in Figure 2.
25
Plagiarism - Examples
Main output of a LLP project was a Manual for teachers and pupils on “Learning English with Innovative Learning and Thinking Techniques”
26
Establishing Plagiarism
1. Show copying has happened
2. Show direction of copying
3. Show lack of references
27
Plagiarism - tools
• Google / Bing search– “phrase in quotes”
• Doc Cop (http://www.doccop.com)– Up to 500 words– Report by email
28
Doc Cop
29
Plagiarism follow up
Assess– likelihood of expert confusion
– extent of copying
– explicit claims of originality
– repetitive plagiarism
Responsibility & resources
Appropriate action– rejection for resubmission ... terminating participation– AO decision
29
30
“Fake” Partners
• Nonexistent companies• Existing companies, different purpose• Existing companies, unaware
31
Partner’s website
• Does the website of the partner exist?• Does the project fit the partner’s core activities?• Does the website give contact information - and
does it match the DoW? • Is the website registered by the partner?
– www.domaintools.com / whois– www.networksolutions.com
32
Company registry, phone
• List of company registration websites–http://www.rba.co.uk/sources/registers.htm –Don’t always get full information–Coface (more details, not free)
• Infobel, ixquick–Cross-check the phone number given by the partner with
the phone number in the yellow/white pages–Reverse search on the phone number given by the
partner where available, yellow & white pages
33
Examples
34
Examples
3535
Earth Match – partner in SOLARSYS
36
Emsoft.com
37
Earthmatch.com.mt
3838
Examples
39
Examples
A network of companies organised projects to get funding without actually doing any work (several EU programmes). A company presented project proposals, acting as intermediary for third companies and including in its team consultants who were unaware of “their” participation.
Projects were presented by different legal entities situated at the same addresses in different countries.
OLAF case
40
Examples
Multiple financing for similar projects run by independent but connected companies. Several companies based all over Europe. The companies were independent members of consortia without giving any indication that they belonged to the same group.
Certain persons were listed as either owner or director in a number of the companies concerned. A large number of people appeared to manage or work for several of listed companies, often combining roles as employee in one company and in-house experts in others. Some of the companies existed only in paper.
OLAF case
41
Tools - internet search
• Search for company in Google– Not reassuring if nothing found
• Translation tools– http://translate.google.com – babelfish.yahoo.com– www.systran.fr – ECMT, EC Machine Translation Service
(https://webgate.ec.europa.eu/mt/ecmt/Menu.do?method=login)
– http://eurovoc.europa.eu/ (The EU’s multilingual thesaurus)
42
Address Checks
• Google Maps / Streetview• Bing Aerial Photos
43
120 Organisations, 1 Address:
44
Indicators
• Email address not in company domain• Phone number = fax number• Phone number = gsm number• Website registered by another company• Website or phone numbers in another country• Corporate website
–without contact coordinates–“under construction”
45
Information Sources
• Project proposal, DoW, Project reports • SAYKISS• Company website• Company registration sites• www.braintrack.com (Webdirectory for university and
college search)• Phone / fax numbers
46
Information Sources: SAYKISS
47
Information Sources: SAYKISS
48
Information Sources: SAYKISS
49
Information Sources: SAYKISS
50
Possible follow up
• Check for other indicators
• Request further information
• Assess–No action needed
–Reinforced monitoring / targetted review
–That partner to be present at meetings
–Consult R2 (LO/Financial sector)
51
“Fake” People
• Non-existent people
• Existing, but–not relevant–not employed–not aware of project
• People in multiple roles / companies / projects
52
Neuron – partner in BRAIN
53
Neuron: Key staff
• DoW Description for Niklas Synapse– Computer Science degree– Experienced ICT researcher– etc
54
Neuron – Key staff
55
Neuron – Key staff
56
iThink - partner in BRAIN
57
Existing person – not aware
58
Example
- Fake participants in an event organised by the beneficiary (Town
twinning project)
See list of participants
59
Indicators
• Key staff– not found on internet
– from top management– always the same names– appear for multiple companies– live in a different country from the company
• CV mismatch - LinkedIn / DoW • Phone number = gsm number• No business cards• Non-company email (yahoo, gmail etc.)
60
Information Sources
• Names–Proposal, DoW, – IAR, PMR (ask for names), Deliverable authors–Business cards, presence lists (phone & email)
• General information on people–Google, 123people,
– Microsoft academic research (http://academic.research.microsoft.com/)
• CVs - LinkedIn• Phone/Fax - Infobel
61
Possible follow up
• If you see an indicator• Check info. sources to see if person exists, is an ICT researcher, linked to partner
• Request further information - justification of role, fixed phone number, ...
• Assess• No action needed• Reinforced monitoring, targetted review• Consult R2 (LO/Financial sector)
62
Conclusions
• The aims of this training were to:
• Help you to spot problems in projects as early as possible
• Show you indicators and information sources
• Gather and share best practice
63
Conclusions
• There are irregularities in our projects• Most projects and partners are trying to do
genuine work – only a few cases are irregular.
• We would like to hear from you!–Suggestions of information sources on data mining–Next steps & procedures–Requirements for future IT tools