NCIIT 12 Proceedings

86

Click here to load reader

description

Conference Proceedings

Transcript of NCIIT 12 Proceedings

Page 1: NCIIT 12 Proceedings

Proceedings of DRDO Sponsored

Third National Conference on Innovations in

Information Technology NCIIT 2012

24th & 25th February 2012

Organized By

Department of Computer Science and Engineering

Bannari Amman Institute of Technology

Sathyamangalam – 638 401

Erode District, Tamil Nadu

Page 2: NCIIT 12 Proceedings

ORGANIZING COMMITTEE

PATRON

Dr S V Balasubramaniam

Chairman, BIT

CHIEF PATRON

Dr S K Sundararaman

Director, BIT

CHAIRMEN

Dr A M Natarajan

Chief Executive, BIT

Dr A Shanmugam

Principal, BIT

CONVENER

Dr. P. Thangaraj

Professor & Head / CSE. BIT

ORGANIZING SECRETARY

Dr. K. Premalatha

Professor / CSE, BIT

COORDINATOR

Mr. J Vijay Franklin

Assistant Professor / CSE, BIT

Page 3: NCIIT 12 Proceedings

TECHNICAL COMMITTEE

Dr A Kannan

Anna University, Chennai

Dr S Kanmani

Pondicherry Engg. College,

Pondicherry

Dr K Chadrasekaran

NIT,Suratkal

Dr. Sudhasadhasivam

PSG College of Tech,

Coimbatore

Dr. M.L. Valarmathi

GCT, Coimbatore

Dr.N.Shanthi,

K.S.Rangasamy College of

Technology, Tiruchengode

Mr. N. Sundersan,

CTS, Bangalore

Dr. R. Thangarajan

Kongu Engineering College,

Erode

Dr. R.R. Rajalaxmi

Kongu Engineering College,

Erode

Dr. B. Sathyabama

Sona College of Technology,

Salem

Dr. B. Nagarajan

Bannari Amman Institute of

Technology, Sathyamangalam

Dr. Amitabh Wahi

Bannari Amman Institute of

Technology, Sathyamangalam

Page 4: NCIIT 12 Proceedings

Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

About the Conference

The Third National Conference on Innovations in Information Technology will

provide an excellent national forum for sharing knowledge and results in theory,

methodology, applications and innovations in Information Technology. The Conference

looks for significant contributions to all major fields of the Computer Science and

Information Technology in theoretical and practical aspects. The aim of the conference is

to bring together academic scientists, industry researchers and scholar students to

exchange and share their experiences and research results about all aspects of Information

Technology, and discuss the practical challenges encountered and the solutions adopted.

About Bannari Amman Institute of Technology

The Bannari Amman Institute of Technology (BIT) is the fruit of decided efforts put

up by the Bannari Amman Group, a leading corporate house under the dynamic

chairmanship of a great visionary Dr. S. V. Balasubramaniam in South India to establish a

center of excellence in Engineering & Technology. It is an impressive campus, situated in a

serene surrounding at the foot hills of Nilgiris Mountains. The institute is affiliated to Anna

University, Coimbatore and approved by AICTE, New Delhi. The Institute offers several

undergraduate and postgraduate Programmes in Engineering, Technology. The institution

is ISO 9001:2000 certified for its quality education and most of the courses are accredited

by National Board of Accreditation (NBA), AICTE.

About the Department of CSE

The Department of Computer Science & Engineering is a unique center of BIT

established in 1996. It offers a 4 year B.E - CSE programme and 2 year M.E - CSE

programme. The department has dedicated and specialized faculty members in different

areas of computer Science & Engg., with rich experience in academics, industry and

research. The department has well equipped and spacious laboratories with modern

computer equipments.

Page 5: NCIIT 12 Proceedings

Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

LIST OF PAPERS

Venue: CSE Smart Hall I & II

S.NO. TITLE OF THE PAPER PAGE NO.

1. A Secure And Dependable Storage With Multiple Cloud Servers

1

2. A Survey: Job Scheduling Algorithms In Grid Environment

1

3. An Efficient And Secure Nonlinear Programming Outsourcing in

Cloud Computing

2

4. Caching Scheme For Distributed Data Sharing In Peer-To-Peer

Environments

2

5. Enabling Public Auditability And Data Dynamics For Storage

Security In Cloud Computing

3

6. Enhancing Performance Of Home Network Using Middleware

3

7. Improving Hierarchical Load Balanced Algorithm For Job

Scheduling In A Grid Environment

4

8. Parallel Scheduling And Security Mechanism For Heterogeneous

Distributed Systems

4

9. Performance Analysis And Optimization Of Multi-Cloud

Computing For Loosely Coupled MTC Applications

5

10. Resume Analyzer Website For IT Companies Using Cloud

Computing

5

11. A Study Of Congestion Control Algorithm In Wireless Sensor

Networks

6

12. Accurate Tracking In Wireless Sensor Network By Sensor

Collaboration

6

13. Achieving High-Throughput Multicast Routing Based On

Rateguard In Wireless Mesh Networks

7

14. Analysis Of Contention Based Method For MAC Layer In

Wireless Networks

7

Page 6: NCIIT 12 Proceedings

Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

S.NO. TITLE OF THE PAPER PAGE NO.

15. Channel Allocation For Uncoordinated WLANs

8

16. Cognitive Topology Control In CR-MANETs

8

17. Constructing Minimum Size Wireless Sensor Networks To

Provide Critical Square Grid Coverage Using Steiner Tree Based

Critical Grid Coverage Algorithm

9

18. Controlled Sink Mobility For Prolonging Wireless Sensor

Networks Lifetime

9

19. Cooperative Multi-Hop Transmission In Wireless Networks

10

20. Customized QoS Metric Based On Data Traffic In Wireless

Sensor Network Routing

10

21. Secure And Efficient Retrieval Of Data In Cloud Computing

11

22. Workflow Optimization For Allocation Of Jobs In Grid

Environments

11

23. Efficient Resource Selection And Load Balancing Algorithm

Based On The Scheduling Of Parallel Applications

12

24. Novel Method For Throughput A Prediction Of Network Service

And Transfer Of Data Packets

12

25. Efficient Clustering And Discovery Of Resources In Wide-Area

Distributed Computational Grids

13

26. Concert Measure Of Network I/O Workload In Virtualized

Datacenter Using Para virtualization

13

27. Classification And Evaluation Of Grid Resources Based On Grid

Tasks For Minimizing Overhead Computation Time

14

28. Grid Technology For Neuroscience 14

29. Automatic Reconfigurable System For Run-Time Application 15

30. Secured And Efficient Outsourcing Of Linear Programming In

Cloud Computing

15

Page 7: NCIIT 12 Proceedings

Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

S.NO. TITLE OF THE PAPER PAGE NO.

31. Fairness Scheduler With Hierarchical Classification For

Resource Management In Grid Environment

16

32. A New Wireless Web Access For Web Surfing Based On Cloud

Computing

16

33. User Movement And Service Prediction Scheme For Mobile

Environment

17

34. Efficient Data Broadcasting In Underwater Wireless

Communication Networks

17

35. Efficient Data Collection Over Multitraffic Flow Using The Pass

Node Deployment

18

36. Energy Efficient On-Demand Routing Protocol For Local

Monitoring In Wireless Sensor Networks

18

37. Enhanced Medium Access Control Using Cross-Layer Link

Asymmetry Interaction For Wireless Mesh Network

19

38. Multiuser Detection And Collision Avoidance In Wireless

Network

19

39. Supporting Efficient And Scalable Multicasting Over Mobile Ad

Hoc Networks

20

40. Improving ADCC For Home Automation Networks In High

Volume Sensed Data

21

41. A Novel Approach for Network Security Using Data Mining

21

42. A Secure Authentication for Blocking Misbehaving Access using

Ticket Based Method

22

43. A Secure Key Transfer In Decentralized Secure Group

Communication By Using MDS Codes

22

44. Adaptive Audio Steganography Based On LSBMR Algorithm 23

45. An Architecture To Provide Authentication In Anonymous

Networks

23

Page 8: NCIIT 12 Proceedings

Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

S.NO. TITLE OF THE PAPER PAGE NO.

46. An Improved Method Based On Anonymization Algorithms For

Preserving Source-Location Privacy In Wireless Sensor Network

24

47. Authentic Non Symmetric Group Key Agreement Using

Broadcast Encryption

24

48. Automotive Can Network Attacked By Security Threats 25

49. Balancing Revocation And Storage Trade-Offs In Privacy-

Preserving Universal Authentication Protocol

25

50. Blacklisting Misbehaving Users In Indistinctive Networks 26

51. Concealment Of Information In Inactive Audio Frames Of VoIP 26

52. Detection Of Malicious User In Cooperative System

27

53. Dynamic Path Selection For Secure Communication In Peer-To-

Peer Systems

27

54. Improving Security And Efficiency In Mobile IP Networks 28

55. Low Cost And Low Power Security System Based On GSM

Technology

28

56. Low-Rate DDOS Attack Detection And Modified IP Traceback 29

57. Mitigating Selective Forwarding TCP Attacks By Combining

MAITH With A Channel-Aware Approach In MANET

29

58. Restoring Network Connectivity By Securing The Topology Of

Wireless Sensor Networks From Malicious Attack

30

59. Ticket Based Security Architecture For Anonymizing And

Tracing Misbehaving Clients In Wireless Mesh Networks

30

60. Stealthy Attacks In Wireless Adhoc Networks: Detection In

Multihop Networks

31

Page 9: NCIIT 12 Proceedings

Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

S.NO. TITLE OF THE PAPER PAGE NO.

61. Accuracy Optimization For High-Dimensional Data Using

DENCOS Clustering Algorithm

31

62. Efficient Bridging of Domain Distribution Gap using BIG

Algorithm in Transfer Learning

32

63. Document Segmentation Approaches And Techniques – An

Overview

32

64. E-Mail Abstraction Scheme For Spam Detection

33

65. Automatic Segmentation Of Retinal Images By Using

Morphological Watershed And Region Growing Method

33

66. Boosting-SVM And SRM-SVM Cascade Classifiers In Face

Verification

34

67. Detection Of Video Copy Using Fingerprint Extraction

34

68. Detection of WML In Brain Images Using Geostatistical Fuzzy

Clustering

35

69. Feature Extraction Of Intraductal Breast Lesion Images Using

GMM

35

70. Furthest Nearest Neighbour Criterion Based Active Learning In

KNN And SVM Classifiers

36

71. Image Segmentation For High Spatial Resolution Using Marker

Based Watershed Algorithm

36

72. Improving Web Image Search Using GMI Method

37

73. Optimal Contrast Tone-Mapping Using Linear Programming For

Image Enhancement

37

74. Optimal Feature Region Set Selection For Robust Digital Image

Watermarking

38

75. Reversible Data Hiding Technique For Hiding Secret Data In

Video Scene

38

76. Road Detection From A Single Image Using Simulated

Annealing

39

Page 10: NCIIT 12 Proceedings

Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

S.NO. TITLE OF THE PAPER PAGE NO.

77. The Effective Color Feature Selection And Mechanism For Face

Recognition

39

78. Vessel Tree Segmentation In Lung Images Affected By

Interstitial Lung Diseases

40

79. Handwritten Letter Recognition Using Classification Algorithms

40

80. An Approach To Motion Detection In Video Sequence

41

81. Location Monitoring Algorithms For Wireless Adhoc Networks

41

82. Modified Multimedia Architecture For Mobile Multimedia

Application

42

83. Enhancing Reliability And Lifetime Maximization In Duty

Cycled Wireless Sensor Network Based On Forwarding

Procedure

42

84. Performance Analysis And Improvement Measures For

Cognitive Radio Networks

43

85. Relay Node Placement Scheme To Increase Life Time In

Wireless Sensor Networks

43

86. Reliability And Securing Topology Maintenance Protocols For

Sensor Network

44

87. Robust Diffusion Of Video Using SUV In VANET‘s

44

88. Stable Channel Based Routing With Node-Path Handoff In

MANET

45

89. File Sharing In Unstructured Peer-To-Peer Network Using

Sampling Technique

45

90. Supporting Scalability And Stateless Multicasting In MANET

46

91. A Novel Approach On Greedy Maximal Scheduling Algorithm

On Embedded Networks

46

92. Location Tracking In Underwater Sensor Networks Using

Mobility Prediction By Divided Sensitive Ranges

47

Page 11: NCIIT 12 Proceedings

Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

S.NO. TITLE OF THE PAPER PAGE NO.

93. A High-Quality Secured Shell For Location Monitoring System

In Wireless Sensor Networks

47

94. Localization Scheme For Minimizing Error In Wireless Sensor

Networks Using Monte Carlo Localization Algorithm

48

95. Latency For Vertical Handoff Decision In Heterogeneous

Networks

48

96. Secure Hybrid Range Query Framework In Tiered Sensor

Network

49

97. Secure On-Demand Multicast Routing Protocol For Wireless

Mesh Networks

49

98. Route Redirection In Unstructured Network Using Chord

50

99. Secure And Efficient Incentive Protocol For Wireless Sensor

Networks

50

100. Automated urban drinking water distribution and water theft

control

51

101. Realizing Programmable Logic Control Environment Using Arm7

Microcontroller With Wired And Wireless Communication

Capabilities To Host

51

102. Sea Waves Signal Processing Using Recurrent Neural

Networks

52

103. Design Of Multiband Microstrip Patch Antenna

52

104. Linked Data Generation Framework and Its Application

53

105. Enhancement Of Wireless Sensor Network Based On Clustering

Approach

53

106. High Dimensional Data Anonymous Publication And Updates

To Confidential Databases

54

107. Identification Of Spam Using Structure Abstraction Generation

54

108. Mining KDD Cup Database For Intrusion Detection Based On

Fuzzy Class-Association Rule Mining Using Genetic Network

Programming

55

Page 12: NCIIT 12 Proceedings

Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

S.NO. TITLE OF THE PAPER PAGE NO.

109. Liver Cancer Classification From Gene Expression Using Swarm

Intelligence

56

110. An Efficient False Hits Reduction by Authenticated Multistep

Nearest Neighbor Search

57

111. Feature based Semantic Multi-Document Update Summary

Generation

57

112. Improving The Intrusion Detection System Using An Elegant

Adaptive Learning Technique

58

113. Web User Interference By Clustering

58

114. An Indexing Method For XML Data

59

115. Customized News Filtering and Summarization System Based on

Personal Interest

59

116. Low Power Filter Design Using Optimized Multiplexer Based

Multiplier And Adder Cell

60

117. Optimum Throughput Estimation In Multiband Multiantenna

Wireless Mesh Networks

60

118. Background Modeling and Subtraction of Dynamic Scenes

61

119. A Comparison Study Of Genetic Algorithm And Artificial

Immune System

61

120. Biometrics as an Authentication Measure

62

121. Review on Free and Open Source Software

62

122. Dynamic Bandwidth Adaptation supported Adaptive Call

Admission Control Mechanism for 3GPP: LTE Networks

63

123. An Efficient Jamming Detection in Wireless Mesh Networks

63

124. Enhancing Privacy And Reducing The Traffic Levels In

Vehicular Ad-Hoc Networks

64

125. Threat Modeling Using An Attack Surface Metric

64

Page 13: NCIIT 12 Proceedings

Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

S.NO. TITLE OF THE PAPER PAGE NO.

126. Secure Routing Through Trusted Nodes for Mobile Adhoc

Networks

65

127. A Modified Approach For Continuous User Authentication And

Intrusion Detection In High Security Mobile Ad Hoc Networks

65

128. An Efficient Approach For Detecting Mobile Replica Node

Attack In Wireless Sensor Network

66

129. A Optimal Information Hiding Technique with Tree Based

Similarity

66

130. An Effective Minimization Of Storage Overhead For Tracking

Down The Invasion Of Replicated Nodes In Wireless Sensor

Networks

67

131. Correlation - Based Traffic Analysis Attacks On Anonymity

Networks

67

132. Quick Response (Qr) Code: A Review

68

133. Modified Multimedia Architecture For Mobile Multimedia

Application

68

134. A Novel Hybrid Approach To Detect Color Texts In Natural

Scene Images

69

135. VP8 Video Codecs for mobile applications

69

136. Image based learning to enhance the study of visual impaired

person

70

137. Efficient Iris Recognition Based Biometric Techniques For

Embedded System

70

138. Image retrieval Using multi-feature score fusion through Genetic

Algorithm

71

139. An Artificial Device To Regain Memories For Accidently

Memory Lost Person‘s

71

140. Intelligent Car Backup Warning System

72

Page 14: NCIIT 12 Proceedings

Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

Page 15: NCIIT 12 Proceedings

1 Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

A SECURE AND DEPENDABLE STORAGE WITH MULTIPLE

CLOUD SERVERS

Gomathy N,

PG Student, Computer

Science and Engineering,

Oxford Engineering

College, Trichy. [email protected]

Raghav Ramana A V T

ASSO. PROF.& HOD,

Information Technology,

Oxford Engineering

College, Trichy. [email protected]

Sampathkumar V

PROF&HOD, Computer

Science and Engineering,

Oxford Engineering

College, Trichy

Abstract

A cloud storage system, consisting of a collection of storage servers, provides long

term storage services over the Internet. Storing data in single cloud server causes Loss and

corruption of data, loss of availability. Dependable storage system, a storage cloud of-

clouds that overcomes the limitations of individual clouds by using cryptography, secret

sharing, erasure codes and the diversity that comes from using several clouds. Dependable

storage employs a secret sharing scheme and erasure codes to avoid storing clear data in the

clouds and to improve the storage efficiency, amortizing the replication factor on the cost

of the solution. The proposed model extending scheme using the asymmetric keys and

multiple cloud servers for storage.

A SURVEY: JOB SCHEDULING ALGORITHMS IN GRID

ENVIRONMENT

Abstract

Grid computing is now being used in many applications that are beyond distribution

and sharing of resources. The distributed resources are useful only if the grid resources are

scheduled. Using optimal scheduler results in high performance grid computing, where as

poor schedulers produce contrast results. Now, the grid scheduling is a big topic in grid

environment for new algorithm model. The scheduling in grid environment has to satisfy a

number of constraints of different problems. This study provides one even basis for

comparison and insights into circumstances where one technique will outperform another.

The evaluation procedure is specified, the heuristics are defined, and then comparison

results are discussed.

S.Umarani

PG Student,

SNS College of Technology,

Coimbatore

[email protected]

L.M.Nithya

Associate Professor,

SNS College of

Technology, Coimbatore

Dr.A.Shanmugam

Professor &

Principal, Bannari

Amman Institute of

Technology, Erode

Page 16: NCIIT 12 Proceedings

2 Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

AN EFFICIENT AND SECURE NONLINEAR PROGRAMMING

OUTSOURCING IN CLOUD COMPUTING

M.Madhura,

PG Scholar, Department of Computer

Science and Engineering, Karpagam

University,INDIA

[email protected]

R.Santosh

Assistant Professor, Department of

Computer Science and Engineering,

Karpagam University,INDIA

[email protected]

Abstract

Cloud Computing provides a appropriate on-demand network access to a shared

pool of configurable computing resources which could be rapidly deployed with much

more great efficiency and with minimal overhead to management. This paper deals with the

secure outsourcing of nonlinear programming. It provides a practical mechanism design

which fulfils input/output privacy, cheating resilience, and efficiency. In the proposed

approach practical efficiency is achieved by explicit decomposition of NLP into NLP

solvers running on the cloud and private NLP parameters owned by the customer. When

compared to the general circuit representation the resulting flexibility allows exploring

appropriate security/efficiency trade-off via higher-level abstraction of NLP computations.

It is possible to construct a set of effective privacy-preserving transformation techniques for

any problem, by framing a private data possessed by the client for NLP problem as a

combination of matrices and vectors, which allow customers to transform original NLP

problem into some arbitrary value while defending sensitive input or output information.

To confirm the computational result, the fundamental duality theorem of NLP computation

should be explored and then derive the essential and adequate constraints that a accurate

result must satisfy. Such a result verification mechanism is very competent and suffers

close-to-zero extra cost on both cloud server and customers.

CACHING SCHEME FOR DISTRIBUTED DATA SHARING IN

PEER-TO-PEER ENVIROMENTS

S.Umamaheswari, H.Arthi Geetha, S.Jeevanandham

Assistant Professor,Department of Computer Science and Engineering

Akshaya college of Engineering and Technology, Coimbatore [email protected],[email protected]

Abstract

Distributed data sharing in peer-to-peer networks is implemented in two ways: One

way is the structured peer-to-peer network which maintains regular topology and provides

efficient data sharing. The other way is unstructured peer-to-peer network which maintains

arbitrary topology and provides flexibility in peer joining and leaving. To obtain both

efficiency and flexibility, these two categories of peer-to-peer networks are combined as

hybrid peer-to-peer network. In a hybrid network, when the popular data is requested by

large number of peers, the hosting peer in multicast communication is responsible in

sending data to all those requests. Here the hosting peer reaches the state of load

overwhelming. So we divide the multicast communication group into regional subgroups

by implementing a caching scheme. Each subgroup is independently managed by a

subgroup controller (SGC) like a separate multicast group with its own subgroup key.

Page 17: NCIIT 12 Proceedings

3 Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

ENABLING PUBLIC AUDITABILITY AND DATA DYNAMICS FOR

STORAGE SECURITY IN CLOUD COMPUTING

N.Sathya

PG Scholar, Computer Science and

Engineering, Vivekanandha College of

Engineering for Women

[email protected]

V.UshaRani

Lecturer in Computer Science and

Engineering, Vivekanandha College of

Engineering for Women

Abstract

Cloud Computing has been envisioned as the next-generation architecture of IT

Enterprise. It moves the application software and databases to the centralized large data

centers, where the management of the data and services may not be fully trustworthy. This

unique paradigm brings about many new security challenges, which have not been well

understood. This work studies the problem of ensuring the integrity of data storage in

Cloud Computing. In particular, we consider the task of allowing a third party auditor

(TPA), on behalf of the cloud client, to verify the integrity of the dynamic data stored in the

cloud. The introduction of TPA eliminates the involvement of the client through the

auditing of whether his data stored in the cloud are indeed intact, which can beimportant in

achieving economies of scale for Cloud Computing .The support for data dynamics via the

most general forms of data operation, such as block modification, insertion, and deletion, is

also a significant step toward practicality, since services in Cloud Computing are not

limited to archive or backup data only.

ENHANCING PERFORMANCE OF HOME NETWORK

USING MIDLLEWARE

Abstract

Cloud computing allows accessing resources across Internet transparently: requiring

no expertise in, or control over the underlying infrastructure. There is an increasing interest

in sharing media files with family and friends. As cloud computing grows rapidly and

Video-on-Demand (VoD) services become popular, it is critical and important to provide

Quality of Service (QoS) to more customers under limited resources. To address this issue,

we propose an adaptive QoS management framework for VoD cloud service centers.

However, UPnP or DLNA were not designed for media distribution beyond the boundaries

of a local network and manage media files through web applications can be tedious. To

overcome this problem, we propose Media Cloud, a middleware for Set-top boxes for

classifying, searching, and delivering media inside home network and across the cloud that

interoperates with UPnP and DLNA. We present the architecture of the service center and

then illustrate the QoS controlling process. To enhance the total revenue of the service

provider, we define optimization problem considering the charging model according to

―pay-as-you-go‖ patterns. The QoS-aware Cache Replacement algorithm is then developed

and described.

T.M.Nithya,

Student II M.E CSE ,Oxford

Engg college, Trichy-620009,

[email protected]

V.Murugan,

Asst.Professor,Oxford

Engg College,

Trichy-620009,

[email protected]

P.Saravanan

Lecturer,

M.Kumarasamy College of

Engineeering,

[email protected]

Page 18: NCIIT 12 Proceedings

4 Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

IMPROVING HIERARCHICAL LOAD BALANCED ALGORITHM

FOR JOB SCHEDULING IN A GRID ENVIRONMENT Tharani R

Master of Engineering,

Sri Ramakrishna Engineering College,

Coimbatore.

Email:[email protected]

Deepa K

Asst.Professor,

Department of Information Technology,

Sri Ramakrishna Engineering College,

Coimbatore.

Abstract

A grid environment collects, integrates, and uses heterogeneous or homogeneous

resources scattered around the globe by a high-speed network. A grid environment can be

classified into two types: computing grids and data grids. This paper mainly focuses on

computing grids. In computing grid, job scheduling is a very important task. A good

scheduling algorithm can assign jobs to resources efficiently and can balance the system

load. In this paper, we propose a hierarchical framework and a job scheduling algorithm

called Hierarchical Load Balanced Algorithm (HLBA) for Grid environment. In our

algorithm, we use the system load as a parameter in determining a balance threshold. And

the scheduler adapts the balance threshold dynamically when the system load changes. The

main contributions of this paper are twofold. First, the scheduling algorithm balances the

system load with an adaptive threshold and second, it minimizes the makespan of jobs.

PARALLEL SCHEDULING AND SECURITY MECHANISM FOR

HETEROGENEOUS DISTRIBUTED SYSTEMS

K.Lakshmi Raj,

Muthayammal Engineering College,

Rasipuram, Namakkal.

Abstract

High speed networks are used to connect heterogeneous systems. Different resource

levels are used in heterogeneous distributed systems. Resources are allocated for the

dependant and independent tasks. Processor and memory are shared with the nodes under

heterogeneous distributed system environment. Scheduling schemes are used for the

resource allocation process.

Heterogeneous systems are build to execute user applications on remote resources.

Processors are shared between the users. Tasks are divided into two categories. They are

dependent and independent tasks. Independent tasks can be scheduled in any sequence.

Dependent tasks can be scheduled with reference to the tasks precedence. Denial of Service

(DoS) attacks are initiated by the malicious users. Security is provided with reference to the

trust level of the resources. The current resource scheduling scheme supports single

resource allocation model. High security overhead is obtained in the current scheduling

scheme. HDS interconnection is not handled. Trust level is not optimized in the scheduling

methods

The proposed system is designed to manage resources with security. Multi

dimensional resource allocation scheme is proposed for the scheduling process.

Heterogeneous distributed systems (HDS) communication is provided in the system.

Resource allocation is performed with security level factors. The system development is

planned with JAVA front end and Oracle back end softwares.

Page 19: NCIIT 12 Proceedings

5 Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

PERFORMANCE ANALYSIS AND OPTIMIZATION OF MULTI-

CLOUD COMPUITNG FOR LOOSLY COUPLED

MTC APPLICATIONS

K.Indira,

M.E CSE

Mohamed Sathak Engineering College,

Kilakarai.

email_id:[email protected]

R.Karthikeyan, M.E (Ph.D),

Assist. Professor/CSE

Mohamed Sathak Engineering College,

Kilakarai.

Abstract Cloud storage enables network online storage where data is stored on multiple

virtual servers. In order to carry out huge tasks in cloud environment, single cloud provider

is not sufficient to perform the many tasks applications and services. For Handling

Intensive task, need to have multi cloud environment to improve the cost-effectiveness of

the deployment and increase availability. Larger tasks are carried out by processing of

many tasks at a time in a cloud computing environment. In this paper, for efficient handling

of multiple tasks, need to have the performance analysis and optimization of all tasks in the

multi-cloud environment. Performance analyses consist of CPU scheduling, Memory

utilization, I/O tasks, and resource time sharing and cost benefits. Earlier system

concentrates on the deployment of multi cloud architecture and multi-processing needs

more accuracy, scalability and efficiency. In the methodology, Job allocation by front end

server and service LAN are used. This research will achieved the process of multitasking

environment in multi cloud infrastructure by having some effective tools for measuring

over all performance and optimization of multi cloud computing services.

RESUME ANALYZER WEBSITE FOR IT COMPANIES USING

CLOUD COMPUTING

K.M.Bridhashree,

(M.E), LECTURER,

brindhashree@gmail.

com

K.Aishwarya,

B.Tech(IT)

aishukrishnasamy@

gmail.com

R.TamilSelvi,

B.Tech(IT),

tamilselvi90.r@

gmail.com,

.Akilandeswari,

B.Tech (IT)

akilamathi1031

@gmail.com,

Angel College Of Engineering and Technology, Tirupur. Abstract

In recent days in order to recruit an employee, the HR-Department officials has to

refer more than thousands of resumes personally and has to short list few of the resumes as

per the job. The officials spend lot of time on this process. To overcome this problem

LOMATCH tool was introduced. The LO-MATCH platform attempt heterogeneity issues

in the descriptions of resumes. The LO-MATCH tool is not affordable by everyone because

it is not open source. In this paper a system which offers a service similar to LO-MATCH

tool is proposed. The service relies on a cloud computing environment so as to offer SaaS

type of service. So that everyone can make use of the tool just by hiring instead of owning.

Page 20: NCIIT 12 Proceedings

6 Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

A STUDY OF CONGESTION CONTROL ALGORITHM IN

WIRELESS SENSOR NETWORKS

Abstract

Performance of Wireless Sensor Networks (WSNs) can be affected when the

network is deployed under different topologies. Without Proper Congestion control

mechanisms, the network become highly complex. Congestion occurs due to buffer

overflow and channel contention. Congestion causes packet losses, which in turn decreases

network performance and throughput. It is important to design protocols to control

congestion. It is also important to control traffic rather than forwarding path. In this paper,

we investigate various congestion control algorithms and evaluates their characteristics.

ACCURATE TRACKING IN WIRELESS SENSOR NETWORK BY

SENSOR COLLABORATION

Ms T. Dhanalakshmi, Ms R.Mythili, Prof T. Rajendran,

PG Scholar PG Scholar HOD of CSE Department

[email protected] [email protected] [email protected]

Angel College of Engineering and Technology, Tirupur

Abstract

Heuristic techniques enable to select an information fusion of the selected sensor

observation with the prior target location. The main objectives are to share the information

about the target accurately, to adjust the power consuming in the sensors while

transforming the information and to conserve the system resources by associating data

tracks which is simpler in distributed environment of global consistency. Mutual

information based sensor selection (MISS) algorithm is adopted to track the accurate

information collaboratively. A novel approach to energy savings is devised as information-

controlled transmission power (ICTP) adjustment in the nodes. These aspects enable

dual-space approach for both tracking and sensor resource management.

R.B.Dravida Priyaa,

ME (CCE),

SNS College of Technology,

Coimbatore.

[email protected]

Page 21: NCIIT 12 Proceedings

7 Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

ACHIEVING HIGH-THROUGHPUT MULTICAST ROUTING BASED

ON RATEGUARD IN WIRELESS MESH NETWORKS

Abstract Recent work in multicast routing for wireless mesh networks has focused on metrics

that estimate link quality to maximize throughput. Nodes must collaborate in order to

compute the path metric and forward data. The assumption that all nodes are honest and

behave correctly during metric computation, propagation, and aggregation, as well as

during data forwarding, leads to unexpected consequences in adversarial networks where

compromised nodes act maliciously. In this work, novel attacks against high throughput

multicast protocols in wireless mesh networks are identified. The attacks exploit the local

estimation and global aggregation of the metric to allow attackers to attract a large amount

of traffic. Here these attacks are very effective against multicast protocols based on high-

throughput metrics. Aggressive path selection is a double-edged sword: While it maximizes

throughput, it also increases attack effectiveness in the absence of defense mechanisms. it

from the network.

ANALYSIS OF CONTENTION BASED METHOD FOR MAC LAYER

IN WIRELESS NETWORKS

Abstract

Quality of Service (QoS) is the ability to guarantee a certain level of performance to

a data flow ie., guaranteeing required bit rate, delay, etc. IEEE 802.11 a/b/g networks do

not provide QoS differentiation among multimedia traffic. QoS provisioning is one of the

essential features in IEEE 802.11e. It uses Enhancement Distributed Channel Access

(EDCA) which is a contention-based channel access mode to provide QoS differentiation.

EDCA works with four Access Categories (AC). Differentiation of Access Categories are

achieved by differentiating the Arbitration Inter-Frame Space (AIFS), the initial contention

window size (CWmin), the maximum contention window size (CWmax) and the

transmission opportunity (TXOP).However AIFS, CWmin, CWmax are considered to be

fixed for a given AC, while TXOP may be varied. A TXOP is a time period when a station

has the right to initiate transmissions onto the wireless medium. By varying the TXOP

value among the ACs the QoS optimization- throughput stability

Anu Manohar ,

II Yr ME CSE

Department of Computer Science and Engineering ,

M.Kumarasamy college of Engineering

Thalavapalayam,Karur-639 113,Tamil Nadu,India

[email protected]

V.R.Azhaguramyaa

PG Student,

Sri Krishna College of

Technology,

[email protected]

S.J.K.Jagadeesh Kumar

Professor & Head, Sri

Krishna College of

Technology, Coimbatore

[email protected]

P.Parthasarathi

Asst. Professor, Sri Krishna

College of Technology,

Coimbatore

[email protected]

Page 22: NCIIT 12 Proceedings

8 Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

CHANNEL ALLOCATION FOR UNCOORDINATED WLANS

Abstract

Uncoordinated WLANs are small in size, independently owned and managed, and

deployed in areas where access points density may vary greatly. Traditional wireless LANs

make use of knowledgeable network administrators for centralized decisions on channel

selection, whereas in uncoordinated WLANs access points are often deployed by network

non specialists in an uncoordinated manner, leading to unplanned topologies, interference

and unsatisfactory throughput performance. In the existing system, a distributed channel

assignment algorithm for uncoordinated WLANs termed CACAO (Client-Assisted Channel

Assignment Optimization) is used for channel assignment. It uses the client‘s feedback

traffic information such as number of neighbors in the channel, channel traffic load for

better channel assignment. Using the client feedback, AP calculates the interference and

switch over to less interference channel. The proposed work is to modify the existing

CACAO algorithm by varying the way of calculating interference. Clients send the signal

strength of each channel to access points. The interference in channel is found by using

signal strength. Access point and its associated clients with high interference will switch

over to less interference channel.

COGNITIVE TOPOLOGY CONTROL IN CR-MANETS

Abstract

Energy and topology are limited resources in manets, Topology control reduces the

energy consumption of a node by reducing the number of links in a topology thereby

reducing initial topology of the network. Cognitive radio networks provide dynamic

spectrum allocation to cognitive users when primary users are inactive. In this paper

prediction based cognitive topology control (PCTC algorithm) is used such that topology

control technique is used to induce cognition in routing in CR-MANETS (Cognitive Radio

Mobile Ad Hoc Networks) based on link availability prediction. In this paper topology

control is performed on a cognitive radio network using AODV protocol with prediction

and without prediction. Simulation results shows that AODV routing protocol with

prediction perform better than AODV without prediction.

A.S.Renugadevi

M.E Computer and Communication

Engineering

Department of information Technology

Kongu Engineering College,Perundurai

Erode Dt,TamilNadu,India

Email:[email protected]

R.Devipriya Assistant Professor

Department of Information Technology

Kongu Engineering College,Perundurai

Erode Dt,TamilNadu,India.

Email: [email protected]

Chandni

II ME (C.C.E) / Information

Technology

Kongu Engineering College

Perundurai. Erode Dt., India

[email protected]

Suresh.P

Assistant Professor/Information

Technology

Kongu Engineering College

Perundurai. Erode Dt., India

[email protected]

Page 23: NCIIT 12 Proceedings

9 Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

CONSTRUCTING MINIMUM SIZE WIRELESS SENSOR

NETWORKS TO PROVIDE CRITICAL SQUARE GRID COVERAGE

USING STEINER TREE BASED CRITICAL GRID COVERAGE

ALGORITHM

Abstract

Wireless sensor network is used in wide range of application, since it has the ability

to collect, process and store information as well as to communicate with others. Such

connected networks have been applied extensively in military and civilian applications. In

certain application, the large sensor field is often distinguished into critical and common

areas. It is advisory to deploy the sensors efficiently in the critical areas compared to

common areas. Thus the connected wireless sensor network is formed by deploying

minimum number of sensors in the critical grid points. In this paper, we propose a Steiner

tree based algorithm to provide coverage of the critical grids. A good solution for the

proposed algorithm is obtained from the simulation results.

CONTROLLED SINK MOBILITY FOR PROLONGING WIRELESS

SENSOR NETWORKS LIFETIME

Abstract

Data delivery latency often increases due to the speed limit of mobile sink. Mostly

mobility is used to exploit the problem of data collection in Wireless Sensor Networks

(WSNs). WSN with MS (Mobile Sink) and provides a comprehensive taxonomy of their

architectures, based on the role of the MS. The overview of the data collection process

identifies the corresponding issues and challenges. On the basis of these issues, a protocol

named Energy Efficient Data Collection (EEDC) is used. Path selection problem in delay-

guaranteed sensor networks with a path-constrained mobile sink is focused. EEDC efficient

data collection scheme, which simultaneously improves the total amount of data and

reduces the energy consumption. The optimal path is chosen to meet the requirement on

delay as well as minimize the energy consumption of entire network. Predictable sink

mobility is exploited to improve energy efficiency of sensor networks. Simulation

experiments based on GLOMOSIM is conducted to validate the effectiveness of the

presented formulations and algorithms.

Dr.S.Uma Maheswari

Electronics and communication

Engineering

Coimbatore Institute of

Technology

[email protected]

P.Nithyakalyani

Electronics and communication

Engineering

Coimbatore Institute of

Technology

Coimbatore, India.

[email protected]

T.Sudha

PG Scholar,

Computer Science and Engineering,

Muthayammal Engineering College

[email protected]

Prof.M.Sayee Kumar

Assistant Professor,

Computer Science and

Engineering,

Muthayammal Engineering College

[email protected]

Page 24: NCIIT 12 Proceedings

10 Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

COOPERATIVE MULTI-HOP TRANSMISSION IN WIRELESS

NETWORKS

Abstract

We evaluate the performance of cooperative transmission, where nodes in a sending

cluster are synchronized to communicate a packet to nodes in a receiving cluster. In our

communication model, the power of the received signal at each node of the receiving

cluster is a sum of the powers of the transmitted independent signals of the nodes in the

sending cluster. The increased power of the received signal, vis-à-vis the traditional single-

node-to-single-node communication, leads to overall saving in network energy and to end-

to-end robustness to data loss. We propose an energy-efficient cooperative protocol, and we

analyze the robustness of the protocol to data packet loss. When the nodes are placed on a

grid, it reduces the probability of failure to deliver a packet to destination. In Energy

Efficient Protocol for Cooperative networks, transmitting and receiving nodes recruit

neighboring nodes to assist in communication. We model a cooperative transmission link in

wireless networks as a transmitter cluster and a receiver cluster.Up to 80% in energy

savings can be achieved for a grid topology, while for random node placement our

cooperative protocol can save up to 40% in energy consumption relative to the other

protocols. The reduction in error rate and the energy savings translate into increased

lifetime of cooperative sensor networks.

CUSTOMIZED QOS METRIC BASED ON DATA TRAFFIC IN WIRELESS

SENSOR NETWORK ROUTING

Abstract

Wireless Sensor Networks (WSN) with best effort quality of service (QoS) have

recently attracted a lot of attention as effective platforms for pervasive computing. The QoS

requirements are efficient usage of energy, latency and reliability. So the routing protocol

based on QoS requirement is proposed. So the routing protocol based on QoS requirement

is proposed. This modular approach protocol aims in ensuring the required QoS. In this

approach, a separate module is dedicatively made for QoS classification. A new Localized

quality of service routing protocol attempts to fulfill the required data-related quality of

service metrics with each packet while considering the power efficiency. Queuing module

takes strategy of prioritizing the packet. The QoS requirement is based on data traffic and

also considering the efficient energy usage. The metric considered here for QoS

classification are reliability, transmission energy, residual energy and delay. Based on this

metric, traffic is classified as several categories and queued. Depending on this classified

traffic, sufficient QoS is provided.

V.Yuvaraj PG Scholor,

Department of Computer Science and

Engineering,

The Kavery Engineering College, Mecheri, Salem,

India.

[email protected]

P.Sathishkumar HOD,

Department of Computer Science and

Engineering,

The Kavery Engineering College, Mecheri,

Salem, India.

V.Jeya Priya

II year M.E-Communication Systems

Anna University of Technology Madurai

[email protected]

R.Arun Prasath

Assistant Professor-ECE Department

Anna University of Technology Madurai

prasta [email protected]

Page 25: NCIIT 12 Proceedings

11 Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

WORKFLOW OPTIMIZATION FOR ALLOCATION OF JOBS IN

GRID ENVIRONMENTS

K.Nithiya ,ME(CSE) II YEAR

St.Michael college of

engineering and technology,

kalaiyarkoil.

[email protected]

Abstract

Grid scheduling is essential to Quality of Service provisioning as well as to efficient

management of grid resources. Grid scheduling usually considers the state of the grid

resources as well application demands. However, such demands are generally unknown for

highly demanding applications, since these often generate data which will be transferred

during their execution. Without appropriate assessment of these demands, scheduling

decisions can lead to poor performance. This paper introduces the IPDT-FUZZY scheduler,

a scheduler which considers the demands of grid applications with such uncertainties. The

scheduler uses fuzzy optimization and both computational and communication demands are

expressed as fuzzy numbers. Its performance was evaluated, and it was shown to be

attractive when communication requirements are uncertain.

SECURE AND EFFICIENT RETRIEVAL OF DATA IN CLOUD

COMPUTING M.Mythili,

Department of IT.

Angel College of Engineering

and Technology, Tirupur.

Abstract

Cloud Computing has been envisioned as the next-generation architecture of IT

Enterprise.It moves the application software and databases to the centralized large data

centers, where the management of the data and services may not be fully trustworthy.

Cloud computing provides cheap and efficient solutions for storing and analyzing mass

data. Firstly, cloud computing, Map Reduce programming model and hash table are

introduced. This paper proposes a method for retrieving efficient data in cloud

environment. This work studies the problem of ensuring the integrity of data storage in

Cloud Computing. In particular, we consider the task of allowing a third party auditor

(TPA), on behalf of the cloud client, to verify the integrity of the dynamic data stored in the

cloud. The introduction of TPA eliminates the involvement of the client through the

auditing of whether his data stored in the cloud are indeed intact, which can be important in

achieving economies of scale for Cloud Computing. The support for data dynamics via the

most general forms of data operation, such as block modification, insertion, and deletion, is

also a significant step toward practicality, since services in Cloud Computing are not

A.Saranyadevi,

Department of IT,

Angel College of Engineering and

Technology,Tirupur.

A.Suresh ME,MBA,

St.Michael college of

engineering and technology,

kalaiyarkoil.

Page 26: NCIIT 12 Proceedings

12 Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

limited to archive or backup data only. While prior works on ensuring remote data integrity

often lacks the support of either dynamic data operations, this paper achieves both. We first

identify the difficulties and potential security problems of direct extensions with fully

dynamic data updates from prior works and then show how to construct an elegant

verification scheme for the seamless integration of these two salient features. In particular,

to achieve efficient data dynamics, we improve the existing proof of storage models by

manipulating the classic Merkle Hash Tree construction for block tag authentication. To

support efficient handling of multiple auditing tasks, we further explore the technique of

bilinear aggregate signature to extend our main result into a multiuser setting, where TPA

can perform multiple auditing tasks simultaneously. Extensive security and performance

analysis show that the proposed schemes are highly efficient and provably secure.

NOVEL METHOD FOR THROUGHPUT A PREDICTION OF

NETWORK SERVICE AND TRANSFER OF DATA PACKETS

Sangeetha. P R.Kanagaraj, M.E,

PG scholar Assist .Professor

[email protected] Dept of Software Engg

Abstract Data Communication using Many task Computing Environment in a widely

Distributed Environment service uses multiple parallel TCP streams to improve the end-to-

end throughput of data transfers. It is developed to determine the number of parallel

streams, required to achieve the best network performance. Prediction points can be

obtained using Iperf and Grid FTP samplings inter cluster protocols, aggregating traffic

for high-speed encoding and using a new forward error correction scheme to handle bursty

loss.

EFFICIENT RESOURCE SELECTION AND LOAD BALANCING

ALGORITHM BASED ON THE SCHEDULING OF PARALLEL

APPLICATIONS G.K. Kamalam,

Assistant Proffesor/CSE

Kongu Engineering College,

Erode, India.

Abstract

Scheduling of jobs to the distributed heterogeneous resources in grid is a complicated

problem.The goal of scheduling is to achieve highest possible system throughput and to match

the application need with the available computing resources. In grid computing system, when

all jobs are assigned to the same resource leads to the high work load for the resource and the

computational time of the processed jobs is also high. Load balancing is a methodology

involving the linear and continuous modeling of partitionable computation and communication

loads for parallel processing. The ULS maintains the user-level resource pool, enables resource

selection and controls the execution. WLB-based scheduling, evaluate dynamic resource pool

and resource selection mechanisms, and examine dependencies of application performance on

aggregate characteristics of selected resources and application profile.

M.Sarmila,

PG Scholor, Dept of CSE

Kongu Engineering College,

Erode, India.

[email protected]

Page 27: NCIIT 12 Proceedings

13 Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

EFFICIENT CLUSTERING AND DISCOVERY OF RESOURCES IN

WIDE-AREA DISTRIBUTED COMPUTATIONAL GRIDS R.Nithya, ME CSE,

Kongu Engineering College,

Erode, Tamil Nadu, India

Email: [email protected]

Abstract

In large-scale computational Grids, discovery of heterogeneous resources as a working

group is crucial to achieving scalable performance. In the existing system, resource

management scheme, hierarchical cycloid overlay architecture, resource clustering and

discovery algorithms for wide-area distributed Grid systems are designed. Program/data

locality is established by clustering resources based on their physical proximity and

functional matching with user applications. Dynamism resilient resource management

algorithm, cluster-token forwarding algorithm and deadline-driven resource management

algorithms are developed for comparing favorably with other resource discovery methods

in static and dynamic Grid applications. The proposed work is to extend the HCO model to

secure and safeguard the Grid applications by applying virtual machine techniques and also

it focus on integrating peer-to-peer and grid technologies with machine virtualization

techniques for global scale internet applications.

CONCERT MEASURE OF NETWORK I/O WORKLOAD IN

VIRTUALIZED DATACENTER USING PARAVIRTUALIZATION Rajesh .M,

II Year M.E,

Department of Computer Science

and Engineering,

Kongu Engineering College,

Anna University Tamil Nadu,

Email: [email protected]

Abstract

Cloud computing [10] is gaining popularity as it‘s the way to virtualize the

datacenter and increase flexibility in the use of computation resources. This virtual machine

approach can dramatically improve the efficiency, power utilization and availability of

costly hardware resources, such as CPU and memory. Virtualization in datacenter had been

done in the back end of Eucalyptus software and Front end was installed on another CPU.

The operation of performance measurement had been done in network I/O applications

environment of virtualized cloud. Then measurement was analyzed based on performance

impact of co-locating applications in a virtualized cloud in terms of throughput and

resource sharing effectiveness, including the impact of idle instances on applications that

are running concurrently on the same physical host. This project proposes the virtualization

technology which uses the hypervisor to install the Eucalyptus software in single physical

machine for setting up a cloud computing environment.

Geetha M

Assistant Professor,

Department of Computer Science and

Engineering,

Kongu Engineering College,

Anna University Tamil Nadu, India.

E-mail:geetha @kongu.ac.in

Mr.N.Krishnamoorthy,

Assistant Professor, CSE, Kongu

Engineering College, Erode, Tamil

Nadu, India.

Email:[email protected]

Page 28: NCIIT 12 Proceedings

14 Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

CLASSIFICATION AND EVALUATION OF GRID RESOURCES

BASED ON GRID TASKS FOR MINIMIZING OVERHEAD

COMPUTATION TIME K.Kayalvizhi

PG Student, CSE,

Kongu Engineering College,

Erode, Tamil Nadu, India

Email: [email protected]

Abstract

Grid computing aggregates heterogeneous resources distributed across Internet,

regardless of differences between resources such as platform, hardware, software,

architecture, language, and geographical location. Resource allocation and job scheduling

are the core functions of grid computing. These functions are based on adequate

information of available resources. Timely acquiring resource status information is of great

importance in ensuring overall performance of grid computing. Grid resource monitoring

and grid resource prediction mechanisms are used for acquiring information of grid

resources. A distributed system architecture is designed which includes the key issues for

system implementation, including machine learning based methodologies for modelling

and optimization of resources. The proposed system focus on classification and evaluation

of grid resources.

GRID TECHNOLOGY FOR NEUROSCIENCE

Jayabharathi.P,G.Rubia,

Research scholar

Karpagam University, Coimbatore

[email protected]

Abstract

A parallel processing architecture in which CPU resources are shared across a

network, and all machines function as one large supercomputer. It allows unused CPU

capacity in all participating machines to be allocated to one application that is extremely

computationintensive and programmed for parallel processing. Grid computing evolved

from the parallel processing systems of the 1970s, the large-scale cluster computing

systems of the 1980s, and the distributed processing systems of the 1990s, and is often

referred to by these names.In this paper we present the overview of Brain Analysis Using

Grid Computing.

S.Kuppuswami,

Principal,

Kongu Engineering College,

Erode, Tamil Nadu, India

Email: [email protected]

Mrs.Agnes Kalarani,

Professor

Karpagam University, Coimbatore

[email protected]

Page 29: NCIIT 12 Proceedings

15 Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

AUTOMATIC RECONFIGURABLE SYSTEM FOR RUN-TIME

APPLICATION S.Inayathulla

2nd year ME.,

Sudharsan Engineering College,

Pudukkottai

Email id: [email protected]

Abstract

New generation embedded systems demand high performance, efficiency, and

flexibility. Reconfigurable hardware can provide all these features. However, the costly

reconfiguration process and the lack of management support have prevented a broader use

of these resources. To solve these issues we have developed a scheduler that deals with

task-graphs at run-time, steering its execution in the reconfigurable resources while

carrying out both prefetch and replacement techniques that cooperate to hide most of the

reconfiguration delays. In our scheduling environment, task-graphs are analyzed at design-

time to extract useful information. This information is used at run-time to obtain near-

optimal schedules, escaping from local-optimum decisions, while only carrying out simple

computations.

SECURED AND EFFICIENT OUTSOURCING OF LINEAR

PROGRAMMING IN CLOUD COMPUTING

Abstract

Cloud computing is delivery of computing in form of service rather than product. It

enables customers with limited computational resources to outsource their large

computation workloads to the cloud. Economically it provides massive computational

power, bandwidth, storage, and even appropriate software that can be shared in a pay-per-

use manner. Even though it provides tremendous benefits, security is the main concern for

computing customer‘s confidential data over the cloud. Mechanisms have to be designed

for protecting data and also the malicious behavior of computations in cloud. Fully

homomorphic encryption was a technique used for encrypting data for computational

outsourcing. This was represented by combinational Boolean circuit that was evaluated

with encrypted private inputs. Due to high complexity and circuit sizes it was not easy for

applying in practical. So, to provide higher level of abstraction, linear programming

concept was designed. it decomposes the LP computation outsourcing into public LP

solvers running on the cloud and private LP parameters owned by the customer. As a result

the security and effiency were improved over the general circuit representation.

Mr.T.Kapilachander M.E.,

Assistant professor,

ECE Department,

Sudharsan Engineering,

College,Pudukkottai.

Vijayalakshmi M

PG Student, Department of CSE,

Oxford Engineering College

[email protected]

Murugan V

Senior Lecturer, Department

of IT, Oxford Engineering

College

[email protected]

Page 30: NCIIT 12 Proceedings

16 Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

FAIRNESS SCHEDULAR WITH HIERARCHICAL

CLASSIFICATION FOR RESOURCE MANAGEMENT IN GRID

ENVIRONMENT

Absract

Resource management is a vital task of grid computing environment. It is the

responsibility of grid system to ensure that all applications/clients/tasks requesting for

resources are getting resources in a timely manner. Various recourse allocation strategies

are there which provide guidance for grid systems to make resource allocation decisions.

The detail paper will describes various Proportional share schedulers with O(1) overhead

for resource management in grid environment. The fair share scheduler(s) and Hierarchical

Classification ensure that resources are allocated to in an efficient manner and this ensures

fairness in resource allocation.

A NEW WIRELESS WEB ACCESS FOR WEB SURFING BASED ON

CLOUD COMPUTING

Abstract

The growing popularity of wireless networks has ead to cases of heavy utilization of

wireless networks; the wireless portion of the network is a major performance bottleneck.

Understanding the behavior of the wireless portion of such networks is critical to ensure

their robust operation. This understanding can also help optimize network performance.

This paper deals about Wireless network refers to computer network associated with a

cloud computing whose interconnections between nodes are implemented without the use

of wires (e.g., Mobile web) and how is accessed. Wireless operations permits services, such

as long range communications, that are impossible or impractical to implement with the use

of wires. Here the Mobile Web refers to the use of Internet-connected applications, or

browser-based access to the Internet from a mobile device, such as a Smartphone or tablet

computer, connected to a wireless network.

P.Dhivya

M.E Computer Science and

Engineering, SNS College of Technology,

Coimbatore

M.Sukanya

M.E Software Engineering

SNS College of Technology,

Coimbatore

Rajeshwari.P

PG scholar

Angel College of Engineering &

Technology

[email protected]

Padmavathi.S

PG scholar

Angel College of Engineering &

Technology

[email protected]

Page 31: NCIIT 12 Proceedings

17 Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

USER MOVEMENT AND SERVICE PREDICTION SCHEME FOR

MOBILE ENVIRONMENT

Abstract

Spatial data mining techniques are used to mine data values with location

information. Temporal analysis is applied to perform time analysis. Location and time

based analysis is applied on a variety of mobile service analysis application. Different

services are provided under the mobile environment. User behavior analysis is carried out

using the sequential pattern mining methods. Location and time factors are used in the

sequential pattern mining methods. Clustering techniques are used to group up the

transactions based on the transaction relevancy.Most of existing techniques focus on

discovering mobile patterns from the whole logs. However, this kind of patterns may not be

precise enough for predictions since the differentiated mobile behaviors among users and

temporal periods are not considered. Cluster-based Temporal Mobile Sequential Pattern

Mine (CTMSP-Mine) is used to discover the Cluster-based Temporal Mobile Sequential

Patterns (CTMSPs). A prediction strategy is proposed to predict the subsequent mobile

behaviors. In CTMSP-Mine, user clusters are constructed by Cluster-Object-based Smart

Cluster Affinity Search Technique (CO-Smart-CAST). The similarities between users are

evaluated by the Location-Based Service Alignment (LBS-Alignment). A time

segmentation approach is presented to find segmenting time intervals where similar mobile

characteristics exist. The CTMSP mine model is enhanced with hybrid prediction model

(HPM) and recursive motion functions (RMF) scheme to improve the pattern identification

and prediction accuracy levels. The system development is planned with Java language and

Oracle database.

EFFICIENT DATA BROADCASTING IN UNDERWATER WIRELESS

COMMUNICATION NETWORKS R.Navitha

PG Scholar,

Department of CSE,

Kongu Engineering College, Erode, Tamil Nadu, India

Email: [email protected]

Abstract Underwater wireless communications can enable many scientific, environmental,

commercial, safety, and military applications. The design of routing protocols for

Underwater Wireless Sensor Networks (UWSNs) poses many challenges due to the

intrinsic properties of underwater environments. This paper proposes an efficient adaptive

depth-based routing protocol (DBR) for disseminating the data items in the network. DBR

does not require full-dimensional location information of sensor nodes. Instead, it needs

only local depth information of every sensor node in order to forward the data packets. The

main advantage of DBR protocol is that it can handle network dynamics efficiently without

the assistance of a localization service and also it achieves higher packet delivery ratio in

dense networks.

C.Kavitha,

Sengunthar Engineering College,

Tiruchengode.

[email protected]

M.Sakthivel., M.E., (Ph.D).,

Sengunthar Engineering College,

Tiruchengode.

Page 32: NCIIT 12 Proceedings

18 Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

EFFICIENT DATA COLLECTION OVER MULTITAFFIC FLOW

USING THE PASS NODE DEPLOYMENT S.SARIGA,

PG Scholar, Mount Zion College

of Engineering and

Technology,Lenevilaku

Email: [email protected]

Vivian Rachel jayson,

Assistant Professor, Mount Zion

College of Engineering and

Technology,Lenevilaku

Email:

[email protected]@.com

Abstract

In a heterogeneous wireless sensor network (WSN), relay nodes (RNs) are adopted

to relay data packets from sensor nodes (SNs) to the base station (BS). The deployment of

the RNs can have a significant impact on connectivity and lifetime of a WSN system. This

paper studies the effects of random deployment strategies. We first discuss the biased

energy consumption rate problem associated with uniform random deployment. This

problem leads to insufficient energy utilization and shortened network lifetime. To

overcome this problem, we propose two new random deployment strategies, namely, the

lifetime-oriented deployment and hybrid deployment. The former solely aims at balancing

the energy consumption rates of RNs across the network, thus extending the system

lifetime. However, this deployment scheme may not provide sufficient connectivity to SNs

when the given number of RNs is relatively small. The latter reconciles the concerns of

connectivity and lifetime extension. Both single-hop and multihop communication models

are considered in this paper. With a combination of theoretical analysis and simulated

evaluation, this study explores the trade-off between connectivity and lifetime extension in

the problem of RN deployment. It also provides a guideline for efficient deployment of

RNs in a large-scale heterogeneous WSN.

ENERGY EFFICIENT ON-DEMAND ROUTING PROTOCOL FOR

LOCAL MONITORING IN WIRELESS SENSOR NETWORKS

P.Kavitha

PG Scholar, Computer Science and

Engineering, Muthayammal Engineering

College

[email protected]

Prof.M.Sayee Kumar

Assistant Professor, Computer Science

and Engineering, Muthayammal

Engineering College

[email protected]

Abstract

` Sleep-wake protocols are critical in sensor networks to ensure long-lived operation.

However, an open problem is how to develop efficient mechanisms that can be

incorporated with sleep-wake protocols to ensure both long lived operation and a high

degree of security. Our contribution in this paper is to address this problem by using local

monitoring, a powerful technique for detecting and mitigating control and data attacks in

sensor networks. In local monitoring, each node oversees part of the traffic going in and out

of its neighbors to determine if the behavior is suspicious, such as, unusually long delay in

forwarding a packet.

Page 33: NCIIT 12 Proceedings

19 Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

ENHANCED MEDIUM ACCESS CONTROL USING CROSS-LAYER

LINK ASYMMETRY INTERACTION FOR WIRELESS MESH

NETWORK S.Priya

M.E- Computer Science & Engineering

Muthayammal Engineering College

Namakkal- 637 408

India

Email:[email protected]

D.Dhivya

Lecturer –Computer Science &

Engineering

Narasu‘s Sarathy Institute of Technology

Salem

India

Email:[email protected]

Abstract

In wireless mesh network, the cross-layer approach eliminates the link asymmetry

problem that occurs due to heterogeneous transmission range. However in transport layer,

network events such as channel noise, mobility and congestion deteriorates the quality of an

existing end-to-end TCP connection performance. A two new mechanisms namely, the

TCP Fractional Window Increment scheme and the ROute-failure notification using BUlk-

losS Trigger (ROBUST) policy are designed to address the problem in TCP connection.

The TCP Fractional Window Increment scheme is a preventive solution used to reduce the

congestion-driven wireless link loss. The ROBUST policy is a corrective solution that

enables on-demand routing protocols to suppress overreactions induced by the aggressive

TCP behavior. This approach improves the performance and throughput of Wireless Mesh

Network. Simulation results can be performed to determine the performance of the

network.

MULTIUSER DETECTION AND COLLISION AVOIDNESS IN

WIRELESS NETWORK S.Arunkumar,

M.E. Network Engineering,

Vel Tech Multi Tech Dr.RR Dr.SR

Engineering College.

Email: [email protected].

Y.Kallifulla,

Assistant Professor,

Department of Information Technology,

Vel Tech Multi Tech Dr.RR Dr.SR

Engineering College.

Abstract Combating collision is one of the major challenges in the design of the MAC

algorithm for wireless network. To overcome this problem, proposing new MIMO/MPR-

aware cross layer MAC/PHY design. That is capable of combating collision through the use

of a multiple packet reception technique. Recent advances in MIMO communication have

provided the possibility of simple detection of colliding packets. Analytical and simulation

result show that proposed MAC design can considerably improve the throughput of a

WLAN operating over lossy links. Exploit the multi-packet reception (MPR) do the

increase the order capacity of random wireless network.MPR provides a better capacity

improvement for wireless network. The use of MIMO in PHY point-to-point as well as

multiuser communication has been extensively studied in the recent literature. The

proposed analysis applies both packet transmission schemes employed by DCF, namely,

the basic access and the RTS/CTS access mechanism.

Page 34: NCIIT 12 Proceedings

20 Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

SUPPORTING EFFICIENT AND SCALABLE MULTICASTING

OVER MOBILE AD HOC NETWORKS

F.AngelIgnishyaa,[II-ME]

Department of Computer Science and Engineering,

M.Kumarasamy college of Engineering

Thalavapalayam, Karur (Dt.), Tamil Nadu, India.

[email protected]

Abstract

Group communications are important in Mobile Ad hoc Networks (MANETs).

Multicast is an efficient method for implementing group communications. However, it is

challenging to implement efficient and scalable multicast in MANET due to the difficulty

in group membership management and multicast packet forwarding over a dynamic

topology. I propose a novel Efficient Geographic Multicast Protocol (EGMP). EGMP uses

a virtual-zone-based structure to implement scalable and efficient group membership

management. A network wide zone-based bidirectional tree is constructed to achieve more

efficient membership management and multicast delivery. The position information is used

to guide the zone structure building, multicast tree construction, and multicast packet

forwarding, which efficiently reduces the overhead for route searching and tree structure

maintenance. Several strategies have been proposed to further improve the efficiency of the

protocol, for example, introducing the concept of zone depth for building an optimal tree

structure and integrating the location search of group members with the hierarchical group

membership management. Finally, I design a scheme to handle empty zone problem faced

by most routing protocols using a zone structure. The scalability and the efficiency of

EGMP are evaluated through simulations and quantitative analysis. My simulation results

demonstrate that EGMP has high packet delivery ratio, and low control overhead and

multicast group joining delay under all test scenarios, and is scalable to both group size and

network size. Compared to Scalable Position-Based Multicast (SPBM), EGMP has

significantly lower control overhead, data transmission overhead, and multicast group

joining delay.

Page 35: NCIIT 12 Proceedings

21 Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

IMPROVING ADCC FOR HOME AUTOMATION NETWORKS IN

HIGH VOLUME SENSED DATA

Abstract

A wireless sensor network (WSN) consists of spatially distributed autonomous

sensors to monitor physical or environmental conditions, such as temperature, sound,

vibration, pressure, motion or pollutants and to cooperatively pass their data through the

network to a main location.There is a growing interest in the wireless sensor network

technology in the home automation field, but as the number of sensor nodes in the home

increases and as the data traffic generated by such nodes grows, the network becomes more

congested. Due to resource constraints, a congestion control scheme for wireless sensor

network is designed with simplicity and energy efficiently. In existing system, ADCC

(Adaptive Duty-cycle Based Congestion Control), a lightweight congestion control scheme

using duty-cycle adjustment for wireless sensor networks was used. This scheme uses both

the resource control and traffic control approaches according to the amount of network

traffic for the congestion avoidance. The proposed work improves energy efficiency with

congestion control scheme implemented for Home Automation Network (HAN) with

wireless sensor network (WSN). The Improvement is made on Adaptive Duty-cycle Based

Congestion Control (ADCC) scheme. The deployment of Improved ADCC involves the

aggregation of incoming traffic and node‘s channel capacity variation.

A NOVEL APPROACH FOR NETWORK SECURITY USING DATA

MINING M.M Gowthul Alam

1

Assistant professor

National College of Engineering,

Maruthakulam, Tirunelveli.

[email protected]

P.Rama Subramanian2

P.G student

National College of Engineering,

Maruthakulam, Tirunelveli

[email protected]

Abstract

Data mining is the process of automatically searching large volumes of data for patterns.

The Network Security System has been developed to impart security to the files accessed

by the users logging into a secured network. The System determines whether the user is an

authorized user of the network. The users are validated using their login id, password and

secret codes. If the user login is valid then they are given access to view the files according

to their category classification. If the user is an intruder then an alert message is displayed.

The System determines all the intruders using Decision tree algorithm which is one of type

of Data mining algorithm.

A.M.NATARAJAN

Professor & Chief

Executive, Bannari Amman

Institute of Technology,

Sathyamangalam, Erode

N.PREMALATHA,

Assistant Professor,

Department of CSE, Kongu

Engineering College, Erode,

[email protected]

J.YASODHA,

PG Student, CSE, Kongu

Engineering College,

Erode,

[email protected]

Page 36: NCIIT 12 Proceedings

22 Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

A SECURE AUTHENTICATION FOR BLOCKING MISBEHAVING

ACCESS USING TICKET BASED METHOD

Jackulin.C ,

PG Scholar, Mount Zion College of

Engineering and Technology,lenevilaku

Email: [email protected]

D.Elavarasi M.E

Assistant Professor, Mount Zion

College of Engineering and Technology,

lenevilaku

Abstract

The main purpose of Ticket based method is to make a credential system authentication

for the users who are misbehave with the anonymous networks. These anonymous systems

helps user to hide their ip from the server, so server can‘t find the clients who are doing the

defacing. And generally the most of the abuser acts in the server with these anonymous

network helps. During this kind of problem the server will denial the whole access of the

particular route, because of this the users which are properly communicate by the route may

affect. When the server blocks the whole route it will purely reflect on the other users.

Ticket based system makes an authentication for the defacing users. It will get a blacklist

from the server and check every user with it by the help of pseudonym manager[6]. The

black list of the server will helps to restrict the users of anonymous network. If the user

misbehaves the pseudonyms will added into the blacklist, this will further helps to nymble

manager to block the users from route. This system ensures that users are aware of their

blacklist status before they present a nymble, and disconnect immediately if they are

blacklisted.

A SECURE KEY TRANSFER IN DECENTRALIZED SECURE

GROUP COMMUNICATION BY USING MDS CODES

T.Sangeetha ,R.B Dravidaa priyaa Mr.V.Jeyakrishnan

M.E Computer &Communication Engg Assistant professor

SNS College Of Technology, Coimbatore SNS College Of Technology,

Coimbatore [email protected]

[email protected]

Abstract

In this paper mainly, we mainly focus storage, computation, communication cost for

secure dynamic multicast key distribution. Efficient key distribution is an important

problem in secure group communication. Members in the groups are dynamic. They

required new key update by using some encryption algorithm during the time of member

join and revoked from the group. The previous work is focus on the basis of complete key

graph algorithm, hierarchical key management algorithm follow the secure key distribution

in the centralized method. We propose new protocol framework is secure group overlay

multicast (SeGrOM) that follow decentralized method apply the protocol in hierarchical

structure forms hybrid key management algorithm. Instead of using conventional

encryption algorithm, the MDS code is used. Easily combined with the hierarchical

structure provide low and balanced communication cost, storage cost, computation

complexity for secure dynamic multicast key distribution.

Page 37: NCIIT 12 Proceedings

23 Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

ADAPTIVE AUDIO STEGANOGRAPHY BASED ON LSBMR

ALGORITHM Sharmila.B

II M.E [Computer and Communicaiton Engg]

Kongu Engineering College

Perundurai. Erode Dt., India

e-mail :[email protected]

Abstract

Steganography is the art of hiding the fact that communication is taking place, by hiding

information in other medium. Many different carrier file formats can be used, but audio

files are the most popular because of their frequency on the Internet. For hiding secret

information in audio, there exists a large variety of steganographic techniques. The least-

significant- bit (LSB) based approach is a simplest type of steganographic algorithm. In all

the existing approaches, the decision of choosing the region within a audio is performed

without considering the relationship between audio samples and the size of secret message.

Thus the samples in lower power audio will be ruin after data hiding even at a low data

rate. Hence choosing the high power audio samples for data hiding will be a solution. This

paper presents the results of analyzing the performance of data hiding at high power of the

audio. Moreover to increase the complexity for intrusion detection, some preprocess is

done. The adaptive steganography for audio file is experimented with WAV file.

AN ARCHITECTURE TO PROVIDE AUTHENTICATION IN

ANONYMOUS NETWORKS

P.Yogananth,

Department of CSE,

M. Kumarasamy College of Engineering,

Karur-639 113, India.

Email: [email protected],

Tel: +91 9965868842.

Abstract

Anonymizing networks such as Tor allow users to access Internet services privately by

using a series of routers to hide the client‘s IP address from the server. The success of such

networks, however, has been limited by users employing this anonymity for abusive

purposes such as defacing popular Web sites. Web site administrators routinely rely on IP-

address blocking for disabling access to misbehaving users, but blocking IP addresses is not

practical if the abuser routes through an anonymizing network. As a result, administrators

block all known exit nodes of anonymizing networks, denying anonymous access to

misbehaving and behaving users alike. To address this problem, I have present Nymble, a

system in which servers can ―blacklist‖ misbehaving users, thereby blocking users without

compromising their anonymity. Our system is thus agnostic to different servers‘ definitions

of misbehavior—servers can blacklist users for whatever reason, and the privacy of

blacklisted users is maintained.

Shanthakumari.R

Assistant Professor/Information

Technology

Kongu Engineering College

Perundurai. Erode Dt., India

email: [email protected]

Mr.P.Saravanan ME

Department of CSE,

M. Kumarasamy College of

Engineering,

Karur-639 113, India.

Page 38: NCIIT 12 Proceedings

24 Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

AN IMPROVED METHOD BASED ON ANONYMIZATION

ALGORITHMS FOR PRESERVING SOURCE-LOCATION PRIVACY

IN WIRELESS SENSOR NETWORK

S.Reka,

PG Scholar,

Mount Zion College of

Engineering and Technology,

Lenevilaku.

[email protected]

Abstract

Sensor networks have been widely employed in many real-time applications. One of the

most obvious challenges appearing to threaten the successful deployment of sensor

networks is privacy issues including source-location privacy which can not be adequately

addressed by general security mechanisms. Focusing on this important kind of privacy,

among many approaches proposed in literatures, self-adjusting phantom routing is a very

successful one. But it still has some weaknesses. In this paper, we propose an improved

version of it to enhance its performance. This method can decrease energy consumption and

communication cost while increase the accuracy of the aggregate locations by minimizing

their monitored areas.

AUTHENTIC NON SYMMETRIC GROUP KEY AGREEMENT

USING BROADCAST ENCRYPTION Eva Mariam Babu, PG Scholar,

Sri Ramakrishna Engineering College, Coimbatore.

Email: [email protected]

Abstract

Asymmetric GKA enables users of a group to derive common encryption key which is

accessible to any outsiders. This provides confidential communications and also allows any

outsider to send encrypted messages to the users of a group. An Authenticated ASGKA

protocol offers security against active attacks in open networks. Based on this protocol, a

broadcast encryption system without relying on a trusted dealer to distribute the secret keys

to the users is proposed. Improved systems are also permits a sender to select receivers for

broadcast encryption and achieves perfect forward security.

C.Senthamarai M.E dPh. ,

Assistant Professor, Mount Zion College of

Engineering and Technology,

Lenevilaku.

[email protected]

Page 39: NCIIT 12 Proceedings

25 Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

AUTOMOTIVE CAN NETWORK ATTACKED BY SECURITY

THREATS C. Umaa,

PG Scholar,

Anna University of Technology,

Madurai,[email protected]

Abstract

The IT security of automotive system is an evolving area of research. To analyse the

current situation and the potentially growing tendency of arising threats we performed

several practical tests on recent automotive technology. This article summarizes the result

of four practical tests performed on the control system for the window lift ,airbag control

system, warning light and the central gateway. The paper further discuss two selected

countermeasures and they are Intrusion detection and IT forensics measures. While these

reactive approaches are short-term measures, which could already be added to today‘s

automotive IT architecture, long-term concepts also are shortly introduced, which are

mainly preventive but will require a major redesign. Beneath a short overview on respective

research approaches, we discuss their individual requirements, potential and restrictions.

BALANCING REVOCATION AND STORAGE TRADE-OFFS IN

PRIVACY-PRESERVING UNIVERSAL AUTHENTICATION

PROTOCOL M.Saranya,

PG Student,

Sri Krishna College of Technology,

[email protected]

Abstract A novel protocol to achieve privacy-preserving universal authentication protocol for

wireless communications called Priauth. Verifier-Local Revocation Group Signature with

Backward Unlinkability (VLR-GS-BU), it can satisfy all requirements. Priauth belongs to

the class of Universal Authentication Protocols in which same protocol and signaling flows

are used regardless of the domain (home or foreign) a roaming user is visiting. Allowing

people to get connected seamlessly using their devices without being limited by the

geographical coverage of their own home networks roaming service should be deployed.

The key is used to encrypt data transmitted to the servers or users. The efficient distribution

of the new key for multiple membership changes is a critical problem in secure group

communication. The goal of the enhancement is to evaluate trade-off between storage and

revocation cost. Storage is computed in terms of keys that each user (respectively, VA)

maintains and revocation cost is computed in terms of the encryptions performed, and the

number of messages transmitted by the VA(visitor agent).

A.Muthu krishnan

Faculty, ECE, Anna university

of technology, Madurai

P.Dhivya,

Assistant Professor

Sri Krishna College of Technology,

[email protected]

Page 40: NCIIT 12 Proceedings

26 Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

BLACKLISTING MISBEHAVING USERS IN INDISTINCTIVE

NETWORKS P.Suganya,

PG Scholar,

Computer Science and Engineering,

Srinivasan Engineering College.

[email protected]

Abstract

The Onion Router is used to access Internet services privately by hiding the client‘s IP

address from the server. The success has been limited by the concept of anonymity for

purpose of defacing the popular Web Sites. Administrators monitors and rely on IP address

blocking for disabling access to users those who misbehaves. If the user accessing from the

anonymizing network then it is impossible to block. As a result web site administrators

block all the users from an anonymizing network. To solve this problem we use Nymble in

which it can blacklist only the misbehaving users although they access from the

anonymizing networks. Servers can blacklist the users and the privacy of blacklisted users

can be maintained

CONCEALMENT OF INFORMATION IN INACTIVE AUDIO

FRAMES OF VoIP M.Karthick,

PG Student, Department of CSE,SKCT,

Coimbatore, Tamilnadu, India.

Mail: [email protected]

Abstract

Steganography is the hiding of a secret message within an ordinary message and the

extraction of secret message at its destination. In digital steganography, electronic

communications may include steganographic coding inside of a transport layer, such as a

document file, image file, program or protocol. This paper describes how to segregate the

audio that are streaming in the Voice over Internet Protocol (VoIP) and a steganography

algorithm for embedding data in the inactive frames of low bit rate audio streams.VoIP is

an IP telephony term for a set of facilities used to manage the delivery of voice information

over the Internet. Voice activity detection (VAD), is a technique used in speech processing

in which the presence or absence of human speech is detected. The main uses of VAD are

in speech coding and speech recognition, it can avoid unnecessary coding/transmission of

silence packets in Voice over Internet Protocol applications, saving on computation and on

network bandwidth. This VAD algorithm decides whether the current audio frame is an

active voice or inactive voice. The experimental results show the VAD process. The

purpose of segregating the audio frames is to embed the data in audio frame. These audio

frames are encoded with G.723.1 source codec.

Prof.J.Suganya,

Assistant Professor, Computer

Science and Engineering,

Srinivasan Engineering College.

[email protected] [email protected]

P.Madhavan,

Assistant Professor,

Department of CSE,SKCT,

Coimbatore, Tamilnadu, India.

Mailto:[email protected]

Page 41: NCIIT 12 Proceedings

27 Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

DETECTION OF MALICIOUS USER IN COOPERATIVE SYSTEM

Abstract

Detection of primary users is an important task for cognitive radio systems. By

countering the shadow effects, it is found that the performance of the CR spectrum sensing

system by the cooperation of few spectrum users. Final decision regarding the presence or

absence of the PU signal is made by a parallel fusion network with sensors to sense

information to an access point. Severe degradation of performance of the system will take

place when malicious users are present. They are identified in this paper using outlier

detection techniques for a cooperative sensing system, by considering constraints like lack

of information about primary signal propagation environment and small data sensing

samples. A novel method is proposed here to identify such users by using the spatial

information of the CR sensors and the performance results are simulated.

DYNAMIC PATH SELECTION FOR SECURE COMMUNICATION IN

PEER-TO-PEER SYSTEMS

AARTHI M,

PG student, Oxford

Engineering College,

Trichy.

[email protected]

Abstract

Anonymizing Peer-to Peer (P2P) systems often incurs extra traffic costs. Peer to Peer

network doesn‘t contain the centralized server hence the communication between the peers

in the network.Communication between the peers should have the path to transfer a data or

information from source to destination. Existing approach is a lightweight protocol and

non-path-based mutual anonymity protocol for decentralized Peer-to-Peer systems.But the

security is not enhanced in the non-path-based systems.Predefined path in peer to peer

systems leads some attacks and also act as non secured communication. For that purpose

the protocol rumor riding is involved to make the communication secured by using the

―dynamic path selection technique‖. The proposed system increase the query speed by

using the protocol rumor riding and also acknowledgement about the message

transformation to the source node is also specified.

P.Sakthi Vadivel,

PG scholar,Department of ECE,

Sri Krishna college of engineering

and technology,

Coimbatore,

E-mail: [email protected]

K.Suriya,

Assistant professor,Department of ECE,

Sri Krishna college of engineering and

technology,

Coimbatore,

E-mail: [email protected]

JEYASUDHA J,

Lecturer, Oxford

Engineering

College, Trichy

Dr.SAMPATH KUMAR,

Prof & Head in Dept of

CSE, Oxford Engineering

College, Trichy

Page 42: NCIIT 12 Proceedings

28 Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

IMPROVING SECURITY AND EFFICIENCY IN MOBILE IP

NETWORKS

Dr.S.Uma,

Electronics and Communication Engineering,

Coimbatore-641 014

Abstract

The emergence of mobile devices or mobile nodes allows the users to access the network

when they are on the move. As users move frequently from one network to another network

a new IP address is assigned to the mobile node every time when it visits a new network.

The change in IP address is informed to other nodes is dealt by Mobile IP. A secure and

efficient ID-based registration protocol with user anonymity is proposed in this paper for

IP-based mobile networks. The protocol minimizes the registration delay through a minimal

usage of the identity (ID)-basedsignature scheme that eliminates expensive pairing

operations. User anonymity is achieved via a temporary identity (TID) transmitted by a

mobile user, instead of its true identity. Additional replay protection from a Foreign Agent

(FA) is included in the registration messages to prevent a possible replay attack. Numerical

analysis and computer simulation results demonstrate that the proposed protocol

outperforms the existing ones in terms of the registration delay and the computational load

on a Mobile Node (MN) while improving security.

LOW COST AND LOW POWER SECURITY SYSTEM BASED ON

GSM TECHNOLOGY

Ms. M. Poongothai

Electronics and Communication Engineering

Coimbatore Institute of Technology

E mail id: [email protected]

Abstract

In this paper, we have proposed a model that alerts the user (owner) in case of intrusion in

office premises or home by calling a pre-defined number. A Passive Infrared (PIR) sensor

is used to identify the presence of intruder by detecting the motion of a body that emits

infrared radiation. A PIC microcontroller is used to receive the signal sensed by the PIR

sensor. On receiving the signal from the sensor the controller sends AT commands to GSM

modem, that initiates a voice call to the pre-defined number. Test bed design has been

implemented and simulation results were presented.

R.Thilagavathy,

Electronics and Communication

Engineering,

Coimbatore-641 014.

[email protected]

Mr. R. Suresh

Electronics and

Communication Engineering

Coimbatore Institute of

Technology

[email protected]

Page 43: NCIIT 12 Proceedings

29 Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

LOW-RATE DDOS ATTACK DETECTION AND MODIFIED IP

TRACEBACK

Prof.T Senthil Vinayakam (Research Scholar),

Electronics and Communication Engineering,

Coimbatore.

Abstract

A distributed denial-of-service (DDoS) attack is one in which a multitude of

compromised systems attack a single target, thereby causing denial of service for users of

the targeted system. The flood of incoming messages to the target system essentially forces

it to shut down, thereby denying service to the system to legitimate users. There exists

several metrics to detect DDoS attacks. In this paper, we proposed the entropy variation is

used to discriminate the DDoS attack from surge legitimate accessing. The simulation

results show that the proposed information metric can effectively detect low-rate DDoS

attacks and clearly reduce false positive rates. Furthermore, the proposed IP traceback

based on TTL based packet marking, can find all attacks as well as discard attack traffic.

The simulation environment for DDoS attacks of flooding type is done using network

simulator NS2.

MITIGATING SELECTIVE FORWARDINGTCP ATTACKS BY

COMBINING MAITH WITH A CHANNEL-AWARE APPROACH IN

MANET

Mrs.M.Kavitha,

AP/CSE,Dept of CSE,

SNS College of

Technology,

[email protected]

Abstract

TCP attacks are the major problem faced by Mobile Ad hoc Networks (MANETs) due to

its limited network and host resources. Attacker traceback is a promising solution in

MAITH Environment which allows a victim to identify the exact location of the attacker

and hence enables the victim to take proper countermeasure near attack origins, for

forensics and to discourage attackers from launching the attacks. While most of the existing

studies on selective forwarding attacks focus on attack detection under the assumption of an

error-free wireless channel,we consider a more practical and challenging scenario that

packetdropping may be due to an attack, or normal loss events such as medium access

collision or bad channel quality. However, attacker traceback in MANET is a challenging

problem due to dynamic network topology, limited network and host resources such as

memory, bandwidth and battery life. We introduce a novel method of TCP attacker

Identification in MANET by combining MAITH method with a Channel aware approach .

C.Suganthi Evangeline,

Electronics and

Communication Engineering,

[email protected].

Manjula.M ,

SNS College

of Technology,

manjucse.sns08

@gmail.com

Kishore Kumar.C,

SNS College of

Technology,

[email protected]

MasanaYuvaraj.S,

SNS College of

Technology,

masanayuvaraj@

gmail.com

Page 44: NCIIT 12 Proceedings

30

Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

RESTORING NETWORK CONNECTIVITY BY SECURING THE

TOPOLOGY OF WIRELESS SENSOR NETWORKS FROM

MALICIOUS ATTACK

D.Abila Princy,

M.E. Scholar ,

Anna University of Technology ,

Tirunelveli [email protected]

Abstract

A Wireless Sensor Network is a collection of nodes that are organized into a cooperative

network .The topology maintenance protocols such as PEAS and CCP are important for the

operation of sensor networks .Sensor network lifetime is increased by the protocols .The

existing PEAS and CCP protocol do not care the effect of malicious attacks .Three types of

attacks launched against these protocols are Network substitution attack ,Snooze attack and

Sleep deprivation attack .These attack reduces the lifetime of the sensor network and

degrade the functionality of the sensor applications by decreasing the network connectivity

and sensing coverage .The proposed Sec-PEAS and Sec-CCP protocol takes

countermeasures to increase the connectivity and coverage lifetime of the sensor network

.Authentication mechanism is provided

TICKET BASED SECURITY ARCHITECTURE FOR ANONYMIZING

AND TRACING MISBEHAVING CLIENTS IN WIRELESS MESH

NETWORKS

P.Priyadharshini,II-M.E(CSE).,

M.Kumarasamy college of Engineering,

[email protected]

Abstract

Anonymity has received increasing attention in the literature due to the users‘ awareness

of their privacy nowadays. Anonymity provides protection for users to enjoy network

services without being traced. While anonymity-related issues have been extensively

studied in payment-based systems such as e-cash and peer-to-peer (P2P) systems, little

effort has been devoted to wireless mesh networks (WMNs). On the other hand, the

network authority requires conditional anonymity such that misbehaving entities in the

network remain traceable. In this paper, I propose security architecture to ensure

unconditional anonymity for honest users and traceability of misbehaving users for network

authorities in WMNs. The proposed architecture strives to resolve the conflicts between the

anonymity and traceability objectives, in addition to guaranteeing fundamental security

requirements including authentication, confidentiality, data integrity, and nonrepudiation.

Thorough analysis on security and efficiency is incorporated, demonstrating the feasibility

and effectiveness of the proposed architecture.

S.RajaRajeswari ,

Asst.Prof(CSE) ,

Anna University of Technology

,Tirunelveli

Page 45: NCIIT 12 Proceedings

31 Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

STEALTHY ATTACKS IN WIRELESS ADHOC NETWORKS:

DETECTION IN MULTIHOP NETWORKS

Abstract

In adhoc wireless network there are four possible attacks. They are misrouting, power

control, identity delegation, and colluding collision. The popular method of detecting

attacks in wireless network is multihop detection technique using ODMRP. The multihop

networks provide connection oriented service by partitioning available bandwidth to

multiple channels. ODMRP provides a richer connectivity among multicast members using

a mesh based approach and it Supplies multiple route for one particular destination. In

single hop it does not effectively to transfer long distance but multi-hop process to get high

gain, throughput and efficient. These provide an addition protection against malicious

nodes by supporting multiple node disjoint paths and use NS2 simulation to show its

effectiveness against representative control and data attacks.

ACCURACY OPTIMIZATION FOR HIGH-DIMENSIONAL DATA

USING DENCOS CLUSTERING ALGORITHM M.S.Hema S.Leela

Department of Computer Science Department of Computer Science

Kumaraguru College of Technology Kumaraguru College of

Technology

Coimbatore, TamilNadu, India Coimbatore, TamilNadu, India

[email protected] [email protected]

Abstract

Subspace clustering seeks to find clusters in a dataset by selecting the most relevant

dimensions for each cluster separately. Dense regions in subspaces are predicted using

density threshold. Identifying such dense regions suffers from a critical problem, called

―the density divergence problem‖, which incurs the serious loss of clustering accuracy

(precision or recall) in different subspace cardinalities. This paper proposes an innovative

algorithm, called DENCOS (DENsity Conscious Subspace clustering), which adopts a

divide-and-conquer scheme to efficiently discover clusters satisfying different density

thresholds in different subspace cardinalities. As validated by extensive experiments on

various data sets, DENCOS can discover the clusters in all subspaces with high quality and

efficiency.

Gowri S

PG Student, Department of

Computer Science and

Engineering

Oxford Engineering College

[email protected]

Muthu Venkatachalam P

Professor, Department of

Computer Science and

Engineering

Oxford Engineering College

Dr Sampathkumar V

Professor and Head,

Department of Computer

Science and Engineering

Oxford Engineering College

Page 46: NCIIT 12 Proceedings

32 Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

EFFICIENT BRIDGING OF DOMAIN DISTRIBUTION GAP USING

BIG ALGORITHM IN TRANSFER LEARNING R. Kavitha C. Gowtham

Department of Computer Science Department of Computer Science

Kumaraguru College of Technology Kumaraguru College of

Technology

Coimbatore, Tamil Nadu, India Coimbatore, Tamil Nadu, India

[email protected] [email protected]

Abstract

In data classification, the lack of labeled data degrades the classification

performance in many real world applications. To overcome this problem transfer learning

techniques are used to classify domain data from other source data. But transfer learning

may not work well when the distribution gap between source domain and target domain is

large. This paper proposes BIG (Bridging Information Gap) algorithm which effectively

extracts useful knowledge from a worldwide knowledge base and links the source and

target domains for improving the classification performance. As validated by extensive

experiments on several real-world cross-domain text classification tasks, BIG algorithm

outperforms several existing domain adaptation approaches significantly.

DOCUMENT SEGMENTATION APPROACHES AND TECHNIQUES

– AN OVERVIEW Priyadharshini N Vijaya MS

M.Phil Research Scholar, Associate Professor,

PSGR Krishnammal College for Women, G.R.Govindarajalu School of Applied

Computer Technology

[email protected] [email protected]

Abstract This paper presents an analysis on different approaches and techniques of document

segmentation. Document segmentation is vital phase of document analysis process. It

includes separation of text and non-text region from a document image. A text region

contains pure text regions, tables, mathematical equations labels etc and a non-text region

contains images, graphs, charts, background regions etc. Two categories of methods have

been used in document analysis namely 1) hierarchical methods including Bottom-up, Top-

down and Hybrid approaches 2) non-hierarchical methods. The segmentation technique

which use these approaches are Hough Transform, Skew Detection, Projection profile Cuts,

Run-Length Smoothing Algorithm, Texture Analysis, Projection-based, Pitch-based,

Recognition-based, Region-based Which have been used in these approaches are discussed

in this chapter.

Page 47: NCIIT 12 Proceedings

33 Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

EMAIL ABSTRACTION SCHEME FOR SPAM DETECTION

S.Jansi Rani E.A.Vimal

II Year ME Student Assistant Professor(SRG)

Kumaraguru College of Technology, Kumaraguru College of Technology,

[email protected] [email protected]

Abstract

spam is one of the major problems of the today‘s internet, bringing financial

damage to companies and annoying individual users. Among the approaches developed to

stop spam, filtering is an important and popular one. In the field of collaborative spam

filtering by near-duplicate detection, a superior e-mail abstraction scheme is required to

more certainly catch the evolving nature of spams. Compared to the existing methods in

prior research, in this project, we explore a more sophisticated and robust e-mail abstraction

scheme, which considers e-mail layout structure to represent e-mails. The specific

procedure SAG is proposed to generate the e-mail abstraction using HTML content in e-

mail, and this newly-devised abstraction can more effectively capture the near-duplicate

phenomenon of spams. Moreover, a complete spam detection system Cosdes has been

designed to efficiently process the near-duplicate matching and to progressively update the

known spam database .One major focus of this work is to design the innovative data

structure to facilitate the process of near-duplicate matching. SpTable and SpTrees (sp

stands for spam) are proposed to store large amounts of the email abstractions of reported

spams.

AUTOMATIC SEGMENTATION OF RETINAL IMAGES BY USING

MORPHOLOGICAL WATERSHED AND REGION GROWING

METHOD

D.Jebasudha S.Kaleeswari M.Tech

PG Scholar, Assistant Professor,

Mount Zion College of Engineering Mount Zion College of Engineering

and Technology, and Technology,

lenevilaku lenevilaku

[email protected] kaleeswari@[email protected]

Abstract

Retinal image segmentation is essential for diagnosing various problems occurs in

eye. Retinal image segment is one of the critical issues because these images contain very

small nerves and some artifacts present in it. This paper proposes an automatic

morphological watershed segmentation and region growing method to change the

representation of an image into something that is more meaningful and easier to analyze the

interested object. There are several methods that intend to perform segmentation, but it is

difficult to adapt easily and detect the very small nerves accurately. To resolve this

problem, this paper aims to present an adaptable automatic morphological watershed

segmentation and region growing method that can be applied to any type of retinal images

which is exactly diagnosed even with the small changes that

Page 48: NCIIT 12 Proceedings

34 Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

BOOSTING-SVM AND SRM-SVM CASCADE CLASSIFIERS IN FACE

VERIFICATION

Abstract

Face verification in the presence of age progression is an important problem. The

problem of designing and evaluating discriminative approaches without explicit age

modelling is used. To find the gradient orientation discard magnitude information. This

representation is further improved when hierarchical information is used which results in

the use of gradient orientation pyramid. When combined with a support vector machine,

gradient orientation pyramid demonstrate excellent performance. Gradient Orientation of

each color channel of human faces is robust under age progression. The feature vector

which is computed as the cosines of the difference between gradient orientations at all

pixels, is given as the input to the support vector machine classifier. The support vector

machine is used to divide the feature space into two classes, one for the intrapersonal pairs

and the other for extrapersonal pairs. Svm is not an easy classifier to train. Also Svm is a

binary classifier and it is immune to noise Boosting svm and SRM svm can be used to

improve the performance.The proposed method may give better performance.

DETECTION OF VIDEO COPY USING FINGERPRINT

EXTRACTION

Abstract

A video copy detection system is a content-based search engine focusing on Spatio-

temporal features. It aims to find whether a query video segment is a copy of video from

the video database or not based on the signature of the video. It is hard to find a video is a

copied video or a similar video since the features of the content are very similar from one

video to the other. The main focus is to detect that the query video is present in the video

database with robustness depending on the content of video and also by fast search of

fingerprints. The Fingerprint Extraction Algorithm and Fast Search Algorithm are adopted

to achieve robust, fast, efficient and accurate video copy detection. As a first step, the

Fingerprint Extraction algorithm is employed which extracts a fingerprint through the

features from the image content of video. The images are represented as Temporally

Informative Representative Images (TIRI). Next step is to find the presence of copy of a

query video in a video database, in which a close match of its fingerprint in the

corresponding fingerprint database is searched using inverted-file-based method.

Safiya K.M

PG student,

Sri Krishna College of

Technology,

Coimbatore, India,

[email protected]

Prof.S.Bhuvana

Assistant Professor,

Sri Krishna College

of Technology,

Coimbatore, India,

[email protected]

R.Gnana Rubini

PG Student,

Sri Krishna College of

Technology,

Prof.P.Tamijeselvy

Assistant Professor, Sri Krishna College of

Technology,

Ms. P. Anantha Prabha

Assistant Professor, Sri Krishna College of

Technology,

Page 49: NCIIT 12 Proceedings

35 Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

DETECTION OF WML IN BRAIN IMAGES USING

GEOSTATISTICAL FUZZY CLUSTERING

Abstract

White matter lesions are small areas of dead cells found in parts of the brain that act

as connectors are detected using magnetic resonance imaging (MRI) which has increasingly

been an active and challenging research area in computational neuroscience. This paper

presents new image segmentation models for automated detection of white matter changes

of the brain in an elderly population. The main focus is on unsupervised clustering

algorithms. Clustering is a method for dividing scattered groups of data into several groups.

It is commonly viewed as an instance of unsupervised learning. In machine learning,

unsupervised learning refers to the problem of trying to find hidden structures in unlabeled

data. Unsupervised clustering models, Fuzzy c-means clustering and Geostatistical Fuzzy c-

means clustering algorithms partition the dataset into clusters according to some defined

distance measure. The Region of Interest (ROI) is then extracted on the membership map.

Much more accurate results are obtained by GFCM, which better localized the large regions

of WMLs when compared to FCM.

FEATURE EXTRACTION OF INTRADUCTAL BREAST LESION

IMAGES USING GMM

Abstract

Intraductal Carcinoma is a noninvasive condition in which abnormal cells are found

in the lining of a breast duct. The abnormal cells have not spread outside the duct to other

tissues in the breast. In some cases, Intraductal Carcinoma may become invasive cancer and

spread to other tissues, although it is not known at this time how to predict which lesions

will become invasive. Intraductal cancer is the most common type of breast cancer in

women. Memory Intraductal includes 3-types of cancer: Usual Ductal Hyperplasia (UDH),

Atypical Ductal Hyperplasia (ADH), and Ductal Carcinoma in Situ (DCIS). So the system

of detecting the breast microscopic tissue of UDH, ADH, DCIS is proposed. The current

standard of care is to perform percutaneous needle biopsies for diagnosis of palpable and

image-detected breast abnormalities. UDH is considered benign and patients diagnosed

UDH undergo routine follow-up, whereas ADH and DCIS are considered actionable and

patients diagnosed with these two subtypes get

M. Anitha

PG Student,

Sri Krishna College of

Technology,

[email protected]

Prof.P.Tamijeselvy

Assistant Professor,

Sri Krishna College of

Technology,

[email protected]

G.Prieyadharsini

PG Student,

Sri Krishna College of

Technology,

[email protected]

Prof.P.Tamijeselvy

Assistant Professor

Sri Krishna College of

Technology,

[email protected]

Page 50: NCIIT 12 Proceedings

36 Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

FURTHEST NEAREST NEIGHBOUR CRITERION BASED ACTIVE

LEARNING IN KNN AND SVM CLASSIFIERS

Abstract

Active learning is a supervised learning method that is based on the idea that a

machine learning algorithm can achieve greater accuracy with fewer labelled training

images if it is allowed to choose the image from which it learns. Facial age classification is

a technique to classify face images into one of the several predefined age groups. The

proposed system applies an active learning approach to facial age classification which

allows a classifier to select the data from which it learns. The classifier is initially trained

using a small pool of labeled training images. This is achieved by using the bilateral two

dimension linear discriminant analysis. Then the most informative unlabeled image is

found out from the unlabeled pool using the furthest nearest neighbor criterion, labeled by

the user and added to the appropriate class in the training set. The incremental learning is

performed using an incremental version of bilateral two dimension linear discriminant

analysis. This active learning paradigm is proposed to be applied to the k nearest neighbor

classifier and the support vector machine classifier and to compare the performance of

these two classifiers.

IMAGE SEGMENTATION FOR HIGH SPATIAL RESOLUTION

USING MARKER BASED WATERSHED ALGORITHM Vaigarai B Mathivanan B

PG Scholar, Asst. Proffesor,

Department Of CSE, Department of CSE,

Sri Ramakrishna Engineering college, Sri Ramakrishna Engineering College,

Coimbatore Coimbatore

[email protected]

Abstract

In this paper we proposes an edge embedded marker based watershed algorithm for

high spatial resolution remote sensing image segmentation. Two improvement techniques

are proposed for the two key steps of maker extraction and pixel labeling, respectively, to

make it more effective and efficient for high spatial resolution image segmentation.

Moreover, the edge information, detected by the edge detector embedded with confidence,

is used to direct the two key steps for detecting objects with weak boundary and improving

the positional accuracy of the objects boundary. It performs well both in retaining the weak

boundary and reducing the undesired over-segmentation.

Indu. M. T

PG Student,

Sri Krishna College of

Technology,

Coimbatore

[email protected]

Ms. S. Bhuvana

Assistant Professor,

Sri Krishna College of

Technology,

Coimbatore

[email protected]

Page 51: NCIIT 12 Proceedings

37 Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

IMPROVING WEB IMAGE SEARCH USING GMI METHOD M.Sree Rajeswari J.Selvakumar

Master of Engineering Asst.Professor

Department of CSE (PG) Dept.of Software Engineering

Sri Ramakrishna Engineering College, Sri RamakrishnEngineering College

Coimbatore Coimbatore

[email protected] [email protected]

Abstract

We have to type a query in traditional text-based image retrieval (TBIR), relevant images

are to be reranked using visual features after the initial text-based search. reranked using visual

features after the initial text-based search. In this paper, we propose a new bag-based reranking

framework for large-scale TBIR. MI learning methods is used in this bag based re-ranking. we

used generalised MI (GMI) setting for this application. Developing a new method referred to as

GMI-SVM to enhance retrieval performance by propagating the labels from the bag level to the

instance level. To acquire bag annotiations for GMI learning, we propose a bag ranking method

to rank all the bags according to the bag ranking score. We had used real-world data set NUS-

WIDE demonstrate our framework with automatic bag annotation can achieve the best

performances compared with existing image reranking methods. GMI-SVM can achieve better

performances.

OPTIMAL CONTRAST TONE-MAPPING USING LINEAR

PROGRAMMING FOR IMAGE ENHANCEMENT Dr.S.Uma Maheswari T.Sasikumar

Electronics and communication Engineering Electronics and communication

Engineering

Coimbatore Institute of Technology Coimbatore Institute of Technology

Coimbatore, India Coimbatore, India.

Abstract

A novel linear programming approach for optimal contrast-tone mapping is

proposed. A measure of contrast gain and a sister measure of tone distortion are defined for

gray level transfer functions. These definitions allow us to depart from the current practice

of histogram equalization and formulate contrast enhancement as a problem of maximizing

contrast gain subject to a limit on tone distortion and possibly other constraints that

suppress artifacts. The resulting contrast-tone optimization problem can be solved

efficiently by linear programming. The proposed constrained optimization framework for

contrast enhancement is general, and the user can add and fine tune the constraints to

achieve desired visual effects. Experimental results are presented to illustrate the

performance of the proposed method, demonstrating clearly superior performance of the

new technique over histogram equalization. In addition, two locally adaptive contrast

enhancement techniques by the proposed method are investigated.

Page 52: NCIIT 12 Proceedings

38 Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

OPTIMAL FEATURE REGION SET SELECTION FOR ROBUST

DIGITAL IMAGE WATERMARKING R.Sakila N.Naveenkumar,M.E.,(Ph.D)

II year M.E. (C.S.E) Sr.Lecturer

Sengunthar Engineering College, Sengunthar Engineering College,

Tiruchengode. Tiruchengode.

Emailid:[email protected]

Abstract

The efficiency of a digital watermarking algorithm is indicated by the strength of

embedded watermarks against various attacks. Attacks which attempt to destroy or

invalidate watermarks can be classified into two types, noise-like signal processing and

geometric distortions. Attacks of the first type intend to remove embedded watermarks

from the cover image by a signal processing approach. The second type of attack, which

results in synchronization errors by geometric distortions, makes a detector fail to detect the

existence of watermarks even if they are still on the image. Comparing with some well-

known feature-based methods, the proposed method exhibits better performance in robust

digital watermarking.The proposed system is the region collection scheme for robust digital

image watermarking. This method aims to select a non overlapping feature region set,

which has the greatest strength against various attacks and can care for image quality as

much as possible after watermarked.

REVERSIBLE DATA HIDING TECHNIQUE FOR HIDING SECRET

DATA IN VIDEO SCENE K.Eswaramoorthy R.Dhanalakshmi

Department of Computer Science Department of Computer

Science

and Engineering and Engineering

R.M.K Engineering College R.M.K Engineering College

[email protected] [email protected]

Abstract

This work proposes a novel reversible data hiding scheme for Encrypted &

Compressed video scene, which enables the exact recovery of the original video upon

extraction of the embedded information. It is expected that this reversible data hiding

technique will be deployed for a wide range of applications in the areas such as secure

medical image data system, law enforcement, e-government, image authentication and

covert communication where any distortion to the original image is not acceptable. First the

raw video is taken and then splitted into the video frames. After encrypting the image

(video frame), the additional data can also be encrypted using encrypted key and then

embedded into the image by modifying a small proportion of the video frames. With an

encrypted image containing additional encrypted data called marked image is send to the

receiver. Receiver may firstly decrypt the image using the encryption key, and the

decrypted version is similar to the original video. According to the data-hiding key, with

the aid of spatial correlation in natural video scene, the embedded data can also be

successfully extracted and then decrypted using the encryption key.

Page 53: NCIIT 12 Proceedings

39 Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

ROAD DETECTION FROM A SINGLE IMAGE USING SIMULATED

ANNEALING Sofia Joshy M.Balamurugan

PG Student, Assistant Professor

Department of Computer Science Department of Computer Science

and Engineering, and Engineering,

Sri Krishna College of Technology, Sri Krishna College of Technology,

Coimbatore-641 042 Coimbatore-641 042

[email protected] [email protected]

Abstract

Road detection algorithms are used to detect road form a single image. This paper

decompose the road detection process into two steps: the estimation of the vanishing point

associated with the main (straight) part of the road, followed by the segmentation of the

corresponding road area based upon the detected vanishing point. The proposed system

compute texture orientation for each pixel in the image using Gabor filter. Then the

confidence score for each pixel orientation is calculated and it is used for the estimation of

vanishing point of the road by a locally adaptive soft voting scheme (LASV). Vanishing

point constrained road border detection method is used to find the road borders. Simulated

annealing algorithm can be used to improve the performance of the system by selecting the

parameters of the Gabor filters correctly.

THE EFFECTIVE COLOR FEATUER SELECTION AND

MECHANISM FOR FACE RECOGNITION

KANCHERALA. HIMABINDU

M.E (comm. systems)

Hindustan University

Chennai-603103

Tamilnadu, India

[email protected]

Abstract

This paper introduces the new color face recognition (FR) method that makes

effective use of boosting learning as color-component feature selection framework. The

proposed boosting color-component feature selection framework is designed for finding the

best set of color-component features from various color spaces (or models), aiming to

achieve the best FR performance for a given FR task. In addition, to facilitate the

complementary effect of the selected color-component features for the purpose of color FR,

they are combined using the proposed weighted feature fusion scheme.

The effectiveness of our color FR method has been successfully evaluated on the

following public face databases (DBs): Experimental results show that the results of the

proposed method are impressively better than the results of other state-of-the-art color FR

Ms .SHIJU C CHACKO

Department Electronics &

Communication

Hindustan University

Chennai-603103

Tamilnadu, India

Page 54: NCIIT 12 Proceedings

40 Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

VESSEL TREE SEGMENTATION IN LUNG IMAGES AFFECTED BY

INTERSTITIAL LUNG DISEASES Dr.S Uma Maheswari S. X. Roger Antony

Electronics and communication Engineering Electronics and communication

Engineering

Coimbatore Institute of Technology Coimbatore Institute of Technology

Coimbatore, India. Coimbatore, India.

Abstract Vessel tree segmentation techniques have gained attention, since they play a key

role in Computer Aide Diagnosis (CAD) applications aimed at nodule or pulmonary

embolism detection as well as Interstitial Lung Disease (ILD) pattern quantification. These

segments can act as the control points for lung image registration applications in case of

follow-up data, as well as for guiding airway tree and lung lobe segmentation.

Segmentation of lung images has been done from the past to know the exact position of the

lobes and other structures in the lungs. Segmentation of vessel trees is important in the

diagnosis of ILD. Segmentation of nodular trees is also an area of research used for

identification of Obstructive airway diseases. Techniques used till now are used only for the

extraction of vessel trees of the lungs. In this project, the presence of any pathologies and

lesions can also be extracted along with the vessels from lung images which is gives a

better and clear interpretation of the disease. Thus, this acts as a pre-processing step in the

CAD schemes of ILD patterns in Multi Detector Computed Tomography (MDCT). The

performance of this algorithm is evaluated using three main performance measures namely

area overlap, true positive fraction and false positive fraction.

HANDWRITTEN LETTER RECOGNITION USING

CLASSIFICATION ALGORITHMS

Saranya K Vijaya MS

M.Phil Research Scholar, Associate Professor,

PSGR Krishnammal College for Women, G.R. Govindarajalu School of Applied

Computer Technology

[email protected] [email protected]

Abstract

Handwriting Recognition is the ability of a computer to receive and interpret intelligible

handwritten input from sources such as paper documents The image of the written text may

be sensed "off line" from a piece of paper by optical scanning or intelligent word

recognition. Handwriting recognition system handles formatting, performs correct

segmentation into characters and finds the most plausible words. Classification technique in

data mining suits best for this problem. In this paper the classification algorithms namely

Naive Bayes, J48 and Multilayer Perceptron are trained and the trained models are used to

recognize the letter. The data set retrieved from UCI Machine learning repository is used

for learning. The letter recognition dataset consists of 10,000 instances and 17 attributes.

The performance of the classifiers is evaluated using 10-fold cross-validation and the

results are analyzed.

Page 55: NCIIT 12 Proceedings

41 Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

AN APPROACH TO MOTION DETECTION IN VIDEO SEQUENCE

VANITHA.R,

M.E. Computer and communication

M.A.M. College of Engineering

E mail id: [email protected]

Abstract

Video processing is one of the most challenging areas in image processing. It deals

with identifying an object of interest. Motion detection has been used in many fields either

directly or indirectly. In this paper an efficient approach to motion detection in video

sequence using color feature extraction operator. Using this approach we improve the

background subtraction and detecting the moving object with greater accuracy. In this

paper, background modeling is done in order to make the update of background due to light

illumination and change in the weather condition. Foreground detection is done before

updating the background model. Color feature extraction is done in order to avoid the

dynamic background such as moving leaves, rain, snow, rippling water.

LOCATION MONITORING ALGORITHMS FOR WIRELESS

ADHOC NETWORKS P.Kavitha Prof.M.Sayee Kumar

PG Scholar,

Assistant Professor,

Computer Science and Engineering, Computer Science and Engineering,

Muthayammal Engineering College Muthayammal Engineering College

[email protected] [email protected]

Abstract

It monitors personal locations with a potentially untrusted server poses privacy

threats. The two algorithms, namely, resource-aware and quality-aware algorithms, that aim

to provide high quality location monitoring services for system users. Both algorithms

established k-anonymity privacy concept to enable trusted sensor nodes to provide the

aggregate location information. The resource-aware algorithm aims to minimize

communication and computational cost, while the quality-aware algorithm aims to

maximize the accuracy of the aggregate locations. These two algorithms are used in adhoc

networks rather than infrastructure. It use a spatial histogram approach that estimates the

distribution of the monitored persons based on the gathered aggregate location information.

It guarantees the location privacy of the monitored persons.

Page 56: NCIIT 12 Proceedings

42 Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

MODIFIED MULTIMEDIA ARCHITECTURE FOR MOBILE

MULTIMEDIA APPLICATION S.Priya D.Dhivya

M.E- Computer Science & Engineering Lecturer –Computer Science & Engineering

Muthayammal Engineering College Narasu‘s Sarathy Institute of Technology

Namakkal- 637 408 Salem

India India

Email:[email protected] Email:[email protected]

Abstract Dynamically modified SRAM array for low-power mobile multimedia application.

The proposed structure use a lower voltage for cells storing low-order bits and a nominal

voltage for cells storing higher order bits .Parametric failures due to manufacturing

variations limit the opportunities for power saving in SRAM. The architecture allows

reconfigure the number of bits in the low-voltage mode to change the error characteristics

of the array in run-time. We can obtain more than 45% savings in memory power with a

marginal (10%) reduction in image quality under Simulations in predictive 70 nm nodes.

ENHANCING RELIABILITY AND LIFETIME MAXIMIZATION IN

DUTY CYCLED WIRELESS SENSOR NETWORK BASED ON

FORWARDING PROCEDURE S.Arunkumar, Y.Kallifulla,

M.E. Network Engineering, Assistant Professor,

Vel Tech Multi Tech Department of Information Technology,

Dr.RR Dr.SR Engineering College, Vel Tech Multi Tech

Email: [email protected] Dr.RR Dr.SR Engineering College

Abstract

The paper deals about Chinese remainder theorem based packet forwarding

technique in duty cycled wireless sensor network with energy efficient multipath

routing.This is a novel packet forwarding technique to grade reliable delivery and saves

energy of this duty cycled wireless sensor network.This forwarding technique reduces the

burden of the node by transmitting only few message bits.The duty cycled wireless sensor

network is the network in which node switches to active and power saving state to save

energy.Consider a node in network wants to transmit a message to sink then the nodes

grouped as hierarchial cluster and also each node knows about the number of

neighbourhood nodes in next hierarchy of cluster so that splitting of packets could be done

at nodes.By Chinese remainder theorem(CRT) the packet splitting will be done in which

the number of message bits is modular divisioned with the set of Prime numbers and a

mask will be added to the message and this is called CRT component. Mask gives the index

of the splitted message.A simple reconstruction which satisfies the computation complexity

level of processor in duty cycled wireless sensor network.Node switches between active

state and power saving state and if the CRT component is passed within the active state of

node then the node will receive else packet will be lost.Using this CRT method CRT

component loss is admissible and

Page 57: NCIIT 12 Proceedings

43 Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

PERFORMANCE ANALYSIS AND IMPROVEMENT MEASURES

FOR COGNITIVE RADIO NETWORKS

Abstract

Cognitive Radio Network opportunistically exploits locally unused spectrum which

is not heavily occupied by licensed users. It provides a new path to resolve spectral scarcity

and to avoid spectral congestion, without disturbing the QoS requirements of others in the

system. Of the many challenges involved in the practical ways of framing concepts for a

working model of CR Network, we present its results for predominant applications of data

and video, in detail. This project addresses the most plausible method of integrating a full

scale CR Network capable of using TCP connections and data based traffic.

By providing a higher priority status for video traffic over data traffic feeds, simulations for

performance analysis are shown with the help of Dynamic Vertical Sharing Overlay

methodology. The theoretical values of mean delay and jitter for data and TCP packets may

be computed along with an analysis of the average throughput and efficiency of the system

for different TCP based connections. Further simulation with higher priority video signal

and data traffic will be considered. A comparative analysis of results over data and video

signal will be carried out.

RELAY NODE PLACEMENT SCHEME TO INCREASE LIFE TIME

IN WIRELESS SENSOR NETWORKS

J.YASODHA N.PREMALATHA

PG Student, Assistant Professor,

Department of CSE, Department of CSE,

Kongu Engineering College, Kongu Engineering College,

Erode, Tamil Nadu, India Erode, Tamil Nadu, India

Email: [email protected] Email:

[email protected]

Abstract

To alleviate the traffic burden of sensor nodes and extend the network lifetime in

wireless sensor networks, one approach is to deploy a large number of expensive macro-

relay nodes for routing data. The problem is it will degrade the fault tolerance capacity of

WSNs. If relay nodes are uniformly placed, the issue of uneven power consumption arises

as relay nodes closer to the sink are required to relay more data and thus will deplete their

energy more quickly. Here, the relay node deployment problem is discussed under the

assumption that the relay node has the same dimension and same energy supply as the

sensor node. Based on balancing power consumption among all sensor nodes and relay

nodes, a relay node density function is deduced according to which relay nodes are placed

in the sensing field. This approach can achieve high energy utilization and the network

lifetime is significantly extended compared with the approach of uniformly placing relay

nodes.

V.P.Ajay,

[M.E – Comm.Systems]

PG Student, SKCET,

Coimbatore

Ph: +918870764430

[email protected]

Mrs. Rathika Dhamu,

Assistant Professor,

KPRIET,

Coimbatore

Ph: +919865817783

[email protected]

Dr. Sofia Sudhir

Professor,

SKCET,

Coimbatore

Ph: +919487752081

[email protected]

Page 58: NCIIT 12 Proceedings

44 Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

RELIABILITY AND SECURING TOPOLOGY MAINTENANCE

PROTOCOLS FOR SENSOR NETWORK

A.ANUPAMA V.MURUGAN

PG student, Senior lecturer,

Oxford engineering college Oxford engineering college

[email protected] [email protected]

Abstract

We analyze the security vulnerabilities of PEAS, ASCENT, and CCP, three well-

known topology maintenance protocols (TMPs) for sensor networks. These protocols aim

to increase the lifetime of the sensor network by only maintaining a subset of nodes in an

active or awake state. The design of these protocols assumes that the sensor nodes will be

deployed in a trusted, nonadversarial environment, and does not take into account the

impact of attacks launched by malicious insider or outsider nodes. We propose a

metaprotocol (Meta-TMP) to represent the class of topology maintenance protocols. The

Meta-TMP provides us with a better understanding of the characteristics and of how a

specific TMP works, and it can be used to study the vulnerabilities of a specific TMP. We

describe various types of malicious behavior and actions that can be carried out by an

adversary to attack a wireless sensor network by exploiting the TMP being used in the

network. We describe three attacks against these protocols that may be used to reduce the

lifetime of the sensor network, or to degrade the functionality of the sensor application by

reducing the network connectivity and the sensing coverage that can be achieved. Further,

we describe countermeasures that can be taken to increase the robustness of the protocols

and make them resilient to such attack.

ROBUST DIFFUSION OF VIDEO USING SUV IN VANET’S S.Parthasarathy Mr.V.Arun,

M.E Student, Assistant Professor of ECE,

Anna University of Technology Madurai,. Anna University of Technology Madurai,

Madurai. Madurai.

[email protected]

Abstract Vehicular ad hoc network is not efficient to support the transformation process of

multimedia streaming. Broadcast and Multicast in adhoc network facing the problem of

highly dynamic topology of Vehicular network and the strict delay requirements of

streaming application. Inter vehicular communications called Streaming Urban Video,

which is fully distributed and dynamically adapts to topology changes, and leverages the

characteristics of streaming applications to yield a highly efficient, cross-layer solution.

Page 59: NCIIT 12 Proceedings

45 Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

STABLE CHANNEL BASED ROUTING WITH NODE-PATH

HANDOFF IN MANET Jinu Mercy Joy,

PG Scholar,

Sri Ramakrishna Engineering College, Coimbatore

Email: [email protected]

Abstract

In wireless mobile ad hoc networks (MANETs), packet transmission is impaired by

radio link fluctuations. A variety of routing protocols have been proposed for mobile ad hoc

networks communicating over unreliable wireless links and have generally ignored channel

fading. To accommodate this channel fading, this paper proposes a novel channel adaptive

routing protocol which extends the Ad hoc On-Demand Multipath Distance Vector

(AOMDV) routing protocol. It utilizes the average nonexpiring time, combined with hop-

count, to select stable links and applies a preemptive handoff strategy to maintain reliable

connections.

FILE SHARING IN UNSTRUCTURED PEER-TO-PEER NETWORK

USING SAMPLING TECHNIQUE

Ms. P. Preethi Rebecca, M.ARUNA M.E (CSE) ,

Asst.Professor / CSE , St. Peter‘s University,Chennai

St. Peter‘s University, Chennai.

Abstract

This paper presents a detailed examination of how the dynamic and heterogeneous

nature of real-world peer-to-peer systems can introduce bias into the selection of

representative samples of peer properties (e.g., degree, link bandwidth, number of files

shared). We propose the Metropolized Random Walk with Backtracking (MRWB) as a

viable and promising technique for collecting nearly unbiased samples and conduct an

extensive simulation study to demonstrate that our technique works well for a wide variety

of commonly-encountered peer-to-peer network

conditions. We have implemented the MRWB algorithm for selecting peer addresses Using

the Gnutella network, we empirically show that - yields more accurate samples than tools

that rely on commonly-used sampling techniques and results in dramatic improvements in

efficiency and scalability compared to performing a full crawl.

Page 60: NCIIT 12 Proceedings

46 Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

SUPPORTING SCALABILITY AND

STATELESS MULTICASTING IN MANET K.Vanitha S.Varadhaganapathy

M.E Computer and Communication Professor

Department of Information Technology Deprtment of Information Technology

Kongu Engineering College Kongu Engineering College.

Erode, TamilNadu.India. Erode,TamilNadu.India.

[email protected] [email protected]

Abstract

A mobile ad hoc network is defined as a transient network which is formed

dynamically by the collection of nodes which are arbitrary. It can also be called as

infrastructure less network, because it has no structure frame for positioning the nodes.

Node itself acts and does the work of router in routing the packets from one node to another

node. Multicasting is the process of delivering packets from a single source to several

destinations simultaneously. To achieve scalability, the network terrain is divided into

zones and every node is aware of its own position. The position information is integrated

with zone structure building, group membership management and multicast tree

maintenance. The concept of zone depth is used to further improve the efficiency of the

protocol. EGMP provides high packet delivery ratio, less control overhead .The main focus

of this work is to reduce the control overhead and also to improve the performance of the

Efficient Geographic Multicast Protocol (EGMP) through stateless multicasting.

A NOVEL APPROACH ON GREEDY MAXIMAL SCHEDULING

ALGORITHM ON EMBEDDED NETWORKS

Mr.N.Kumaresan, N.Arun Prasath,

Assistant Professor M.E.EmbeddedSystemsand

Technologies,

Department of Electronics and Communication,

Anna University of technology, Coimbatore

[email protected] [email protected]

Abstract

There has been a significant amount of work done in developing low-complexity

scheduling schemes to achieve high performance in wireless networks. A centralized sub-

optimal scheduling policy, called Greedy Maximal Scheduling (GMS) is a good candidate

because its empirically observed performance is close to optimal in a variety of network

settings. However, its distributed realization requires high complexity, which becomes a

major obstacle for practical implementation. In this paper, we develop simple distributed

greedy algorithms for scheduling in wireless networks in embedded. we propose greedy

algorithms for scheduling, with better performance and lower complexity and reduce

delay .We reduce the complexity by relaxing the global ordering requirement of GMS, up

to near-zero. Simulation results show that the new algorithms approximate the performance

of GMS, and improved method to reduce packet loss and enhance the total output. This

algorithm also reduce larger queue length on the wireless networks.

Page 61: NCIIT 12 Proceedings

47 Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

LOCATION TRACKING IN UNDERWATER SENSOR NETWORKS

USING MOBILITY PREDICTION BY DIVIDED SENSITIVE RANGES A.Arivazhagi Prof.S.Jayanthi

PG Scholar, Assistant Professor,

Computer Science and Engineering, Computer Science and Engineering,

Srinivasan Engineering College Srinivasan Engineering College

[email protected] [email protected]

Abstract

Location tracking of mobile sensor nodes is indispensable for underwater sensor

networks. For example, in aquatic environment monitoring applications, getting a correct

location is an essential task in order to get useful location-aware data. Only a limited

number of schemes are available for the localization service in underwater acoustic

networks. These solutions are mainly designed for small-scale networks(usually with tens

of nodes or even less).However, many aquatic applications, such as coastline protection,

requires a localization solution that communicate with large number (hundreds to

thousands) of nodes. The proposed paper, focus on the localization service for large-scale

mobile underwater sensor networks. We propose a mobility prediction algorithm by

dividing sensitive ranges. The division is in accordance with the cell transformation

probability. Then various estimation methods are applied according to the sensitivity of the

range to gain high precision. As it turns out, the simulation results show that the proposed

method can accurately estimate the location for mobile users even in the situation of

deficient location history.

A HIGH-QUALITY SECURED SHELL FOR LOCATION

MONITORING SYSTEM IN WIRELESS SENSOR NETWORKS N.Rekha S.Chinnadurai

PG Scholar, Assistant Professor,

Computer Science and Engineering, Computer Science and Engineering,

Srinivasan Engineering College Srinivasan Engineering College

[email protected] [email protected]

Abstract

Advances in sensor networking and location tracking technology enable location

based applications but they also create significant privacy risk. Tracking our personal

information in an untrusted environment poses privacy breach. In order to preserve the

privacy, a high quality location monitoring services for system users needed. Hence two in-

network location anonymization algorithms, namely, resource and quality-aware algorithms

has been proposed. Both algorithms use k-anonymity privacy concept, to enable trusted

sensor nodes to provide the aggregate location information of monitored persons for our

system and also preserve personal location privacy, while enabling the system to provide

location monitoring services. The resource-aware algorithm aims to minimize

communication and computational cost, while the quality-aware algorithm aims to

minimize the size of cloaked areas in order to generate more accurate aggregate locations.

To provide location monitoring services based on the aggregate location information, a

spatio-temporal histogram approach has been proposed that analyzes the aggregate

locations reported from the sensor nodes to estimate the distribution of the monitored

Page 62: NCIIT 12 Proceedings

48 Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

LOCALIZATION SCHEME FOR MINIMIZING ERROR IN

WIRELESS SENSOR NETWORKS USING MONTE CARLO

LOCALIZATION ALGORITHM

S.Balasubramanian S.Sathishkumar

M.E Software Engineering M.E Software Engineering

Anna University of Technology Anna University Of Technology

Tiruchirapalli Tiruchirapalli

Email id:[email protected] Email id: [email protected]

Abstract

Localization is an essential and important research issue in wireless sensor networks

(WSNs). Most localization schemes focus on static sensor networks. However, mobile

sensors are required in some applications to acquire all the relevant data. As such, a

localization scheme designed for mobile sensor networks is necessary to track the moving

nodes. In this paper, we propose a localization scheme to improve the localization accuracy

of existing protocols. In this proposed scheme, the normal nodes without location

information can estimate their own locations by gathering the positions of location-aware

nodes (anchor nodes) and the one-hop normal nodes whose locations are estimated from the

anchor nodes. In addition, we propose a scheme that predicts the moving direction of sensor

nodes to increase localization accuracy. Simulation results show that the localization error

in our proposed scheme is lower than the previous schemes in various mobility models and

moving speeds.

LATENCY FOR VERTICAL HANDOFF DECISION IN

HETEROGENEOUS NETWORKS D.Velmurugan, P.Thirumaraiselvan

Department of Electronics and communication,

Adhiparasakthi Engineering College

[email protected]

Abstract

Next generation wireless communications will likely rely on integrated networks

consisting of multiple wireless technologies. Hybrid networks based, for instance, on

systems such as WiMAX and WiFi can combine their respective advantages on coverage

and data rates, offering a high Quality of Service (QoS) to mobile users. In such

environment, WiFi/WiMAX dual mode terminals should seamlessly switch from one

network to another, in order to obtain improved performance or at least to maintain a

continuous wireless connection. our proposed algorithm raises the system capacity, thus

increasing the gain that can be achieved with a WiMAX and WiFi heterogeneous

deployment. Here use a new fuzzy logic inference system for vertical handover, which

combines a trigger to continuously maintain the connection and another one to maximize

the user throughput. The proposed handoff algorithm between WLAN 802.11 and

WLAN 802.16 network is implemented. The result of the simulation shows the behavior of

the handoff latency in WLAN networks. The vertical handoff decision algorithm is able to

determine when a handoff is required, and selects the best access network that is optimized

to network conditions, quality service requirements, mobile terminal conditions, user

preferences, and service cost.

Page 63: NCIIT 12 Proceedings

49 Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

SECURE HYBRID RANGE QUERY FRAMEWORK

IN TIERED SENSOR NETWORK Se.Nerthikaa Prof.S.Saravanan

PG Scholar, Assistant Professor,

Computer Science and Engineering, Information Technology,

Srinivasan Engineering College Srinivasan Engineering College

[email protected] [email protected]

Abstract

The two-tier architecture consisting of a small number of resource-abundant storage

nodes in the upper tier and a large number of sensors in the lower tier. Master nodes collect

data from sensor nodes and answer the queries from the network owner. In this architecture,

each sensor having multiple sensing capabilities periodically forwards the multidimensional

sensed data to the storage node. Unfortunately, the sensed data could be leaked to or could

be manipulated by the compromised nodes. In this paper Centralized algorithm is proposed,

represent the first distributed approximations of the facility location problem that can be

practicably implemented in multihop sensor networks with local communication. Through

simulation studies, clustering technique with hybrid tree structure is used to locate nodes to

detect the adversary. In addition, this paper proposes simple extensions to our algorithms to

support dynamic networks with varying link qualities and node additions and deletions.

SECURE ON-DEMAND MULTICAST ROUTING PROTOCOL FOR

WIRELESS MESH NETWORKS R.Maheshwari, R.C.Suganthe,

PG student, CSE, Professor, CSE,

Kongu Engineering College, Kongu Engineering College,

Erode, Tamil Nadu, India Erode, Tamil Nadu, India

[email protected]

Abstract

Wireless Mesh Network has become the focus of research in recent years since it

offers rich and high speed content access. Traditional multicast routing protocols in mesh

network use hop count as a path selection metric. This path selection may leads to the

selection of poor quality link and negatively impact on throughput. So, high-throughput

multicast routing protocols use link quality as a path selection metric. In this protocol,

nodes must collaborate inorder to compute path metric and forward data. Here all the nodes

are assumed as honest and behaving correctly during path establishment and data

forwarding. However, given the shared and multihop nature of the communication, this

assumption no longer holds and wireless mesh networks are subject to a wide range of

security threats like metric manipulation attack, Sybil attack, warmhole attack, blackhole

attack,etc. Previous works focus on protecting the multicast protocol from metric

manipulation attack that distorts the path selection process in the entire network. In this

work, we identify sybil attack against high throughput multicast protocols in wireless mesh

network. Our proposed defense mechanism relies on Redundant Identity Detection for

detecting attacks.

Page 64: NCIIT 12 Proceedings

50 Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

ROUTE REDIRECTION IN UNSTRUCTURED NETWORK USING

CHORD

Rama B, Muthu Venkatachalam P, Dr.Sampath Kumar V

PG Student, Professor, Professor and Head,

Oxford Engineering College. Department of CSE, Department of CSE,

[email protected] Oxford Engineering College. Oxford Engineering College.

Abstract

Chord as a tree based routing DHTs are used to share storage and routing

responsibility of all nodes in an unstructured network (unbounded path length). The two

major problems called deny access and misroute lookups are addressed by replica

placements. This placement creates route redirection. The route redirection method called

neighbor set routing with the replica placement can successfully route messages to the

correct position even when some of the nodes are compromised at random. Route

redirection provides better robustness. The one-hop DHTs and DHT with multilevel

hierarchies are the proposed technique for different type of applications. The benefit of this

idea is to improve the throughput and to reduce the delay.

SECURE AND EFFICIENT INCENTIVE PROTOCOL FOR

WIRELESS SENSOR NETWORKS Aathira.R.Kurup Prof.Sri Ramalinga Ganesa Perumal.B.E.,M.Tech.,

2nd year ME., Head of the Department, ECE Department,

Sudharsan Engineering College, Sudharsan Engineering College,

Pudukkottai Pudukkottai.

Email id: [email protected]

Abstract

The selfish nodes in the multihop wireless network does not transfer packets as per

the requirements of the client system and it affects the network performance adversely. To

avoid this delay in the transmission of the data packets a credit based secure incentive

protocol was proposed. So that it simulate cooperation among mobile nodes implemented

in a fully distributed way and immune to wide range of attacks. Incentive-based protocols

are more proper for multi-hop wireless networks because in addition to cooperation

stimulation, these protocols can achieve fairness by rewarding credits to the cooperative

nodes, and discourage packetflooding attack where the attackers exchange bogus packets to

consume the intermediate node‘s resources because the nodes pay for relaying their

packets. However, secure incentive protocols usually use signatures to achieve payment

nonrepudiation which is important to prevent payment manipulation, and thwart free riding

attacks because the message‘s integrity is checked in each hop. These cryptosystems incur

too heavy overhead to be used efficiently in limited-resource nodes. In this paper, we

propose an Efficient and Secure cooperation Incentive Protocol (ESIP) that uses public-key

operations only for the first packet in a series, and then the efficient hashing operations are

used in the next packets, so that the overhead of the packet series converges to that of the

hashing operations. Hash chains and keyed hash values are used to achieve payment non

repudiation and thwart free riding attacks.

Page 65: NCIIT 12 Proceedings

51 Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

AUTOMATED URBAN DRINKING WATER DISTIBUTION AND

WATER THEFT CONTROL Renjisha E Rajan Porf.P.Moorthy

Vivekanadha College of Engineering, Vivekanadha College of Engineering,

Thiruchengode Tamilnadu, India Thiruchengode Tamilnadu, India

[email protected] [email protected]

Abstract

Water is a precious resource. Urban water supply networks are large-scale systems

that transport drinking water over vast geographical areas to millions of consumers. The

rapid growing of the wide urban residential areas imposes the expansion as well as the

modernization of the existing water supply facilities. Along with this one more problem

identified in the water supply channels is water theft since some people use ½ HP to 1 HP

pump to suck the water directly from the channel. This automated urban drinking water

distribution and theft control system is mainly to collect the real time parameters and to

control the supply if any consumer consume excess amount of water. Implementation of

this project in a domestic area is to monitor and control the real time water flow to houses,

and intimation system for theft control along with safe and secure operations. This

automated system allows overall supervision and remote control of all the water network

equipments and the management of the water flow according to the users demand and the

available water volume related with the reservoirs level. Process automation system based

upon utilization of an industrial PC system including all the network components represents

the best way to improve the water distribution technological process.

REALIZING PROGRAMMABLE LOGIC CONTROL

ENVIRONMENT USING ARM7 MICROCONTROLLER WITH

WIRED AND WIRELESS COMMUNICATION CAPABILITIES TO

HOST Kavi Kishore.P Himanshu Shekhar

M.E (Comm Systems) M.Tech (Phd...)

Hindustan University Dept of ECE,

Chennai-603103 Hindustan University

[email protected] Chennai-603103

[email protected]

Abstract

In this paper, a new design is brought forward according to the requirement of

monitoring for the remote PLCs. The system based on DCS uses monitoring equipment

to collect information of PLCs by UARTs, and then the data is sent to a DCS

terminal to be processed and stored in SQL Database. The microprocessor control

unit(MCU) of monitor is based on LPC1758, which is ARM Cortex-M3 based

microcontroller. The hardware configuration of monitor, software design and

communication method are introduced. The monitoring program of the DCS terminal is

also described. The monitor for remote PLCs has the characteristics of simple equipment,

low cost and accuracy. It has great value in use for process control

Page 66: NCIIT 12 Proceedings

52 Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

SEA WAVES SIGNAL PROCESSING USING RECURRENT

NEURAL NETWORKS S.K.Umamaheswari, Mrs.M.Umadevi

Ganadipathy Tulsi‘s Jain Engg College, Ganadipathy Tulsi‘s Jain Engg College,

Chittoor-Cuddalore Road, Chittoor-Cuddalore Road,

Kaniyambadi, Kaniyambadi,

Vellore-632102. Vellore-632102.

Email id:[email protected], Email id:[email protected]

Abstract

Recurrent neural networks (RNNs) are powerful tools to learn complex dynamical

systems, for two main reasons: i) they are universal approximators of dynamical systems,

and ii) they can exhibit continuous dynamics; a suitable property to model, for example,

agent(robot)-environment interaction. However, "standard" RNNs, such as BPTT and

RTRL suffer from computational complexity and slow training. To overcome these

difficulties, concepts of such as echo state networks (ESN) and Liquid State Machines

(LSM) have been proposed. The core idea of reservoir computing consists of using a large

RNN as a ―pool‖ of excitable complex neural dynamics, from which readout neurons can

learn to extract the current state of the network. This reduces the complexity of training to

simple linear regression while preserving the recurrent property of the network.

DESIGN OF MULTIBAND MICROSTRIP PATCH ANTENNA S.Mahendrakumar, M.E., (Ph.D)

Assistant Professor,

Department of ECE,

Velalar College of Engineering & Technology,

Thindal.

[email protected]

Abstract

In this project our aim is to design a patch antenna for frequency ranges starting from 900

MHz to 5.35 GHz which includes the GSM (880-960) GPS (1568-1592 MHz), DCS (1710-

1880 MHz), and PCS (1850-1990 MHz). UMTS (1920-2170 MHz), IEEE 802.11 b/g

(2400-2484) and WLAN IEEE 802.11a band (5.15-5.35) in order to meet the demand for

newer microwave and millimeter-wave systems and emerging telecommunication

challenges with respect to size, performance and cost of an antenna. It offers the advantages

of light weight, low cost and ease of fabrication. Design of patch antenna includes the

analysis of antenna parameters such as Bandwidth, Gain and Efficiency which are related to

antenna dimensions and their substrate Material parameters. Further we extend our

investigations on various methods of improving the bandwidth and gain of the antenna

which is used for mobile communication.

M.Dhineshkumar,

V.S.Kamalamanickam,

R.Malathy,

M.Manikandan,

Department of ECE,

Velalar College of Engineering &

Technology, Thindal

Page 67: NCIIT 12 Proceedings

53 Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

LINKED DATA GENERATION FRAMEWORK AND ITS

APPLICATION

Ujjal Marjit,

C.I.R.M.,

University of Kalyani

Klayani-741235, W.B.,

India

[email protected]

Abstract The enormous quantity of semantically interlinked data is a prerequisite for making the

Semantic Web become a reality. The major theme of the Semantic Web is to publish

structured data on the web using Resource Description Framework (RDF). Occasionally

these expressive pieces of data remains on the web as data silos without contributing a lot

to the preparation of a global data-space. Linked Data helps machines understand content of

web and alleviate the mentioned above objective. In this paper our approach is to represent

a framework and an application based on it, so that the data from the legacy database

(RDBMS) is published as linked data on the web around the concept of global data-space.

ENHANCEMENT OF WIRELESS SENSOR NETWORK BASED ON

CLUSTERING APPROACH

Bibin.M, Miss.E.Srie Vidhya Janani,

2nd

year ME Communication Systems, Asst professor, CSE Department,

Anna University of Technology Madurai, Anna University of Technology

E-mail id: [email protected]. Madurai

Abstract

Wireless Sensor networks are infrastructure less and application specific in nature. They

are mostly deployed in hard to reach places, hence increase in the lifetime of WSN is

mandatory in safety critical and reliable application, else it may cause economic losses, or

even fatalities. In general, clustering sensors into groups is a popular strategy to maximize

the network lifetime, but none of the clustering algorithms address the predictability issue

for time-critical WSNs. In this proposed method the HEF clustering algorithm is chosen as

a design reference model, which is proved to be an optimal clustering policy under certain

ideal conditions. To address network lifetime predictability in practice, the network lifetime

bounds and feasibility test for the HEF are developed via the worst case energy

consumption analysis. Proper shedulability of packet transmission also analyzed using

various techniques.

Arup Sarkar,

Department of Computer

Science & Engineering.,

University of Kalyani

Klayani-741235, W.B.,

India

[email protected]

Utpal Biswas,

Department of Computer

Science & Engineering.,

University of Kalyani

Klayani-741235, W.B.,

India

[email protected]

Page 68: NCIIT 12 Proceedings

54 Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

HIGH DIMENSIONAL DATA ANONYMOUS PUBLICATION AND

UPDATES TO CONFIDENTIAL DATABASES S.Kiruthika

PG Scholar,

Computer Science and Engineering,

Vivekanandha College of

Engineering for Women

[email protected]

Abstract

Existing research on privacy-preserving data publishing focuses on relational data: in this

context, the objective is to enforce privacy-preserving paradigms, such as k-anonymity and

‗-diversity, while minimizing the information loss incurred in the anonymizing process

(i.e., maximize data utility). Existing techniques work well for fixed-schema data, with low

dimensionality. We propose two categories of novel anonymization methods for sparse

high-dimensional data. The first category is based on approximate nearest-neighbor (NN)

search in high-dimensional spaces, which is efficiently performed through locality-sensitive

hashing (LSH).

IDENTIFICATION OF SPAM USING STRUCTURE ABSTRACTION

GENERATION

Sharjina Rani. R

M.E. Computer Science and Engineering

Sri Krishna College Of Engineering

and Technology

[email protected]

Abstract

Spam is the use of electronic messaging systems to send unwanted bulk messages

indiscriminately. In spam detection, collaborative filtering with near duplicate similarity

matching scheme has been recently used. Email abstraction is generated from email content

text. The main goal of near duplicate similarity matching scheme is to maintain a known

spam database, formed by user feedback, to block subsequent near-duplicate spams. The

email abstractions cannot fully catch the spams, and are not effective in near-duplicate

detection because these abstractions are too brief and thus they are not robust enough to

withstand intentional attacks. In proposed system, Collaborative Spam Detection

System(Cosdes) with a novel e-mail abstraction scheme is used. This project is about

identification of spam using HTML content in email. In this scheme, e-mail layout structure

is used to represent emails. This email abstraction scheme generates the e-mail abstraction

using HTML content in email and then designs a complete spam detection system Cosdes.

Cosdes consists of two schemes. There are near-duplicate matching scheme and a

progressive update scheme. The progressive update scheme enables system Cosdes to keep

the most up-to-date information for near-duplicate detection. The primary idea of near

duplicate scheme for spam detection is to use reported known spams to block subsequent

ones which have similar content.

Dr.R.K.Gnanamurthy

Principal,

Vivekanandha College of

Engineering for Women

Page 69: NCIIT 12 Proceedings

55 Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

MINING KDD CUP DATABASE FOR INTRUSION DETECTION

BASED ON FUZZY CLASS-ASSOCIATION RULE MINING USING

GENETIC NETWORK PROGRAMMING

N.Mohan Prabhu,

PG student Dept of CSE ,

Sudharshan Engg College,

Pudukkottai.

Email id: [email protected]

Abstract

As the Internet services spread all over the world, many kinds and a large number of

security threats are increasing. Therefore, intrusion detection systems, which can

effectively detect intrusion accesses, have attracted attention. This paper describes a novel

fuzzy class-association rule mining method based on genetic network programming (GNP)

for detecting network intrusions. GNP is an evolutionary optimization technique, which

uses directed graph structures instead of strings in genetic algorithm or trees in genetic

programming, which leads to enhancing the representation ability with compact programs

derived from the reusability of nodes in a graph structure. By combining fuzzy set theory

with GNP,the proposed method can deal with the mixed database that contains both

discrete and continuous attributes and also extract many important class association rules

that contribute to enhancing detection ability. Therefore,the proposed method can be

flexibly applied to both misuse and anomaly detection in networkintrusion- detection

problems.

Mr.Venkatasan,

Asst Prof Dept of CSE,

Sudharshan Engg College,

Pudukkottai.

Page 70: NCIIT 12 Proceedings

56 Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

LIVER CANCER CLASSIFICATION FROM GENE EXPRESSION

USING SWARM INTELLIGENCE

N.Kannaiya Raja,

M.E., (P.hd) .,

A.P/CSE Dept.

Arulmigu Meenakshi Amman College of Engg

Thiruvannamalai Dt,

Near Kanchipuram ,

[email protected]

Abstract

The most important vital part of microarray in gene expression analysis which classify

different types of liver cancer tissue sampling according to gene expression with the help of

known sample expression levels which have been extracted from different sources of the

systems by using network. We present parallel form of dataset that ensures high level of

classification can be done on the dataset from different hospital from worldwide by using

logical network and integrated into knowledge mining and used as a image in the

microarray. Accuracy of test samples in a computer aided diagnosis framework than a

single diagnosis form. In this paper, in medical diagnosis such dataset are highly desirable

as medical expert can gain additional information are needed for each diagnosis, we

investigate knowledge discovery from clinically collect dataset for liver cancer which is a

chronic disease a major public health challenge in the world . According to international

statistics 200 million civil liver patient living in the world wide and this number is expected

to rise with in 2 years therefore we propose a algorithm Particle swarm optimization based

Feasible Ensemble Classifier used for gene selection and accurate liver cancer classification

,FEC technique shows good discriminating power in gene expression analysis. The PSO-

FEC provides better classification accuracy than GA-FEC classifier, The FEC method can

be easily extended for nonlinear classifier.

P.Uma Devi, ME.,(P.hd).,

A.P/CSE Dept.

Arulmigu Meenakshi

Amman College of Engg

Thiruvannamalai Dt,

Near Kanchipuram ,

[email protected]

Dr. K.Arulanandam, Prof & Head,

CSE Department,

Ganadipathy Tulsi‟s Jain Engineering

College, Vellore,

[email protected]

N.Saranya Arulmigu Meenakshi

Amman College of Engg

Thiruvannamalai Dt, Near

Kanchipuram ,

[email protected]

Page 71: NCIIT 12 Proceedings

57 Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

AN EFFICIENT FALSE HITS REDUCTION BY AUTHENTICATED

MULTISTEP NEAREST NEIGHBOR SEARCH

Abstract

For an increasing number of modern database applications, efficient support of similarity

search becomes an important task. Multistep processing is commonly used for nearest

neighbor (NN) and similarity search in applications for costly distance computations.

Today, many such applications require a proof of result correctness. In this setting, clients

issue NN queries to a server that maintains a database signed by a trusted authority. The

server returns the NN set along with supplementary information that permits result

verification using the data set signature. An adaptation of the multistep NN algorithm

incurs prohibitive network overhead due to the transmission of false hits that the

records that are not in the NN set. In order to alleviate this problem, it presents a novel

technique that reduces the size of each false hit and for a distributed setting, where the

database is horizontally partitioned over several servers

FEATURE BASED SEMANTIC MULTI-DOCUMENT UPDATE

SUMMARY GENERATION K.Ambika , A.Kogilavani

Department of Computer Science & Department of Computer Science&

Engineering Engineering

Kongu Engineering College, Kongu Engineering College,

Perundurai, Perundurai,

Tamilnadu, India Tamilnadu, India

[email protected] [email protected]

Abstract

Multi-document summarization is an automatic procedure aimed at extraction of

information from multiple texts written about the same topic. This paper proposes an

approach to produce an extractive semantic summary from multiple documents. Initially

constructing a list of sentences for each document is done with the help of annotating the

sentences according to the aspects. To calculate the score of each sentence, the set of

features like Word, Position, Length, Centrality, Sentence with Proper Noun, Numerical

Data, Annotated sentence, Preposition and Named Entities features are used. Finally top

ranking sentences are selected for initial summary. Then phrase matching process is used to

remove redundancies and to generate update summary. Experiment evaluation shows that

the proposed system outperforms the existing system.

Jasmine Alice Manonmani. J

M.E - Software Engineering

Jayaram College of Engg and Tech,

Anna University, Trichy

Pagalavadi, Tamilnadu State 621014,

India

[email protected]

Senthilmathi. T M.E.,

Asst Professor, Dept. of CS

Jayaram College of Engg and Tech,

Anna University, Trichy

Pagalavadi, Tamilnadu State 621014,

[email protected]

Page 72: NCIIT 12 Proceedings

58 Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

IMPROVING THE INTRUSION DETECTION SYSTEM USING AN

ELEGANT ADAPTIVE LEARNING TECHNIQUE

S.Suganya,

PG Student,

CSE,

Kongu Engineering College,

Erode, Tamil Nadu, India

[email protected]

Abstract

Security is an important issue of a network. There is a possibility for the third parties to

interrupt the communication in a network. So it is necessary to secure the system from the

adversaries. The technique which is used to detect the interruption is called as Intrusion

Detection System (IDS). Many solutions have been proposed for improving IDS. One of

them is IDS using Artificial Neural Network (ANN). To classify the different types of

attacks, ANN uses back propagation algorithm with feed forward neural network. But the

main drawback of ANN is that it is inefficient to use for larger networks. The proposed

system focuses to improve the IDS with larger networks by implementing a new technique

called Adaptive Learning system (ALS). The ALS technique maintains a detection model

which has the ability to identify the known interrupt in the system. Since it classifies the

activities of the system, if any new intrusion or some malfunctioning takes place the ALS

will easily identify through the detection model and it will classify those unknown activities

an anomaly. This will improve the efficiency and lifespan of the network. Experiments will

be evaluated with KDD CUP 99 data sets to test the behavior of the system.

WEB USER INTERFERENCE BY CLUSTERING M.Malathy

M.C.BABU

2nd

year P.G Student Lecturer of CSE

St. Peter‘s University, ` St. PETER‘S UNIVERSITY,

Chennai. Chennai.

[email protected]

Abstract This paper focuses on the definition and identification of ―Web user-sessions‖,

aggregations of several TCP connections generated by the same source host. The

identification of a user-session is non trivial. Traditional approaches rely on threshold based

mechanisms. However, these techniques are very sensitive to the value chosen for the

threshold, which may be difficult to set correctly. By applying clustering techniques, we

define a novel methodology to identify Web user-sessions without requiring an a priori

definition of threshold values. We define a clustering based approach, we discuss pros and

cons of this approach, and we apply it to real traffic traces. The proposed methodology is

applied to artificially generated traces to evaluate its benefits against traditional threshold

based approaches. Web user-sessions tend to be Poisson, but correlation may arise during

periods of network/hosts anomalous behavior.

R.Manjuladevi

Assistant professor, Kongu

Engineering College, Erode,

Tamil Nadu, India

Page 73: NCIIT 12 Proceedings

59 Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

AN INDEXING METHOD FOR XML DATA P.Nataraj,

PG Scholar,

Department of CSE,

Kongu Engineering College,

Perundurai, Erode.

Email: [email protected]

Abstract

XML has now become the standard for transporting data between different web

applications. Due to the use of XML in web applications, updating takes place so

frequently. We need a better labeling scheme that will effectively handle the updating in the

web. There are many XML labeling scheme that can assign labels for each and every nodes

in the XML tree structure. When an updating takes place, we need to assign a new label for

that node in tree. This also results in changing all labels, due to changes in tree structures.

This result in increase in complexity for labeling, when new data was added in web. We

need a XML labeling technique that can able to insert a new label without changing already

available labels in XML tree. We propose a new labeling scheme which was based on

Improved Binary String Labeling. XMill compression will reduce the label size.

CUSTOMIZED NEWS FILTERING AND SUMMARIZATION

SYSTEM BASED ON PERSONAL INTEREST

Anand babu M.H

[email protected]

Department of M.E Software Engineering

Anna University of Technology,

Thiruchirapalli

Abstract Information on the World Wide Web is consisted with large amounts of news contents.

Recommendation, filtering, and summarization of Web news have received much attention

in Web intelligence, aiming to find interesting news and summarize concise content for

users. In this paper, we present our research on developing the News Filtering and

Summarization system (CNFS). An embedded learning component of CNFS induces a user

interest model and recommends Customized news. A keyword knowledge base is

maintained and provides a real-time update to reflect the general Web news topic

information and the user‘s interest preferences. The non-news content irrelevant to the

news Web page is filtered out. Keywords that capture the main topic of the news are

extracted using lexical chains to represent semantic relations between words. An Example

run of our CNFS system demonstrates the superiority of this Web intelligence system.

Dr P.Jayanthi,

Assistant Professor (SG),

Department of CSE,

Kongu Engineering College,

Email: [email protected]

Page 74: NCIIT 12 Proceedings

60 Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

LOW POWER FILTER DESIGN USING OPTIMIZED

MULTIPLEXER BASED MULTIPLIER AND ADDER CELL .

G.Anjuga Priyanka G.Sathish Kumar Prof.B.M.Prabhu Prof.L.Raja

Angel College of Engineering and Technology,

Tirupur

Abstract

The main components to design a digital filter are Adder, Multiplier and delay element.

The Filter is optimized with the proposed architecture for the adder and multiplier which is

based on the concept of multiplexers. The proposed architecture for the adder and

multiplier is reconfigured architecture with the concept of multiplexer. The proposed

architecture the full adder circuit has only 14 transistor which results in the low power and

occupies less area. An obvious method to reduce power consumption is to reduce number

of transistors in a circuit.Among the three inputs in the full adder one of the input is used as

a select line for multiplexer.6 multiplexers and an inverter is used in the proposed

architecture . After analyzing the performance characteristics of conventional multiplier

types, it is observed that the multiplexer-based multiplication algorithm is more

advantageous. The multiplexer based adder is designed in tanner s-edit and the circuit is

simulated.The proposed adder consumes 23% less power than the most power efficient 10-

transistor adders and is 64% faster of all other tested adders. Therefore, is suitable to be

applied to build larger low-power high performance VLSI systems.

OPTIMUM THROUGHPUT ESTIMATION IN MULTIBAND

MULTIANTENNA WIRELESS MESH NETWORKS

Deepika.S DR.H.Mangalam

P.G scholar, Professor,

Sri Krishna college of engineering Sri Krishna college of engineering

and technology, and technology,

Coimbatore, India, 641008 Coimbatore, India, 641008,

Email: [email protected] ,

Abstract

The shortage of bandwidth has made both industry and government to explore new ways

of using limited resources.Thus recent advances in signal processing combined with those

in antenna technology provide MIMO capabilities, thereby creating oppurtunities for

enhancing the throughput of wireless networks.Both SDR and MIMO together enable next

generation wireless network such as mesh networks to support dynamic and adaptive

bandwidth sharing along time,frequency and space. Use of multiple antennas at the

transmitter will suppress the interference caused by the receiver and facilitate multiple

spectrum bands.Establish nodes in WMN and find the shortest path thenafter evaluate the

approximation algorithm to predict the loss caused by constrain and estimate the

throughput.Here LP constrain relaxation techniques is used to characterize and analyze the

maximum achievable throughput that multihop,multiband

Page 75: NCIIT 12 Proceedings

61 Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

BACKGROUND MODELING AND SUBTRACTION OF DYNAMIC

SCENES

SUGANYA DEVI A, M.TECH-IT,

Sasurie College of Engineering, Tirupur, India

([email protected])

Abstract

This project presents an approach for video metrology from videos acquired by a

stationary camera, we first recover the accurate scene in the place based upon we will fine

tracking the moving objects that from the particular place. We finally fuse the multi frame

measurements using the least median of squares (LMedS) as a robust cost function. The

authors examine the problem of segmenting foreground objects in live video when

background scene textures change over time. In particular, we formulate background

subtraction as minimizing a penalized instantaneous risk functional yielding a local online

discriminative algorithm that can quickly adapt to temporal changes. We analyze the

algorithms convergence, discuss its robustness to non stationary, and provide an efficient

nonlinear extension via sparse kernels. To accommodate interactions among neighboring

pixels, a global algorithm is then derived that explicitly distinguishes objects versus

background using maximum a posteriori inference in a Markov random field (implemented

via graph-cuts). By exploiting the parallel nature of the proposed algorithms, we develop an

implementation that can run efficiently on the highly parallel graphics processing unit

(GPU). Empirical studies on a wide variety of datasets demonstrate that the proposed

approach achieves quality that is comparable to state-of-the-art offline methods, while still

being suitable for real-time video analysis.

A COMPARISON STUDY OF GENETIC ALGORITHM AND

ARTIFICIAL IMMUNE SYSTEM Swathy Priyadharsini P

Bannari Amman Institute of Technology,

Sathyamangalam Erode,

Tamil Nadu, India

[email protected]

Abstract

Bloom filter is a probabilistic and space efficient data structure designed to check the

membership of an element in a set. The trade-off to use Bloom filter may have configurable

risk of false positives. The percentages of a false positive can be made low if the hash bit

map is sufficiently massive. Spam is an unsolicited or irrelevant message sent on the

internet to an outsized range of users or newsgroup. A spam word may be a list of well-

known words that usually appear in spam mails. In the proposed system, Bin Bloom Filter

(BBF) groups the words into number of bloom filters that have different false positive rates

primarily based on the weights of the spam words. Clonal Selection Algorithm is one of the

methods in Artificial Immune System (AIS) involved with computational methods inspired

by the process of the biological immune system. This paper demonstrates the CSA

algorithm for minimizing the total membership invalidation cost of the BBF which finds

the optimal false positive rates and number of elements to be stored in bloom filters of Bin.

The experimental results demonstrate the application of CSA in BBF and compare the

results with Genetic Algorithm (GA).

Arulanand Natarajan

Anna University of Technology

Coimbatore

Tamil Nadu, India

[email protected]

Page 76: NCIIT 12 Proceedings

62 Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

BIOMETRICS AS AN AUTHENTICATION MEASURE

Anantha kumar. T

Junior Research Fellow

Department of Computer Science and Engineering

Bannari Amman Institute of Technology, Sathyamangalam, Erode

[email protected]

Abstract

The term "biometrics" is derived from the Greek words bio (life) and metric (to measure).

Biometrics refers to the automatic identification of a person based on his/her physiological

or behavioral characteristics. This method of identification is preferred over traditional

methods involving passwords and PIN numbers for its accuracy and case sensitiveness. A

biometric system is essentially a pattern recognition system which makes a personal

identification by determining the authenticity of a specific physiological or behavioral

characteristic possessed by the user. An important issue in designing a practical system is to

determine how an individual is identified. Depending on the context, a biometric system

can be either a verification (authentication) system or an identification system. Verification

involves confirming or denying a person's claimed identity while in identification, one has

to establish a person's identity. Biometric systems are divided on the basis of the

authentication medium used. They are broadly divided as identifications of Hand

Geometry, Vein Pattern, Voice Pattern, DNA, Signature Dynamics, Finger Prints, Iris

Pattern and Face Detection. These methods are used on the basis of the scope of the testing

medium, the accuracy required and speed required. Every medium of authentication has its

own advantages and shortcomings. With the increased use of computers as vehicles of

information technology, it is necessary to restrict unauthorized access to or fraudulent use

of sensitive/personal data. Biometric techniques being potentially able to augment this

restriction are enjoying a renewed interest.

FREE AND OPEN SOURCE SOFTWARE FOR DISASTER MANAGEMENT

Prasanna Balaji M.S

Department of Computer Science and Engineering

Bannari Amman Institute of Technology

[email protected]

Abstract

Free and open source software is increasingly being used in many spheres of development

including disaster management. With the economic downturn, stakeholders at both ends of

the technology divide are turning to free and open source software solutions. Despite the

successes of open source, few challenges including sustainability remain a problem. While

use of open source varies between ―developed‖ and ―developing‖ nations, cheaper cost and

vendor independence have been cited as the key factors in favour of its use. The paper

illustrates empowerment of communities in developing nations through appropriate open

source applications. Explicitly, the paper describes the functioning of the ―Sahana‖ disaster

management system deployed during the 2004, Indian Ocean Tsunami. Sahana used free

and open source software to create number of functionalities. Success of these

functionalities has contributed to efficient and effective management of disaster relief. The

paper offers a set of generic policy options for the use of open source in disaster

management.

Page 77: NCIIT 12 Proceedings

63 Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

Dynamic Bandwidth Adaptation supported Adaptive Call Admission

Control Mechanism for 3GPP: LTE Networks

Senpaka Priya V

Department of Computer Science and

Engineering,

Bannari Amman Institute of Technology

Erode, TN, India

[email protected]

J.Vijay Franklin

Department of Computer Science and

Engineering,

Bannari Amman Institute of Technology

Erode, TN, India

[email protected]

Abstract

In this continuous fast world of mobile devices there is always a growing demand

of high rate services. So a call has to be continuous with same data rates during a handoff.

This paper deals with a novel approach to reduce the call dropping probability while

ensuring QoS demands are met in LTE wireless networks. The reduction is based on

Adaptive Call Admission Control (Ad-CAC) scheme which gives priority to handoff call

over the new calls. The Dynamic Bandwidth Adaptation (DBA) approach is used to

maximize the overall system utilization while keeping the blocking rates low. This

approach also maintains a low new call blocking rates.

Keywords-Ad-CAC, DBA, Handoff call, New call, QoS

An Efficient Jamming Detection In Wireless Mesh Networks

Ms.J.Soniya. (M.E),

anna university of technology Coimbatore

[email protected]

Abstract

Wireless mesh networks (WMNs) have emerged as a key technology for next-

generation wireless networking. Because of their advantages over other wireless networks,

WMNs are undergoing rapid progress and inspiring numerous applications. However, many

technical issues still exist in this field. We develop a cross-layer approach for mesh access

networks to simultaneously address the unidirectional link problem. The main ideas of our

approach are to eliminate the unidirectional link at the network layer and introducing bi

directional link at the same layer also we are detecting the jammers in the network layer.

Jamming is difficult to mitigate in broadcast networks because transmitting and receiving

are inherently symmetric operations: A user that possesses the key to decode a transmission

can also use that key to jam the transmission. We describe a code tree system that provides

input to the physical layer and helps the physical layer circumvent jammers. Finally, we

demonstrate that our scheme approaches the best possible performance by performing an

extensive analysis of the system using network simulator version 2.

Mr.M.Newlin Raj kumar

M.S.,M.B.A.,(Ph.D).,

Assistant professor of CSE Dept.,

anna university of technology

Coimbatore.

[email protected]

Page 78: NCIIT 12 Proceedings

64 Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

Enhancing Privacy And Reducing The Traffic Levels In

Vehicular Ad-Hoc Networks

A.Naveenkumar,

P.G Scholar,

[email protected],

Angel College of Engineering

and Technology, Tripur.

Abstract Vehicular Ad Hoc Networks is a kind of special wireless ad hoc network, which has the

characteristics of high node mobility and fast topology changes. The Vehicular Networks

can provide wide variety of services, ranges from safety and crash avoidance to internet

access and multimedia applications. Attacking and misusing such network could cause

destructive consequences. It is therefore necessary to integrate security requirements into

the design of VANETs and defend VANET systems against misbehaviour, in order to

ensure correct and smooth operations of the network. In this paper, We propose a security

system for VANETs to achieve privacy desired by vehicles and traceability required by law

enforcement authorities, in addition to satisfying fundamental security requirements

including authentication, nonrepudiation, message integrity, and confidentiality. Moreover,

we propose a privacy-preserving defense technique for network authorities to handle

misbehaviour in VANET access, considering the challenge that privacy provides avenue for

misbehaviour. The proposed system employs an identity-based cryptosystem where

certificates are not needed for authentication. We show the fulfilment and feasibility of our

system with respect to the security goals and efficiency.

Threat Modelling Using An Attack Surface Metric Isaiarasi.S M.E - Software Engineering Jayaram College of Engineering

and Technology

Anna University, Trichy

[email protected]

Abstract Measurement of software security is a long-standing challenge to the research

community. At the same time, practical security metrics and measurements are essential for

secure software development. Hence, the need for metrics is more pressing now due to a

growing demand for secure software. In this paper, we propose using a software system‘s

attack surface measurement as an indicator of the system‘s security. We formalize the

notion of a system‘s attack surface and introduce an attack surface metric to measure the

attack surface in a systematic manner. Our measurement method is agnostic to a software

system‘s implementation language and is applicable to systems of all sizes; we demonstrate

our method by measuring the attack surfaces of small desktop applications and large

enterprise systems implemented in C and Java.

V.Surendhiran,

P.G Scholar,

[email protected],

Angel College of Engineering

and Technology, Tripur.

Sakthivel.V,

M.E - Software Engineering,

Jayaram College of Engineering

and Technology,

Anna University, Trichy

[email protected]

Page 79: NCIIT 12 Proceedings

65 Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

Secure Routing Through Trusted Nodes For Mobile Adhoc Networks

Dhanya Simon, PG Scholar

Sri Ramakrishna Engineering College, Coimbatore.

Email: [email protected]

Abstract The infrastructure-less property of MANET, Dynamic network topology and lack of

certificate authority make the security problems of MANET need to pay more attention.

This algorithm is to provide secure routing in ad hoc mobile networks. We use trust

establishment through friends and special challenges are used for authenticating the nodes.

The friend nodes are rated on the basis of the amount of data transmission they accomplish

and their friendship with other nodes in the network. This provides a robust mechanism for

preventing attacks by isolating malicious nodes in the network.

A Modified Approach For Continuous User Authentication And

Intrusion Detection In High Security Mobile Ad Hoc Networks

A. Caroline, Asst Prof,

Department of CSE,

Dr.S.J.S.Paul Memorial College of engineering and

technology,[email protected]

Abstract Since Mobile adhoc networks are made up entirely of wireless mobile nodes, they are

inherently more susceptible to security threats compared to fixed networks. Access to

wireless links is virtually impossible to control.. Authentication is the hallmark of security

and failure to achieve this will be the stumbling block in the way of securing MANETs.

Intrusion detection systems for MANETs are indispensable for a reliable system. In this

paper a modified approach has been proposed by considering the connectivity of the

network for scheduling decisions in High security Mobile adhoc networks. In such

environment we introduce a packet scheduling mechanisms which approximates an ideal

scheduling mechanism which shares the throughput among the contending flows.

Furthermore, the connectivity of the network is ensured by connectivity improvement

algorithm. Simulation results are presented to show the performance of the proposed

approach.

Page 80: NCIIT 12 Proceedings

66 Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

An Efficient Approach For Detecting Mobile Replica Node Attack In

Wireless Sensor Network

Ms.Saranya

Second Year [M.E]

Dept of Information Technology

M.A.M College of Engineering

Trichy, India

[email protected]

Abstract

The sensor networks are unattended and the sensor nodes are not equipped with the

tamper-resistance hardware so that , an adversary can capture and compromise sensor

nodes, make replicas of them, and then mount a variety of attacks with these replicas. These

replica node attacks are dangerous in which the adversary takes the secret keying materials

from a compromised node, generates a large number of attacker-controlled replicas that

share the compromised node‘s keying materials and ID, and then spreads these replicas

throughout the network. With a single captured node, the adversary can create as many

replica nodes as he has the hardware to generate. Previous works against replica node

attacks detection suffer from a high communication /storage overhead and also work for

only static sensor network . They do not work in mobile sensor networks, where sensors are

expected to move. An Efficient and Distributed Detection (EDD) scheme along with

Sequential Hypothesis testing for detecting replica node attacks in mobile sensor networks

is proposed. The security and performance analysis indicate that the proposed scheme can

identify replica node attacks with a high detection probability at the cost of a low

computation/communication/storage overhead.

A Optimal Information Hiding Technique With Tree Based Similarity

S.Kalaivani

PG Scholar,

Computer Science and Engineering,

Srinivasan Engineering College,

[email protected]

Abstract

The information hiding deals with distortion reduction using steganography and

security enhancement using cryptography. Distortion reduction is done using Tree Based

Parity Check which uses Majority vote strategy. The Tree Based Parity Check is very

optimal for cloaking a message on image. The proposed majority vote strategy results in

least distortion. The SHA-1 algorithm is implemented for security enhancement. The result

obtained in proposed method works effectively even with large payload.

Ms. Nerthikaa

Second Year [M.E]

Dept of Computer science

and Engineering

Srinivasan College of

Engineering

Perambalur,India

[email protected]

V. Maheshwari,

Assistant Professor, Computer

Science and Engineering,

Srinivasan Engineering College.

maheshwari.vetri.sec@gmailcom

Page 81: NCIIT 12 Proceedings

67 Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

An Effective Minimization Of Storage Overhead For Tracking Down The

Invasion Of Replicated Nodes In Wireless Sensor Networks R.Rajavaishnavi,

PG Scholar,

Computer Science and Engineering,

Srinivasan Engineering College

[email protected]

Abstract Wireless sensor networks are vulnerable because nodes are often placed in a hostile or

dangerous environment where they are not physically protected. A central problem in

sensor network security is that sensors are susceptible to physical node capture attacks.

Once a sensor is compromised, the adversary can easily launch replica attacks by

duplicating the compromised node, distributing the replicas throughout the network, and

starting a variety of insider attacks. Previous works against replica attacks suffer from a

high communication /storage overhead or poor detection accuracy. A new protocol called

Node Location Maintenance (NLM) protocol is used. This employs a table of values at

each node to record the trace of the random walks. Each witness node will create a new

entry in its table for every new location claim. Randomized, efficient and distributed (RED)

protocol is used for the detection of node replication attacks. RED executes routinely at

fixed intervals of time. The security and performance analysis indicate that the proposed

scheme can identify replica attacks with a high detection probability at the cost of a low

computation/communication/storage overhead.

Correlation - Based Traffic Analysis Attacks On Anonymity Networks M.Kumaresan,

II MCA.,

mkumaresan45@

gmail.com

Department of Computer Science, PREC

Abstract Mixes have been used in many anonymous communication systems and are

supposed to provide countermeasures to defeat traffic analysis attacks. In this project,

we focus on a particular class of traffic analysis attacks, flow correlation attacks, by

which an adversary attempts to analyze the network traffic and correlate the traffic of a

flow over an input link with that over an output link. Two classes of correlation methods

are considered, namely time-domain methods and frequency-domain methods. Based on

our threat model and known strategies in existing mix networks, we perform extensive

experiments to analyze the performance of mixes. We find that all but a few batching

strategies fail against flow-correlation attacks, allowing the adversary to either identify

ingress or egress points of a flow or to reconstruct the path used by the flow. Counter

intuitively, some batching strategies are actually detrimental against attacks. The

empirical results provided in this project give an indication to designers of Mix

networks about appropriate configurations and mechanisms to be used to counter flow-

correlation attacks.

Prof.S.Chinnadurai,

Assistant Professor, Computer

Science and Engineering,

Srinivasan Engineering College

[email protected]

N.Rajavel,

II MCA,

rajveluvasu

@gmail.com

C.Raja,

II MCA,

k.crajas917@gmail.

com

R.Gnanakumarn,

Asst. Professor,

rgkumaran@gmail

.com

Page 82: NCIIT 12 Proceedings

68 Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

Quick Response (QR) Code: A Review A.K.Shafreen Banu

Asst. Professor,

Department of IT,

Bishop Heber College,

Trichy-620017.

[email protected].

Abstract

A QR code (abbreviated from Quick Response code) is a type of matrix barcode (or

two-dimensional code) first designed for the automotive industry. More recently, the

system has become popular outside of the industry due to its fast readability and large

storage capacity compared to traditional UPC barcodes. The code consists of black modules

arranged in a square pattern on a white background. The information encoded can be made

up of four standardized kinds ("modes") of data (numeric, alphanumeric, byte/binary), or

by supported extensions virtually any kind of data. The technology has seen frequent use in

the United Kingdom and the United States; QR usage is growing fastest

in Canada and Hong Kong. QR code was created by Toyota subsidiary Denso Wave in

1994 to track vehicles during the manufacturing process, the QR code is one of the most

popular types of two dimensional barcodes. It was designed to allow its contents to be

decoded at high-speed.

Modified Multimedia Architecture For Mobile Multimedia Application

S.Parthasarathy

M.E Student,

Anna University of Technology Madurai,

Madurai.

[email protected]

Abstract Dynamically modified SRAM array for low-power mobile multimedia

application. The proposed structure use a lower voltage for cells storing low-order

bits and a nominal voltage for cells storing higher order bits .Parametric failures due

to manufacturing variations limit the opportunities for power saving in SRAM. The

architecture allows reconfigure the number of bits in the low-voltage mode to change

the error characteristics of the array in run-time. We can obtain more than 45%

savings in memory power with a marginal (10%) reduction in image quality under

Simulations in predictive 70 nm nodes INDEX TERMS – Image Processing, low

power, multimedia, process variation, SRAM.

M.Lovelin Ponn Felciah

Asst. Professor,

Department of IT,

Bishop Heber College,

Trichy-620017.

[email protected]

G.Kavinraj

Student, II M.Sc. IT,

Department of Information

Technology, Bishop Heber

College, Trichy-620017.

[email protected]

Mr.V.Arun,

Assistant Professor of ECE,

Anna University of

Technology, Madurai,

Madurai.

Page 83: NCIIT 12 Proceedings

69 Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

A Novel Hybrid Approach To Detect Color Texts In Natural Scene

Images

C.Selvi (M.E-CSE)

PG Scholar

[email protected]

Gnanamani College of Technology

Abstract Large amounts of information are embedded in natural scenes which are often required

to be automatically recognized and processed. This requires automatic detection,

segmentation and recognition of visual text entities in natural scene images. In this paper,

we present a hybrid approach to detect color texts in natural scene images. The approaches

used in this project are region based and connected component based approach. A text

region detector is designed to estimate the probabilities of text position and scale, which

helps to segment candidate text components with an efficient local binarization algorithm.

To combine unary component properties and binary contextual component relationships, a

conditional random field (CRF) model with supervised parameter learning is proposed.

Finally, text components are grouped into text lines/words with a learning-based energy

minimization method. In our proposed system, a selective metric-based clustering is used to

extract textual information in real-world images, thus enabling the processing of character

segmentation into individual components to increase final recognition rates. This project is

evaluated on natural scene image dataset.

VP8 Video Codecs For Mobile Applications

Basavaraju S,

TE Dept, Dr AIT Bangalore

[email protected]

Abstract Google has recently released the video compression format VP8 to the open source

community. This new compression format competes against the existing H.264 video

standard developed by the ITU-T Video Coding Experts Group (VCEG) in collaboration

with the ISO/IEC Moving PictureExperts Group (MPEG). This paper is about coding

standards in terms of video bit rate-distortion (quality) performance and the video network

track variability with different long video sequences.. VP8 is based on decomposition of

frames into square sub-blocks of pixels, prediction of such sub-blocks using previously

constructed blocks, and adjustment of such predictions (as well as synthesis of unpredicted

blocks) using a discrete cosine transform (hereafter abbreviated as DCT).

Mr.A.A.R.Senthil Kumaar, M.E.,

(Ph.D). ,

Head of the Department,

Gnanamani College of Technology.

Dr B Siva Kumar ,

Professor and HOD of TE Dept,

Dr AIT Ban galore,

[email protected]

Page 84: NCIIT 12 Proceedings

70 Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

Image Based Learning To Enhance The Study Of Visual Impaired Person

Mrs J Kokila,

PG Student,

Department of CSE,

Oxford Engineering College,

Tiruchirappalli

[email protected]

Abstract Bag-of-visual Words (BoWs) representation has been applied for various problems in

the field of multimedia and computer vision. In this paper, descriptive visual words

(DVWs) and descriptive visual phrases (DVPs) are proposed as the visual correspondences

to the text words and phrases, where visual phrases refer to the frequently co-occurring

visual word pairs. The proposed framework will select images and transform it into a text

file using descriptive visual words and visual phrases method. The text file will be read by

the speech synthesizer which tells to the visually impaired person. Independent navigation

is always a challenge to visually impaired person, whenever they learning in the image

based books such as science, social science, biology, computer science books. We apply the

identified DVWs and DVPs in several applications including large-scale near-duplicated

image retrieval, image re-ranking, object recognition and text to speech synthesis. The

proposed image re-ranking algorithm: DWP Rank performs the state-of-the-art algorithm

by 12.4% in mean average precision and about 11 times faster in efficiency.

Efficient Iris Recognition Based Biometric Techniques For Embedded

System Ranjisha.R,

Vivekanadha College of Engineering,

Thiruchengode Tamilnadu, India

[email protected],

Abstract Several systems require authenticating a person‘s identity before giving access to

resources. With new advances in technology, biometrics is one of the most promising

techniques in human recognition. Biometrics intends to identify a person by his physical

and/or behavioral characteristics. This paper presents an approach for designing personal

tokens where iris biometric authentication is applied. An iris-recognition algorithm first has

to localize the inner and outer boundaries of the iris in an image of an eye. The set of pixels

containing only the iris, normalized to compensate for pupil dilation or constriction, is then

analyzed to extract iris code needed to compare two iris images. These iris codes can be

stored in the data base or in personal token. For identification or verification, a iris code

created by imaging an iris is compared to stored template(s) in a database. Iris segmentation

is very important for an iris recognition system. If the iris regions were not correctly

segmented, there would possibly exist four kinds of noises in segmented iris regions:

eyelashes, eyelids, reflections and pupil, which will result in poor recognition performance.

Hence this paper also proposes methods which will enhance the quality of segmented iris

image by removing the noises

Mr.N.Sathish Kumar,

UG Student,

Department of CSE,

Oxford Engineering College,

Tiruchirappalli .

[email protected]

[email protected]

om

Mr.P.Thanga Durai,

UG Student,

Department of CSE, Oxford

Engineering College,

Tiruchirappalli

[email protected]

Prof. P.Moorthy,

Vivekanandha College of

Engineering

Thiruchengode Tamilnadu, India

[email protected]

Page 85: NCIIT 12 Proceedings

71 Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

Image Retrieval Using Multi-Feature Score Fusion Through Genetic

Algorithm

Abstract Our work deals with image retrieval based on multi-feature score fusion using Genetic

algorithm. In this the retrieval results from color feature and texture feature are analyzed,

and the method of fusing multi-feature score is described. The color information of an

image is represented by the hue, saturation and intensity values. The texture features are

determined by calculating the energy, entropy, contrast and correlation values. Genetic

algorithm is applied for assigning fusion weights of multi-feature scores. Genetic algorithm

proves to be more efficient because it can assign weight functions in a randomized way.

The final query ranking is based on the total normalized distance in color and texture

features. The color and texture features of all the data base images are calculated and

compared with the same features of the input image. The top 10 images with least

difference are retrieved as the output.

An Artificial Device To Regain Memories For Accidently Memory Lost

Person’s

Dinakar.S,

Lecturer,

Dept of CSE,

SNS College of Technology,

Coimbatore, TamilNadu.

[email protected]

Abstract

Brain is considered as one among the most complex organs in the human body. It is known

as super computer which cannot be decoded by any computer in the world. The proposed

describes about the retrieval of stored information from the memory for each and every

human activity. But in case of accidentally memory lost persons the process is not attained

fully so we make use of a silicon chip that generates the deep brain simulation known as

reference signal that can simulate the person activity without any interruption.

S.Bhuvaneswari,

Lecturer,

Dept of CSE,

SNS College of

Technology,

Coimbatore,

TamilNadu.

bhuvana_it04@gmail

.com

M.Velmurugan,

Student,

Dept of CSE,

SNS College of

Technology,

Coimbatore,

TamilNadu.

murugan.vel92@

gmail.com

N.Vivek Bharathi,

Student,

Dept CSE,

SNS College of

Technology,

Coimbatore,

TamilNadu.

vivekbharathi.P.n

@gmail.com

Sanmukapriya.V

Department of ECE,

Sri Shakthi Institute of

Engineering and Technology,

Coimbatore – 62.

[email protected]

m

Shanmugapriya.V

Department of ECE,

Sri Shakthi Institute of

Engineering and Technology,

Coimbatore – 62.

[email protected]

S.Sangeetha

Asst. Prof /Department of

ECE,

Sri Shakthi Institute of

Engineering and Technology,

Coimbatore – 62.

Page 86: NCIIT 12 Proceedings

72 Proceedings of NCIIT 2012

Sponsored by DRDO, New Delhi Organized by Bannari Amman Institute of Technology

Intelligent Car Backup Warning System

Ms. K. Vanithamani, Associate Professor,

Electrical and Electronics Engineering,

Coimbatore Institute of Technology,

Coimbatore

Abstract When the car is in reverse gear, most of the drivers use back-up camera, or reverse

radar to know the road situation behind the vehicle. Pedestrians can know if the vehicle is

in reverse gear or not, by seeing a light in the rear side of the car. So, the pedestrians should

be more careful while the car is in reverse gear. This is the drawback, while the car is in

reverse gear. So, most often backup collision occurs. To prevent collision the driver should

turn around and looks out of the car. But it is not possible for all the time. Therefore, this

research tries to design an intelligent car reverse warning system using an embedded

controller. This system uses sensors to detect the objects when the car is engaged in reverse

gear. Then it will produce an alarm signal. If the driver is not attentive and if the car crosses

the predefined distance, the motor speed get slows down and if it is more closer to the

obstacle means the car will stop immediately. Touch panel is used to give the input in the

real time environment. Fuzzy logic rules are used to develop an intelligent car reversing

system.

A. Janetlin Anna Kiruba,

M.E, Embedded And Real

Time Systems,

Coimbatore Institute of

Technology, Coimbatore