Вы находитесь на странице: 1из 146

International Conference On Innovative Trends In Computing And Technology Icitct'13 March 2013

LOW COMPLEXITY FAULT DETECTION SCHEME FOR ADVANCED ENCRYPTION STANDARD


JISHNUVIMAL, PG, A.SARAVANAN, MahaBarathi Engineering College

Abstract The Advanced Encryption Standard is the newly accepted symmetric cryptographic standard for transferring block of data safely. In order to prevent the Advanced Encryption Standard from suffering from fault attacks, the technique of error detection can be adopted to detect the errors during encryption or decryption. In this paper, low complexity fault detection schemes for reaching a reliable AES architecture are mentioned. Parity based mechanism is implemented instead of look up table method in the case of sub bytes and there by we propose low complexity fault detection schemes for the AES encryption and decryption. Our simulation results show the error coverage of greater than 99 per cent for the proposed schemes.

COLLABORATIVE INFORMATION RETRIEVAL WITH N-GRAM EXTRACTION AND COLLABORATIVE USER RANKING
S.RENUGADEVI, S.AFSAR SALEEMA, Anna University, Chennai,

Abstract There are too many information available on the web; users are often not patient enough to see long list of results given by search engines to find relevant information. Web search can be made more useful, effective and less burdensome to users by trying to infer what would be relevant for the current user for a given query considering individual users interests and provide those results on the top, so that the user need not have to scroll down a long list of results. The collaboration makes sense to search and recommend the results in effective manner among many users. While working with multiple word queries, N-gram approach is applied to search in documents, especially in cases when one must work with phrase queries. Here the n gram approach is used for searching and retrieving the utmost matched key phrase. For multiple word queries, we internally get a result set for each word inside the query; but better matches combined with a good UsersRank are more probable to occur at the first spots. Collaborative information retrieval in the Ranking phase provides more relevant and preferred pages by the users in the Collaboration.
Organized By, Department Of Computer Science And Engineering & Information technology The Rajaas Engineering College Vadakangulam

International Conference On Innovative Trends In Computing And Technology Icitct'13 March 2013

RECENT TREND IN IT- HAPTIC TECHNOLOGY IN SURGICAL SIMULATION AND MEDICAL TRAINING. ( A TOUCH REVOLUTION)
Vickram college f engineering

ALAGESWARAN.A, OM PRAKKASH T.S,

Abstract Engineering as it finds its wide range of application in every field not an exception even the medical field. One of the technologies which aid the surgeons to perform even the most complicated surgeries successfully is Virtual Reality. Even though virtual reality is employed to carry out operations the surgeons attention is one of the most important parameter. If he commits any mistakes it may lead to a dangerous end. So, one may think of a technology that reduces the burdens of a surgeon by providing an efficient interaction to the surgeon than VR. Now our dream came to reality by means of a technology called HAPTIC TECHNOLOGY. Haptic is the science of applying tactile sensation to human interaction with computers. In our paper we have discussed the basic concepts behind haptic along with the haptic devices and how these devices are interacted to produce sense of touch and force feedback mechanisms. Also the implementation of this mechanism by means of haptic rendering and contact detection were discussed. We mainly focus on Application of Haptic Technology in Surgical Simulation and Medical Training. Further we explained the storage and retrieval of haptic data while working with haptic devices. Also the necessity of haptic data compression is illustrated.

i-TREESEARCH USING TOP-K APPROXIMATE SUBTREE MATCHING


INDU R NETHAJI Mahendra Institute of Engineering & Technology

Abstract This research paper implements i-TreeSearch using Top-k Approximate Subtree Matching (TASM). It is the problem of finding the k best matches of a small query tree within a large document tree using the canonical tree edit distance as a similarity measure between subtrees.
Organized By, Department Of Computer Science And Engineering & Information technology The Rajaas Engineering College Vadakangulam

International Conference On Innovative Trends In Computing And Technology Icitct'13 March 2013

Evaluating the tree edit distance for large XML trees is difficult. The best known algorithms have cubic runtime and quadratic space complexity, and, thus, do not scale. TASM-postorder is a memory-efficient and scalable TASM algorithm. This paper proves an upper bound for the maximum subtree size for which the tree edit distance needs to be evaluated. The upper bound depends on the query and is independent of the document size and structure. A core problem is to efficiently prune subtrees that are above this size threshold. I develop an algorithm based on the prefix ring buffer that allows us to prune all subtrees above the threshold in a single postorder scan of the document.

SEMANTIC WEB SERVICES-A SURVEY Gayathiri Abstract The technology where the meaning of the information and the service of the web is defined by making the web to understand and satisfies the request of the people is called Semantic Web Services. The idea of having data on the web defined and linked in a way that it can be used by machines not just for display purpose, but for automation, integration and reuse of data across various application and semantic is raised to overcome the limitation of the Web services such as Average WWW searches examines only about 25% of potentially relevant sites and return a lot of unwanted information, Information on web is not suitable for software agent and Doubling of size. It is built on top of the Web Services extended with rich semantic representations along with capabilities for automatic reasoning developed in the field of artificial intelligence. This work attempts to give an overview of the underlying concepts and technologies along with the categorization, selection and discovery of services based on semantic.

DELIVERING SCALABLE HIGH BANDWIDTH STORAGE FOR HIGH SPEED DATA TRANSFER
L.M. GLADIS BEULA Mrs. N. SARAVANAN VelTech MultiTech Dr.Rangarajan Dr.SakunthalaEngineering College.

Organized By, Department Of Computer Science And Engineering & Information technology The Rajaas Engineering College Vadakangulam

International Conference On Innovative Trends In Computing And Technology Icitct'13 March 2013

Abstract A number of high-bandwidth networks have been constructed; existing, high-speed protocols cannot fully utilize the bandwidth of the high speed networks. As their fixed size application

level receiving buffers suffer from buffer bottleneck. In this paper, analyze the buffer bottleneck problem and propose Rada. By periodically detecting the data arrival rate and consumption rate in the buffer using Exponential Moving Average Scheme. Rada adapts the buffer size dynamically. Rada decides to increase/decrease the buffer when the data arrival rate is constantly faster/slower than the data consumption rate. The adaptation extent in each buffer increase/decrease operation based on a Linear Aggressive Increase Conservative Decrease scheme. Memory utilization is based on Weighted Mean Function. To achieve a high-speed data transfer, as well as easy deployment, User Datagram Protocol (UDP) based high-speed protocols running at the application level have recently been proposed and deployed. UDP based high-speed protocols still cannot fully utilize these high-bandwidth networks.

DEFENCE TO UNSAFE COMPONENT LOADINGS


GREESHMA BANERJI., HEMALATHA.B., Mahendra College of Engineering for Women

Abstract Dynamic loading is an important mechanism for software development. It allows an application the flexibility to dynamically link a component and use its exported functionalities. In general, an operating system or a runtime environment resolves the loading of a specifically named component by searching for its first occurrence in a sequence of directories determined at runtime. Correct component resolution is critical for reliable and secure .Dynamic loading can be hijacked by placing an arbitrary file with the specified name in a directory searched before resolving the target component. A key step in dynamic loading is component resolution, i.e., how to locate the correct component for use at runtime. Operating systems generally provide two resolution methods, either specifying the fullpath or the filename of the target component. It is now important to detect and fix these vulnerabilities. This is first automated technique to detect vulnerable and unsafe dynamic component loadings. Classify two types of unsafe dynamic loadings, resolution
Organized By, Department Of Computer Science And Engineering & Information technology The Rajaas Engineering College Vadakangulam

International Conference On Innovative Trends In Computing And Technology Icitct'13 March 2013

failure and resolution hijacking and develop an effective dynamic program analysis to detect and avoid both types of unsafe loadings. It can detect more than 1,700 unsafe DLL loadings and discover new serious attack vectors for remote code execution. In this the detected malicious dll files are prevented from loading while opening any of the file or copying any files and then asks the user whether or not to continue opening or copying .If the user wants to continue, the system continues the opening of the corresponding software if opening or continues copying process. If the user wants to stop the process, user can select the stop option.

JAMMING ATTACK PREVENTION IN WIRELESS NETWORK USING PACKET HIDING METHODS


E.GOPINATHDHINAKARAN, S.V.MANIKANTHAN M.E., Dr.Pauls Engineering College

Abstract Modern society is heavily dependent on wireless network for data Transmission. While data transmission in wireless medium, the jamming attacks occur.That selective jamming attacks can be launched by performing real-time packet classification at the physical layer.In All-Or-Nothing Transformation methods introduce a modest communication and computation overhead.In this method Block encryption algorithm is used to hiding the massages.But this algorithm not considered the Timing limits and Parameters length.To overcome this problem the Smart code generator algorithm is used.This techniques provide the strong security level in wireless Medium.

PREVENTION OF BLACK HOLE ATTACK AND CO-OPERATIVE BLACK HOLE ATTACK IN MANET jeeva Abstract Advancement in the research field has witnessed a rapid development in Mobile Ad-hoc Networks. The distributive nature and the infrastructureless structure make it an easy prey to security related threats. A black hole is a malicious node which replies the route requests that it has a fresh route to destination and drops all the receiving packets. The damage will be serious when they work as a group and this type of attack is called Co-operative black hole attack. In this
Organized By, Department Of Computer Science And Engineering & Information technology The Rajaas Engineering College Vadakangulam

International Conference On Innovative Trends In Computing And Technology Icitct'13 March 2013

work, we have designed a routing solution called Trust Based DSR (TBDSR) that enables the Dynamic Source Routing Protocol (DSR) to find a secure end-to-end route free of black hole nodes with cooperation from the neighbors. Also our solution can be able to protect the network in the presence of colluding attackers without the need of promiscuous monitoring the neighbor nodes. The extended defense routing protocol worked efficiently for the malicious node detection and removal in case of Co-operative Black Hole attack resulting in increased network performance. Keywords: Black Hole Attack, Cooperative Black Hole Attack, Ad Hoc Networks, DSR.

AUTHENTICATION ON KEY MANAGEMENT FRAMEWORK WITH HYBRID MULTICASTING NETWORK.


JEYABHARATHI P.S.R. Engineering college.

Abstract Wireless Ad Hoc Network is a collection of wireless hosts that can be rapidly deployed as a multi hop packet radio network without the aid of any established infrastructure or centralized administration. The cost reduction and fast evolution experienced by wireless communication technologies have made them suitable for a wide spectrum of applications, One of them is multicasting Networks. Multicasting systems aim at providing a platform for various applications that can improve safety and efficient group communication. This proposed asynchronous key verification scheme as a part of the protocol poses a significant reduction in the message delay.

EMPOWERED SERVICE DELEGATION WITH ATTRIBUTE ENCRYPTION FOR DISTRIBUTED CLOUD COMPUTING
Organized By, Department Of Computer Science And Engineering & Information technology The Rajaas Engineering College Vadakangulam

International Conference On Innovative Trends In Computing And Technology Icitct'13 March 2013

M.JOTHIMANI

Nandha Engineering College,

Abstract Cloud computing has emerged as one of the most influential paradigms in the IT industry. In this, new computing technology requires users to entrust their valuable data to cloud providers, there have been increasing security and privacy concerns on outsourced data. Several schemes employing attribute-based encryption (ABE) have been proposed for access control of outsourced data in cloud computing. The most of them suffer from inflexibility in implementing complex access control policies. In this paper, allowing cloud service providers (CSPs), which are not in the same trusted domains as enterprise users, to take care of confidential data, may raise potential security and privacy issues. To keep the sensitive user data confidential against untrusted CSPs, a natural way is to apply cryptographic approaches, by disclosing Decryption keys only to authorized users. But also provide high performance, full delegation, and scalability, so as to best serve the needs of accessing data anytime and anywhere, delegating within enterprises, and achieving a dynamic set of users.HASBE employs multiple value assignments for access expiration time to deal with user revocation more efficiently than existing schemes. It can be provide fine-grained access control and full delegation. Based on the HASBE model, Finally, we propose a scalable revocation scheme by delegating to the CSP most of the computing tasks in revocation, to achieve a dynamic set of users efficiently.

A SIMPLE METHOD FOR LYMPHOMA CLASSIFICATION USING PRINCIPAL COMPONENT ANALYSIS


METILDA.D, KALAIVANI.I Dr. Sivanthi Aditanar College of Engineering

Abstract Lymphoma is a cancer of the lymphocyte. The proposed approach tends to classify three types of malignant lymphoma: chronic lymphocytic leukemia, follicular lymphoma, and mantle cell lymphoma. Initially, raw pixels were transformed with a set of transforms into spectral planes. Simple and compound transforms were computed. Raw pixels and spectral planes were then routed to the second stage. At the inner level, the set of features was computed. A single feature

Organized By, Department Of Computer Science And Engineering & Information technology The Rajaas Engineering College Vadakangulam

International Conference On Innovative Trends In Computing And Technology Icitct'13 March 2013

vector is formed by fusing all computed features. The classification mechanism carried out to classify the malignancies by type.

EFFICIENT RESOURCES PROVISIONING IN CLOUD SYSTEMS FOR COST BENEFITS


M.KARTHI S.NACHIYAPPAN Velammal College of Engineering and Technology

Abstract Cloud providers can offer cloud consumers two provisioning plans for computing resources, namely reservation plan and on-demand plan. In generally, the cost of utilizing computing resources provisioned by reservation plan is cheaper than on demand plan. There are many kinds of resource provisioning options available in cloud environment to reduce the total paying cost and better utilizing cloud resources.However, the best advance reservation of resources is difficult to be achieved due to uncertainty of consumers future demand and providers resource prices. To address this problemProbabilistic based cloud resource provisioning (PCRP) algorithm is proposed by formulating a Probabilistic model.In this paper to obtain the solution of the PCRP algorithm is considered including State base machine, probability of utilization and Estimate future demand.

DERIVING CAPACITY LIMITS OF DATA COLLECTION FOR RANDOM WIRELESS SENSOR NETWORKS
MR.K.A.RAJA, M.E, K.LAVANYA, Ranipettai Engineering college.

Abstract A wireless sensor network (WSN) consists of spatially distributed sensors to monitor environmental condition, and to cooperatively collect and pass their data through the network to a main location or sink. So data collection is a fundamental function provided by wireless sensor network. The performance of data collection in sensor networks can be characterized by the rate at which sensing networks can be collected and transmitted to the sink node. Data collection capacity reflects how fast the sink can collect sensing data from all sensors with interference constraint. In
Organized By, Department Of Computer Science And Engineering & Information technology The Rajaas Engineering College Vadakangulam

International Conference On Innovative Trends In Computing And Technology Icitct'13 March 2013

this project, in order to get optimal performance for any arbitrary sensor networks, we use a simple BFS tree method for data collection and greedy scheduling algorithm for deriving capacity bounds of data collection under general graph model where two nearby nodes may be unable to communicate due to barrier or path fading where sensor nodes can be deployed in Gaussian distribution with any network topology.

DETECTING COMPROMISED MACHINERY BY MONITORING SOCIABLE COMMUNICATION


MRS. K.ARUNA N.LAKSHMI, V.GAYATHRI, G.ABIRAMI A.V.C College of Engineering

Abstract This paper focus on the detection of the compromised machines in a network that are used for sending spam messages which are commonly referred to as spam zombies. The nature of sequentially observing outgoing messages gives rise to the sequential detection problem. In this project, we will develop a spam zombie detection system, named SPOT, by monitoring outgoing messages. SPOT is Ratio designed based on a statistical method called Sequential Probability

Test (SPRT), developed by SPRT. It is a powerful statistical method that can be used to

test between two hypotheses (in our case, a machine is compromised versus the machine is not compromised), as the events (in our case, outgoing messages) occur sequentially. We develop an effective spam zombie detection system named SPOT by monitoring outgoing messages of a network. Our evaluation studies show that SPOT is an effective and efficient system in in a network. This system focuses on

automatically detecting compromised machines

identifying the spam message and also detects the compromised machine based on that spam message. After identifying the spam message the SPOT system detect the compromised machine and also restrict the outgoing messages for the corresponding compromised machine. This system can be used in online applications. Based on this project we have to reduce the large number of compromised machines in a same network.
Organized By, Department Of Computer Science And Engineering & Information technology The Rajaas Engineering College Vadakangulam

International Conference On Innovative Trends In Computing And Technology Icitct'13 March 2013

AUTOMATED DETECTION OF CYBER SECURITY ATTACKS


SABARINATHAN P., KAVIYARASI S, PABCET.

Abstract Internet services and applications have become an inextricable part of daily life, enabling communication and the management of personal information from anywhere. To accommodate this increase in application and data complexity, web services have moved to a multi-tiered design wherein the web server runs the application front-end logic and data are outsourced to a database or file server. This is the main reason for the attackers try to attack the data base. The Cyber security attacks can be detected by using double guard. Double Guard differs from other type of approach that correlates alerts from independent IDSs. The cyber security uses Container-based and session-separated web server architecture enhances the security performances and also provides the isolation between the information flows that are separated in each container session. Virtualization is used to isolate objects and enhance security performance. Lightweight containers can have considerable performance advantages over full virtualization. SECURE AUTHENTICATION USING BIOMETRIC CRYPTOSYSTEM
MS. N.MADHU SUGANYA, MS.T.MEKALA M.Kumarasamy College of engineering

Abstract Cryptography is a concept to protect data during transmission over wireless network. Cryptography is used in information security to protect information from unauthorized or accidental disclosure while the information is in transmitting (either electrically or physically) and while information is in storage. The information could be accessed by the unauthorized user for malicious purpose. Therefore, it is necessary to apply effective encryption/decryption methods to enhance data security. The existing system limits only the total number of users from the unknown remote host to as low as the known remote host. It uses the white list values for tracking legitimate users. But the cookie value expires after certain time period. So the attackers may use different browsers or may try on another machine or may retry after certain time. If any malicious attacks occurred the authenticated user does not know about that. The proposed system uses two algorithms known us Bio-Metric Encryption Algorithm (BEA), Minutiae Extraction Algorithm (MEA). It uses Multi Bio-metric features for authentication purpose. And also this system
Organized By, Department Of Computer Science And Engineering & Information technology The Rajaas Engineering College Vadakangulam

International Conference On Innovative Trends In Computing And Technology Icitct'13 March 2013

dynamically generates a new Session Key for each transaction. After completion of each transaction Authenticated user must change their PIN Number to improve the security. So the proposed system will protect Data Confidentiality, Data Integrity, Authentication, Availability, Access control of information over the network.

DISTRIBUTED OPPORTUNISTIC ROUTING WITH CONGESTION DIVERSITY


MR.M.ISLABUDEEN.M.E.,(PH.D) P.NAGARAJAN. Syed Ammal Engineering College

Abstract The main challenge in the design of minimum delay routing policies is balancing the exchange between routing the packets along the shortest paths to the destination and distributing traffic according to the maximum backpressure. Combining important aspects of shortest path and backpressure routing, this paper provides a systematic development of a distributed opportunistic routing policy with congestion diversity(D-ORCD) in wireless Ad-hoc networks. D-ORCD uses a measure of draining time to opportunistically identify and route packets along the paths with an expected low overall congestion. D-ORCD is proved to ensure a bounded expected delay for all networks and under any admissible traffic. RealisticQualnet simulations for 802.11based networks demonstrate a significant improvement in the average delay over comparative solutions in the literature.

ENHANCED MEASURES FOR PERSONALIZATION OF WEB SEARCH RESULT


MR.P.PRABU MR.T.GOPALAKRISHNAN Bannari Amman Institute of Technology

Abstract The web search results reordering should be performed along with the user results for more relevance according to his profile. The above concepts should be called as personalization. Creation of the profile based on the input directly given in any form by the user and also users browsing patterns. Input given by the user in the way of keywords, instruction, etc. The profile
Organized By, Department Of Computer Science And Engineering & Information technology The Rajaas Engineering College Vadakangulam

International Conference On Innovative Trends In Computing And Technology Icitct'13 March 2013

refers to data that should be maintained by client or server level. Reordering of results that should maintained standards or degree while retrieving by the user. Hyperlink to be formed with data collection based on reordering search result and hyperlink data is page ranked using page ranking algorithm and apriority algorithm.

EVOLUTION OF ONTOLOGY BASED ON FREE TEXT DESCRIPTOR FOR WEB SERVICE


RAGAVENDIREN

Abstract Ontologies have become the de-facto modeling tool of choice, employed in a variety of applications and prominently in the Semantic Web. Nevertheless, ontology construction remains a daunting task. Ontological bootstrapping, which aims at automatically generating concepts and their relations in a given domain, is a promising technique for ontology construction. Bootstrapping ontology based on a set of predefined textual sources, such as Web services, must address the problem of multiple concepts that are largely unrelated. This paper exploits the advantage that Web services usually consist of both WSDL and free text descriptors. The WSDL descriptor is evaluated using two methods, namely Term Frequency/Inverse Document Frequency (TF/IDF) and Web context generation. We propose an ontology bootstrapping process that integrates the results of both methods and validates the concepts using the free text descriptors. The web service free text descriptor offering the more accurate definition of ontologies. They extensively validated our ontology.

Organized By, Department Of Computer Science And Engineering & Information technology The Rajaas Engineering College Vadakangulam

International Conference On Innovative Trends In Computing And Technology Icitct'13 March 2013

AUTOMATION OF PUBLIC DISTRIBUTION SYSTEM USING RFID CARD AND BIOMETRIC FOR FASTER AND SAFER ACCESS
K.RAJA MR.N.ANANDA KUMAR Arunai College of Enginering.

Abstract Our paper focuses on design and implementation of computerization of Public Distribution System (Ration Shop) through the state. In recent scenario, all the public and private sectors are going for computerization in their process to simplify and to reduce errors. Civil Supplies Corporation is the major public sector which manages and distributes the essential commodities to all the citizens. In that system various products like Rice, Sugar, Dhal, and Kerosene etc., are distributed using conventional ration shop system. Some of the limitations of conventional ration shop system are due to the manual measurements in the conventional system, the user can not able to get the accurate quantity of material. And also there is a chance for the illegal usage of our products in the conventional system. So we have proposed computerization of Ration Shop and to enhance security we have introduced fingerprint for opening the billing interface so as to avoid illegal entries without the knowledge of the ration card holder. User can also get the accurate quantity of supplies and correct price. In this automated system we replace the convectional ration card by smart card in which all the details about users are provided RFID Smart card and providing Fingerprint of the card holders is used for user authentication. Monitoring Public Distribution system is one of the big issues among public sector, so we have eased the process monitoring the whole system and also the public complaints are directly sent to the higher authority without any intermediate.

DATA MINING APPROACH TO DETECT SPAM ON FACEBOOK


A.RAJASEKAR,G.SUTHAKAR, Jayaraj Annapackiam C.S.I.College of Engineering

Abstract In this work, we present a social network spam detection application based on texts. Particularly, we tested on the Face book spam. We develop an application to test the prototype of Face book
Organized By, Department Of Computer Science And Engineering & Information technology The Rajaas Engineering College Vadakangulam

International Conference On Innovative Trends In Computing And Technology Icitct'13 March 2013

spam detection. The features for checking spams are the number of keywords, the average number of words, the text length, the number of links. The data mining model using the decision tree J48 is created using Weka [1]. The methodology can be extended to include other attributes. The prototype application demonstrates the real use of the Face book application.

AUTHENTICATION BASED CLOUD STORAGE AND SECURE DATA FORWARDING


RAJASEKARAN.S.,KALIFULLA.Y., MURUGESAN.S Veltech Multitech Dr.Rangarajan Dr.Sakunthala Engineering College,

Abstract A cloud storage system, consisting of a collection of storage servers, provides long-term storage services over the Internet. Storing data in a third partys cloud system causes serious concern over data confidentiality. General encryption schemes protect data confidentiality, but also limit the functionality of the storage system because a few operations are supported over encrypted data. Constructing a secure storage system that supports multiple functions is challenging when the storage system is distributed and has no central authority. We propose a threshold proxy reencryption scheme and integrate it with a decentralized erasure code such that a secure distributed storage system is formulated. The distributed storage system not only supports secure and robust data storage and retrieval, but also lets a user forward his data in the storage servers to another user without retrieving the data back. The main technical contribution is that the proxy re-encryption scheme supports encoding operations over encrypted messages as well as forwarding operations over encoded and encrypted messages. Our method fully integrates encrypting, encoding, and forwarding. We analyze and suggest suitable parameters for the number of copies of a message dispatched to storage servers and the number of storage servers queried by a key server. These parameters allow more flexible adjustment between the number of storage servers and robustness.

NORMALIZED MEAN MEDIAN FILTER FOR HIGHLY CORRUPTED IMPULSE NOISE IMAGE
P.RAJESWARI Anna University, Regional Centre,

Organized By, Department Of Computer Science And Engineering & Information technology The Rajaas Engineering College Vadakangulam

International Conference On Innovative Trends In Computing And Technology Icitct'13 March 2013

Abstract A novel normalized mean median filter is presented for the removal of salt and pepper noise from highly corrupted noisy images. The noisy pixels are replaced with either the computed mean value or the computed median value. The proposed method replaces only the noisy pixel.Experimental results show the superiority of the performance of the proposed algorithm as compared to that of the state-of-the art methods like standard median filter, progressive switching median filter especially when the image is corrupted with high density impulse noise.

EFFECTIVE OPTIMIZATION OF VIDEO TRANSMISSION IN WLAN


RAMYASREE.R.S. MRS.A.CYNTHIA Dhanalakshmi Srinivasan College of Engg.and Tech.,

Abstract The prevalence of high-denition (HD) cameras, televisions, Blu-Ray players, and DVD recorders means that almost all video content is now captured and recorded digitally and much of it in HD. MPEG-2, H.264/AVC, and VC-1 are the most popular codecs in use today, and these rely on decorrelating trans- forms, motion estimation, intra prediction, and variable-length entropy coding (VLC) to achieve good picture quality at high compression ratios .Alongside the need for efcient video compression, there is a critical requirement for error resilience, in particular in association with wireless networks which are characterized by highly dynamic variations in error rate and bandwidth . Compression techniques based on prediction and variable- length coding render an encoded bit stream highly sensitive to channel errors. In the paper, techniques such as pyramid vector quantization (PVQ) have been implemented for increasing the ability to prevent error propagation through the use of fixed-length codeword in the Wireless Environment. In the paper, frame performance of the video has been observed in the pyramid vector section which offers greater compression performance in various techniques.

CLOUD COMPUTING
Organized By, Department Of Computer Science And Engineering & Information technology The Rajaas Engineering College Vadakangulam

International Conference On Innovative Trends In Computing And Technology Icitct'13 March 2013

M.RIZWANA BARVEEN R.V.R.PRIYANKA

KLN College of information technology

Abstract The term, cloud computing, has become one of the latest buzzwords in the IT industry. Cloud computing is an innovative approach that leverages existing IT infrastructure to optimize compute resources and manage data and computing workloads. Cloud computing promises to increase the velocity with which applications are deployed, increase innovation, and lower costs, all while increasing business agility. cloud computing that allows it to support every facet, including the server, storage, network, and virtualization

A GLOBAL THRESHOLD BASED APPROACH FOR DENDRITIC SPINE DETECTION


MR.S.ATHINARAYANANK. SAM ELIEZER

Abstract Neuron reconstruction and dendritic spine identification on a large data set of microscopy images is essential for understanding the relationship between morphology and functions of dendritic spines.Dendrites are the tree-like structures of neuronal cells, and spines are small protrusions on the surface of dendrites. Spines have various visual shapes (e.g., mushroom, thin, and stubby) and can appear or disappear over time. Existing neurobiology literature shows that the morphological changes of spines and the dendritic spine structures are highly correlated with their underlying cognitive functions.How to accurately and automatically analyse meaningful structural information from a large microscopy image data set is a difficult task. One challenge in spine detection and segmentation is how to automatically separate touching spines. In this paper, based on various global and local geometric features of the dendrite structure touching spines are detected and to segment them a breaking-down and stitching-up algorithm is used.
Organized By, Department Of Computer Science And Engineering & Information technology The Rajaas Engineering College Vadakangulam

International Conference On Innovative Trends In Computing And Technology Icitct'13 March 2013

SYMPATHETIC NODE LOCALIZABILITY OF WIRELESS AD HOC AND SENSOR NETWORKS


G.BASKARAN, M.SARANYA Srinivasan Engineering College,

Abstract Location awareness is highly difficult for wireless sensor networks. To localize the node using GPS, it is observed that the network is not entirely localized and also cant identify the number of nodes can be located within a network. Node localizability testing cannot be achieved. A new scheme called as Euclidean distance ranging techniques and polynomial algorithm localizability testing, for the node localizability is proposed. It can identify the number of nodes can be located in a connected network. When localize the node, the nodes can be uniquely localized and also the path can be identified using vertex disjoin path. Node localizability provides useful guidelines for network deployment and other location based services.

A LOSSLESS COMPRESSION SCHEME FOR BAYER COLOR FILTER ARRAY IMAGES USING GENETIC ALGORITHM
S.SARANYA Mr.G.MOHANBAABU Dr.G.ATHISHA PSNA College of Engg & Tech

Abstract A portable device such as a digital camera with a single sensor and Bayer color filter array (CFA) requires demosaicing to reconstruct a full color image. In most digital cameras, Bayer CFA images are captured and demosaicing is generally carried out before compression. Recently, it was found that compression-first schemes outperform the conventional demosaicing-first schemes in terms of output image quality. A Genetic Algorithm based lossless compression scheme for Bayer CFA images is proposed in this Project. Simulation results show that the proposed compression scheme

Organized By, Department Of Computer Science And Engineering & Information technology The Rajaas Engineering College Vadakangulam

International Conference On Innovative Trends In Computing And Technology Icitct'13 March 2013

can achieve a better compression performance than conventional lossless CFA image coding schemes. HOMOMORPHIC AUTHENTICATION WITH DYNAMIC AUDIT FOR CATCHING THE MODIFICATIONS OF DATA IN MULTI CLOUD STORAGE
V.B.VINITHA Prathyusha Institute of Technology and Management

Abstract A multi cloud is a cloud computing environment in which an organization provides and manages some internal and external resources. Provable data possession (PDP) is a audit technique for ensuring the integrity of data in storage outsourcing. However, early remote data audit schemes have focused on static data and the fact that users no longer have physical possession of the possibly large size of outsourced data makes the data integrity protection is very challenging task. In this project proposes the homomorphic authentication with dynamic audit mechanism in multi clouds to support the scalable service and data migration, in which multiple cloud service providers to collaboratively store and maintain the clients' data. Security in cloud is achieved by signing the data block before sending to the cloud by using Boneh LynnShacham (BLS) algorithm which is more secure compared to other algorithms. To ensure the correctness of data, we consider an external auditor called as third party auditor (TPA), on behalf of the cloud user, to verify the integrity of the data stored in the cloud. The audit service construction is based on the techniques, fragment structure, random sampling and index-hash table, supporting provable updates to outsourced data and timely anomaly detection. The security of this scheme based on multi-prover zero-knowledge proof system, which can satisfy completeness, knowledge

soundness, and zero-knowledge properties. The technique of bilinear aggregate signature is used to achieve batch auditing. Batch auditing reduces the computation overhead. Extensive security and performance analysis shows the proposed schemes are provably secure and highly efficient. A SHORT-WAVE INFRARED NANOINJECTION IMAGER WITH 2500 A/W RESPONSIVITY AND LOW EXCESS NOISE
K.SRIJA V.SINDHU

Abstract
Organized By, Department Of Computer Science And Engineering & Information technology The Rajaas Engineering College Vadakangulam

International Conference On Innovative Trends In Computing And Technology Icitct'13 March 2013

We report on a novel nanoinjection-based short-wave infrared imager, which consists of InGaAs/GaAsSb/InAlAs/InP-based nanoinjection detectors with internal gain. The imager is 320256 pixels with a 30m pixel pitch. The test pixels show responsivity values in excess of 2500 A/W, indicating generation of more than 2000 electrons/photon with high quantum efficiency. This amplification is achieved at complementary metal-oxide semicon- ductor (CMOS) compatible, subvolt bias. The measured excess noise factor F of the hybridized imager pixels is around 1.5 at the responsivity range 1500 to 2000 A/W. The temperature behavior of the internal dark current of the imager pixels is also studied from 300 to 77 K. The presented results show, for the first time, that the nanoinjection mechanism can be implemented in imagers to provide detector-level internal amplification, while maintaining low noise levels and CMOS compatibility.

DATA HIDING IN MPEG VIDEO FILES USING BLOCK SHUFFLING APPROACH


GOPU

Abstract Data hiding consists of two set of data, namely the cover medium and digital medium embedding data ,which is called message. Early video data hiding approaches were proposing still image watermarking techniques. This work extended to video by hiding the message in each frame independently. This work deals with two approaches for data hiding .First approach, quantization scale of constant bit rate video and second-order multivariate regression .However, the message payload is restricted to one bit per macro block. Second approach, Flexible Macro block Ordering was used to allocate macro block to slice group according to the content of message. In existing work of compressed video, packets may lost if channel is unreliable. The enhancement to robustness of existing work may proposes a Block shuffling scheme to isolate erroneous block caused by packet loss. And apply data hiding to add additional protection for motion vector. The existing solutions cause compression overhead and proposed solution reduces the packet loss during transmission.

Organized By, Department Of Computer Science And Engineering & Information technology The Rajaas Engineering College Vadakangulam

International Conference On Innovative Trends In Computing And Technology Icitct'13 March 2013

PERCOLATION THEORY BASED 2-D NETWORK CONNECTIVITY IN VANETS


SAKTHI LAKSHMI PRIYA C Anna University, Regional centre

Abstract Vehicle Ad Hoc Network is a special category of Ad-Hoc Network in which vehicles act as nodes. Due to its fast moving nature the network connectivity is an important factor because it can be greatly the performance of VANETs. Percolation theory[7] can be used to analyze the connectivity of VANETs through theoretical deduction, discover the quantitative relationship among network connectivity, vehicle density and transmission range. When vehicle density or transmission range is big enough then there is a jump of network connectivity. By knowing the vehicle density it is possible to calculate the minimum transmission range to achieve good network connectivity. As a large transmission range can cause serious collisions in wireless links, it is tradeoff to choose proper transmission range. Proper analysis of the transmission range can be useful in the realworld deployment of VANETs.

INFORMATION LOSS REVELATION USING FAKE OBJECTS


SREEKUMAR K N Mahendra Institute of Engineering & Technology

Abstract A data distributor has given sensitive data to a set of supposedly trusted agents (third parties).Some of the data are leaked and found in an unauthorized place (e.g., on the web or somebodys laptop). The distributor must assess the likelihood that the leaked data came from one or more agents, as opposed to having been independently gathered by other means. This paper proposes data allocation strategies (across the agents) that improve the probability of identifying leakages. These methods do not rely on alterations of the released data (e.g., watermarks). In some cases, we can also inject realistic but fake data records to further improve our chances of detecting leakage and identifying the guilty party.

Organized By, Department Of Computer Science And Engineering & Information technology The Rajaas Engineering College Vadakangulam

International Conference On Innovative Trends In Computing And Technology Icitct'13 March 2013

ADAPTIVE DEFENSE AGAINST VARIOUS ATTACKS IN DOS LIMITING NETWORK ARCHITECTURE


A.ANTON STENY, MRS.A.MANIAMMAL Kurinji College of Engineering & Technology

Abstract DoS attacks aim to reduce scarce resource by generating illegal requests from one or more hosts. This affects the reliability of the Internet. Also this threatens both routers as well as hosts. To avoid this new concept is proposed named Adaptive Selective Verification (ASV) to avoid Dos attacks, which is a distributed adaptive mechanism for thwarting attackers efforts to deny service to legitimate clients based on selective verification. It performs empirical evaluation AVS protocol with the aim of understanding

A CRYPTOGRAPHIC APPROACH FOR EFFICIENT KEYWORD SEARCH SCHEME IN CLOUD COMPUTING


K.FATHIMA BUSHRA N. MARTINA P. USHADEVI Dr.Sivanthi Aditanar College of Engineering

Abstract A user stores his personal files in a cloud, and retrieves them wherever and whenever he wants. For the sake of protecting the user data privacy and the user queries privacy, a user should store his personal files in an encrypted form in a cloud, and then sends queries in the form of encrypted keywords. However, a simple encryption scheme may not work well when a user wants to retrieve only files containing certain keywords using a thin client. First, the user needs to encrypt and decrypt files frequently, which depletes too much CPU capability and memory power of the client. Second, the service provider couldnt determine which files contain keywords specified by a user
Organized By, Department Of Computer Science And Engineering & Information technology The Rajaas Engineering College Vadakangulam

International Conference On Innovative Trends In Computing And Technology Icitct'13 March 2013

if the encryption is not searchable. Therefore, it can only return back all the encrypted files. A thin client generally has limited bandwidth, CPU and memory, and this may not be a feasible Solution under the circumstances. In this paper, we investigate the characteristics of cloud computing and propose an efficient privacy preserving keyword search scheme in cloud computing. It allows a service provider to participate in partial decipherment to reduce a clients computational overhead, and enables the service provider to search the keywords on encrypted files to protect the user data privacy and the user queries privacy efficiently. By proof, our scheme is semantically secure.

DATA DREADGING FOR STREAMING DATA NUGGETS USING XQUERIES


A.T.SUMITHAR. VAISHNAVI Sri Sairam Engineering College

Abstract Ever growing internet has a very large amount of digital information in the form of semi structured documents and retrieving interesting data according to the user query is a Herculean task. Indeed, documents are often so large that the dataset returned as an answer to a query may be huge to convey interpretable knowledge. In this work an approach is described based on Tree-based Association Rules (TARs), which provides approximate, intensional information on both the structure and the contents of XML documents, and can be stored in XML format as well. This mined knowledge is later used to provide: (i) a concise idea of both the structure and the content of the XML document and (ii) quick, approximate answers to queries. In this work we focus on the second feature. A prototype system and experimental results demonstrate the effectiveness of the approach.

DIGITAL IMPLEMENTATION OF MULTILAYER PERCEPTRON NETWORK FOR PATTERN RECOGNITION


A.THILAGAVATHY K.VIJAYA KANTH Srinivasan engineering college, Perambalur.

Organized By, Department Of Computer Science And Engineering & Information technology The Rajaas Engineering College Vadakangulam

International Conference On Innovative Trends In Computing And Technology Icitct'13 March 2013

Abstract Artificial Neural Networks support their processing capabilities in a parallel architecture. It is widely used in pattern recognition, system identification and control problems. Multilayer Perceptron is an artificial neural network with one or more hidden layers. This paper presents the digital implementation of multi layer perceptron neuron network using FPGA (Field Programmable Gate Array) for image recognition. This network was implemented by using three types of non linear activation function: hardlims, satlins and tansig. A neural network was implemented by using VHDL hardware description Language codes and XC3S250E-PQ 208 Xilinx FPGA device. The results obtained with Xilinx Foundation 9.2i software are presented. The results are analyzed by using device utilization and time delay.

COMPUTER AIDED DIAGNOSIS OF CANCER WITH MAMMOGRAM USING FUZZY CASE BASED REASONING
A.ATHIRAJA, Dr.P.VENKATA KRISHNAN , Dr. A. ASKARUNISHA Vickram college of engineering,

Abstract In this project to diagnosis cancer based on micro calcification (MC) in the mammography image. Pre-processing technique is applied to mammogram after that it will convert into training dataset. Perception algorithm used to classify the MC present cells and Mc absent cells. To collect the MC present cells only is called region of interest (ROI). Case Based Reasoning (CBR) classifier used to classify MC present cells into following classes initial, small, medium, high and very high. To make prediction or decision making either patient can be affected by cancer or not. If the patient affected by cancer, they are suggested to antibiotics or operations. The decision making using the fuzzy set approach. This method provides better performance compare to the previous diagnosis techniques.

Organized By, Department Of Computer Science And Engineering & Information technology The Rajaas Engineering College Vadakangulam

OFFLINE HANDWRITTEN TAMIL CHARACTER RECOGNITION USING HMM MODEL


M.AYSHWARIYA, M.ANTONY ROBERT RAJ Alpha College of Engineering

Abstract Character recognition is the important area in image processing and pattern recognition fields. Handwritten character recognition refers to the process of conversion of handwritten character in to Unicode character. The recognition system can be either on-line or off-line. Offline TAMIL handwritten character recognition has become a difficult problem because of the high variability and ambiguity in the character shapes written by individuals. A lot of researchers have proposed many approaches are designed to solve this complex problem. But still some of the problems encountered by researchers include, long network training time, long recognition time and low recognition accuracy. The performance of character recognition system is depends on proper feature extraction and correct classifier selection. This paper proposes an approach for offline recognition of Tamil characters using their structural features. Structural features are based on topological and geometrical properties of the character, such as aspect ratio, cross points, loops, branch points, inflection between two points, horizontal curves at top or bottom, etc. These features utilize Hidden Markov Models (HMMs) classifier for recognizing offline Tamil handwritten characters. Higher degree of accuracy in results has been obtained with the implementation of this approach on a comprehensive database and the precision of the results demonstrates its application on commercial usage. The concept proposed is a solution crafted to enhance computational efficiency and improve the recognition accuracy.

SLOW FEATURE ANALYSIS: A NOVEL APPROACH TO HUMAN ACTION RECOGNITION


L. BERWIN RUBIA K. MANIMALA Dr. Sivanthi Aditanar College of Engineering

Abstract Slow Feature Analysis (SFA) extracts slowly varying features from a quickly varying input signal. SFA framework is introduced to the problem of human action recognition by incorporating the supervised information with the original unsupervised SFA learning. Firstly, interest points are detected in the local spatial and temporal regions, and local feature is described with SFA method. Each action sequence is represented by the Accumulated Squared Derivative (ASD), which is a statistical distribution of the slow features in an action sequence [1]. The descriptive statistical features are extracted inorder to reduce the dimension of the ASD feature is proposed. Finally, one against all support vector machine (SVM) is trained to classify action represented by statistical features.

ANALYSIS, FUTURE AND COMPARISON OF 4G TECHNOLOGY


DIVYA PRIYADHARSHINI M PREETHI S Anna University, Regional Centre
Abstract

The means of communication not until recently has been only voice and text. Voice and SMS services were given top priority by telecom networks. But, the Internet has provided many other services like electronic file sharing, online gaming, e-commerce and getting access to any information by just goggling which appeal to people as these services are cost effective and also reduces burden on the human part. Making these services available on mobile devices has far more benefits and interesting situations. However, todays internet through cables and wireless limits connectivity only to a small region called Local Area Network (LAN) and Wireless Local Area Network (WLAN) hot spot respectively. Also getting an advanced service support to todays voice dominated telecom mobile networks is not an easy task either. Globally there is a perception that IP is the protocol that will enable new possibilities for telecom sector in future. This article discusses about the features of 4G, the edge it provides once operational, impact on India, barriers to implementation of 4G and recommendations to overcome these barriers.

A HYBRID NETWORK FOR AUTOMATIC GREENHOUSE MANAGEMENT


R.NARMATHA, C.K.NITHYA, G.RANJITHA, M.KALAIYARASI P.S.R.Rengasamy College of Engineering for women

Abstract A greenhouse is a building in which plants are grown in closed environment. Greenhouse management is controlling of several greenhouse. The wireless section is located in the indoor environment where great flexibility is needed, particularly in the production area of greenhouse. Instead, the wired section is mainly used in the outside area as a control backbone, to interconnect the greenhouse with the control room. An integrated wired/wireless solution is to use the advantages of both technologies by improving performances. In the wired section, a controller area network (CAN) type network has been chosen on the account of its simplicity, strongest, cheapness, and good performances. for the wireless part, a zigbee type network has been chosen. The SCADA system is to monitor and control data in a

simple way. To maintain the optimal conditions of the environment, greenhouse management requires data acquisition using the SCADA (supervisory control and data acquisition.

AUTOMATIC DETECTION OF FACIAL RECOGNITATION USING HAAR CLASSIFIER


K.JEYASREE, V.RAJALEKSHMI Lord Jegannath College of Engineering & Technology

Abstract This paper proposes Automatic detection of facial recognitation in an image that can be important stage for various facial image manipulation works, such as face recognition, facial expression recognition, 3D face modeling and facial features tracking. R egi on d etection of facial features like eye, pupil, mouth, nose, nostrils, lip corners, eye corners etc., with different facial image with neutral region selection and illumination is a challenging task. In this paper, we presented different methods for fully automatic region detection of facial features. Object detector is used along with haar-like cascaded features in order to detect face, eyes and nose. Novel techniques using the basic concepts of facial geometry are proposed to locate the mouth position, nose position and eyes position. The estimation of detection region for features like eye, nose and mouth enhanced the detection accuracy effectively. An algorithm, using the H-plane of the HSV color space is proposed for detecting eye pupil from the eye detected region. Proposed algorithm is tested over 100 frontal face images with two different facial expressions (neutral face and smiling face).

KEY MANAGEMENT SCHEME FOR VANET BASED ON VECTOR GROUP


K.JEYASREE, V.RAJALEKSHMI Lord Jegannath College of Engineering & Technology

Abstract Vehicular AdHoc Networks can offer various service to the user. In this paper, A Key management scheme based on vector group is to be proposed for VANET to overcome high memory overhead and also to reduce the high computational time in the existing system. We propose vector Based Cryptosystem to achieve security in terms of privacy and authentication. KeywordsKey Management, privacy, Authentication Energy consumption, Pair wise key Establishment

PERFORMANCE ANALYSIS AND DESIGN OF ENERGY EFFICIENT ARITHMETIC ADDERS BY PTL TECHNOLOGY AND ITS APPLICATION

L.KRISHNAKUMARI K.MURUGAN

National College of Engineering

Abstract Energy efficiency is one of the most required features for the modern electronic systems designed for high-performance and portable applications. Based on this, a high-speed and low-power full-adder cell is designed with DPL and SR-CPL internal logic to reduce the power delay product. The full adder is implemented with an alternative internal logic structure based on multiplexing in order to reduce the power consumption and delay. The designed DPL and SR CPL full adder shows the reduction in power consumption and delay . Post Layout Simulations shows that the proposed full adders gives better energy efficieny . The resultant full adder show to be more efficient then other logic implementation. Thus the full adders were designed and the performances were analysed in which the full-adder cells are implemented with an alternative internal logic structure, based on the multiplexing of the Boolean functions XOR/ XNOR and AND/OR, to obtain balanced delays in SUM and CARRY outputs, respectively, and pass-transistor powerless/groundless logic styles, in order to reduce power consumption.

A DESIGN OF VIRUS DETECTION PROCESSOR FOR EMBEDDED NETWORK SECURITY


S.SHAMILI AND B.KARTHIGA

Abstract In an Intrusion Detection System (IDS) has emerged as one of the most effective way of furnishing security to those connected to the network and the heart of the modern intrusion detection system is a pattern matching algorithm. A network security application needs the ability to perform the pattern matching to protect against attacks like viruses and spam. The solutions for firewall are not scalable; they dont address the difficult of antivirus. The main work is to furnish the powerful systematic virus detection hardware solution with minimum memory for network security. Instead of placing the entire patterns on a chip, a two phase antivirus processor works by condensing as much of the important filtering information as possible onto a chip. Thus, the proposed system is mainly concentrated on reducing the memory gap in on chip memory.

A NOVEL HIERARCHICAL ACCESS CONTROL APPROACH IN CLOUD COMPUTING


MR. V. VENKATESA KUMARM., MUTHULAKSHMI Anna University Regional Centre,

Abstract The Cloud Computing user will store the data in storage area provided by service providers. To achieve the security in storage devices, the HASBE (Hierarchical Attribute Set Based Encryption) this is driven by the CP-ABE with a hierarchical structure of Cloud users. The data owner can concurrently obtain encrypted data and decryption keys and allows the user to access files without authorization. When user revocation taken the Data must re-encrypt and re-upload to Cloud. This process has to do by data owner itself. The Computation cost and bandwidth cost increased. The HASBE scheme proposes a new scalable hierarchical attribute access control by introducing a key server which shows both efficient access control for outsourced data and encryption/decryption keys. The key must be generated to the users when user revocation taken. The generated key may be useless when user revocated after key generation process. Here, the Key generation time is increased. The HASBE scheme also proposes a new scheme for balancing the key generation and user revocation. It shows effective way for generating key in Cloud environment. It shows high secure and effective way for accessing data in Cloud environment.

IMPLEMETATION OF TURBO CODED WIRELESS SYSTEM AND STBC BASED SPATIAL DIVERSITY FOR WSN
N.NAVEEN KUMAR MR.P.SAMPATH KUMAR Varuvan Vadivelan institute of technology

Abstract The uncoded systems have been discussed and energy efficiency has been calculated in previous works. An energy-efficient virtual multiple-input multiple-output (MIMO) communication architecture based on turbo coder is proposed for energy constrained, distributed wireless sensor networks. As sensor nodes are generally battery-powered devices, the critical aspects to face concern how to reduce the energy consumption of nodes, so that the network life time can be increased to reasonable times. The efficiency of space-time block code-encoded (STBC) cooperative transmission is studied. Energy consumption differs for coded and uncoded systems. Though STBC is discussed, a channel encoding scheme consumes more power while system is operating. Energy efficiency is analyzed as a trade-off between the reduced transmission energy consumption and the increased electronic and overhead energy consumption. Simulations are expected to show that with proper design, cooperative

transmission can enhance energy efficiency and prolong sensor network lifetime. Along with that the BER performance is also analyzed under various SNR conditions. Simulation results are included. Since we use turbo coder and decoder for this coded system, BER is expected to be zero at a least value as less as 3dB. CLOUD COMPUTING
S.SANTHOS KUMAR NIRMAL KUMAR N Gnanamani College of Technology

Abstract Cloud computing is a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction. The concept of cloud computing fills a perpetual need of IT a way to increase capacity or add capabilities on the fly without investing in new infrastructure, training new personnel, or licensing new software. Cloud computing encompasses any subscription-based or pay-per-use service that, in real time over the Internet, extends its existing capabilities. Cloud computing provides computation, software, data access, and storage services that do not require end-user knowledge of the physical location and configuration of the system that delivers the services. Parallels to this concept can be drawn with the electricity grid, wherein end-users consume power without needing to understand the component devices or infrastructure required to provide the service Cloud computing THEFT IDENTIFICATION DURING DATA TRANSFER IN IT SECTOR
PREETHI

Abstract Theft identification during data transfer can be elaborated as in when a data distributor has given sensitive data to the trusted agents and some of the data is leaked and found in an unauthorized place. For this the system can use data allocation strategies or can also inject "realistic but fake" data records to improve identification of leaked data and who leaks the data. The Fake Objects looks exactly like original data in which the agents cannot be identified. Many of the data from the organization can be mostly leaked through the e-mails. In order to secure the data which are leaked from the mail can be detected and identified through the Fake Objects. E-Random and S-Random algorithm are used to minimize the content as well as to detect the guilty agent. The leaked data from the organization can be sent to the third parties in the form of cipher text.

A DIFFERENTIATED QUALITYOF SERVICE BASED SCHEDULING ALGORITHM FOR REAL TIME TRAFFIC

AFSAN FAZIHA. R,ABINAYA DEEPIKA .R, DEEPALAKSHMI.R Velammal College of Engineering

Abstract It is a very important factor to allocate the resources for increasing the QoS (Quality of service) for any network carrying various types of real time traffic. Real-time applications are most important to get the benefit of QoS adaptation. More scheduling disciplines are employed at the router to guarantee the QoS of the network. DiffServ (Differentiated Services) is an IP based QoS support framework that differentiates between different classes of traffic. The function of the core router of the network is to forward packets as per the per-hop behavior associated with the DSCP (Differentiated services code point) value. So we are going to propose Quality of Service Model Scheduling Algorithm (QSMA) and random walk protocol for an effective scheme to maintain QOS Parameters such as packet loss, packet delay and Bandwidth providing absolute differentiated services for real-time applications.

AN EFFICIENT WAY OF SIMILARITY SEARCH DATA PRESERVATION USING ASCENT PLUGNE METHOD
S.SELVABHARATHI, S.SARANYA, K.SELVASHEELA. N.P.RAJESWARI. Veerammal Engineering College

Abstract Our project is a method of preliminary formulation of ascent plunge with data seclusion preservation. Here there are two approach named stochastic approach and least square approach. The two methods are proposed for securely complex blocks for both horizontally partitioned data and vertically portioned data. In horizontal and vertical partitioned the parties hold the same object for same set of attributes. The mining of attribute is confined securely and it can access by the key, which is generated from DSA algorithm. Access that data the method involved is the linear regression method... The secure matrix multiplication is used for the computation for linear regression and classification for performing the operation securely and finding the data set of user without accessing the private data. This method is used for securely performing ascent plunge method over vertically partitioned data.

DUAL PROTECTING MECHANISM FOR MULTITIER WEB APPLICATION

SHANOFER SHAJAHAN

Abstract Web application is an application that is accessed over a network such as the Internet. They are increasingly used for critical services. In order to adopt with increase in demand and data complexity, web application are moved to multitier Design. Thus web applications are become a popular and valuable target for security attacks. These attacks have recently become more diverse and attention of an attacker have been shifted from attacking the front-end and exploiting vulnerabilities of the web applications in order to corrupt the back-end database system. In order to penetrate their targets, attackers may exploit well known service vulnerabilities. To protect multitier web applications, several intrusion detection systems has been proposed. By monitoring both web and subsequent database requests, we are able to ferret out attacks that an independent ID would not be able to identify. An intrusion detection system (IDS) is used to detect potential violations in database security. In every database, some of the attributes are considered more sensitive to malicious modifications compared to others.

A NOVEL CHANNEL ADAPTIVE ROUTING WITH HANDOVER IN MANETs


A.SIVAGAMI, M.RENUKADEVI P.S.R.Rengasamy College of Engineering For Women

Abstract Radio link fluctuations is a difficult task in packet transmission in mobile ad hoc networks. To overcome this we are proposing a new protocol called novel channel adaptive routing protocol which reduces channel fading. The proposed channel used to select stable links for route discovery by using average non fading duration technique and handoff strategy maintains reliable connections. This protocol provides a dual-attack for avoiding unnecessary route discoveries, predicting path failure leading to handoff and then bringing paths back into play when they are again available, rather than simply discarding them at the first sign of a fade. Keywords- Mobile ad hoc networks, Average non-fading duration, Routing protocols, Channel adaptive routing.

A RANKING BASED APPROACH FOR HANDLING SENSITIVE DATA


TANIA VERONICA SEBASTIAN J. RAJA Annai Mathammal Sheela Engineering College

Abstract An organization undoubtedly wants to preserve and retain data stored in an organizations computers. On the other hand, this data is necessary for daily work processes. Users within the organizations perimeter (e.g., employees, subcontractors, or partners) perform various actions on this data (e.g., query, report, and search) and may be exposed to sensitive information embodied within the data they access. In an effort to determine the extent of damage to an organization that a user can cause using the information obtained, the concept of a ranking based approach in security alert for handling sensitive data organization is introduced. The score measure is tailored for tabular data sets (e.g., result sets of relational database queries) and cannot be applied to non-tabular data such as intellectual property, business plans, etc. By assigning a score that represents the sensitivity level of the data that a user is exposed to, the weight can determine the extent of damage to the organization if the data is misused. Using this information, the organization can then take appropriate steps to prevent or minimize the damage.

A NOVEL ON-DEMAND CO-OPERATIVE OPPORTUNISTIC ROUTING SCHEME FOR CLUSTER BASED MANET
ABINAYA DEEPIKA.R, AFSAN FAZIHA.R, DEEPALAKSHMI.R.Velammal College of Engg and Tech

Abstract Mobile networks have received great deal of attention during last few decades due to their potential applications such as large scale, improved flexibility, and reduced costs. Variation in link quality and routing assignment are the major problems in communication network. This proposed work addresses two problems associated with mobile network such as method to reduce overhead between the nodes, and energy balanced routing of packets by Co-Operative opportunistic routing for cluster based communication. We propose a modified algorithm that uses On-Demand Opportunistic Group mobility based clustering (ODOGMBC) for forming the cluster and predicting the cluster mobility by neighborhood update algorithm. Cluster formation involves election of a mobile node as Cluster head. Each cluster comprises of cluster head and non-cluster head node that forms a cluster dynamically. Each node in the network continuously finds it neighbour by communicating with them, and nodes have consistent updated routing information in route cache by neighborhood update algorithm. In routing process packet forwarded by the source node is updated by intermediate forwarder if topology undergo changes. This opportunistic routing scheme provides responsive data transportation and managing the node effectively, even in heavily loaded environment. Thus, our proposed routing technique helps us to reduce overhead, increases efficiency and better control of path selection.

ANDROID MOBILES TO STOP THE PASSWORD STEALING ATTACKS


P. PREETHY JEMIMA , AND MR. S. MUTHUKUMARASAMY S.A.Engineering College

Abstract Text passwords have been adopted as the primary mean for user authentication in online websites. Humans are not experts in memorizing them, therefore they rely on the weak passwords. As they are the static passwords there are some adversary who can launch attacks to steal passwords, and suffers quitely from few security drawbacks: phishing, keyloggers and malware. This problem can be overcome by a protocol named oPass which leverages a users cellphone and an SMS to thwart password stealing. Opass greatly avoids the man-in-middle attacks. In case of users lose their cellphones, this still works by reissuing the SIM cards and long-term passwords. This is a efficient user authentication protocol and is at affordable cost.

RIHT: A NOVEL HYBRID IP TRACEBACK SCHEME


ATHIRA JAYAN , MALU FATHIMA AFSAR National College Of Engineering

Abstract Because the Internet has been widely applied in various fields, more and more network security issues emerge and catch peoples attention. However, adversaries often hide them selvesby spoofing their own IP addresses and then launch attacks. For this reason, researchers have proposed a lot of trace back schemes to trace the source of these attacks. Some use only one packet in their packet logging schemes to achieve IP tracking. Others combine packet marking with packet logging and therefore create hybrid IP trace back schemes demanding less storage but requiring a longer search. In this paper, we propose a new hybrid IP trace back scheme with efficient packet logging aiming to have a fixed storage requirement for each router (under 320 KB, according to CAIDAs skitter data set) in packet logging without the need to refresh the logged tracking information and to achieve zero false positive and false negative rates in attack-path reconstruction. In addition, we use a packets marking field to censor attack traffic on its upstream routers. Lastly, we simulate and analyze our scheme, in comparison with other related research in the following aspects: storage requirement, computation, and accuracy. EFFECTIVE ADAPTIVE PREDICTION SCHEME FOR WORKLOADS IN GRID ENVIRONMENT
MOHAMED AFFIR. A, VIJAYA KARTHIK.P, VASUDEVAN. Kalasalingam University

Abstract It is easier to predict workload when task is not complex but it is difficult to predict grid performance if a task is complex because heterogeneous resource nodes are involved in a distributed environment. Time-consuming execution of workload on a grid is even harder to predict due to heavy load fluctuations. In this paper we use, polynomial fitting method for CPU workload prediction. While predicting the workload of the grid error may occur during the prediction, such errors are denoted as prediction errors. These errors are minimized by using the technique called EBAF (Estimation Based Adaptive Filter method). Resource window is generated and mean square error analysis is done, error means difference between the true value and predicted value is calculated. Finally benchmark techniques have been applied to evaluate the performance of the grid.

ASSURING DATA AVAILABILITY ALL TIME IN CLOUD USING ERASURE CODE TECHNIQUE
A. FLORENCE, M. DHANALAKSHMI, V.VENU MOHAN KUMAR Saveetha School of Engineering

Abstract Cloud computing is the long dreamed vision of computing as a utility, where data owners can remotely store their data in the cloud to enjoy on-demand high-quality applications and services from a shared pool of configurable computing resources. In this paper, to focus on the security of cloud data storage, effective and flexible distributed storage verification scheme to ensure the correctness and availability of users data all time in the cloud.The proposed design allows encryption process of the data by the data owner before it reaches the Cloud server. To guarantee the simultaneous identification of the misbehaving servers such as Byzantine failure, malicious data modification attack and even sever colluding attacks. By implementing Erasure code technique, data can be recovered from the above failures and that to achieve data availability all time in cloud. Storing data in a third partys cloud system causes serious concern on data confidentiality. This project also provides where users can safely delegate the integrity checking tasks to third-party auditors (TPA).The proposed design further supports secure and efficient dynamic operations on outsourced data, including block modification, deletion, and append This project ensures proper double time data security. AN EFFECTIVE METHOD TO COMPOSE RULES USING RULE ONTOLOGY IN REPEATED RULE ACQUISITION FROM SIMILAR WEB SITES
A.L.ANUSHYA M.U.ABDUL BASITH S.ARUL K.SANGEETHA SNS College of Technology

Abstract Semantic content of the web page is used to extract the rules from similar web pages of same domain. Rule acquisition is used to acquire rules. We obtain rules from web pages which contain unstructured texts .Acquiring rules from a site by using similar rules of other sites in the same domain rather than extracting rules from each page from the start. We proposed an automatic rule acquisition procedure using rule ontology Rule To Onto, which represents information about the rule components and their structures. The rule acquisition procedure consists of the rule component identification step and the rule composition step. The rule component identification is complete. We use Genetic algorithm for the rule composition and we perform experiments demonstrating that our ontology-based rule acquisition approach works in a real-world appln.

FUZZY LOGIC BASED SCALABLE PACKET CLASSIFICATION ON FPGA


SILPA.L., MRS.NAGESWARI., The Rajaas Engineering College

Abstract Multi-field packet classification has evolved from tra- ditional fixed 5-tuple matching to flexible matching with arbitrary combination of numerous packet header fields. For example, the recently proposed Open Flow switching requires classifying each packet using up to 12-tuple packet header fields. It has become a great challenge to develop scalable solutions for nextgeneration packet classification that support higher throughput, larger rule sets and more packet header fields. This paper exploits the general packet classication problem has received a great deal of attention over the last decade. The ability to classify packets into ows based on their packet headers is important for security, virtual private networks and packet ltering applications. Multi-eld packet classication has evolved from traditional xed 5-tuple matching to exible matching with arbitrary combination of numerous packet header elds. In this project we propose a new approach to packet classication based on fuzzy logic decision trees. We focus here only on the problem of identifying the class to which a packet belongs. Here we present a fuzzy decision-tree-based linear multi-pipeline architecture on FPGAs for wire-speed multield packet classication. A new method of fuzzy decision trees called soft decision trees is used. This method combines tree growing and pruning, to determine the structure of the soft decision tree, with refitting and back-fitting, to improve its generalization capabilities. We considered the next-generation packet classication problems where more than 5-tuple packet header elds would be classied. Several optimization techniques were proposed to reduce the memory requirement of the state-of-the-art decision-tree-based packet classication algorithm. When matching multiple fields simultaneously, it is difficult to achieve both high classification rate and modest storage in the worst case. Our soft decision tree-based scheme, which can be considered among the most algorithms which has high throughput and efficiency.

ENHANCED CLOUD STORAGE SECURITY WITH AVP MECHANISM


ANANDH.A Saveetha Engineering college

Abstract Cloud storage is an online storage where data is stored in virtualized pools of storage which are hosted by third parties. However, data storage may not be fully trustworthy which possesses many security challenges on cloud storage. Access control, version control, and public auditing are taken into account to secure the data stored in the cloud. Proposed secured overlay cloud storage system will provide finegrained access by using hierarchy based access control, access to the cloud storage system is provided based on the users group type. Version control is the framework which eliminates data redundancy and provides version backup in the cloud storage. On top of the version control design, layered approach of cryptographic protection is added to enhance the data security. Version control will be employed in to the cloud storage by implementing appropriate storage mechanism. Finally, public auditing in the cloud storage system will be enforced to maintain the activity log and to analyses the data accessed by the users, thus data stored in the cloud storage is audited and any kind of modification to the data will be reported to the administrator. DETECTING SESSION HIJACKS IN WIRELESS NETWORKS
BANU PRIYA.E , MOHAMMAD MALIK MUBEEN.S. National College of Engineering

Abstract Among the variety of threats and risks that wireless LANs are facing, session hijacking attacks are common and serious ones. When a session hijacking attack occurs, an attacker forces a normal user to terminate its connection to an access point (AP) by rst masquerading the APs MAC address. The attacker then associates with the AP by masquerading the us ers MAC address and takes over its session. Current techniques for detecting session hijacking attacks are mainly based on spoof able and predictable parameters such as sequence numbers, which can be guessed by the attackers. To enhance the reliability of intrusion detection systems, mechanisms that utilize the un spoofable PHY layer characteristics are needed. We show that using a Wavelet Transform (WT), the colored noise with complex POWER Spectral Density (PSD) in our case can be approximately whitened. Since a larger Signal to Noise Ratio (SNR) increases the detection rate and decreases the false alarm rate, the SNR is maximized by analyzing the signal at specic frequency ranges.

ENERGY CONSUMPTION AND LOCALITY OF SENSOR NETWORKS


ARAVINTH.S Mr. RAMALINGAM SAKTHIVELAN N.M.K. Shri Krishna Engineering College

Abstract Wireless sensor networks (WSNs) are used in many areas for critical infrastructure monitoring and information collection. For WSNs, SLP service is further complicated by the nature that the sensor nodes generally consist of low-cost and low-power radio devices. Computationally intensive cryptographic algorithms (such as public-key cryptosystems), and large scale broadcasting-based protocols may not be suitable. Propose criteria to quantitatively measure source-location information leakage in routing-based SLP protection schemes for WSNs. Through this model, identify the vulnerabilities of SLP protection schemes. Propose a scheme to provide SLP through routing to a randomly selected intermediate node (RSIN) and a network mixing ring (NMR). The security analysis, based on the proposed criteria, shows that the proposed scheme can provide excellent SLP.The message will send securely.The adversaries cannot able to identify the source location. The adversaries cannot make any interuption to the message because of the secure algorithms. The comprehensive simulation results demonstrate that the proposed scheme is very efficient and can achieve a high message delivery ratio. It can be used in many practical applications. PACKET DATA REDUNDANCY ELIMINATION IN DATA AGGREGATION
C.HANNAH JASMINE S.SIVARANJANI Kalasalingam University

Abstract Wireless Network consists of sizable amount of device nodes and base station. Every nodes transmit the similar information to the bottom station. So Energy are wasted and network life is drained quicker in device network. During this paper we have a tendency to projected Energy efficient Heterogeneous cluster Protocol with Support Vector Machine (SVM) supported the data Aggregation. It collects or compress the data from the various finish points. Therefore minimum energy are spent and prolong the network life time and additionally it minimizes the amount of transmissions. The performance of the projected technique is then compared with the LEACH protocol. Simulation results shows that this projected mechanism will with efficiency take away the data redundancy in wireless device network.

EFFICIENT AND EFFECTIVE DATA MINING WITH BLOOMCAST AND RANKING OF DATA BY STEMMING ALGORITHM IN UNSTRUCTURED P2P NETWORKS
P.ARUNA .P.RANJAN School of Computing Sciences, Hindustan University.

Abstract Efficient and effective full-text retrieval and ranking process in unstructured peer-to-peer networks remains a challenge in the research community because it is difficult, if not

impossible, for unstructured P2P systems to effectively locate items with guaranteed recall and existing schemes to improve search success rate often rely on replicating a large number of item replicas across the wide area network, incurring a large amount of communication and storage costs. Due to the exact match problem of DHTs and federated search problem, such schemes provide poor full-text search capacity. It proposes replication of Bloom Filters for efficient and effective data retrieval and ranking of data in unstructured P2P networks. Ranking that provides the needs of the users vary, so that what may be interesting for one may be completely irrelevant for another. The role of ranking process is thus crucial: select the pages that are most likely be able to satisfy the users needs, and bring them in the top positions. Ranking of data is performed based on the term frequency and keywords. By replicating the encoded term sets using BFs and stemming of words instead of raw documents among peers, the communication and storage costs are greatly reduced, while the full-text multi keyword searching is supported. BEHAVIOURAL BASED SECURED USER AUTHENTICATION USING IMAGE CAPTCHA
S.DEEPAN K.SURESH KUMAR Saveetha Engineering College,

Abstract Recent days, web access has become more popular. Due to more number of user access there are many threats. Web access has been controlled by providing a secured authentication (username and password). Remembering the Password is the main challenging task for the user when accessing the webpage or Email account. A major problem in security is the fact that internet users have online accounts to many websites, systems, and devices that require them to remember passwords for identification. Because users can only remember a limited number of passwords, many simply forget them. In order to reaccess their account security question provides the solution. Since remembering the Password is the main challenge, it is difficult for the user to give the appropriate answer. To overcome this problem a new technique proposed called as image CAPTCHA.

SUPERVISED CLUSTERING ALGORITHM FOR GENE DATA CLASSIFICATION


DR. S.SAKTHIVEL D.GOMATHI Anna University of Chennai

Abstract Microarray is an array of gene data; in turn a gene data is nothing but a cell. The Microarray technology is an important biotechnological means that allows us to record the expression levels of thousands of genes. This process is carried out simultaneously within a number of different samples. An important application of microarray gene expression data in functional genomics is to classify samples according to their gene expression profiles. The proposed work attempts to find the application of the mutual information criterion to evaluate a set of attributes and to select an informative subset to be used as input data for microarray classification. In this set of large amount of genes, only a few is effective to perform diagnostics in an optimal way. In order to find the effective group of genes we are proposing a Supervised Clustering Algorithm (SCA) in this work. All the existing Unsupervised Clustering Algorithms groups genes according to mean and Standard Deviation measures. These existing algorithms will not consider parameters such as mutual information or correlation. The proposed algorithm is introduced to compute the similarity between attributes. This similarity measure is useful for reducing the redundancy among the attributes. The original gene set is partitioned into subsets or clusters with respect to the similarity measure. A single gene from each cluster having the highest gene-class relevance value is selected as the representative gene. The proposed supervised attribute clustering algorithm yields biologically significant gene clusters. The performance of the proposed algorithm is effective when compared with existing algorithms on both qualitatively and quantitatively. INDIAN LICENSE PLATE RECOGNITION BASED ON OPTICAL CHARACTER RECOGNITON
R.DENNIS,DR.R.K.SELVAKUMAR Cape Institute of Technology

Abstract Indian license plate recognition based on optical character recognition (ILPROCR) plays an important role in numerous applications and a number of techniques have been proposed. The approach concerns stages of pre-processing, license plate detection, extract character and number from the detection plate, license plate segmentation and character recognition. In the experiments all types of license plate, camera obtained at different day time and weather conditions were used. This paper provides character recognizer for the identification of the characters in the license plate.

ANALYSIS AND COMPENSATION FOR NONLINEAR INTERFERENCE OF MULTICARRIER MODULATION OVER SATELLITE LINK
A.IDA BERYL J.BEMINA Anand Institute of Higher Technology

Abstract The analytical characterization of the nonlinear interference that results when passing more than one high-order modulation carrier through the same nonlinear transponder high-power amplifier. A Volterra filter is proposed which is novel in its implementation of this analytical characterization and modeling of inter symbol interference and adjacent channel interference. The focus is on adaptive algorithms with pilot-based training so that the solutions are completely blind to unknown transponder HPA characteristics, and can rapidly respond to varying operating back-off level. Furthermore, two families of adaptive solutions are provided to compensate for nonlinear ISI and ACI. The first set performs adaptive channel inversion and then applies equalization. The second set of solutions performs adaptive channel identification and then applies cancellation. The effectiveness of the proposed analysis and techniques is demonstrated via extensive simulations for high-order QAM and APSK modulations. It is also included the coded performance with selected LDPC codes designed for the DVB-S2 standard. Finally, computational complexity is assessed and performance impact is quantified when complexity is reduced by decreasing the number of Volterra coefficients.

CLOUD INFORMATION ACCOUNTABILITY FRAMEWORK


SELVAMANJU.E,. MS.S.AGNES JOSHY, Francis Xavier Engineering College,

Abstract: Cloud Computing is a large scale distributed storage system. It offers the end user resources and highly scalable services. In the cloud services, users data are usually placed in the remote area. Users do not operate the data directly. Due to this user fears about losing of their own data. Then a highly decentralized information accountability framework is mainly used to keep on monitor the users data in the cloud. To provide both logging and auditing mechanisms for users data and users control. To ensure the users data will trigger authentication and automated logging to the JAR (JavaARchives) programmable capabilities. JAR files automatically log the usage of the users data by any entity in the cloud. In addition, an approach can handle personal identifiable information and also to provide a security analysis and reliability. It is essential to provide an effective mechanism.

RISK FACTOR ASSESSMENT FOR HEART DISEASE USING DATA MINING TECHNIQUES
I.S.JENZI, P.PRIYANKA, DR.P.ALLI Velammal College of Engineering and Technology

Abstract Cardio vascular disease is a major threat to half of the world population. The term heart disease is related to all the diverse diseases affecting the heart. The healthcare industry generates large amount of data that are too difficult to be analyzed by traditional methods. This shows the significance of computer assisted method to make correct decisions. The objective of this paper is to develop a heart disease prediction system using data mining techniques. This helps to identify useful patterns of information from the medical data for quality decision making. Association rules and Classification techniques like Decision Tree, Naive Bayes and Neural Networks are used in literature for disease prediction. This work concentrates on building a classifier model using Decision Tree for predicting heart disease. The system is implemented in .NET platform and the popular data mining tool WEKA is also used. The result obtained from the classifier enables to establish significant patterns and relationships between the medical factors relating to heart disease.

A NOVEL CORRELATION PRESERVING INDEXING METHOD FOR DOCUMENT CLUSTERING IN CORRELATION SIMILARITY MEASURE SPACE
S.GEOFFRIN, MRS.J.C KANCHANA, KLN College of Engineering,

Abstract This paper presents a new spectral clustering method called correlation preserving indexing (CPI),which is performed in the correlation similarity measure space. In this framework, the documents are projected into a low-dimensional semantic space in which the correlations between the documents in the local patches are maximized while the correlations between the documents outside these patches are minimized simultaneously. Since the intrinsic geometrical structure of the document space is often embedded in the similarities between the documents, correlation as a similarity measure is more suitable for detecting the intrinsic geometrical structure of the document space than euclidean distance. Consequently, the proposed CPI method can effectively discover the intrinsic structures embedded in high-dimensional document space. The effectiveness of the new method is demonstrated by extensive experiments conducted on various data sets and by comparison with existing document clustering methods.

ANALYSIS AND COMPENSATION FOR NONLINEAR INTERFERENCE OF MULTICARRIER MODULATION OVER SATELLITE LINK
A.IDA BERYL J.BEMINA Anand Institute of Higher Technology

Abstract The analytical characterization of the nonlinear interference that results when passing more than one high-order modulation carrier through the same nonlinear transponder high-power amplifier. A Volterra filter is proposed which is novel in its implementation of this analytical characterization and modeling of intersymbol interference and adjacent channel interference. The focus is on adaptive algorithms with pilot-based training so that the solutions are completely blind to unknown transponder HPA characteristics, and can rapidly respond to varying operating back-off level. Furthermore, two families of adaptive solutions are provided to compensate for nonlinear ISI and ACI. The first set performs adaptive channel inversion and then applies equalization. The second set of solutions performs adaptive channel identification and then applies cancellation. The effectiveness of the proposed analysis and techniques is demonstrated via extensive simulations for high-order QAM and APSK modulations. It is also included the coded performance with selected LDPC codes designed for the DVB-S2 standard. Finally, computational complexity is assessed and performance impact is quantified when complexity is reduced by decreasing the number of Volterra coefficients. RISK FACTOR ASSESSMENT FOR HEART DISEASE USING DATA MINING TECHNIQUES
I.S.JENZI, P.PRIYANKA, DR.P.ALLI Velammal College of Engineering and Technology

Abstract Cardio vascular disease is a major threat to half of the world population. The term heart disease is related to all the diverse diseases affecting the heart. The healthcare industry generates large amount of data that are too difficult to be analyzed by traditional methods. This shows the significance of computer assisted method to make correct decisions. The objective of this paper is to develop a heart disease prediction system using data mining techniques. This helps to identify useful patterns of information from the medical data for quality decision making. Association rules and Classification techniques like Decision Tree, Naive Bayes and Neural Networks are used in literature for disease prediction. This work concentrates on building a classifier model using Decision Tree for predicting heart disease. The system is implemented in .NET platform and the popular data mining tool WEKA is also used. The result obtained from the classifier enables to establish significant patterns and relationships between the medical factors relating to heart disease.

RFID AND ZIGBEE BASED MANUFACTURE MONITORING SYSTEM


R.JEGATHEESWARI , R.KANITHA Kalasalingam Institute of Technology

Abstract Automatic Identification Technologies (AIT) have revolutionize the way the world conducts commerce, but many people do not really understand what these technologies do or how AIT is changing our lives. Automatic Identification Technology is comprised of numerous technologies such as RFID, OCR, 2D-bar codes, magnetic strips, smart cards, voice recognition, and biometrics. Automatic identification holds the promise of collecting data about a movable asset in the physical world with 100 percent accuracy in real-time. Traditional methods of monitoring production in enterprises by humans on site are unable to meet the expectations for efficiency, accuracy and cost as product lifecycles are shortened continuously. Setting up an RFID and ZigBee based manufacturing monitoring system is a good approach to improve monitoring efficiency so as to improve management efficiency in enterprises. Although there are still some problems to be solved for RFID and ZigBee technologies, their unique features still make the monitoring system based on them a promising system in manufacturing enterprises. The architecture of the RFID and ZigBee based monitoring system is presented in this paper. A MACHINE DOCTOR THAT DIAGNOSING OPHTHALMOLOGY PROBLEMS USING NEURAL NETWORKS
A.JENEFA., S RAJI., Francis Xavier Engineering college

Abstract Ophthalmology is the branch of medicine it deals with eye and its problems. ExpertSystem contain the knowledge about a particular diseases. Machine doctor is one without any human doctor machine can cure ophthalmology problems by using expert system. We are ophthalmology problems such as myopia hypermetropia, astigmatism, mainly dealing with

presbiopia , retinopathy

and glaucoma . Our machine doctor provides both advice about diseases and the information about diseases using the expert's system and auto refraction. The machine doctor get the input as either the queries, data, voice etc.. and provide the output as data can be taken either by printed statement or using any electronic devices. The Neural Networks concept is used to get the input Using back propagation alg. The main aim is to give advice to rural people and make our machine doctor as user friendly one and cost effective.

EMPOWERED SERVICE DELEGATION WITH ATTRIBUTE ENCRYPTION FOR DISTRIBUTED CLOUD COMPUTING
M.JOTHIMANI Nandha Engineering College

Abstract Cloud computing has emerged as one of the most influential paradigms in the IT industry. In this, new computing technology requires users to entrust their valuable data to cloud providers, there have been increasing security and privacy concerns on outsourced data. Several schemes employing attributebased encryption (ABE) have been proposed for access control of outsourced data in cloud computing. The most of them suffer from inflexibility in implementing complex access control policies. In this paper, allowing cloud service providers (CSPs), which are not in the same trusted domains as enterprise users, to take care of confidential data, may raise potential security and privacy issues. To keep the sensitive user data confidential against untrusted CSPs, a natural way is to apply cryptographic approaches, by disclosing Decryption keys only to authorized users. But also provide high performance, full delegation, and scalability, so as to best serve the needs of accessing data anytime and anywhere, delegating within enterprises, and achieving a dynamic set of users. HASBE employs multiple value assignments for access expiration time to deal with user revocation more efficiently than existing schemes. It can be provide fine-grained access control and full delegation. Based on the HASBE model, Finally, we propose a scalable revocation scheme by delegating to the CSP most of the computing tasks in revocation, to achieve a dynamic set of users efficiently. A NOVEL CORRELATION PRESERVING INDEXING METHOD FOR DOCUMENT CLUSTERING IN CORRELATION SIMILARITY MEASURE SPACE
S.GEOFFRIN, MRS.J.C KANCHANA KLN College of Engineering,

Abstract This paper presents a new spectral clustering method called correlation preserving indexing (CPI), which is performed in the correlation similarity measure space. In this framework, the documents are projected into a low-dimensional semantic space in which the correlations between the documents in the local patches are maximized while the correlations between the documents outside these patches are minimized simultaneously. Since the intrinsic geometrical structure of the document space is often embedded in the similarities between the documents, correlation as a similarity measure is more suitable for detecting the intrinsic geometrical structure of the document space than euclidean distance. Consequently, the proposed CPI method can effectively discover the intrinsic structures embedded in high-dimensional document space. The effectiveness of the new method is demonstrated by extensive experiments conducted on various data sets and by comparison with existing document clustering methods.

ROBUST FUZZY SCHEDULING IN OFDMA NETWORKS FOR RESOURCE ALLOCATION


S.BRENIJA STANLEY DR.M.IRULAPPAN Francis Xavier Engineering College

Abstract This paper, deals with an opportunistic resource scheduling problem for the relay-based Orthogonal Frequency Division Multiple Access(OFDMA) cellular network where relay stations (RSs) perform opportunistic network coding with downlink and uplink sessions of a mobile station (MS). To this end, consider time-division duplexing (TDD) where each time-slot is divided into three phases according to the type of transmitter nodes, i.e., the base station(BS), MSs, and RSs. Moreover, to improve the flexibility for resource allocation, dynamic TDD scheme is applied, in which the time duration of each phase in each time-slot can be adjusted. The opportunistic network coding, introduces a novel model for network coding aware RSs with which an opportunistic network coding problem can be reduced to an opportunistic sub channel scheduling problem. Scheduling can be provided by the Fuzzy algorithm. This paper formulate an optimization problem that aims at maximizing the average weighted-sum rate for both downlink and uplink sessions of all MSs, in order to satisfy the quality-of service(QoS) requirements of each MS. It develops a resource scheduling algorithm that optimally and opportunistically schedule sub channel, transmission power, network coding, reduced power consumption and time duration of each phase in each time-slot. Through the numerical results, measures how each of network coding strategy and dynamic TDD affects the network performance with various network environments. RTOS BASED MONITOR THE INDUSTRIAL ENVIRONMENT AND EMBEDDED SYSTEM INTEGRATED IN A WSN
R.JAYAKUMAR MRS.R.THENMOZHI Ganadipathy tulsis jain engineering college

Abstract The system proposed in this paper aims to reduce the switching time delay and increase number of motors at monitoring the torque and efficiency in Automatic industrial environment, in real time by employing wireless sensor networks (WSNs). An embedded system is employed for acquiring electrical signals from the motor in a noninvasive and invasive manner, and then performing local processing for torque and efficiency estimation. The values calculated by the embedded system are transmitted to a monitoring unit through an IEEE 802.15.4-based WSN. At the base unit, various motors can be monitored in real time. The RTOSVXWOKS Reduce switching delay time between the twotasks according to assigned priority, it is theoretically zero time delay, so increase the monitoring time. IEEE 802.15.4Zigbee is work according to MAC address based. When the speed and

temperature exceeds threshold level, the ARM controller were control the motor and enables the buzzer. VXWORKS provide high performance, scalable, Reliable and high throughput monitoring system. Keyword: VXWORKS, Efficiency estimation, embedded systems, induction Motors, torque measurement, wireless sensor networks (WSNs). TOP-K RESPONSES USING KEYWORD SEARCH OVER RELATIONAL DATABASES THROUGH TUPLE UNITS
MRS.G.JEYASRI, MRS.K.UMAMAHESWARI, University College of Engineering

Abstract Existing keyword search methods on databases usually find Steiner trees composed of connected database tuples as answers. By discovering rich structural relationship between database tuples they on-the fly identify Steiner trees, and without consider the fact that structural relationship can be precomputed and indexed. Tuple units are proposed to improve search efficiency by indexing structural relationships between tuples, and existing methods identify a single tuple unit to answer keyword queries. In many cases, multiple tuple units should be combined together to answer a keyword query. Hence these methods will involve false negatives. To handle this problem, we study how to integrate multiple related tuple units to effectively answer keyword queries and to achieve a high performance, two novel indexes are used, single keyword based structure aware index and keyword pair based structure aware index .Structural relationships between different tuple units are incorporated into the indexes. By using the indexes, we can effectively identify the answers of integrated tuple units. New ranking techniques and algorithms are progressively implemented to find top-k answer.

IMAGE STEGANOGRAPHY USING ADAPTIVE LSB ENCODING


ANAND.G , ABINAYA .S , KARTHIGA RPSRR College of engineering for women

Abstract Steganography is the art and science of hiding secret data within an innocent cover data so that it can be securely transmitted over a network. Steganography is the art of hiding the fact that communication take place by secretly hiding the information from others hiding. Many different carrier file formats are used, but digital images are popular because of their frequency on the internet. Hiding the information within the image includes various steganography methods some are more complex than others and all of them have respective good and weak points. This project report intends to give an view of image steganography, its uses and techniques. In this paper we proposed on edge adaptive scheme that will select the sharper pixel along the edges for hiding the message. The advantage is that the smooth edges are very less affected. A new technique that adaptively selects the

pixels at the edge (sharper regions) for hiding the secret data rather than selecting randomly using PRNG. For lower embedding rates, only sharper edge regions are used. When the embedding rate increases, the edge regions can be released adaptively for data hiding by adjusting a few parameters. A TREE BASED MINING APPROACH FOR DETECTING INTERACTION PATTERNS
LINCY JANET. F, KARTHIKA. N, M. JANAKI MEENA Velammal College of Engineering and Technology

Abstract Human interactions are defined as the social behavior or communicative actions taken by meeting participants related to the current topic. The various kinds of human interactions are proposing an idea, giving comments, acknowledgement; ask opinion, positive opinion and negative opinion. These interactions are essential to predict the user role, attitude and intention towards the topic. This project focuses on only the task oriented interactions that address the task related aspects. Mining human interaction is important to access and understand the meeting content quickly. This project proposes a mining method to extract the frequent human interaction patterns. The interaction flow is represented as a tree. Hence, the popular tree based mining algorithms namely frequent interaction tree pattern mining and frequent interaction subtree pattern mining are designed to analyze and extract frequent patterns from the constructed trees.

EMBEDDED SYSTEM BASED OILWELL HEALTH MONITORING AND CONTROLING USING SENSOR NETWORK
T.KATHIRAVAN., MR.M.SUDHAKARAN Ganadipathy tulsis jain engineering college

Abstract The security management system is described in this project. System structure of wireless security adopts two level structures. The first level is consist of some remote controllers and a launcher which is include wireless burglar alarm, fault alarm, power-off alarm, self-checking alarm and some wireless night patrol point. The second level is consisting of a wireless receiver and a wireless alarm controller in the system. Gas sensor to detect the flammable gas which generally evolves from the oil wells, in case of any detection occur automatically exhaust will switch on to pass away the particles. Temperature level feel to be high in the mean time cooling fan will trigger to reduce or to maintain the particular temperature in the wells. With the help of current & potential Transformer we can find out the fluctuations in the pumping section, The level of oil should be vary from the indicated level its gives alert message via voice indicator, measure the humidity, use PH sensor. All datas transmitted & monitored in PC, which means control room.

A WEIGHTED PERIODICAL PATTERN MINING AND PREDICTION FOR PERSONAL MOBILE COMMERCE
P. LAKSHMIPRIYA, M.KALIDASS Maharaja Engineering College

Abstract The development of wireless and web technologies has allowed the mobile users to request various kinds of services by mobile devices at anytime and anywhere but also use their mobile devices to make business transactions easily, e.g., via digital wallet. The location of the mobile phone user is an important piece of information used during mobile commerce or m-commerce transactions. Mining and Prediction of users mobile commerce behaviors such as their movements and purchase transactions has been studied in data mining research. Most of the previous studies adopt an Pattern Mining approach. However, Pattern Mine needs more time to mine the frequent patterns in transaction databases when data size increases. In this study, propose a Weighted Sequential Pattern (WSP) and Periodical Pattern method for mining and prediction of purchase behavior of mobile users which reduces the time complexity and mining the accurate result for each item set. Performance study shows that the Weighted Sequential Pattern (WSP) and Periodical Pattern mining is efficient and accurate for Predict both large and small frequent patterns, and is about an order of magnitude faster than some recently reported new frequent-pattern mining methods. A JOINT SENTIMENT TOPIC DETECTION FROM TEXT USING SEMI-SUPERVISED TACTIC
KARTHIKA .N , LINCY JANET .F, JANAKI MEENA.M., Velammal College of Engineering and Technology

Abstract Sentiment analysis or Opinion mining is the process of detecting the subjective information in given text. Text may include subjective information like opinions, attitudes and feelings. Sentiment analysis also has an important potential role as enabling technologies for other systems. This paper employed two semi supervised probabilistic approaches called JST model and Reverse JST model to detect sentimental topic. The system designed in this paper classifies positive and negative labels of an online review. In JST, the document level sentiment classification is based on topic detection and topic sentiment analysis. JST process, the sentiment labels are associated with documents, the topics are generated dependent on sentiment distribution and words are generated conditioned on the sentiment topic pair. In Reverse JST, the sentiment label is dependent on the topics. In this process, where the topics are associated with documents, the sentiment labels are associated with topics and words are associated with both topics and sentiment labels. In LDA, where topic are associated with documents and words are associated with topic distribution. JST and Reverse JST are evaluated on

four different domains using the Gibbs Sampling Algorithm. The nature of JST makes it highly portable to other domains. It compares JST and Reverse JST with latent dirichlet allocation. In this paper observed topic and topic sentiment detected by JST are indeed coherent and informative. A SURVEY ON TRANSACTION MAPPING ALGORITHM FOR MINING FREQUENTLY OCCURRING DATASETS
S.SURIYA, R.M.MEENAKSHI Velammal College of Engineering and Technology

Abstract An algorithm for mining complete frequent itemsets is used. This algorithm is referred to as the Transaction mapping algorithm. This algorithm contains transaction ids of each itemset that are mapped and compressed to continuous transaction intervals in a various space and the counting of itemsets is performed by intersecting these interval lists in a depth-first order along the lexicographic tree. And as when the compression coefficient becomes smaller than the average number of comparisons for intervals intersection at a particular level, the algorithm switches to transaction id intersection. DIGITAL IMAGE FORGERY DETECTION AND ESTIMATION THROUGH EXPLORING IMAGE MANIPULATIONS
MARY METILDA. J Roever Engineering College

Abstract In this modern age in which we are living, digital images play a vital role in much application areas. But at the same time the image retouching techniques has also increased which forms a serious threat to the security of digital images. To scope with this problem, the field of digital forensics and investigation has emerged and provided some trust in digital images. The proposed technique for image authentication that detects the manipulations that are done in the digital images. In most of the image forgeries such as copy-and-paste forgery, region duplication forgery, image splicing forgery etc basic image operations or manipulations are often involved, if there exists the evidence for basic image alterations in digital images we can say that the image has been altered. This paper aims at detecting the basic image operations such as re-sampling (rotation, rescaling), contrast enhancement and histogram equalization which are often done in forged images. The available interpolation related spectral signature method is used for detecting rotation and rescaling and for estimating parameters such as rotation angle and rescale factors. This rotation/rescaling detection method detects some unaltered images as altered one when the images are JPEG compressed. We have overcome that problem by adding noise in the input images. We have also used the existing fingerprint detection technique for detecting contrast enhancement and histogram equalization.

OPASS: A USER AUTHENTICATION PROTOCOL RESISTANT TO PREVENT ATTACK AND DETECT PHISHING WEBSITE
M.MINU SUNITHA MARY., MS.E.SALOME , Holy Cross Engineering College

Abstract Today security concerns are on the rise in all areas such as banks, governmental applications, healthcare industry, military organization, educational institutions, etc. Government organizations are setting standards, passing laws and forcing organizations and agencies to comply with these standards with non-compliance being met with wide-ranging consequences. There are several issues when it comes to security concerns in these numerous and varying industries with one common weak link being passwords.Most systems today rely on static passwords to verify the users identity. However, such passwords come with major management security concerns. Users tend to use easy-to-guess passwords, use the same password in multiple accounts, write the passwords or store them on their machines, etc. Furthermore, hackers have the option of using many techniques to steal passwords such as shoulder surfing, snooping, sniffing, guessing, etc.Several proper strategies for using passwords have been proposed. But they didnt meet the companys security concerns. Two factor authentication using devices such as tokens and ATM cards has been proposed to solve the password problem and have shown to be difficult to hack. Two factor authentications is a mechanism which implements two factors and is therefore considered stronger and more secure than the traditionally implemented one factor authentication system. Withdrawing money from an ATM machine utilizes two factor authentications; the user must possess the ATM card, i.e. what you have, and must know a unique personal identification number (PIN), i.e. what you know. ENERGY EFFICIENT SENSORY DATA COLLECTION AND RECONCILING FROM DAMAGED NETWORK
MS.SRIE VIDHYA JANANI.E., NANDHINI.B Anna University, Regional Centre

Abstract Wireless Sensor Network has wide range of applications in the field of networks. The sink nodes need to communicate effectively with other sensor nodes, for effective communication. The facts such as cluster size, energy and lifetime of the nodes should be considered to make the communication effective. While transmitting, the nodes are grouped in clusters with one head per cluster. The cluster, nearer to the sink nodes may run out of energy due to continuous utilization. So an intermediate node for communication is used, called as AGM node .The sensor nodes in the clusters, first send the information to their cluster head(chosen on the basis higher residual energy),the cluster head in turn

sends the information to the AGM node whereas the AGM transmits it to the respective sensor node and vice versa. Selection of AGM among many nodes and the entire process is carried out on the basis of The maneuver algorithm, which has 6 phases like compact clustering, AGM selection, Interclustering, Load balancing and data distribution, Communication and replenishment and Reconcile algorithm. In this process the CH transmits data after eliminating redundancy in it. In case of massive damage, Reconcile algorithm is used for the efficient usage of available AGM nodes to regain from the relapsed network. FEATURES EXTRACTION AND VERIFICATION OF SIGNATURE IMAGE
A.VAIRAMUTHU,NAVIAJOSEPH,A.RUBIYA,S.RAMYA P.S.R.Rengasamy college of engineering for women

Abstract Communication leads to the development of languages. Writing is an art which varies from person to person .Signature is one of the best way to identify the people. Signature of the same individual may vary with time and situation. Signature verification is very important in the field of person authentication such as military, banking, etc. In order to identify and control forgeries we go for signature verification. There are three types of forgeries present. They are random forgery, simple forgery and skilled forgery. In this paper we present a suitable and efficient method for offline signature verification with good reliability and accuracy. This method is very useful to identify the forgeries. Signature verification process includes preprocessing stage, feature extraction stage and signature verification stage. This method is reliable and less expensive. Skilled forgeries are also identified using this method. AIRBORNE INTERNET
NITHIYA.A ., MANIMEKALAI.V., National engineering college

Abstract The word on just about every Internet user's lips these days is "broadband." We have so much more data to send and download today, including audio files, video files and photos, that it's clogging our wimpy modems. Many Internet users are switching to cable modems and digital subscriber lines (DSLs) to increase their bandwidth. There's also a new type of service being developed that will take broadband into the air. Our paper explains some of the drawbacks that exist in satellite Internet and introduces the airborne Internet, called High Altitude Long Operation (HALO), which would use lightweight planes to circle overhead and provide data delivery faster than a T1 line for businesses. Consumers would get

a connection comparable to DSL. The HALO Network will serve tens of thousands of subscribers within a super-metropolitan area, by offering ubiquitous access throughout the networks signal "footprint". The HALO aircraft will carry the "hub" of a wireless network having a star topology. The initial HALO Network is expected to provide a raw bit capacity exceeding 16 Gbps. The concept of basic network connectivity could be used to connect mobile vehicles, including automobiles, trucks, trains, and even aircraft. Network connectivity could be obtained between vehicles and a ground network infrastructure. BRAIN TUMOR DETECTION AND AUTOMATIC SEGMENTATION
K.MUTHUREGA, A.NIVETHA, H.PADMAPRIYA, Dr.K.Ramasamy P.S.R.Rengasamy College of Engineering for women

Abstract Brain tumor detection and segmentation is the complicated task in MRI. The MRI indicates the regular and irregular tissues to differentiate the overlapping tissues in the brain. The automatic seed selection method has the problem if there is no growth of tumor and if even any small white region is present there. But the edges of the tumor is not sharped so, the result is not accurate. This happened only at the initial stage of tumor. So, the texture based detection is used and it segment automatically to separate the regular and irregular tissue to obtain the tumor area from unaffected area. The method used here is seeded region growing method and version using is MATLAB 7.8.0.347. CLASSIFICATION MODEL FOR EARLY DISEASE DIAGNOSIS USING DATA MINING
P.PRIYANKA,I.S.JENZI, DR.P.ALLI Velammal College of Engineering and Technology

Abstract Cerebrovascular disease is a disease threatening human health seriously. It is ranked as the second leading cause of death after ischemic heart disease. To discover and prevent cerebrovascular disease as early as possible has become critical. In clinical practice its occurrence is so abrupt and fierce that it is hard to make early and accurate diagnosis and prediction beforehand. To overcome this cerebrovascular disease predictive model is constructed using the classification algorithms. This work aims at obtaining data on the patients including their physical exam results, blood test results and diagnosis data. The purpose is to construct an optimum cerebrovascular disease predictive model. Three classification models are constructed using the classification techniques like Bayesian classifier decision tree, and back propagation neural network. This work focuses on providing the pre-processed dataset where the missing values and unwanted values are removed. The pre-processed data are classified according to age attribute and are extracted using feature extraction. The mean and standard

deviation for each attribute is calculated. The attribute matching the threshold value based on the importance of the attributes are extracted employing the SVM algorithm. The extracted attributes are splitted as T1, T2, T3 and are used for constructing classification model. The efficiency of the models are compared with each other and the model with best efficiency is taken and rules are predicted. WIDE RANGE REPUTATION BASED ANNOUNCEMENT SCHEME FOR VANETS
P.RAJASEKAR., MRS.A.H.RAGAMATHUNISA BEGUM,National Engg College

Abstract Using mobile ad hoc networks in an automatic environment (VANET) opens a new set of applications, such as the passing the information about local traffic or road conditions. This can increase traffic safety and improve mobility. One of the main advantages is to forward event related message. Vehicular ad hoc network (VANETs) it can be allowing vehicles to generate and broadcast message to inform nearby neighboring vehicles about road conditions, such as traffic congestion and accidents. Neighboring vehicles can use this information which may improve road safety and traffic efficiency. But messages generated by vehicles may not be reliable. In existing system use an announcement scheme for vanets based on a reputation system it can be allows evaluation of message reliability .This can improve the secure and efficient reputation broadcast in vanets. Our Proposed system It might be of interest to extend the current scheme in such a way that a message can be utilized by vehicles in a greater area. AGENT TRUST FOR EFFECTIVE COMMUNICATION IN INTELLIGENT TRANSPORTATION SYSTEM
S.RAMAPRIYA, S.PADMADEVI Velammal College of Engineering and Technology

Abstract An increasingly large number of cars are being equipped with global positioning system and Wi-Fi devices, enabling vehicle-to-vehicle (V2V) communication with the aim of providing road safety and increased passenger. This technology functions the need for agents that assist users by intelligently processing the most effective received information. Some of these mobile agents try to maximize car owners utility by sending out erroneous information. The consequence of acting on erroneous information implies the serious need to establish trust among mobile agents. The main aim of this work is to develop a model for the trustworthiness of agents in other vehicles to receive the most effective information. The challenge is to design intelligent agents to enable the sharing of information between vehicles in mobile ad hoc vehicular networks (VANETs). This work develops a multifaceted trust modeling approach that incorporates role based trust, priority based trust, experience based trust and majority-based trust and this is able to restrict the number of reports

received. It includes an algorithm that proposes how to integrate various dimensions of trust, with the practice of experimenting to validate the benefit of agent approach, stressing the importance of each of the different facets. The result provides an important methodology to enable effective V2V communication via intelligent mobile agents. CRITICAL EVENT DETECTION AND MONITORING USING NOVEL SLEEP SCHEDULING IN WSN
K.RAMYA, MR. V.SENTHIL MURUGAN, Srinivasan Engineering College

Abstract In wireless sensor networks during critical event monitoring only a small number of packets have to be transmitted. The alarm packet should be broadcast to the entire network as earlier, if any critical event is detected. Therefore, broadcasting delay is an important problem for the application of the critical event monitoring. To prolong the network lifetime some of the sleep scheduling methods are always employed in WSNs it results in a significant broadcasting delay. A novel sleep scheduling method to be proposed it is based on the level-by-level offset schedule to achieve a low broadcasting delay in wireless sensor networks (WSNs). There are two phases to set the alarm broadcasting first one is, if a node detects a critical event, it create an alarm message and quickly transmits it to a center node along a pre-determined path with a node-by-node offset way. Then the center node broadcasts the alarm message to the other nodes along another predetermined path without collision. An on demand distance vector routing protocol is established in one of the traffic direction for alarm transmission. The proposed system is used in military and forest fire application.

CONVOLUTION PARTIAL TRANSMITS SEQUENCE SCHEME FOR IMPROVING THE ENERGY EFFICIENCY OF A TRANSMITTED SIGNAL.
L.NIRMALADEVI, D.RATHIMEENA, A.SHIFANA YASMIN, P.SURESH PANDIYARAJAN P.S.R.Rengasamy College of Engineering for women

Abstract Reducing the Peak to Average Power Ratio (PAPR) in OFDM system by using Convolution Partial Transmit Sequence (C-PTS). In C-PTS has several Inverse Fast-Fourier transform (IFFT) operation increase the computational complexity of C-PTS. In our method the number of IFFT operations are used to reduce the slight PAPR losses. Simulations are performed with QPSK modulation with OFDM signal and Salehmodel power amplifier. The linearity and efficiency of the Saleh model power amplifier (PA). is increased by the effect of digital predistortion (DPD).

SECURE TRANSACTION IN ONLINE BANKING SYSTEM USING IB-MRSA


S.RENUGA DEVI., S.CHIDAMBARAM., V.MANIMARAN., National Engineering College,

Abstract Now a days more number of clients using online banking, online banking systems are becoming more desirable and achieve in banking system secure in client information product data from attacker. To maintain the clients trust and confidentiality of their online banking services on purchase items, check account information etc. How attackers compromise accounts and develop methods to protect them. Towards this purpose, presents a modified model to authenticate clients for online banking transactions through utilizing Identity- Based mediated RSA (IB-mRSA) technique in conjunction with the one-time ID concept for the increase security in online banking, The introduced system exploits a method for splitting private keys between the client and the Certification Authority (CA) server. Generating key splitting into two parties one for SEM (SEcurity Mediator) another key using for client using this key encrypt the message. SEM using key for Decrypt the client requests.

THE EYE MOUSE IS THE EQUIVALENT OF THE CONVENTIONAL COMPUTER MOUSE


RAJA SARATHA

Abstract: The Eye Mouse is the equivalent of the conventional computer mouse, but it is entirely controlled by the eyes and nose movements. This offers interesting possibilities for the study of eye movements during drawing, as well as providing a unique device to allow disabled users to operate computers. In this paper we would like to introduce design of a system for controlling a PC by eye movements. During last ten years the computers have become common tools of work it is nearly impossible to exist without them in everyday life. We are witnessing the time of revolutionary introduction of computers and information technologies into daily practice. Healthy people use keyboard ,mouse, trackball, or touchpad for controlling the PC. However these peripheries are usually not suitable for disabled people. They may have problems using these standard peripheries, for example when they suffer from myopathy, or cannot make moves with hands after an injury. Therefore we are coming with a proposal how to ease the disabled people to control the PC.

THREE-PORT SERIES-RESONANT DCDC CONVERTER TO INTERFACE RENEWABLE ENERGY SOURCES WITH BIDIRECTIONAL LOAD AND ENERGY STORAGE PORTS
RESMI.S.P AND LINDA PHILIP Udaya school of engineering.

Abstract Future renewable energy systems will need to interface several energy sources such as fuel cells, photovoltaic (PV) array with the load along with battery backup. A three-port converter finds applications in such systems since it has advantages of reduced conversion stages, high-frequency aclink, multi winding transformer in a single core and centralized control. Some of the applications are in fuel-cell systems, automobiles, and stand-alone self-sufficient residential buildings

MOBILE DATA GATHERING USING PPS IN WIRELESS SENSOR NETWORK


MR. R.SARAVANAN V.REVATHI, Anna University of Chennai,

Abstract Energy consumption becomes a primary concern in a Wireless Sensor Network. To pursue maximum energy saving at sensor nodes, a mobile collector should traverse the transmission range of each sensor in the field such that each data packet can be directly transmitted to the mobile collector without any relay. This approach leads to significantly increased data gathering latency due to the low moving velocity of the mobile collector. It studies the tradeoff between energy saving and data gathering latency in mobile data gathering by exploring a balance between the relay hop count of local data aggregation and the moving tour length of the mobile collector. This approach proposes a polling-based mobile gathering approach and formulates it into an optimization problem, named bounded relay hop mobile data gathering. Specifically, a subset of sensors will be selected as polling points that buffer locally aggregated data and upload the data to the mobile collector when it arrives. In the meanwhile, when sensors are affiliated with these polling points, it is guaranteed that any packet relay is bounded within a given number of hops. It then gives two efficient algorithms for selecting polling points among sensors.

BRAIN TUMOR DETECTION AND IDENTIFICATION USING IMAGE PROCESSING AND SOFT COMPUTING
A.SAHAYASUJI G.ATHILAKSHMIVINOTHINI P.NIVETHA

Abstract In this paper, modified image segmentation techniques were applied on MRI scan images in order to detect brain tumor. In order that for segmentation purpose we have handled analysis process from that we have proposed better algorithm for detection of brain tumor. In case of the next process that is of identification, in which we are using probabilistic neural network classifier, in this it classifies tumor tissue from normal tissue.

A SECURE DATA FORWARDING IN THE CLOUD STORAGE SYSTEM BASED ON PROXY RE-ENCRYPTION
J.SHYAMALA, B.VINISHA CATHRINE ANTONUS, R.SIVASUBRA NARAYANAN HolyCross Engineering College

Abstract: Cloud computing enables highly scalable services to be easily consumed over the Internet on an asneeded basis. cloud storage is a model of networked online storage where data is stored in virtualized pools of storage which are generally hosted by third parties. Hosting companies operate large data centers, and people who require their data to be hosted buy or lease storage capacity from them. Data robustness is a major requirement for storage systems. There have been many proposals of storing data over storage servers. One way to provide data robustness is to replicate a message such that each storage server stores a copy of the message. A decentralized erasure code is suitable for use in a distributed storage system. To construct a secure cloud storage system that supports the function of secure data forwarding by using a proxy re-encryption scheme. The encryption scheme supports decentralized erasure codes over encrypted messages and forwarding operations over encrypted and encoded messages. Our system is highly distributed where storage servers independently encode and forward messages and key servers independently perform partial decryption. We analyze and suggest suitable parameters for the number of copies of a message dispatched to storage servers and the number of storage servers queried by a key server. These parameters allow more flexible adjustment between the number of storage servers and robustness.

IMPROVING PERFORMANCE OF VIDEO TRANSMISSION QUALITY USING ENCRYPTION AND RESOURCE ALLOCATION METHOD
M.SARANYA, R. KAVITHA Velammal College of Engineering and Technology

Abstract Visual Sensor Networks (VSNs) are recent communication infrastructures used to capture and transmit dense visual information from an application context. This paper addresses the most relevant challenges posed by VSNs, namely energy efficiency and security.Energy efficiency is one of the most challenging issues for multimedia communication in Visual Sensor Networks due to the resource constraints, the requirements for high bandwidth and low transmission delay. When the nodes send any video data, it consumes more time. This is due to the large size of the video file when compared to text file. Therefore, compressed the video data before sending to the destination. Another important factor during data transfer is security. This paper proposes the joint compression and encryption which are employed to enable faster and secured transmission of video data. The joint compression and encryption algorithms resolve two major issues such as energy efficiency and security when confidential video data is sent over the Visual Sensor Networks. ADAPTIVE COUNTERMEASURE TO PREVENT DOS ATTACKS USING AN ADAPTIVE SENSITIVE AUTHORIZATION
N.SELVAGANAPATHY G.VINOTHCHAKKARAVARTHY Velammal College of Engineering and Technology

Abstract DoS attacks aim to reduce scarce resources by generating illegitimate requests from one or many hosts. It made damage to the system. To avoid this, propose a new concept Adaptive selective Verification Certification (ASV) method to avoid DoS attack, which is a distributed adaptive mechanism for thwarting attackers efforts to deny service to legitimate clients based on selective verification, with the auction based payment. Various users network path to be limited. For that, set adaptive bandwidth limit with server state whose size remains small and constant regardless of the actions and set band limit by dynamically changeable. The change depends on the usage of the clients. Perform empirical evaluation of the ASV protocol with the aim of understanding performance in practice of the attackers. And enhanced the system by adding multiple properties for the clients on finding attack rate.

AN EFFICIENT CLUSTERING ALGORITHM FOR SPAM FILTERING.


P.SHARMILA, J.SHANTHALAKSHMI REVATHY Velammal college of engineering & technology

Abstract Clustering high dimensional data results in overlapping and loss of some data. This paper extends the k-means clustering using weight function for clustering high dimensional data. The weight function can be determined by vector space model that convert high dimensional data into vector matrix. Thus the proposed algorithm is for fuzzy projective clustering which is used to find the overlapping boundaries in various subspaces. The objective function is to find the relevant dimensions by reducing the irrelevant dimensions for cluster formation. This can be explained in document clustering. Email documents are taken as sample datasets to explain fuzzy projective clustering. FAST PERCEPTUAL VIDEO ENCRYPTION USING RANDOM PERMUTATION ON MODIFIED DCT CO-EFFICIENTS
M.SHENBAGAVALLI., S.RAJAGOPAL., L.JERART JULUS National engineering college

Abstract Generally videos are of larger volume. Video encryption is also known as video scrambling. It is one of the powerful techniques for preventing unwanted interception. In this paper a robust Perceptual Video Encryption technique is applied by selecting one out of multiple unitary transforms according to the encryption key generated from random permutation method at the transformation stage. The encryption is done by splitting each frame into their corresponding RGB components. By altering the phase angle of the encryption key the separated components of each frame are thus underwent to unitary transform. The transformed frame contains co-efficient which includes both high frequency component and low frequency component. In the first stage, IDCT is applied to encrypted frames and the frames are then combined together to get the encrypted video. In the second stage, the encrypted frames are quantized and encoded. To overcome the drawbacks of Huffman coding, adaptive arithmetic encoder is used at the coding stage. Thus the encrypted bit stream is obtained. In the third stage, the decryption is done to obtain the original video. Also the performance factors under various parameters are analyzed. This methodology will be useful for video-based services over networks.

A NOVEL CHANNEL ADAPTIVE ROUTING WITH HANDOVER IN MANETS


A.SIVAGAMI, L.MARISELVI, M.RENUKADEVI P.S.R.Rengasamy College of Engineering For Women

Abstract Radio link fluctuations is a difficult task in packet transmission in mobile ad hoc networks. To overcome this we are proposing a new protocol called novel channel adaptive routing protocol which reduces channel fading. The proposed channel used to select stable links for route discovery by using average non fading duration technique and handoff strategy maintains reliable connections. This protocol provides a dual-attack for avoiding unnecessary route discoveries, predicting path failure leading to handoff and then bringing paths back into play when they are again available, rather than simply discarding them at the first sign of a fade.

SENSING ENVIRONMENTAL AND DISASTER USING CLOUD COMPUTING INFRASTRUCTURE


SUDHAGAR.V, MUTHU PATTAN.V Lord Jegannath College of Engineering and Technology

Abstract The remote monitoring system is growing very rapidly due to the growth of supporting technologies as well. Problem that may occur in remote monitoring such as the number of objects to be monitored and how fast, how much data to be transmitted to the data center to be processed properly. This study proposes using a cloud computing infrastructure as processing center in the remote sensing data. This study focuses on the situation for sensing on the environment condition and disaster early detection. Where those two things, it has become an important issue, especially in big cities big cities that have many residents. This study proposes to build the conceptual and also prototype model in a comprehensive manner from the remote terminal unit until development method for data retrieval. We also propose using FTR-HTTP method to guarantee the delivery from remote client to server.

A MULTI-RESOLUTION FAST FILTER BANK USING CYCLOSTATIONARY FEATURE DETECTION FOR SPECTRUM SENSING IN MILITARY RADIO RECEIVER
SUNDARESAN.V REVATHI.B

Abstract Now a days scarcity of spectrum is a major issue in the field of wireless communication, so efficient usage of spectrum is needed. This can be achieved by using the cognitive radio. Major problem concerned with cognitive radio is spectrum sensing. A multi resolution fast filter bank using cyclostationary feature detection to sense the various ranges of spectrum in military radio receivers is proposed. It overcomes the constraint of fixed spectrum sensing. In cyclostationary feature detection small sub bands can also be sensed and the small sub bands can be used for LAN communications in military applications. By means of cyclostationary feature detection we can classify and identify the primary signal either Digital Video Broadcasting- Terrestrial (DVB-T) or wireless microphone signal. By using the knowledge of identifying primary signals will help cognitive radio to use fraction of TV band when only a wireless microphone signal is present in the channel. It can also detect some features of the primary signal like double sideband, data rates and the modulation type .

IMPROVING EFFICIENCY IN MULTICASTING PROTOCOL FOR AD HOC NETWORKS


S.SURYA, SANGEETHASENTHILKUMAR Oxford Engineering College

Abstract Multicasting protocols supports group communication in Ad hoc networks where Receiver-based protocol is one among them which have been proposed as a means of allowing communication when nodes do not maintain any state information. In receiver-based protocols, receivers contend to be the next-hop router of a packet. Further, For multicast communication, the RB Multicast protocol is used, which simply uses a list of the multicast members addresses, embedded in packet headers, to enable receivers to decide the best way to forward the multicast traffic and a new retransmission scheme to enhance the performance of RB Multicast was proposed For receiverbased protocols using effective Duty Cycle Assignment technique based on distance. That minimizes the expected energy dissipation for a given node distance to the sink. Moreover, This

method achieves energy efficiency and high packet delivery ratio even in heavy network traffic without sacrificing the latency and throughput significantly.

RESILIENT AUTHENTICATED EXECUTION OF CRITICAL APPLICATIONS IN CORRUPTED ENVIRONMENT USING VMM


S. VALARMATHI Mr. S. SATHISHKUMAR Srinivasan engineering college

Abstract VMM (Virtual Machine Monitor) is used to develop a resilient execution environment for a critical application even in the presence of corrupted OS kernel. The attacker tries to capture the application content by corrupting the OS when an application is executing. In previous case the attacker corrupts the OS by injecting code in the system then the application terminates immediately without executing it. In this current system even in the presence of corruption the application is executed without any interception and it provide a resilient authenticated execution of critical application in untrusted environment by using Virtual Machine Monitor (VMM). VMM is a monitoring technique to monitor all the activities during execution and it is one of the online based recovery schemes to identify any such corruption. It repairs the memory corruption and allows the process for normal execution. VMM solutions generally broadcast into two categories they are memory authentication and memory duplication. Memory authentication is to check the integrity of an application and memory duplication is to rectify the corruption. The system can be applied for military application, hospitals and for all critical applications. IMPLEMENTATION OF EFFICIENT LIFTING BASED MULTI LEVEL 2-D DWT
R.VIJAYAMOHANARENGAN Indra Ganesan College of Engineering,

Abstract To present a modular and pipeline architecture for lifting based multilevel 2-D DWT. A VHDL model was described and synthesized using implementation of our architecture. The whole architecture was optimized in efficient pipeline and parallel design way to speed up and achieve higher hardware utilization. The two dimensional discrete wavelet transform (2-D DWT) is widely used in many image compression techniques. This is because the DWT can decompose the signals into different sub-bands with both time and frequency information and facilitate to achieve a high compression ratio. It is therefore a challenging problem to design an efficient VLSI architecture to implement the DWT computation for real-time applications. Owing to its regular and flexible

structure, the design can be extended easily into Different resolution levels and its area is independent of the length of the 1-D input sequence. Compared with other known architectures, proposed design requires the least computing time for 1-D lifting DWT. EVALUATION OF DATA TRANSFERRING IN MULTICORE SYSTEM
B.VINISHA CATHRINE ANTONUS, J.SHYAMALA, A.JEYAMURUGAN HolyCross Engineering College,

Abstract Receive side scaling (RSS) is an NIC technology that provides the benefits of parallel receive processing in multiprocessing environments. However, RSS lacks a critical data steering mechanism that would automatically steer incoming network data to the same core on which its application thread resides. This absence causes inefficient cache usage if an application thread is not running on the core on which RSS has scheduled the received traffic to be processed and results in degraded performance. To remedy the RSS limitation, Intels Ethernet Flow Director technology has been introduced. However, our analysis shows that Flow Director can cause significant packet reordering. Packet reordering causes various negative impacts in high-speed networks. We propose an NIC data steering mechanism to remedy the RSS and Flow Director limitations. This data steering mechanism is mainly targeted at TCP. We term an NIC with such a data steering mechanism A Transport-Friendly NIC (A-TFN). Experimental results have proven the effectiveness of A-TFN in accelerating TCP/IP performance.

MEMBERSHIP MANAGEMENT IN LARGE SCALE RELIABLE STORAGE SYSTEM


K.VINOTHINI Ms. B. AMUTHA Srinivasan engg college

Abstract The current system is having limitations in handling reconfigurations for a replica set and it is also difficult for life time membership. For that dynamically changing System membership in a large scale reliable storage system is maintained and carried out by a membership service .This service is done with an automatic reconfiguration. This reconfiguration is carried out by a membership service and dBQS[database Query Service]. dBQS is interesting in its own right because its storage algorithms extend existing Byzantine Quorum protocols to handle changes in the replica set, and it

differ from previous DHTs by providing Byzantine Fault tolerance and offering strong semantics. We develop two heuristic algorithms for the problems. Experimental studies show that the heuristic algorithms achieve good performance in reducing communication cost and are close to optimal solutions.

ADAPTIVE IMAGE SEGMENTATION BASED ON HUMAN VISUAL ATTENTION


C.UMAMAHESWARI, S.ROSLIN MARY Anand Institute of Higher Technology.

Abstract One of the major high-level tasks in computer vision is the process of object detection and recognition. The human visual system observes and understands a scene or image by making series of fixations. Every fixation point lies inside a particular region of arbitrary shape and size in the scene which can either be an object or just a part of it. Using that fixation point will be an identification marker on the object, a method to segment the object of interest by finding the optimal closed contour around the fixation point in the polar space. The proposed segmentation framework combines visual cues, in a cue independent manner. The proposed algorithm is more suitable for an active observer capable of fixating at different locations in the scene: it applies in a single image. The optimal closed contour around a given fixation point is found. This proposed segmentation framework is used to establish a simple feedback between the mid level cues (regions) and the low level cues (edges). The segmentation refinement process based on this feedback process. Our algorithm is parameter-free, computationally efficient and robust.

MOUSE ACTIVITY BY FACIAL EXPRESSIONS USING ENSEMBLE METHOD


ANANDHI.P MS.V.GAYATHRI., Srinivasan Engineering College

Abstract This graduation project aims to present an application that is able of replacing the traditional mouse with the human face as a new way to interact with the computer. Facial features (nose tip and eyes) are detected and tracked in real-time to use their actions as mouse events. The coordinates and movement of the nose tip in the live video feed are translated to become the coordinates and movement of the mouse pointer on the users screen. The left/right eye blinks fire left/right mouse click events. The only external device that the user needs is a webcam that feeds the program with

the video stream. In the past few years high technology has become more progressed, and less expensive. With the availability of high speed processors and inexpensive webcams, more and more people have become interested in real-time applications that involve image processing. One of the promising fields in artificial intelligence is HCI(Human Computer Interface.) which aims to use human features (e.g. face, hands) to interact with the computer. One way to achieve that is to capture the desired feature with a webcam and monitor its action in order to translate it to some events that communicate with the computer. In our work we were trying to compensate people who have hands disabilities that prevent them from using the mouse by designing an application that uses facial features (nose tip and eyes) to interact with the Computer.

SERVER BASED ADVANCED VANET COMMUNICATION AND APPLICATION


T.SIRON ANITA SUSAN N.SURESH Kurinji College of Engineering and Technology

Abstract Vehicular ad hoc networks (VANETs) enable vehicles to communicate with each other but require efficient and robust routing protocols for their success. we exploit the infrastructure of roadside units (RSUs) to efficiently and reliably route packets in VANETs. Our system operates by using vehicles to carry and forward messages from a source vehicle to a nearby RSU and, if needed, route these messages through the RSU network and, finally send them from an RSU to the destination vehicle. All the RSUs are interconnected with each other to take our corresponding RSU. Here in our system we are going to implement the same vehicle communication with server based manner. By this server based method the communication will be more effective and we can reduce delay with maximized throughput in VANET and also we can able to predict the traffic density of particular area of our network. This will help the vehicles like Ambulance, Police Department and Commercial users with no cost of price and also for social network application. Here the RSU is going to work as reporter to its particular Server. The server will response for the table maintenance that the data in its database. So the communication will be faster than the existing model.Here we evaluate the performance of our system using the ns2 simulation platform and compare our scheme to existing solutions. The results prove the feasibility and efficiency of our scheme.

AUTHORITY DETECTION AND COMMUNICATIONS INTO DIVERSE SERVICE-

ORIENTED SYSTEMS
ANITHA.S.GEORGE MRS.ANTO.D.BESANT

Abstract Web-based collaborations and processes have become essential in todays business and services

environments. Such processes typically span interactions between people

across globally distributed companies. Web services and SOA are the defacto technology to implement compositions of humans and services. The increasing complexity of

compositions and the distribution of people and services require adaptive and context-aware interaction models. To support complex interaction scenarios, we introduce a mixed serviceoriented system composed of both human-provided and Software-Based Services (SBSs) interacting to perform joint activities or to solve emerging problems. However,

competencies of people evolve over time, thereby requiring approaches for the automated management of actor skills, reputation, and trust. Discovering the right actor in mixed service-oriented systems is challenging due to scale and temporary nature of collaborations. We present a novel approach addressing the need for flexible involvement of experts and knowledge workers in distributed collaborations. We argue that the automated inference of trust between members is a key factor for successful collaborations. Instead of following a security perspective on trust, we focus on dynamic trust in collaborative networks. We discuss Human-Provided Services (HPSs) and an approach for managing user preferences and network structures. HPS allows experts to offer their skills and capabilities as services that can be requested on demand. Our main contributions center around a contextand

sensitive trust-based algorithm called ExpertHITS inspired by the concept of hubs

authorities in web-based environments. ExpertHITS takes trust-relations and link properties in social networks into account to estimate the reputation of users.

PREVENTING INSIDER THREATS RELATED TO CLOUD COMPUTING


P. ARUL SELVAM Mani Institute of Engineering and Technology

Abstract

There is a predictive modeling framework that integrates a diverse set of data sources from the cyber domain, provides automated support for the detection of high-risk behavioral "triggers" to help focus the analyst's attention and inform the analysis. Designed to be domain-independent, the system may be applied to many different threat and warning analysis/sense-making problems. In this paper, we proposed two important areas for cloud-related insider threats: normal user behavior analysis and policy integration. Few publicly available data sets exist that characterize normal user behavior in relation to indicators of insider threats, much less indicators related to cloud-based insiders. We addressing the challenge of collecting and analyzing normal user behavior should be careful to include attributes useful for cloud-based research as well. Other problem is exploring how organizations can better manage discrepancies among cloud-based security policies. We also plan to explore how such policies could be enforced on semi-trusted and/or untrusted cloud infrastructures PACKET CONCEALING METHODS FOR BLOCKING FUSSY JAMMING ATTACKS A.ARUNADEVI., S.ATHIRAYAN PandianSaraswathiYadav EngineeringCollege Abstract: Privacy is the major requirement in the wireless networks. Attacking and misusing such network could cause destructive consequences. Therefore it is necessary to integrate security to defend against the misbehavior. This paper considers the problem of an attacker disrupting an encrypted wireless network through jamming. The open nature of the wireless medium leaves it vulnerable to intentional interference attacks, typically referred to as jamming. This intentional interference with wireless transmissions can be used as a launch pad for mounting Denial-ofService attacks on wireless networks. Typically, jamming has been addressed under an external threat model. However, adversaries with internal knowledge of protocol specifications and network secrets can launch low-effort jamming attacks that are difficult to detect and counter. In this work, we address the problem of selective jamming attacks in wireless networks. In these attacks, the adversary is active only for a short period of time, selectively targeting messages of high importance. To mitigate these attacks, we develop three schemes that prevent real-time packet classification by combining cryptographic primitives with physical-layer attributes. By using these schemes, brute-force attacks against the encryption can be slowed down and can also provide

protection against chosen- plaintext and related-message attacks.

AN ENHANCED SERVICE COMPOSITION USING SEMANTIC BASED AUTOMATED SERVICE DISCOVERY


MR.K.BABU, MR.J.MANNAR MANNAN,Anna University,

Abstract As a greater number of Web Services are made available today, automatic discovery is recognized as an important task. To promote the automation of service discovery, different semantic languages have been created that allow describing the functionality of services in a machine interpretable form using Semantic Web technologies. The problem is that users do not have intimate knowledge about semantic Web service languages and related toolkits. We propose a discovery framework that enables semantic Web service discovery and composition based on the ontology frame work. We describe a novel approach for automatic discovery of semantic Web services which employs LSI to match a user request, expressed in service discovery language, with a semantic Web service description. Additionally, we present an efficient semantic matching technique to compute the semantic distance between ontological concepts. As well as implementation of service composition is take place in the proposed paper. Our approach to semantic based web service discovery involves semantic-based service categorization and semantic enhancement of the service request. We propose a solution for achieving functional level service categorization based on an ontology framework. Additionally, we utilize clustering for accurately classifying the web services based on service functionality. The semantic-based categorization is performed offline at the universal description discovery and integration (UDDI). The semantic enhancement of the service request achieves a better matching with relevant services.

SECURE AND ROBUST IRIS RECOGNITION USING POSSIBILISTIC FUZZY MATCHING ON LOCAL FEATURES
S. DIVYA MRS.C.AKILA Anna University, Regional Center

Abstract Non-contact biometrics such as face and iris have additional benefits over contact based biometrics such as fingerprint and hand geometry. However, three important challenges need to be addressed

in a non-contact biometrics-based authentication system: ability to handle unconstrained acquisition, robust and accurate matching and privacy enhancement without compromising security. In this paper, a novel possibilistic fuzzy matching strategy with invariant properties, which can provide a robust and effective matching scheme for two sets of iris feature points, is proposed. In addition, the nonlinear normalization model is adopted to provide more accurate position before matching. Moreover, an effective iris segmentation method is proposed to refine the detected inner and outer boundaries to smooth curves. For feature extraction, the Gabor filters are adopted to detect the local feature points from the segmented iris image in the Cartesian coordinate system and to generate a rotation-invariant descriptor for each detected point. After that, the proposed matching algorithm is used to compute a similarity score for two sets of feature points from a pair of iris images. The proposed approach includes enhancements to privacy and security by providing cancelable iris templates. Results on public datasets show significant benefits of the proposed approach. ENSURING AUTHENTICATION OF CLOUD INFORMATION USING JAR
FEMILA.D, GOLD BEULAH PATTUROSE. Holy Cross Engineering College

Abstract The information housed on the cloud is often seen as valuable to individuals with malicious intent. There is a lot of personal information and potentially secure data that people store on their computers, and this information is now being transferred to the cloud. This makes it critical to understand the security measures that the cloud provider has in place, and it is equally important to take personal precautions to secure the data. In order to provide security, a new highly decentralized cloud information accountability framework is used to keep track of the original usage of the users data in the cloud. The Java Archive programmable capabilities are used to ensure that any access to users data will trigger authentication and automated logging. To strengthen users control a distributed auditing mechanisms is used. The auditing mechanism involves two forms: push and pull mode.

ACRN: ADAPTIVE CONTROL CHANNEL FOR ADHOC COGNITIVE RADIO NETWORKS


M.GANGADHARAN Mr.R.MUTHU KUMAR National Engineering College

Abstract Cognitive radio (CR) is a emerging technology for accessing the spectrum dynamically, that can be used for flexibility and efficiently achieving open spectrum sharing. A CR system is an intelligent wireless communication system that is aware of incoming signals. In cognitive radio adhoc networks require reliable communication and exchange the spectrum management related information between neighbor nodes. This CR-adhoc network, a common control channel is usually used. This paper in CCC selection protocol that is implemented in a distributed way. According to DCP appearance patterns of primary systems and connectivity among nodes. In the proposed system minimizes the possibility of CCC disruption by primer user activities and maximizes node connectivity. Also reduces the frequency with which control channels are changed and cluster reformation. IMPROVEMENT OF SOURCE AND MESSAGE AUTHENTICATION USING TAM PROTOCOL FOR ADHOC NETWORKS
M.KARTHIK, .P.IMMANUEL VINOTH,M.E. J.GOLD BEULAH PATTUROSE Holycross Engineering College

Abstract Multicast streams are the dominant application traffic pattern in many mission critical ad-hoc networks. The limited computation and communication resources, the large scale deployment and the unguaranteed connectivity to trusted authorities make known security solutions for wired and single-hop wireless networks inappropriate for such application environment. This paper promotes a novel Tiered Authentication scheme for Multicast traffic (TAM) for large scale dense ad-hoc networks. Nodes are grouped into clusters. Multicast traffic within the same cluster employs oneway chains in order to authenticate the message source. Cross-cluster multicast traffic includes a message authentication codes (MACs) that are based on a set of keys. Each cluster uses a unique subset of keys to look for its distinct combination of valid MACs in the message in order to authenticate the source. TAM combines the advantages of the secret information asymmetry and the time asymmetry paradigms and exploits network clustering to reduce overhead and ensure scalability. The numerical and analytical results demonstrate the performance advantage of TAM in terms of bandwidth overhead and delivery delay

MINING WEB GRAPHS FOR SOCIAL RECOMMENDATIONS


L.JENITHA MARYROSELIN MARY Anand Institute of Technology

Abstract The diverse and emerging growth of web information is more critical to organize and utilize the information efficiently and effectively. User-generated information is more freestyle and less structured, which increases the difficulties in mining useful information from these data sources. The Recommender Systems are used to satisfy the information needs of Web users and improve the user experience in many Web applications. The recommendation systems are based on collaborative filtering. Collaborative Filtering is a technique that automatically predicts the interest of an active user by collecting rating information from other similar users or items. For web recommendation process there are three types of challenges. The first challenge is that it is not easy to recommend latent semantically relevant results to users. The second challenge is how to take into account the personalization feature. The last challenge is that it is time consuming and inefficient to design different recommendation algorithms for different recommendation tasks. An efficient Diffusion Graph method is proposed to overcome the above challenges. Diffusion graph method can interact with directed and undirected graphs to achieve 30 % efficiency over existing system.

DESIGN AND IMPLEMENTATION OF T RUST- AWARE ROUTING FRAMEWORK FOR WSN


JAYA MARIYAPAN

Abstract: The multi-hop routing in wireless sensor networks (WSNs) offers little protection against identity deception through replaying routing information. An adversary can exploit this defect to launch various harmful or even devastating attacks against the routing protocols, including sinkhole attacks, wormhole attacks and Sybil attacks. The situation is further aggravated by mobile and harsh network conditions. To secure the WSNs against adversaries misdirecting the multi-hop routing, we have designed and implemented TARF, a robust trust-aware routing framework for

dynamic WSNs. Without tight time synchronization or known geographic information, TARF provides trustworthy and energy-efficient route. Most importantly, TARF proves effective against those harmful attacks developed out of identity deception; the resilience of TARF is verified through extensive evaluation with both simulation and empirical experiments on large-scale WSNs under various scenarios including mobile and RF-shielding network conditions. Further, we have implemented a low-overhead TARF module in TinyOS; as demonstrated, this implementation can be incorporated into existing routing protocols with the least effort. Based on TARF, we also demonstrated a proof-of-concept mobile target detection application that functions well against an anti-detection mechanism. MVEE DEFENCE AGAINST CODE INJECTION ATTACKS
JOHN SAMUEL. B BALA MURUGAN. Mohamed Sathak Engineering College

Abstract The growth of interconnected computer increases the number and complexity of attacks. So the computer systems need appropriate security mechanism. Intrusion detection and prevention systems play an important role in detecting and preventing the attacks before they can compromise software. Multi-variant execution environment (MVEE) is an intrusion detection and prevention mechanism that executes several slightly different versions of a same program, called variants, in concurrency. The variants are defined as n2. These variants contain the same operational unit of the original program. The variants are built to have indistinguishable behavior under normal execution conditions. If any of the variant is under attack, there are noticeable divergences in their execution behavior. A monitor compares the behavior of the variants at specific synchronization points and raises an alarm when a divergence is detected. Here a monitoring mechanism that works under userspace, to supervise the variants is presented.

FORENSIC HASH COMPONENT FOR IMAGE REGISTRATION AND TAMPERING DETECTION USING SIFT FEATURE
LATHARANI .D MR NAGALINGA RAJAN.A Infant Jesus College of Engineering

Abstract

The trustworthiness of photographs has an essential role in many areas, including: forensic investigation, criminal investigation, medical imaging, and journalism. But, in todays digital age, it is possible to very easily change the information. One of the main problems is the authentication of the image received in a communication. In this paper proposed a robust alignment method which makes use of an image hash component based on the Bag of Features paradigm. Forensic hash component is a short signature attached to an image before transmission and acts as side information for analyzing the processing history and trustworthiness of the received image. The estimator is based on a voting procedure. SIFT and block-based features to detect and localize image tampering. Experiments show that the proposed approach obtaining a significant margin in terms of registration accuracy, discriminative performances and tampering detection.

RECORD LINKAGE AND DE-DUPLICATION USING FEBRL BASED INDEXING


K.MALA, S.CHINNADURAI Srinivasan Engineering college,

Abstract Record linkage is the problem of identifying similar records across different data sources. The similarity between two records is defined based on domain-specific similarity functions over several attributes. De-duplicating one data set or linking several data sets are increasingly important tasks in the data preparation steps of many data mining projects. The aim is to match all records relating to the same entity. Different measures have been used to characterize the quality and complexity of data linkage algorithms,and several new metrics have been proposed. An overview of the issues involved in measuring data linkage and de-duplication quality and complexity. A matching tree is used to overcome communication overhead and give matching decision as obtained using the conventional linkage technique. Developed new indexing techniques for scalable record linkage and de-duplication techniques into the febrl framework, as well as the investigation of learning techniques for efficient and accurate indexing.

PROVIDING SECURE AND SCALABLE ACCESS CONTROL OF OUTSOURCED DATA IN CLOUD COMPUTING
MR.T.MANIMARAN, MR. S. PACKIYA RAJKUMAR, Pandian Saraswathi Yadav Engineering

College

Abstract:

The mainstay of the project is to secure the data that stored on the cloud from the unauthorized users and to provide access control of outsourced data. Cloud computing has emerged as one of the most influential paradigms in the IT industry in recent years. Since this new computing technology requires users to entrust their valuable data to cloud providers, there have been increasing security and privacy concerns on outsourced data. Several schemes have been proposed for access control of outsourced data in cloud computing. Most of the proposed schemes employing attribute based encryption (ABE) for access control, but they lacks in flexibility and scalable access. To overcome this, we introduce a scheme for encryption based on hierarchical structure of users. This scheme provides efficient key management mechanism to distribute decryption keys to authorized users, also scalable when number of users becomes large and flexible to revoke keys for a previously legitimate user. The main advantage of this scheme is that no need of data owner always to be in online. The users without a decryption key cannot view the content of file and the file is secured from unauthorized access.

DESIGN AND IMPLEMENTATION OF MICROSTRIP LINE STEPPED IMPEDANCE LOW PASS FILTER
S.MANIPRIYA .B.HEMALATHA Sri Venkateswara College of Engineering

Abstract Filters play an important role in microwave applications. Micro strip filters play various roles in wireless or mobile communication systems. There is an increasing demand for newer microwave and millimeter-wave systems to meet the emerging telecommunication challenges with respect to size, performance and cost. Micro strip stepped impedance low pass filter is designed for low cost ,low insertion loss and return loss by using micro strip layout which works at 1.5GHZ for sixth order butter worth low pass filter for FR4 substrate ,permittivity is 4.4 ,substrate thickness is 1.6mm, and loss tangent is 0.02 is presented. It is used in the application of GPS. Micro strip technology is used for simplicity and ease of fabrication. The design and simulation are performed by using Agilent technologies of ADS (Advanced Design System) simulation tool to plot the insertion and return loss of the low pass filter.

CLASSIFICATION OF CHROMOSOMES USING FEED FORWARD BACK PROPAGATION ALGORITHM


RAJALAKSHMI.M SBC Engineering College

Abstract Karyotyping is a common method in cytogenetic. Automatic classification of the chromosomes within the microscopic images is the first step in designing an automatic karyotyping system. This is a difficult task especially if the chromosome is highly curved within the image. The main aim of this paper was to define a new group of features for better representation and classification of chromosomes. this paper proposes classification & analysis of human chromosomes which

includes the following steps i)we use image processing utilities and filter to remove noise .ii)the filtered image is then entered into segmentation algorithm to segment the image .iii)then the segments enter into two tracks for classifying chromosomes. the first one depends on image processing for measuring the length of chromosomes where the second one deals with initiating the feed forward neural network which is trained by means of back propagation algorithm. By using feed forward neural network and back propagation algorithm, width, position and the average intensity of chromosome was determined. Back propagation algorithm achieves high accuracy with minimum training time, which makes it suitable for real-time chromosome classification in the laboratoryIn our paper ,segmentation is done by using image processing and classification is done by using feed forward neural network and back propagation algorithm.

DATA MASKING IN MPEG VIDEO FILES WITH MESSAGE EXTRACTION ACCURACY


R.ELAKIYA P.HEMALATHA Anand Institute of Higher Technology

Abstract An automated scheme for information hiding and message prediction is proposed. The existing methods, shows the message prediction rate is low and audio loss occurs while embedding data in video. The two data hiding methods are introduced to improve the quality of video with high message prediction rate. To avoid audio loss, FFMPEG media tool is used to split the audio and video. Randomly any image from video is taken and then divided into n number of chunks, by

converting the message into ASCII format and once again transformed into binary bits and these bits are stored in each pixels RGB value. The first method multivariate regression hides one bit in each pixel. The second flexible macroblock ordering hides three bits in each pixel. Finally, the audio and video is combined and sent to the receiver. These two methods are compared and multivariate regression provides high video quality than the flexible macroblock ordering with exact message prediction and the methods are examined visually with existing methods to compare the performance. It improves the efficiency with respect to quality distortion, message prediction, channel bit errors and packet losses. The efficiency in terms of video quality of the proposed work can be enhanced up to 90%.

ENABLING CLOUD COMPUTING SECURITY FROM SINGLE TO MULTI CLOUDS K.A MOHAMED RIYAZUDEEN Abstract The use of cloud computing has increased rapidly in many organizations. Cloud computing provides many benefits in terms of low cost and accessibility of data. Ensuring the security of cloud computing is a major factor in the cloud computing environment, as users often store sensitive information with cloud storage providers but these providers may be untrusted. Dealing with single cloud providers is predicted to become less popular with customers due to risks of service availability failure and the possibility of malicious insiders in the single cloud. A movement towards multi-clouds, or in other words, interclouds or cloud-of-clouds has emerged recently. This paper surveys recent research related to single and multi-cloud security and addresses possible solutions. It is found that the research into the use of multi-cloud providers to maintain security has received less attention from the research community than has the use of single clouds. This work aims to promote the use of multi-clouds due to its ability to reduce security risks that affect the cloud computing user.

TOUCH SCREEN TECHNOLOGY K.Revathi,P.Vinotha Kalasalingam Institute of Technology Abstract

This document gives the detail about the touch screen technology, its history, its construction and its usage, the various technologies used for making touch screen also described in this document. Finally it adds the detail of its operating system.

DESIGN AND IMPLEMENTATION OF BI-QUAD ANTENNA WITH PARABOLIC REFLECTOR FOR ENHANCING THE COVERAGE AREA OF A Wi-Fi ACCESS POINT R.Kanchana., S.saraswathy., R.Ruban Thomas Vel tech multi tech Engg college Abstract The next decade will be the Wireless Era. Intel Executive Sean Maloney Todays network especially LAN has drastically changed. People expect that they should not be bound to the network. In this scenario, Wireless (WLAN) offers tangible benefits over traditional wired networking. Wi-Fi (Wireless Fidelity) is a generic term that refers to the IEEE 802.11 communications standard for Wireless Local Area Networks (WLANs). Wi-Fi works on three modes namely Ad hoc, Infrastructure and Extended modes. Ad hoc network is P2P mode. Ad hoc does not use any intermediary device such as Access Point. Infra Structure and Extended modes use Access Point as interface between wireless clients. The wireless network is formed by connecting all the wireless clients to the AP. Single access point can support up to 30 users and can function within a range of 100 150 feet indoors and up to 300 feet outdoors. The coverage area depends upon the location where the AP is being placed. The AP has the traditional Omni directional antenna The aim of this project is to increase the coverage area of an AP by replacing the traditional Omni directional antenna with Bi-quad antenna with parabolic reflector. MACHINE LEARNING APPROACH FOR MEDICAL LANGUAGE PROCESSING
SABARI ANANDHA RAJA C, SASIDHARAN G, VIKNESH R, Krishnasamy College of Engineering & Technology

Abstract The Machine Learning (ML) field has gained its momentum in almost any domain of research and just recently has becomea reliable tool in the medical domain. The empirical domain of automatic learning is used in tasks such as medical decision support,medical imaging, proteinprotein interaction, extraction of medical knowledge, and for overall patient management care. ML isenvisioned as a tool by which computer-based systems can be integrated in the healthcare field in

order to get a better, more efficientmedical care. This paper describes a ML-based methodology for building an application that is capable of identifying and disseminatinghealthcare information. It extracts sentences from published medical papers that mention diseases and treatments, and identifiessemantic relations that exist between diseases and treatments. Our evaluation results for these tasks show that the proposedmethodology obtains reliable outcomes that could be integrated in an application to be used in the medical care domain. The potentialvalue of this paper stands in the ML settings that we propose and in the fact that we outperform previous results on the same data set. CRYPTOGRAPHY A NETWORK SECURITY MEASURE
N.SUBAS CHANDRAN T.SATHYARAJ NANDHA COLLEGE OF TECHNOLOGY

Abstract In todays world data transmissions means everything. The data transmission can decide even the future of a man or the whole nation. In other word if one can have any data then he rules the world. All the systems are being computerized and we can see computers in each and every field of human work. As the need of the computer increases, there is an equal increase in the computer crimes. Securing the data in computer from illegal access should be the most important task for anyone who owns it. Physical security can be given to secure the data that is being stored in a computer system moreover there are softwares available to protect those data. But it becomes hard to protect the data when it is being transferred. Cryptography comes into play here but all the cryptographic methods are almost completely breakable by using frequency analysis and other methods. As the technique uses the encryption in the compiler level it becomes relatively hard to crack. Moreover not like all encryption techniques it does dynamically encode the data so hard to break. DETECTION OF OCCUPIED & AVAILABLE SPACES IN THE CAR PARKING SYSTEM USING HAAR-LIKE FEATURES
P.K. RAMYA DEVI SRI, M.SARANYA, V.SERMA SELVA NARMADHA, L.VALLI@ANUSHYA Jayaraj Annapackiam CSI College Of Engineering

Abstract This paper describes an approach to overcome a situation of monitoring and managing a parking area using a vision based parking system. With the rapid increase of cars the need to find available parking space in the most efficient manner, to avoid traffic congestion in a parking

area, is becoming a necessity in car park management. Current car park management is dependent on either human personnel keeping track of the available car park spaces or a sensor based system that monitors the availability of each car park space or the overall number of available car park spaces. In both situations, the information available was only the total number of car park spaces available and not the actual location available. In addition, the installation and maintenance cost of a sensor based system is dependent on the number of sensors used in a car park. This paper shows a vision based system that is able to detect and indicate the available parking spaces in a car park by using Haar-like feature. The methods utilized to detect available car park spaces were based on coordinates to indicate the regions of interest and a car classifier. This paper shows that the initial work done here has an accuracy that ranges from 90% to 100% for a 4 space car park. The work done indicated that the application of a vision based car park management system would be able to detect and indicate the available car park spaces.

TEXTURE CLASSIFICATION BASED ON NEURAL NETWORK Sheeba Thankachan, S. Nageswari Abstract As a newly developed 2-D extension of the wavelet transform using multiscale and directional filter banks, the con-tour let transform can effectively capture the intrinsic geometric structures and smooth contours of a texture image that are the dominant features for texture classification. In this paper, I propose a Neural network classifier. It is a classifier which works similar to human neural system. In my project I will be using it to classify the texture category. I propose to use Neural Network Classifier for classification of textures which are trained using Contourlet features. I enhance the classification result using feed forward back propagation neural network. A two-layer feed-forward network, with sigmoid hidden and output neurons, can classify vectors arbitrarily well, given enough neurons in its hidden layer. The network will be trained with scaled conjugate gradient back propagation algorithm.

REMOTE ACCESS & DISPLAY FOR CLIENT LEVERAGING MOBILE COMPUTING


P.SUBASHINI., SUGANYA., R.RUTHRA., M.SUJITHA A.V.C College of Engineering

Abstract:

This project represents how one PC can be controlled from remote place with one smart-phone device with the help of Internet. It means the monitor of PC will be seen in mobile. It turns your phone into a wireless keyboard and mouse with touchpad, using your own wireless network. This application can be performed on android based mobile. It requires server application for your computer. It requires device running on the Android operating system with some sort of wireless connection between them. By getting IP address from the PC and directly browse it on mobile phone. The PC screen will be access on the mobile. It Supports web applications with database for storing the web pages. On Mobile applications retrieves the required data information in certain time interval by connecting with the web server. Able to view your phones screen on your computer monitor which is great for: putting your Android notifications right along the side other notification boxes on your monitor, using it like an on-monitor caller ID, and taking screenshots and screen casts. Remote keyboard/mouse control is great for inputting data on the tiny screen without needing to peck at the on-screen keyboard.

INTELLIGENT TECHNIQUE BASED MULTICONSTRAINED QOS MANETS M.Surya, M.Sundarambal ,V.Jayanthi Abstract

ROUTING FOR

Coimbatore Institute of Technology

A Mobile ad-hoc networks(MANET) is a collection of mobile hosts ,which can communicate by the aid of intermediate mobile hosts without utilizing a fixed infrastructure and centralized admission .Quality of Service (QoS) routing in MANETs has become an important issue due to emergence of Multimedia services.An optimal path is found by considering multiple QoS parameters namely bandwidth,delay,energy and no of hops.Simulation is carried out to measure the achieved throughput and delay.

UNCERTAINTY-BASED SAMPLE SELECTION IN CLOUD COMPUTING USING FUZZY DECISION TREE INDUCTION K.Uthra Devi, V.Venkateshwaridevi Oxford Engineering College Abstract

Sample selection is to select a number of representative samples from a large database so that a learning algorithm can have a reduced computational cost and an improved learning accuracy. Many sample selection algorithms such as IBL, CNN, and their extensions have selection mechanisms that are associated with the class labels of the samples to be selected. Their selection results are directly depended on the class labels of the samples. Thus, these algorithms could just condense the data set but could not reduce labeling cost and also they do not clarify about how to use search query and how to return result from that decision tree. In order to overcome these drawbacks, this paper constructs a fuzzy decision tree in cloud environment and the accurate sample is found out. The Cloud storage method takes a single round of communication, to guarantee retrieval of the exact result. The method of storing uncertain data in cloud storage which are fully secured using encryption algorithm, all the data stored as cipher data. In cloud

environment the data are built by index and this leads to easy search the exact result using search query. VEHICULAR NETWORK FOR INTELLIGENT TRAFFIC SYSTEM TO AVOID ACCIDENTS USING HYBRID SCALABILITY PROTOCOL G.Vigneshchakkaravarthy Abstract Recent advances in wireless technologies have given rise to the emergence of vehicular ad hoc networks Vehicular ad hoc networks (VANETs) are highly mobile wireless networks that are designed to support vehicular safety, traffic monitoring, and other commercial applications. VANETs, vehicle mobility will cause the communication links between vehicles to frequently be broken. Such link failures require a direct response from the routing protocols, leading to a potentially excessive increase in the routing overhead and degradation in network scalability. In such networks, the limited coverage ofWiFi and the high mobility of the nodes generate frequent topology changes and network fragmentations. In this paper we are offering an efficient routing strategy is crucial to the deployment of VANETs. A series of representative metaheuristic algorithms (particle swarm optimization, differential evolution, genetic algorithm, and simulated annealing) are studied in this paper to find automatically optimal configurations of this routing protocol. Vehicular Ad-hoc Networks (VANETs) is attracting considerable attention from the Adhiparasakthi Engineering College

research community and the automotive industry to improve the services of Intelligent Transportation System (ITS). Traffic data from a limited region of road Map is collected to capture the realistic mobility. In this work, the entire region has been divided into various smaller routes. The realistic mobility model used here considers the drivers route choice at the run time. It also studies the clustering effect caused by traffic lights used at the intersection to regulate traffic movement at different directions. The fundamental challenges of providing live multimedia streaming (LMS) services in vehicular ad hoc networks (VANETs) come from achieving stable and high streaming rate (smooth playback) for all the interested vehicles while using minimal bandwidth resources, especially under the highly dynamic topology of VANETs and the lossy nature of vehicular wireless communications. In VANET high speed is the real characteristics which leads to frequent breakdown, interference etc. Therefore Performance of adhoc routing protocols is helpful to improve the Quality of Service (QOS). In this paper we studied various adhoc rWouting protocols, Reactive, Proactive & Hybrid, taking in to consideration parameters like speed, altitude, mobility etc in real VANET scenario The AODV and DYMO (Reactive), OLSR (Proactive) and ZRP (hybrid) protocols are compared for IEEE 802.11(MAC) and IEEE 802.11(DCF) standard using Qualnet as a Simulation tool. Since IEEE 802.11, covers both physical and data link layer. Hence performance of the protocols in these layers helps to make a right selection of Protocol for high speed mobility. Varying parameters of VANET shows that in the real traffic scenarios proactive protocol performs more efficiently for IEEE 802.11 (MAC) and IEEE 802.11(DCF).

DYNAMIC LOAD MONITORING AND RESOURCE ALLOCATION IN CLOUD ENVIRONMENT B.Vignesh Kumar S.Dhanasekaran Kalasalingam University Abstract This paper, dynamic load monitoring and resource allocation in cloud environment is managed. Load monitoring is the process of calculating free physical memory and CPU utilization of each virtual machine. In this continuous monitoring is made. And resource allocation is based on free maximum available physical memory. Here resource allocation done by two basic classification. In the first solution required physical memory to proceed the job is less than the available physical

memory so allocation is done if the condition is satisfied. In second classification where the required physical memory is greater than the available free physical memory so we cant allocate the job directly if we allocate means performance will be low. So we split job according to the need and available physical memory and then allocate to corresponding node where need is less than the free available physical memory after that we process the job. The aim is to achieve maximum utilization of cloud resource in effective manner.

GRID COMPUTING S.Vignesh Kumar R.VinothKumar Jayaram college of engineering and technology Abstract A Grid computing system is a geographically distributed environment with autonomous domains that share resources amongst themselves. Grid computing presents a new trend to distributed computing and Internet applications, which can construct a virtual single image of heterogeneous resources, provide uniform application interface and integrate widespread computational resources into super, ubiquitous and transparent aggregation. Grid computing is a systems design paradigm which efficiently uses distributed computing resources, usually on the scale of supercomputers. We design, implement and evaluate Mobile OGSI.NET, a distributed software system that permits device collaboration and better resource usage, while conforming to the grid computing OGSI Specification. Grid computing provides a challenge for visualization system designers. Grid computing gives users access to widely distributednetworks of computing resources to solve largescale taskssuch as scientific computation. Finally,we describe the middleware challenges implied by theapproach and an architecture for grid computing using virtual machines.

AN EFFICIENT MINING FOR HIGH UTILITY PATTERN FROM WEB LOG DATA USING UP CATALOG P.Visnu pria Abstract Discovering useful patterns hidden in a database plays an essential role in several data mining techniques, such as frequent pattern mining, weighted frequent pattern mining and high utility S.Geetha Oxford Engineering College

pattern mining. In Frequent pattern mining, importance of each item is not considered. To address this problem Weighted Association Rule Mining (WARM) was proposed. In WARM, the weights of each item considered. And, it doesnt satisfy the users requirement. Hence, Utility Pattern (UP) was proposed. Mining high utility item sets refers to the discovery of item sets with high utility like profits. Existing methods often generate a huge set of potential high utility item sets and their mining performance is degraded consequently while the database contains large transactions. Hence, for adopting large number of transactions UP catalog was proposed. To prune the candidate item sets effectively UP growth algorithm was used. Finally, Heuristic rule framing is done with respect to the datasets. On adopting this rule framing strategy the strength of the item sets are evaluated. This proposed mechanism reduces the tree construction cost and time and applicable for large number of logs.

BLOOD VESSEL SEGMENTATION ON DIGITAL FUNDUS IMAGES G.Priyanka1,Mrs.G.Jemilda, Jayaraj Annapackiam CSI College of Engineering, Abstract The main objective of this project is to detect and segment the blood vessels from the digital fundus images. Diabetic Retinopathy is one of the leading causes of visual impairment. It is characterized by the development of abnormal new retinal vessels. This project uses a gray level based features method for segmenting the blood vessels from the optic disk. Fifteen feature parameters associated with shape, position, orientation, brightness, contrast and line density are calculated for each candidate segment. Based on these features each segment is categorized as normal or abnormal using a support vector machine (SVM) classifier. This methodology uses morphological operation to obtain blood vessel.

COMMUNITY ANOMALY DETECTION SYSTEM IN COLLABORATIVE INFORMATION SYSTEMS Thiraviaselvi.G, Subbu Lakshmi.T.C Francis Xavier Engineering College Abstract Collaborative Information Systems (CIS) integrate and coordinate information from diverse sources and allow groups of users to communicate and cooperate over common tasks. CIS is increasingly

relied upon to manage sensitive information. Current security mechanisms to detect insider threats are ill-suited to monitor systems in which users function in dynamic teams. An insider threat is a malicious hacker who is an employee of an institution or an outside person who poses as an employee by obtaining false credentials and cause damages to the sensitive information. Community Anomaly Detection System (CADS), an unsupervised learning framework to detect insider threats based on the access logs of collaborative environments is introduced. This framework is based on the observation that typical CIS users tend to form community structures based on the subjects accessed (e.g., patients records viewed by healthcare providers). CADS consists of two components: 1) relational pattern extraction, which derives community structures and 2) anomaly prediction, which uses a statistical model based on nearest neighbor networks to determine when users have sufficiently deviated from communities. It is capable to detect anomalous insiders in systems that use dynamic teams.

DLA: DYNAMIC LEARNING ALGORITHM FOR ANOMALY DETECTIONIN MOBILE AD HOC NETWORKS (MANET) J.Vinoth KumarMs. K. Madheswari SSN College of Engineering Abstract Mobile ad hoc networks are multi hop networks of independent mobile nodes without any fixed infrastructure. In MANET it is difficult to identify malicious nodes because the network topology constantly changes due to node movement. As topology of MANET constantly changes over time, simple use of static base profile is not efficient. In this paper, the Dynamic Learning Algorithm (DLA) is proposed to detect anomalies and establishing normal profile. The anomaly detection scheme which is based on dynamic learning algorithm, where the training data is to be updated every time according to particular time interval. On comparison of sample packet in the normal baseline profile, the attack is identified and stored in the packet attack database. The NS2 Simulator is used for MANET simulation depending on scenarios based on routing attacks on AODV protocol. IMPROVED UNSUPERVISED SEGMENTATION ALGORITHM FOR TISSUE PATHOLOGY Aiswarya Gopinath Mr.Rajesh T PSN College of Engineering and Technology

Abstract Unsupervised segmentation of histopathological tissue images has two main contributions. First, a new set of high level texture features is introduced to represent the prior knowledge of spatial organization of the tissue components. Second, it proposes to obtain multiple segmentations by multilevel partitioning of a graph constructed on the tissue objects which are then combined using an ensemble function. To define objects in these components, K means algorithm can be used. The K-means algorithm partitions an image into K clusters and assigns each point to the cluster whose center (also called centroid) is nearest. Another clustering algorithm, Fuzzy C Means (FCM), also known as soft K means is present. In fuzzy clustering, each point has a certain degree of belonging to clusters, rather than completely belonging to just one cluster. Thus, points that lie on the edge of a cluster may be in the cluster to a lesser degree than points in the center of a cluster. So FCM is more suitable for elongated clusters and hence provides better results than K means. Hence, FCM is used as the clustering algorithm in this paper. Multilevel graph partitioning increases diversity of individual segmentations, and hence improves the final result.

CLUSTERING BASED DICTIONARY GENERATION FOR DIABETIC RETINOPATHY DETECTION Brindha.N.R, Mr.Sundaraguru.R Abstract A good eye is an important and a significant factor in retaining independence and quality of life of all living beings. Diabetic retinopathy is a retinopathy caused by complications of diabetes that damages the retina. It affects back part of the eye and also damages the blood vessels of the retina. This effects blurry vision, scarring, cloudiness and increased pressure, which leads to blindness. This work is helpful to detect the diabetic retinopathy (DR) using the lesions from fundus image and it can capable to detect without using pre/post processing into the affected red and bright lesions parts in retina. The method based on the concept of marking the points of interest (POI) in PSN College of Engineering and Technology,

lesion location to make a visual word dictionary. The POIs region helps to classify the fundus image neither the retina image is normal nor diabetes affected one. The approach extends by adding feature information with visual word dictionary and so it is applicable for different types of lesions in retina with specific projection space for each class of interest instead of common dictionary for all classes. The red and bright lesions are classified by visual words dictionary with cross validation and cross dataset validation to show the efficiency of this approach. For final classification SVM is proposed for a two class machine learning classification. The visual word dictionary does not depend on resolution of the image. The proposed work shows the ability to detect and classify the retina images in different conditions.

CANCER DISCOVERY USING CLUSTER ENSEMBLE APPROACH Dr.S.Ramakrishnan Mrs.L.Meenachi Ms. G.T.Citra Ms.I.Swathi shree Ms. C.Deepika Abstract Cancer discovery is one of the most important clinical applications .In this paper, we use the cluster ensemble approach to diagnose the cancer cells. Our aim is to improve the accuracy of the clusters that are formed. Initially, the Prior Knowledge about the dataset is represented as pairwise constraints. Then, multiple datasets are generated from the given dataset by using the Random subspace (RSS) technique. Spectral Clustering (SC) is applied on these datasets to obtain a set of clustering solutions. Using the pairwise constraints, confidence factors for each solution is calculated. A Consensus matrix is formed using the obtained clustering solutions and confidence factors. Finally, the consensus matrix is partitioned to get the final clusters. The above method is also applied to the datasets and the accuracy of those approaches has been found. The experiment shows that the cluster ensemble helps to further improve the accuracy of the single clustering algorithm. This method is more accurate,so we apply this for various applications. 3-D VOLUME RECONSTRUCTION FOR MEDICAL APPLICATON Manikandan.V Divya.P, Selvam College of Technology, Abstract DDoS (Distributed Denial of Service) attacks threaten Internet security nowadays. Such attacks are among the hardest security problems to address because they are simple to implement, difficult to

prevent, and very difficult to trace. As a result, there is no effective and efficient method to deal with this issue so far. This work proposes a novel traceback method for DDoS attacks that is based on entropy variations between normal and DDoS attack traffic, which is fundamentally different from commonly used packet marking techniques. In comparison to the existing DDoS traceback methods, the proposed strategy possesses a number of advantagesit is memory nonintensive, efficiently scalable, robust against packet pollution, and independent of attack traffic patterns

EFFICIENT APPROACH TO PATENT SEARCH PARADIGM T.Krishna Chaitanya D.Hemavathi SRM University Abstract Patent search process has attracted considerable attention recently because to search patent data from large data sets. Several other methods which the user issues a try and see approach and issue several queries to check whether the answers are relevant or not which is a complex process. The use of measures such as error correction, topic-based query suggestion, queries expansion for evaluating the interestingness in the search criteria. By these techniques this project proposes a new method for search process and improves search efficiency. The efficiency of search process to find the relevant answers from large data sets is partitioning the data sets. In this project partitions are constructed by taking the dataset from USPTO. First i partition patents into small partitions based up on their classes and topics. Then for a given query, i find the highly relevant partitions and answer the query based on the highly relevant partitions. After the selection of relevant answers from different partitions and generate top-k answers of patent-search query.

SLM : SECURE LEADER MODEL FOR INTRUSION DETECTION IN MOBILE AD-HOC NETWORK J.Mervin Ms.K.Madheswari SSN college of Engineering Abstract The Mobile Ad hoc Networks (MANETs) have no fixed chokepoints /bottlenecks where Intrusion Detection Systems (IDSs) can be deployed. Hence, nodes are clustered and the nodes need to run its

own IDS and cooperate with others to ensure security, since the mobile nodes are energy limited it is inefficient to provide IDS service to all the nodes present in cluster. To balance the resource consumption among all nodes and prolong the lifetime of an MANET, nodes with the most remaining resources should be elected as the leaders. The proposed system deals with the leader election in the presence of selfish nodes for intrusion detection in MANETs, the two obstacles in achieving this goal are the presence of malicious and selfish nodes , since the resources are private information the nodes may behave selfishly in order to increase their own benefits and the nodes may behave maliciously and it does not provide IDS service to all other nodes present in the cluster where the proposed system controls the selfish and malicious nodes by providing incentives in the form of reputation.

MULTIKEYWORD SEARCHING IN PEER- TO-PEER NETWORK USING GOSSIP ALGORITHM COMBINED WITH QUERY RATING TECHNIQUES A.Azhakeswari, V.Venkateshwaradevi Abstract A peer to peer network is a popular network tool for sharing information on the web, where information resides on millions of sites in a distributed manner. Existing P2P retrieval mechanisms provide a scalable distributed hash table (DHT) that allows every individual keyword to be mapped to a set of documents/nodes across the network that contain the keyword. Using this single keyword-based index, a list of entries for each keyword in a query can be retrieved by using existing DHT lookups. For multi keyword search, the simple solution which merges the results of each keyword search incurs a lot of traffic. Bloom Filter (BF) is an effective way to reduce such communication cost. Instead of sending set of documents simply a BF can be transfer, by doing so the traffic can be reduced. The bloom filter settings can be optimized by increasing the bloom filter size and increase the number of the hash function used. Gossip algorithm is used to scans the documents for specified keyword and counting the number of occurrence of the keyword in entire documents. When the counting value reached the threshold value it automatically stored in DHT table. This paper introduces a query rating technique to calculate the number of user who uses the same keyword. This query rating can be combined with gossip algorithm to get better performance thereby reducing the searching time. Oxford Engineering College

A NOVEL BASED CLOUD COMPUTING SECURITY Santhakumar.D, Pragash.K, Balamurali.K Abstract

MANAGEMENT FRAMEWORK

CK College of Engineering & Technology

Cloud computing places an organizations sensitive data in the control of a third party, introducing a significant level of risk on the privacy and security of the data. This paper focus on Cloud Computing, various cloud deployment models and the main security risks and issues that are currently present within the cloud computing industry. It also proposes a collaboration-based security management framework for the cloud computing model. E-SECURE:PRESERVING SECURITY THROUGH THREE-FACTOR AUTHENTICATION IN DISTRIBUTED SYSTEMS K.USHA K.PRAKASH P.MANOJ M.SURIYAPRAKASH R.V.S College of Engineering and Technology Abstract This paper investigates a systematic and real time approach for authenticating clients by three factors namely password, smart card, and biometrics. E-Secure for Preserving security is proposed to upgrade two-factor authentication to three-factor authentication. As part of security within distributed systems, various services and resources protected from unauthorized use efficiently because of three authentication factors. It is not only significantly improves the information assurance at low cost but also protects client privacy in distributed systems. In addition, it maintains several practice-friendly properties of the underlying two-factor authentication.

A DYNAMIC DATA REPLICATION IN CLOUD TO INCREASE SYSTEM AVAILABILITY Rajapriya G Vijayakumar D Srinivasagan K G National Engineering College Abstract Data replication is a method to improve the performance of the data access in distributed systems. Dynamic replication is a replication that adapts replication configuration with the change of user behavior during the time to ensure the benefits of replication. To improve the system availability,

replicate the popular data to multiple suitable locations, as users can access the data from a nearby site. It decide a reasonable number and right locations for replicas has become a challenge in the cloud computing. A dynamic data replication strategy suitable for distributed computing environments. It includes, analyzing and modeling the relationship between system availability and the number of replicas, evaluating and identifying the popular data and triggering a replication operation when the popularity data passes a dynamic threshold, calculating a suitable number of copies to meet a reasonable system byte effective rate requirement and placing replicas among data nodes in a balanced way, designing the dynamic data replication algorithm in a cloud. As a result of the proposed method, increase the availability, performance, reduce user waiting time and also reduce the execution time of the system. AGENT BASED NETWORK SNIFFER INTRUSION DETECTION Mrs. Dhanalakshmi, Mr. A.R.M. Ravi Shankar Abstract Network sniffing was considered as a major threat to network and web application. Every device connected to the Ethernet-network receives all the data that is passed on the segment. By default the network card processes only data that is addressed to it. However listening programs turn network card in a mode of reception of all packets called promiscuous mode. As we know sniffer is a program that eavesdrops all the data moving onto the network [1]. In this mode the NIC does not perform its basic task of filtering. The NIC forwards all the packets moving in the network to the system for further processing. Sniffer does not generate any traffic in the network, so it can not be detected easily. Many sniffers like wireshark, Cain & Abel, ethersniff etc. are available at no cost on the internet. There are many proposed solutions are available for the detection of network sniffing including antisniff [2], SnifferWall [3], Sniffer Detector [4] etc. but any solution does not guarantee full security. Here in this paper we are proposing Mobile Agents as a solution for the problem of sniffer detection. Mobile agents perform a task by migrating and executing on several hosts connected to the network. For the sniffer detection, the network administrator sends some special types of mobile agents in the network and collects information from different nodes. After analyzing this information the network administrator can identify the computer system running in promiscuous mode. Arunai Engineering College

ANDROID AND WIRELESS BASED ROBOTIC VEHICLE CONTROL D.Santhakumar P.Prakash T.Pradeep Kumar Chander CK College of Engineering & Technology Abstract Our paper discusses on controlling the robot by Wireless communication and we are about to create an application in our smart phone for controlling the robot. The instructions are given to the smart phone through voice. The basic instructions are forward, backward, left, right and stop. The instructions are transferred to the robot through the Wireless medium. The robot can move with the help of wheel by receiving the command. The robot is made of arduino board. The board receives the instruction and performs the task. Simultaneously it captures the video and sends it to the android mobile.

IMPLEMENTATION OF A VISIBLE WATERMARKING IN A SECURE STILL DIGITAL CAMERA USING VLSI DESIGN
SELVI AND JEYALAKSHMI

Abstract Watermarking is the process that embeds data called a watermark, a tag, or a label into a multimedia object, such as images, video, or text, for their copyright protection. According to human perception, the digital watermarks can either be visible or invisible. A visible watermark is a secondary translucent image overlaid into the primary image and appears visible to a viewer on a careful inspection. The invisible watermark is embedded in such a way that the modifications made to the pixel value are perceptually not noticed, and it can be recovered only with an appropriate decoding mechanism. This paper presents new very large scale integration (VLSI) architecture for implementing two visible digital image-watermarking schemes. The proposed architecture is designed to aim at easy integration into any existing digital camera framework.

COMPARISON OF CONVENTIONAL MODEL BASED TECHNIQUES WITH MODEL PREDICTIVE CONTROL METHOD FOR TWO-TANK SYSTEM
T.SIVARAM, DR.A.KAVITHAMANI, DR.V.MANIKANDAN Coimbatore Institute of Technology

Abstract

A Conventional controller cannot maintain the desired liquid level of a two-tank system due to load disturbances and unpredictable environmental conditions. Maintenance of the desired level is necessary as otherwise the tank will over flow, leading to catastrophic losses. This paper compares conventional controllers like feedback control, feed forward control, Internal Model Control (IMC) control with model predictive controller. The performance analysis is be done by integral errors. Controllers are simulated and evaluated using MATLAB/SIMULINK. REFINING ANOMALISTIC MALICIOUS ATTACKS IN SOVEREIGN SETUP STEPHEN Abstract The Internet is a global system of interconnected computer networks that use the standard Internet protocol suite to serve billions of users worldwide. Recently the Internet is plagued by malicious activities such as spam, phishing to malware, denial-of-service (DoS), Man in the middle attack, Eavesdropping attacks etc. Much of it thrives on armies of compromised hosts, or botnets, which are scattered throughout the Internet. Malicious activity is not necessarily to be evenly distributed across the Internet: some networks may employ lax security, resulting in large populations of compromised machines, while others may tightly secure their network and not have any malicious activity. So the proposed scheme, concentrates on the three frequently occurring malicious attacks such as Botnet attack, Phishing attack, DDOS attack, by detecting and providing countermeasures. The evaluated results of the proposed methodologies are compared with the existing works to prove its efficiency. A DYNAMIC LOAD BALANCING SCHEME FOR ENERGY EFFICIENT RESOURCE UTILIZATION IN CLOUD COMPUTING
A.SUNITHA, D.VIJAYAKUMAR, DR.K.G.SRINIVASAGAN R.SABARI MUTHU KUMAR National Engineering College

Abstract Cloud computing is an internet based use of computer applications. It mainly used to share hardware and software resources over the network rather than the remote server. Load balancing techniques used to share the workload to the individual nodes of the system to improve both resource utilization and job response time while also avoiding a situation where some of the nodes are always busy while other nodes are idle or doing lower priority work. Load balancing problem

can be balanced using the Load strategy method and create a multiple instances. Using the optimum scheduling algorithm, it schedule the task based on the deadline and cost based constraints which gives the minimum turn around time of the system. Load Strategy method use dynamic scheduling and to balancing the load and also reduce the execution time. And also consider the concept of

Green computing that are used to reduce the energy consumption. Task consolidation is a method used to increase resource utilization and reduces energy consumption which can lead to freeing up of resources that can sit idling yet still drawing power. ANALYSIS OF ION CHANNELS AND BIO-MOLECULES USING NANO-ELECTRONIC MOSFETS T.S.Guruprasath V. Nishanth Jain MNM Jain Engineering College. Abstract This paper considers the use of Nano-Scale MOSFETS (Double-Gated) which when interfaced with the Human Skin can help us analyze the ion channels and their functions in working of various BioMolecules and Cells. This can then be implemented in various fields such as Anti-venom Research, The Effects of Paralysis and Epilepsy can then be studied in detail and this can be possibly used to cure certain medical cases rather than adapting to measures such as amputation. AN IMPLEMENTATION OF SEMI-SUPERVISED LEARNING FROM MICROARRAY SAMPLE CLASSIFICATION Mr.A.P. Gopu, K.S.MANOJEE, Selvam College of Technology, Abstract The recent advancement and wide use of high throughput technology are producing an explosion in using gene expression phenotype for identification and classification in a variety of diagnostic areas. An important application of gene expression data in functional genomics is to classify samples according to their gene expression profiles. In most gene expression data, the number of training samples is very small compared to the large number of genes involved in the experiments. However, among the large amount of genes, only a small fraction is effective for performing a certain task. Furthermore, a small subset of genes is desirable in developing gene-expression-based diagnostic tools for delivering precise, reliable, and interpretable results. In this project, a new supervised attribute clustering algorithm is proposed to find co-regulated clusters of genes whose collective expression is strongly associated with the sample categories or class labels. A new

quantitative measure, based on mutual information, is introduced to compute the similarity between attributes. The proposed supervised attribute clustering method uses this measure to reduce the redundancy among genes. A single gene from each cluster having the highest gene-class relevance value is first selected as the initial representative of that cluster. The representative of each cluster is then modified by averaging the initial representative with other genes of that cluster whose collective expression is strongly associated with the sample categories. Finally, the modified representative of each cluster is selected to constitute the resulting reduced feature set.

ARDUINO BASED AUTOMATIC CONTROL FOR LABORATORY STERILIZATION PROCESS


T. K. SETHURAMALINGAM, M. KARTHIGHAIRASAN PRIST University,

Abstract The automatic control for laboratory sterilization process is designed by arduino. Arduino is an Open Hardware platform that allows a fast prototype development. The system is composed of four modules. Gas Control Module, CPU Module, flame detection, and pressure sensor module. A servomotor attached to a gas valve. Four Control positions are performed by the Module, two main positions are used to open or close the valve. The valve opens at 90 and close to 0. This library allows an Arduino board to control RC (hobby) servo motors. Servos have integrated gears and a shaft that can be precisely controlled. The DS18B20 digital thermometer provides 9-bit to 12-bit Celsius temperature measurements. The DS18B20 communicates over a 1-Wire bus that by definition requires only one data line (and ground) for communication with a central microprocessor. It is to detect a temperature rising after Arduino sends a signal to turn on gas otherwise turns off gas. Pressure sensing module was designed to obtain internal Pressure of the system. As a final product, pressure sensor hasnt an output signal present, only numeric value displayed on LCD screen.

DISTRIBUTED SECURE DATA FORWARDING IN CLOUD STORAGE SYSTEM Amritha.S Mr.S.Saravana Kumar., Srinivasan Engineering College, Abstract

A cloud storage system, used to store large number of data in storage server. Cloud system is used to provide large number storage servers, which provide long-term storage service over the Internet. Third partys cloud system does not provide data confidentiality. Constructing centralized storage system for the cloud system makes hackers stole data easily. General encryption schemes protect data confidentiality. In the proposed system a secure distributed storage system is formulated by integrating a threshold proxy re-encryption scheme with a decentralized erasure code. The distributed storage system not only supports secure and robust data storage and retrieval, but also lets a user forward data from one user to another without retrieving the data back. The main technical involvement is that the proxy re-encryption scheme supports encoding operations over encrypted messages as well as forwarding operations over encoded and encrypted messages. The method fully integrates encrypting, encoding, and forwarding. The proposed system is applied for military and hospital applications, then other secret data transmission. COMMUNITY ANOMALY DETECTION SCHEME IN COLLABORATIVE INFORMATION SYSTEMS Anitha.S, Mr.P.Krishna Kumar Pet Engineering College Abstract Collaborative Information Systems (CISs) are deployed within a diverse array of environments that manage sensitive information.Using current security mechanisms the insiders threats are detected, but they are not suited to monitor the systems in which user function in dynamic teams. This system use the Community Anomaly Detection System (CADS) that utilizes a relational framework to detect insider threats based on the access logs of collaborative information.Based on the observation that typical CIS users leads to form community structures based on the subjects accessed. Anomaly prediction, which can be used to detect deviation from expected behavior. It also extended CADS into MetaCADS to incorpate the semantics of subjects accessed by the users.

AN EFFICIENT PRIVACY PRESERVING SCHEME FOR SOURCE-SINK LOCATION IN WIRELESS SENSOR NETWORK.S. Asha, B.Sivasankar Abstract Sardar Raja college of Engineering

While many protocols for sensor network security provide

onfidentiality for the content of

messages, contextual information usually remains exposed. Such contextual information can be exploited by an adversary to derive sensitive information such as the locations of monitored objects and data sinks in the field. Attacks on these components can significantly undermine any network application. Existing techniques defend the leakage of location information from a limited adversary who can only observe network traffic in a small region. However, a stronger adversary, the global eavesdropper, is realistic and can defeat these existing techniques. This paper first formalizes the location privacy issues in sensor networks under this strong adversary model and computes a lower bound on the communication overhead needed for achieving a given level of location privacy. The paper then proposes two techniques to provide location privacy to monitored objects (source-location privacy)periodic collection and source simulationand two techniques to provide location privacy to data sinks (sink-location privacy)sink simulation and backbone flooding. These techniques provide trade-offs between privacy, communication cost, and latency. Through analysis and simulation, we demonstrate that the proposed techniques are efficient and effective for source and sink-location privacy in sensor networks.

OPTIMAL CSI FEEDBACK WITH WATER FILLING PRECODER IN A MIMO SYSTEM Duffy Asenath.S Abstract In this paper we study the relation between the ergodic capacity and feedback interval for a Multiple Input Multiple Output (MIMO) system. Based on this relation an optimal feedback interval is derived for the Rayleigh fading channel. The minimum differential feedback rate is also determined for this system considering the channelestimation error and channel quantization distortion Francis Xavier Engineering College

MEMS SHUNT SWITCH FOR RECONFIGURABLE ANTENNA APPLICATIONS

Eby Sam Stewart.L, Mr V.S.Selvakumar, Dr.L.Sujatha Abstract

REC

Frequency Reconfigurable antenna has been widely used in most of the latest communication systems. Due to its excellent out-of-band rejection property it is being used in satellite communications as well. The most important component in reconfigurable antenna is the RF switch that provides switching between different antenna patterns. In typical application like satellite communication, the switch has to operate at high frequencies, very low temperatures and under the exposure to cosmic rays. The performance of conventional switches are affected by the bias lines and voltage provided to these switches. An RF switch with low actuation voltage and low loss so as to improve the performance of the reconfigurable antenna. Traditional PIN diode and FET switches exhibit high loss at high frequencies. Also these devices consume large amount for operation and is affected by temperature variations.

ENHANCE THE SECURITY TO MITIGATE ONLINE PASSWORD GUESSING L.Janani S.Ilavarasan Saveetha Engineering College Abstract Security has become a important aspect with respect to password. Several techniques are already available to provide a secure password. A new password guessing resistant protocol (PGRP) is introduced from attacks.(brute force and Dictionary attacks).The project to be developed is to Prevent Password Guessing Attackers between a server and number of clients. Brute force and Dictionary Attack on remote login services are wide spread and ever increasing. Automated Turing test which is continue to be very effective, easy to deploy. We propose a new Password Guessing Resistant protocol (PGRP) the protocol is easy to deploy and scalable requiring minimum computational resources in terms of memory, processing time and disk space. Online password guessing attacks are inevitable and commonly observed against web applications..PGRP limits the total number of login attempts from unknown remote host with reasonably cost of inconvenience to the user. PGRP analyses the performance with the two real world datasets and find more promising existing proposals

FREQUENT ITEMSETS MINING ON LARGE UNCERTAIN DATABASES: USING RULE MINING ALGORITHM Jency Varghese Abstract In recent years, due to the wide applications of uncertain data mining frequent item sets over uncertain databases has attracted much attention. In uncertain databases, the support of an item set is a random variable instead of a fixed occurrence counting of this itemset. In sensor monitoring system and data integration diligence the data manipulated is highly ambiguous. The important issue is extracting frequent itemsets from a large uncertain database, interpreted under the Possible World Semantics. An uncertain database contains an exponential number of possible worlds, by observing that the mining process can be modeled as a Poisson binomial distribution. Mining manifold Itemsets from generous ambiguous database illustrated under possible world semantics is a crucial dispute. Approximated algorithm is established to ascertain manifold Itemsets from generous ambiguous database exceedingly. We propose incremental mining algorithms, which enable probabilistic frequent itemset results to be refreshed. We criticize the support for incremental mining and ascertainment of manifold Itemsets. Tuple and Attribute ambiguity is reinforced. Incremental Mining Algorithm is adduced to retain the mining consequence. K.Soundararajan Vivekanandha College of Engineering for Women

ENERGY EFFICIENT IN IMPROVED EDF FOR MULTI-CORE SYSTEM X.Jude Roy Jeyaseelan., M.Poongothai, Coimbatore Institute of Technology, Coimbatore Abstract Embedded systems have been widely used in portable devices. To meet real time application demands computing capability of the embedded system should be high. It is very important for designing of embedded system to enable minimum energy consumption to while meeting the real time application demands. Dynamic voltage scaling technology enables effective reduction of energy consumption by utilizing slack time to modify operation voltage and frequency of processor in order to reduce energy consumption. Multi-core systems providing better throughput capability than single-core processor working under in same clock frequency. The proposed IEDF-DVS

(Improved EDF with Dynamic Voltage Scaling) scheduling algorithm can effectively reduce energy consumption in multi-core environment and ensure all tasks to meet their deadlines.

DEEP PACKET INSPECTION WITH BIT-REDUCED DFA FOR CLOUD SYSTEM S. Aravindh U. MuthuPandi@Vignesh Chandy College of Engineering Abstract With development of the cloud computing, its security issues have got more and more attention. There is a great demand for the examining the content of data or packets in order to improve cloud security. In this paper, we propose a new algorithm about pattern matching for cloud system, First it performs inexact matching to filter out the part of non- attack information and then do exact matching to get the final attack information. Our presented specific algorithm named Bit-Reduced DFA, it is feasible through a preliminary evaluation. EFFICIENT RFID STREAM ON STATIC AND DYNAMIC READER USING DATA INFERENCE AND DATA COMPRESSION

B. Mahendiran N. Velmurugan Abstract

Saveetha Engineering College

RFID Technology used to identify multiple objects simultaneously in many areas such as healthcare, pharmaceuticals, supply-chain etc. In RFID warehouse presents certain challenges including incomplete data, lack of location and containment and high volumes of data. Because of the collision, some of tags cannot be identified and difficult to identify the location as well containment relationship between tags. Redundant data can leads to excessive space and difficult for query processing in warehouse. They are fixed reader and mobile reader. In this paper, Data inference and compression technique is used to identify tags as well as containment relationship among tags and reduce redundant data efficiently and easy to handle query processing in distributed environment.

ENHANCING SECURITY AND PRIVACY IN DECENTRALIZED KP-ABE WITH KEY REVOCATION R. Nithya B. Parkvi Oxford Engineering College,

Abstract In attribute based encryption (ABE) scheme each user is identified by set of attributes and some functions of those attribute is used to determine decryption ability for each ciphertext. In our proposed system is different from multi authority ABE scheme where each authority can work independently without any cooperation and central authority. The global Identifier (GID) is used to tie all the users secret keys together, while the corrupted authorities cannot pool the users attributes by tracing it and revoked users also allowed to re-encrypt the data using key revocation algorithm. FPGA IMPLEMENTATION OF ROBOTIC ARM HAVING THREE LINK MANIPULATOR A.K.Nitin Subramonium D.Nanda Gokul N.Vijaya Kumar Mr G.Saravanan

KPR Institute of Engineering and Technology Abstract Small robots can be beneficial to important applications such as movement of objects, beverages, placing chips on board, welding purpose and many other industry applications. A reconfigurable technique based on Field Programmable Gate Array (FPGA) is used to implement the robotic arm movement which has the potential for greater functionality, higher performance and lower power dissipation. The FPGA controller is used to generate direction and the number of pulses required to rotate for a given angle. Pulses are sent as a square wave which is generated using Pulse Width Modulation (PWM) and the number of pulses determines the angle of rotation. The frequency of square wave determines the speed of rotation. The proposed control scheme has been realized using XILINX FPGA SPARTAN-3E. The real time operation is done with DC servo motors that are joined for testing motion of the robot arm.

PREDICTING EARTHQUAKES THROUGH DATA MINING Ramya.A.P Rizvana.M Anna University of Technology Abstract Data mining consists of evolving set of techniques that can be used to extract valuable information and knowledge from massive volumes of data. Data mining research &tools have focused on commercial sector applications. Only a fewer data mining research have focused on scientific data.

This paper aims at further data mining study on scientific data. This paper highlights the data mining techniques applied to mine for surface changes over time (e.g. Earthquake rupture). The data mining techniques help researchers to predict the changes in the intensity of volcanos. This paper uses predictive statistical models that can be applied to areas such as seismic activity , the spreading of fire. The basic problem in this class of systems is unobservable dynamics with respect to earthquakes. The space-time patterns associated with time, location and magnitude of the sudden events from the force threshold are observable. This paper highlights the observable space time earthquake patterns from unobservable dynamics using data mining techniques, pattern recognition and ensemble forecasting. Thus this paper gives insight on how data mining can be applied in finding the consequences of earthquakes and hence alerting the public.

SECURE COMMUNICATION BY QUANTUM PROTOCOL Ramya.A.P Rizvana.M Anna University of Technology Abstract In the communication scenario, there is a bottleneck problem for providing secure communication without encryption and without secret key between users. To achieve this objective, principle of Quantum Mechanics can be applied. The quantum mechanics replaces mathematical encryption in conventional encryption techniques. Enciphering and deciphering techniques are often utilized among senders and receivers to achieve enhanced secure communication. But it needs mathematical computation and reduces speed. To avoid this, the Quantum protocol contains an information channel whose existence is undetectable by any currently known technology. Such hidden channels could effectively provide secure communication without encryption technique. The protection that quantum mechanics offers to keys could extend to the information transmitted during communication itself, thereby eliminating the use of key-based encryption. The Email infrastructure has been chosen as a communication medium and provides Email based secure distributed information retrieval. The system is based on a store-and-forward paradigm, utilizing the public email system, to facilitate search distribution and collaborative information retrieval. It ensures the information security and privacy in the search and exchange of sensitive data in an open network. In this paper, a quantum protocol for secure transmission of data using qubits is presented.

Quantum protocol does not require a shared secret key and a secure communication can be made without encryption between two parties.

OPTIMAL BIASED SYSTEM TO IMPROVE SNR Renu Roy Abstract This paper deals with the problem of improving SNR.Modulating signal is the output of optical OFDM and is bipolar in general.It cannot be used for intensity modulated direct detected systems that is unipolar in general.Biasing followed by clipping to simply transform bipolar to unipolar.This work is not focussed on eliminating all the clipping but to provide sufficient bias,so that clipping is not the dominant source in the system.This suggests that provide sufficient bias,so that clipping is not the dominant source in the system. Francis Xavier Engineering College

LICENSE PLATE LOCALIZATION FROM LOCAL IMAGES , Abstract Automatic license plate recognition (ALPR) system for vehicles is a challenging area of research due to its importance to a wide range of commercial applications. The first and the most important stage for any ALPR system is the localization of the license plate within the image captured by a camera. Variety of techniques has already been reported for localization of license plate and recognition of license number thereafter. But most of the works seem to be applicable for a very controlled environment. In the current work, we have concentrated on localization of license plate regions from true color still snapshots captured in a very realistic situation. The technique is based on a novel multi-stage approach for analysis of vertical edge gradients from contrast stretched grayscale images. The technique successfully localizes the actual license plates in 89.2% images. ,

PRIVACY PROTECTING DISTRIBUTED ACCOUNTABILITY FOR DATA SHARING IN THE CLOUD G.Sathiya, V.VenkateshwaraDevi Oxford Engineering College

AbstractCloud computing is the use of computing resources(hardware and software) that are delivered as a service over the network (Internet).It entrusts remote services with users data, software and computation. Even though this new emerging technology has many advantages, users lose the control of their own data (particularly financial and health data).This paper proposes an enhanced Decentralized Cloud Accountability Framework (DCAF) system to keep track of actual usage of users data in the cloud. This object-centred framework uses JAR files which encloses the logging mechanism together with users data and policies. To strengthen the users control, DCAF is also provided with effective Auditing mechanisms. The proposed methodology applies advanced obfuscation techniques to the JAR files provides additional security to JRE

HUMAN ACTION RECOGNITION IN DYNAMIC BACKGROUND USING DYNAMIC PROTOTYPE TREE S.Subalin Abstract Action recognition has been a popular research topic in the vision community due to its wide applicability to multimedia analysis and video surveillance. Shape and motion are the most important and useful visual cues for human action recognition. The existing system relies on high dimensional descriptors for modeling action frames which for large-scale action retrieval and recognition require tremendous amounts of computation. On the other hand, existing approach mostly assumed simple backgrounds or static cameras, and did not explicitly consider the challenging cases of dynamic backgrounds. In the proposed system, an efficient prototype-based approach is used for action recognition which performs recognition efficiently via prototype matching and look-up table indexing. Actions are modeled by learning their prototypes in a joint shape-motion space via k-means clustering. Frame-to-frame distances are rapidly estimated via fast look-up table indexing. Once the action is recognized a message is triggered to the administrator. Finally the recognized actions are converted in to frames and at the same instance the frames from Global Server are retrieved and Gray Scale Conversion is applied to both the recognized frames and Global Server frames to recognize human. R.Femila Goldy Anand Institute of Technology,

DESIGN OF ENERGY OPTIMIZATION ALGORITHM FOR THE WIRELESS SENSOR NETWORKS P.YOGAPRIYA, Dhanalakshmi Srinivasan College of Engg. and Tech Abstract In a Power Constrained Wireless Sensor Network, Power Consumption is an important criterion. By reducing the overall power consumption of an network, the network lifetime can be extended. The power for idle state is more compared to sleep state during transmission, receiving, idle and sleep states. Thus much of the energy can be saved by making the idle nodes to sleep.Hence a proposal to reduce the overall energy consumption to a maximum extent is done. Initially the energy of nodes are estimated based on their respective states. Then the idle nodes which does not involve in the transmission are made to sleep thereby saving an overall power consumption of 10.23. After a detailed study of node state behavior, a neighbor table for all the nodes is maintained along with the RSSI metric. Based on the RTS/CTS concept, a neighbor node with higher RSSI value will be elected as a forwarding node, and other nodes which receive RTS are made to hibernate. Since the RSSI metric is based on the distance and a forwarding node elected is also going to be at the same distance always, Maximum Power Drain occurs at all the forwarding nodes. To overcome this, A Sequential Approach based on maximum RSSI and maximum residual power of the neighboring nodes is used for the selection of forwarding nodes.By this sequential approach, every individual node power is utilized effectively for the entire network.

VLSI DESIGN OF MIMO OFDM FOR THE FUTURE WIRELESS COMMUNICATION G.Jaya Padmapriya Nallathambi Jaya Engineering College Abstract The design of an OFDM physical layer that follows the IEEE standard 802.11a. We then devise an efficient pipelined architecture, and incorporate it into the MIMO-OFDM physical layer. In our experiments, we compare our pipelined architecture to the baseline MIMO-OFDM physical layer

implementation. The baseline MIMO-OFDM system uses the same number of fast Fourier transform (FFT) blocks as antennas. The implementation efficiency of our pipelined architecture, compared with the baseline MIMO-OFDM system, is evaluated using two methods: (1) using just one FFT block, and (2) using Radix-2pipelined streaming FFT block, versus a Radix-4 FFT block used in the baseline MIMOOFDM system. Our experiments show that atleast 30 percent of the resources in the baseline MIMO-OFDM system can be saved using our proposed architecture, while achieving the same data rate. We also show that this data rate can be doubled, with approximately the same resource reduction. Moreover, by exploiting the dynamic reconfiguration, our MIMOOFDM system can adapt to various operating modes. MIMO-OFDM systems using single Fast Fourier Transform block that is shared across modulations for the system. Our experimental results show that the proposed implementation saves at least 60 percent of the hardware resources, while achieving the same data rate as known baseline MIMO-OFDM implementations. We also show that as more channels are used, more resources can be saved by using our proposed architecture.

EVOLUTION OF ONTOLOGY BASED ON FREE TEXT DESCRIPTOR FOR WEB SERVICE S. Chitra, P.Ragavendiran, V. Kajendran, M. Manish Kumar Singh MVIT, Abstract Ontologies have become the de-facto modeling tool of choice, employed in a variety of applications and prominently in the Semantic Web. Nevertheless, ontology construction remains a daunting task. Ontological bootstrap- ping, which aims at automatically generating concepts and their relations in a given domain, is a promising technique for ontology construction. Bootstrapping ontology based on a set of predefined textual sources, such as Web services, must address the problem of multiple concepts that are largely unrelated. This paper exploits the advantage that Web services usually consist of both WSDL and free text descriptors. The WSDL descriptor is evaluated using two methods, namely Term Frequency/Inverse Document Frequency (TF/IDF) and Web context generation. We propose an ontology bootstrapping process that integrates the results of both methods and validates the concepts using the free text descriptors. The web service free text descriptor offering the more accurate definition of ontologies. They extensively validated our ontology.

ANOMALY DETECTION IN MULTI-TIER WEB APPLICATIONS WITH ENHANCED KEY GENERATION IN DOUBLE GUARD ARCHITECTURE P. Sivaranjani A.Sandhiya Devi R.Suganya S.Uthayashangar. Manakula Vinayagar Institute of Technology, Abstract Services provided by the internet is being increased due to the increasing usage of the web services and applications in order to satisfy our day-to-day requirements. Since internet services are common to all the users, the security level provided to the personal details of a single user is not satisfactory. Hence a Double Guard system that provides security to both the front end (web server) and the back end (database) using the lightweight virtualization concept. Double guard ensures security only after the creation of the user dedicated sessions and fails to provide initial level security. Hence we propose a new idea that provides initial security to the web applications using the double guard system by means of Re encrypted Secret Key generation. The results of the system are found to be successful in prominent web applications.

FAST AND ACCURATE MULTITASK SALIENCY DETECTION BY CONSIDERING THE FEATURES OF COLOUR, TEXTURE, ORIENTATION AND LUMINANCE S.Hambert Solomon raja Abstract We propose a new type of multitask saliency detection saliency which aims at detecting the image regions that represent the scene. This definition differs from previous definitions whose goal is to either identify fixation points or object. To detect the dominant object. In accordance with our saliency definition, we present a detection algorithm which is based on four principles observed in the psychological literature. The benefits of the proposed approach are evaluated in two applications where the context of the dominant objects is just as essential as the objects themselves. In image retargeting we demonstrate that using our saliency prevents distortions in the important regions. In summarization we show that our saliency helps to produce compact, appealing, and informative summaries. PSN College of Engineering and Technology

CLUSTERED TECHNIQUES OVER BIOLOGICAL SEQUENCE PREDICTED USING BIMAX S. K. Anantha Priyaa Abstract The five biclustering algorithm (QUBIC, SAMBA, ISA, FABIA, and BIMAX) has been made to identify the Bicluster in gene expression data. The GDS 1620 dataset and pathway dataset were used to compare the five algorithms under different dimension. The testing was preferred to verify the corresponding biological significance using Gene Ontology (GO) and Protein-Protein Interaction (PPI). Its performance and quality were evaluated using two scoring methods Weighted Enrichment (WE) scoring and PPI scoring. Finally, BIMAX algorithm were said to be efficient among the five biclustering algorithm. In proposed, the infections of viral disease datas are to be clustered using the BIMAX algorithm again the datas are biclustered in order to avoid the sequence of data. According to the study the clustering of data and biclustering of data is effective to analyze the viral infection easily. A TWIN PRECISION MULTIPLIER BASED MULTI-RESOLUTION FAST FILTER BANK FOR SPECTRUM SENSING IN MILITARY RADIO RECEIVERS Ancy Michel.M Abstract In this paper, we propose a twin precision multiplier based multi-resolution filter bank (MRFB) for spectrum sensing in military radio receivers. The flexibility in realizing multiple sensing resolution spectrum sensor is achieved by suitably designing the prototype filter and efficiently selecting the varying resolution sub bands without hardware re-implementation. Here to Oxford Engineering College

improve the performance use a twin precision technique for fast filter bank in MR receivers. Twin precision multiplier is able to adapt to different requirements of multi-resolution lter bank. By adapting to actual multiplication bit width using the twin-precision technique, it is possible to save power, increase speed and double computational through put. With our proposed twin-precision multiplier scheme for fast filter bank in MR receivers the execution time is reduced and efficiency is increased.The proposed filter bank architecture achieves less gate count and power reduction over conventional filter banks.

99

A FULLY AUTOMATIC SEMENTATIONOF LIVER AND HEPATIC TUMOR FROM 3D CT ABDOMINAL IMAGES Angel.G, Evangeline Angel.S , HyrunFathima I JayarajAnnapackiam C.S.I College of Engineering,Nazareth. Abstract: An adaptive initialization method was developed to producefully automatic processing frameworks based on graph-cut and gradient flow active contour algorithms. This method was applied to abdominal Computed Tomography (CT) images for segmentation of liver tissue and hepatic tumors. Twenty-five anonymized datasets were randomly collected from several radi-ology centres without specific request on acquisition parameter settings nor patient clinical situation as inclusion criteria. Resulting automatic segmentations of liver tissue and tumors were compared to their reference standard delineations manually performed by a specialist. The analyzed datasets presented 52 tumors: graph-cut algorithm detected 48 tumors while active contour algorithm detected only 44 tumors. In addition, in terms of time performances, less time was requested for graph-cut algorithm with respect to active contour one. The implemented initialization method allows fully automatic segmentation leading to superior overall performances of graph-cut algorithm in terms of accuracy and processing time. The initialization method here presented resulted suitable and reliable for two different segmentation techniques and could be further extended. PRE-COMPUTATION ARCHITECTURE WITH T-ALGORITHM FOR VITERBI DECODER DESIGN P.Moorthy ,Anju Sasi Vivekananda College of Engineering for Women Abstract A popular combination in modern coding system is the convolutional encoder and the Viterbi decoder . With a proper design, they can jointly provide an acceptable performance with feasible decoding complexity. In this paper, we propose an area efficient architecture based on precomputation for Viterbi decoders incorporating T-algorithm. Through optimization at both design level and architecture level, the new architecture greatly shortens the long critical path introduced by the conventional T-algorithm. A general solution to derive the optimal pre-computation steps is also given in the paper. Viterbi decoder is the dominant module in TCM decoders used in space communications. The design example using a rate convolutional code used in the TCM system,

100

provided in this work demonstrates more than twice improvement in clock speed with negligible computation overhead,high power reduction while maintaining decoding performance. INTERACTIVE IMAGE SEGMENTATION USING DYNAMIC BAYESIAN NETWORK Anu Antony, Benesh Selva Nesan , A.Prabin The Rajaas EngineeringCollege Abstract Segmenting semantically meaningful whole objects from images is a challenging problem, and it becomes especially so without higher level common sense reasoning. In this project, we present an interactive segmentation framework that integrates image appearance and boundary constraints in a principled way to address this problem. In particular, we assume that small sets of pixels, which are referred to as seed pixels, are labeled as the object and background. The seed pixels are used to estimate the labels of the unlabeled pixels using Dirichlet process multiple-view learning, The boundary extraction problem is formulated as a Dynamic Bayesian network and the novel approach to the Dirichlet Mixtures with state pruning is used to nd the optimal boundary in a robust and efcient manner based on the extracted external and internal local costs, thus handling much inexact user boundary donations than existing methods.

IMPLEMENTING SCALING ONLINE SOCIAL NETWORKS USING SOCIAL PARTITIONING AND REPLICATION Anusree V.K M. Edwin Jayasingh Abstract Distributed network is the important application of networking. A social networking service is an online service that focuses on building of social networks. Vertical scaling introduces, add resources to a single node in a system, involving the addition of CPUs or memory to a single computer. Such vertical scaling of existing systems also enables more effectively and it provides more resources for the hosted set of operating system. Horizontal scaling introduces to add more nodes to a system, such as adding a new computer to a distributed software application. This model has created an increased demand for shared data storage with very high I/O performance. When a servers CPU is bound, adding more servers do not help serve more requests [1]. Due to complex nature of the OSNs the existing partitioning techniques does not produce any optimal solution for the back end data scalability. In this paper, Social partitioning and replication model for the middleware joint partitioning and replication of the underlying community structure to provide an Infant Jesus College of Engineering

101

optimal solution for the band end data scalability for the online social networks and to ensure that all the data is local.

A STUDY ON UPCOMING TRENDS IN ELDERLY HOME MONITORING SYSTEM N.Beaulah, AL.Kumarappan Abstract Wireless-sensor-networks add extra advantage combined with the advancements in MEMS technology. These are nowadays widely used in the bio-medical applications. WSN based home monitoring system for elderly activity behaviour. By regular monitoring we can determine the wellness of elderly. The system can also be used to monitor physiological parameters, such as temperature and heart rate, of a human subject. Using MEMS sensors to detect falls and to measure different vital signs, the person is wirelessly monitored within his own house; this gives privacy to the elderly people. The captured signals are then wirelessly transmitted to an access point located within the patients home. This connectivity is based on wireless data transmission at 2.4-GHz frequency. The access point is also a small box attached to the Internet through a home asynchronous digital subscriber line router. Afterwards, the data are sent to the hospital via the Internet in real time for analysis and/or storage. Programmed system will minimize the number of false messages to be sent to care provider and supports inhabitant through suitable prompts and action to be performed when there is irregular behaviour in the daily activity. Sri Sai Ram Engineering College,

PERFORMANCE ASSESSMENT OF CLUSTER NODES WITH DYNAMIC PRIORITY SCHEDULING IN PRIVATE CLOUD ENVIRONMENT Balakannan S.P Bhavani R Abstract Cloud computing is an emerging technology for providing its users an efficient and reliable resource sharing mechanisms. Users of the cloud can store, retrieve and process their data and files on demand basis. The use of differentiated cloud services prevent the users from high computational cost and reduces the burden of data storage. In order to get an effective cloud service, it is crucial to find the performance of the cloud nodes. In the existing technique the performance of the cluster nodes have been evaluated with different cluster configurations deployed in a multi-cloud environment, based on throughput measures. In our proposed work we have analyzed the performance of the cluster nodes in a cloud environment. We have evaluated the performance of each node in the cluster based on its scalability and response time. Also we
102

Kalasalingam University

concentrated on scheduling of tasks to the cluster nodes. We have used the Dynamic Priority Scheduling algorithm for scheduling the tasks in the cluster. FIELD PROGRAMMABLE GATE ARRAY BASED ADAPTIVE NEUROMORPHIC CHIP FOR OLFACTORY SYSTEM M.Sudalina Devi, Mrs. J.Nalini, PSN College of Engineering and Technology Abstract Electronic nose system is an artificial neural network system used to detect or classify odour of a specimen and if finds wide application in all commercial industries. To identify a new sample and then to estimate its concentration, use to both spike timing dependent plasticity learning techniques and the least square regression principle. In the first one is aimed at teaching the system how to discriminate among different gases, while the second one uses the least squares regression approach to predict the concentration of each type of samples. The system proposed suggests a scalable and generic architecture. The system aims at reducing the area overhead by incorporating a transposable SRAM array that share learning circuits which grows with the number of neurons also the system is trained for usage in chemical industry by coupling a chemo sensor array. All the component subsystem implemented on neuromorphic chip has been successfully tested in FPGA.

INFORMATION SECURITY USING CHAOTIC MIXING IN DIGITAL STEGANOGRAPHY


L.ESTHER PONNAMMAL ,V.DIVYA MEENAKSHI R.M.K. engineering college Abstract The rapid development of data transfer through internet made it easier to send the data accurate and faster to the destination. One of the most important factors of information technology and communication has been the security of the information. For security purpose the concept of steganography is being used. Steganography is an art of invisible communication. In this paper we propose a new method for strengthening the security of information through a combination of
103

signal processing, cryptography and steganography. A data hiding technique which generates an identifier based on chaotic mixing which provides the structure for the generation of the sorted sequence from an original sequence is used.A concept of toral automorphism is used where a digital image subjected to iterated actions of A matrix would first encounter complete

chaos,i.e. the lattice L(stegoImage) disperses having its points distributed irregularly and then these points would come back to their original position after a specific number of iterations. This is one of the secure methods of data hiding. BIOMETRIC IDENTIFICATION OF PERIOCULAR REGIONS USING TEXTURE FEATURE P.Fasca Gilgy Mary, P.Sunitha Kency Paul, J. Dheeba Noorul Islam Centre For Higher Education Abstract Biometrics is used in computer science as a form of identification and access control. Biometric identification is the process by which a person can be identified by his characteristics. Automated Biometric identification can be done rapidly and uniformly, with a minimum number of training, which provides extremely accurate and secured access to information. In this project work Periocular biometric recognition is used, which is based on the appearance of the region around the eye. Periocular recognition may be useful in applications where it is difficult to obtain a clear picture of an iris for iris biometrics or a complete picture of a face for face biometrics. Acquisition of the Periocular biometrics does not require high user cooperation and close capture distance unlike other ocular biometrics. This region usually encompasses the eyelids, eyelashes, eyebrows, and the neighbouring skin area, which encompasses the information of face recognition and an iris recognition system. In this work, the Local Binary Patterns (LBPs) used for the feature extraction on the Periocular images. LBP is a type of feature used for classification in computer vision and a powerful feature for texture. Modified Backpropagation neural network classifier is used for the classification and recognition of an authenticated individual.

AN EFFICIENT REFINING OF CBMIR THROUGH SUPERVISED LEARNING APPROACH A.Glorin Brittal Rani ,Sangeetha Senthilkumar Oxford Engineering College Abstract CBIR technique is becoming increasingly important in medical field in order to store, manage, and retrieve image data based on user query. In order to reduce the computational time for SVM
104

training we introduce unsupervised clustering before supervised classification. Searching is done by means of matching the image features such as texture, shape, or different combinations of them. Texture features play an important role in computer vision, image processing and pattern recognition. We introduce a novel method of using SVM classifier followed by KNN for CBIR using texture and shape feature. PERFORMANCE ENHANCEMENT OF EZ-SOURCE INVERTER USING INDUCTION MOTOR N.Gurusakthi, R.Sivaprasad Abstract Since the Z-Source element has fewer complexes they are used mainly for buck-boost energy conversion with the help of passive elements. The further advancement in the Z-Source is the Embedded EZ-Source inverter that can produce the same gain as Z-Source inverter. The input to the Embedded EZ-Source inverter is been obtained from solar cell .Therein the ripples which we obtain from the output voltage of solar cell is filtered using Z-Filter. Pure DC is given to the threephase inverter, followed by the conversion to balanced AC. The output of the Embedded EZ Source inverter is used to control the harmonics present in the load. The entire process is analysed with the help of MATLAB-SIMULINK DESIGN OF SOC WIRE BASED ON NOC ARCHITECTURE Hemalatha.T, Daisy Rani.T Abstract The objective is to design and implement SoCWire using NOC architecture. SOC wire using NOC architecture have two parts they are SOC wire CODEC and SOC wire switch.SOC wire CODEC comprises of five major parts they are receiver FIFO, receiver ,statemachine,transmitter and transmitter FIFO. Which connects a node or host system to a SOC wire network.SOC wire switch is used for routing data of many CODEC from one node to many other nodes. In my work I have considered four such nodes. Along with SOC wire CODEC Hamming code is introduced for single bit error detection and correction. SOC wire switch is implement with 8 port crossbar switch .The design is synthesized on Xilinx ISE 9.1 using VHDL coding level.SOC wire is mainly used for space application. DOMOTATION USING HYBRID PROTOCOLS ABINAYA.K, BALAJI.G, JEGANATHAN.V Angel college of Engineering of Technology
105

Sri Sai Ram Engineering College

Hindustan University.

Abstract Building automation optimizes energy savings and reduces operating costs lowering total cost of ownership. It furthermore, enhances security, protection and convenience. This paper focuses on the integration of electrical automation devices using hybrid protocols. It facilitates the cohort of new functionalities by connecting individual working electrical systems and circuits into a network .It deals with automation of the building with aspects like heating, ventilation and air conditioning (HVAC), lighting control and health monitoring. So, the final building automation system has different subsystems which are finally taken to an incorporated building supervision system. The main purpose is to provide the end user with an economical fully centralized system in which home appliances are managed by both wired and wireless networks. The system is able to automate the building with low power expenditure and it can be implemented with increased power backup and the backup power can be utilized with priority basis. Continuance risk can be abridged because of the shared communication between the subsystems.

INVISIBILITY TECHNOLOGY REPLACES MAGIC P.David raj, T.Baskaran, V.R.S College of engineering and technology, Abstract This paper describes a kind of active camouflage system named Optical Camouflage. Optical Camouflage uses the Retro-reflective Projection Technology, a projection-based augmentedreality system composed of a projector with a small iris and a retro reflective screen. The object that needs to be made transparent is Painted or covered with retro reflective material. Then a projector projects the background image on it making the masking object virtually transparent.

NEIGHBORING NODE INTRUSION DETECTION SECURITY FOR MANET ROUTING Gopalakrishnan.T,Jahir Hussain.V Abstract Mobile ad hoc networking (MANET) has become an exciting and important technology in recent years, because of the rapid propagation of wireless devices. MANET is prone to both insider and outsider attacks more than wired and infrastructure based wireless networks. It is highly vulnerable to attacks due to the open medium, dynamically changing network topology, cooperative algorithms, and lack of centralized administration. Intrusion detection systems(IDS) present
106

M.A.M College of Engineering

inspection and observe capabilities that offer then earby security to a node and help to distinguish the specific trust level of other nodes. We propose a Neighboring Node Intrusion Detection (NNID) security y routing mechanism to detect Black Hole Attack (BHA) over Ad hoc On Demand Distance Vector (AODV) MANET routing protocol. In NNID security routing mechanism, the intrusion detection is performed by neighboring nodes using the nearby node of the attacker node. By performing NNID security routing mechanism, the security mechanism overhead would be decreased.

BRAIN CONTROLLED CAR FOR DISABLED USING ARTIFICIAL INTELLENG J.Subash, K.S.Sharath kumar Abstract This paper considers the development of a brain driven car, which would be of great help to the physically disabled people. Since these cars will rely only on what the individual is thinking they will hence not require any physical movement on the part of the individual. The car integrates signals from a variety of sensors like video, weather monitor, anti-collision etc. it also has an automatic navigation system in case of emergency. The car works on the asynchronous mechanism of artificial intelligence. Its a great advance of technology which will make the disabled,abled. In the 40s and 50s, a number of researchers explored the connection between neurology, information theory, and cybernetics. Some of them built machines that used electronic networks to exhibit rudimentary intelligence, such as W. Grey Walter's turtles and the Johns Hopkins Beast. Many of these researchers gathered for meetings of the Teleological Society at Princeton and the Ratio Club in England. Most researchers hope that their work will eventually be incorporated into a machine with general intelligence (known as strong AI), combining all the skills above and exceeding human abilities at most or all of them. A few believe that anthropomorphic features like artificial consciousness or an artificial brain may be required for such a project. DEVELOPMENT OF REPLICA FREE REPOSITORIES USING PARTICLE SWARM OPTIMIZATION ALGORITHM Jeby K Luthiya Abstract
107

SNS College Of technology

C. Umamaheswari

Vivekananda College of Engineering for Women

The increasing volume of information available in digital media becomes a challenging problem for data administrators. Usually built on data gathered from different sources, data repositories such as those used by digital libraries and e-commerce brokers present records with disparate schemata and structures. The increased volume even created redundant data also in the database. So a system or method is become immense to control the redundancy and duplication. In the proposed approach, we made a method that makes use of PSO (Particle swarm optimization) algorithm for generating the optimal similarity measure to decide whether the data is duplicate or not. PSO algorithm is used to generate the optimal similarity measure for the training datasets. Once the optimal similarity measure obtained, the deduplication of remaining datasets is done with the help of optimal similarity measure generated from the PSO algorithm. NONINVASIVE LOAD DELEGATION MODEL IN GRID ENVIRONMENT B.JEYANTHI, S.SUPRAKASH, Abstract In a Grid environment, the most consideration of scheduling policy is to reduce the response time and execution time of a job. Some creative applications might need several resources of different types. It's common for the resource desires of grid applications to travel on the far side what's available in any of the sites creating up a grid. To run such applications, a method known as coallocation, that is, the coinciding or coordinated access of single applications to resources of presumably multiple types in multiple locations managed by completely different resource managers is needed. Allocating resource across the multiple clusters will reduce the execution time of a job in an affordable amount. Such multi cluster systems will give access to larger computational power and to a large range of resources. In this paper, we propose a decentralized grid system model as a group of clusters. We tend to then introduce a Decentralized Job Scheduling Algorithms that performs intra cluster and inter cluster (grid) job scheduling. We model the grid as a bunch of clusters. Group of users submit jobs to the varied clusters. In centralized scheduling, the computer hardware of every cluster is scheduling of submitted jobs. In decentralized scheduling, jobs although submitted locally may be migrated to a different cluster so as to reduce the processing time of the jobs. Kalasalingam University,

A NOVEL CLUSTERING METHOD BASED ON HACO AND FAPSO FOR CONTINUOUS DOMAIN P.Kiruthiga, J.Mercy Geraldine, Srinivasan Engineering College Abstract
108

Data has to be clustered for easy mining of the required content. Data clustering is an essential technique for web applications and organizations. However, the clustering performance has to be optimized to form usable and efficient data clusters. Many optimizing methods have been suggested to improve the clustering performance of the fuzzy c- means clustering. The FAPSO and the HACO optimization techniques have been proposed to improve the clustering performance. However, these traditional methods suffer from various limitations such as sensitivity to initialization, trapping into local minima and lack of prior knowledge for optimum parameters of the kernel functions. Considering the performance of the clustering techniques, the kernel methods are used in kernelized fuzzy c-means algorithm for improving the clustering performance of the well know fuzzy c-means algorithm. This is processed by mapping the considered dataset into a higher dimensional space non-linearly. The obtained dataset is more likely to be linearly separable. In this algorithm, to overcome the drawbacks, a new clustering method based on the recently proposed optimization, hybrid ant colony optimization for continuous domain and particle swarm optimization are proposed. The proposed method is applied to a dataset which is extracted from MITBIH arrhythmia database and four domain features are extracted for each type and training and test sets are formed. This algorithm can be used in various applications such as web application, classifying ECG records. ONTOLOGY EXTRACTION FROM WIKIPEDIA Mahalekshmi Abstract Ontology plays an important role in knowledge management and the semantic web. In this paper we construct ontology for the domain of computer science and we propose an automatic updating methodology. Here we are using Wikipedia of computer science as our source. The proposed approach consists of three phases. In the first phase, the wiki pages are downloaded from the Wikipedia using the web robot. In the second phase, from the extracted wiki pages the concepts and relations are identified for constructing ontology by using the proposed CRI algorithm,. In the third phase any modification in the wiki pages is identified and updated automatically. ADAPTIVELY PIPELINED PARALLEL LINEAR PHASE MIXED DIGITAL FIR FILTER Maragathavalli.N Abstract Sriram Engineering College

109

Based on Fast FIR algorithms, this brief proposes high throughput low latency digital finite impulse response structures. These Structures are beneficial to symmetric convolution of odd length in terms of hardware cost. The proposed parallel FIR structures exploit the intrinsic nature of symmetric coefficients as well as asynchronous pipelines which reduces the latency and area efficient. The main aim is to reduce the multipliers which require more space. Including adders instead of multipliers is advantageous because adders occupy less space. For an N tap three-parallel FIR filter, the proposed structure can save N/3 multipliers with the increase in adders at pre processing and post processing blocks. The proposed parallel FIR structures can lead to significant hardware savings for symmetric convolution which is the main process in FIR filter implementation. The proposed structure can be viewed as the filter having the synchronous and asynchronous components performing the filter function.

POWER OPTIMIZATION IN MULTITHRESHOLD CMOS Minu Johny, P.Moorthy Vivekananda College of Engineering for Women Abstract Power gating (MTCMOS) has emerged as an increasingly popular technique to reduce leakage power during the standby mode, while attaining high speed in the active mode. We propose a new reactivation solution which helps in controlling mode transition noise and in achieving minimum reactivation times. A triple phase sleep signal slew rate modulation technique (TPS) has been proposed an efficient solution to such problems. In order to achieve best leakage power saving under equi noise constraints with UMC 80nm CMOS technology, a digital sleep signal modulator presented. Reactivation time, mode transition energy consumption, leakage power consumption reduced, thus optimizes total power consumption in Multithreshold cmos. In this method, a total power consumed of about 0 .618mw by new sleep signal generator. Results obtained indicate

that our proposed techniques can achieve 70.7% reduction for total power.

DIGITAL JEWELLERY MADE POSSIBLE USING WIRELESS COMMUNICATIONS

R.Selva suganthi DR.Sivanthi adithanar college of engineering tiruchendur Abstract : Mobile computing is beginning to break the chains that tie us to our desks, but many of today's mobile devices can still be a bit awkward to carry around. In the next age of computing, there will be an explosion of computer parts across our bodies, rather than across our desktops. Basically,
110

jewellery adorns the body, and has very little practical purpose. The combination of microcomputer devices and increasing computer power has allowed several companies to begin producing fashion jewellery with embedded intelligence i.e., Digital jewellery. Digital jewellery can best be defined as wireless, wearable computers. Even

the devices we use are protected by

passwords. It can be frustrating trying to keep with all of the passwords and keys needed to access any door or computer program. This paper discusses about a new Java-based, computerized ring that will automatically unlock doors and log on to computers
that allow us to communicate by ways of e-mail, voicemail, and voice communication and enlightens on how various computerized jewelry (like ear-rings, necklace, ring, bracelet, etc.,) will work with mobile embedded intelligence. INTELLIGENT CAR SYSTEM FOR ACCIDENT PREVENTIONUSING ARM-7 R.Muthu Lakshmi K.Anitha Abstract This project is about making cars more intelligent and interactive which may notify or resist user under unacceptable conditions, they may provide critical information of real time situations to rescue or police or owner himself. Driver fatigue resulting from sleep deprivation or sleep disorders is an important factor in the increasing number of accidents on today's roads. In this paper, we describe a real-time online safety prototype that controls the vehicle speed under driver fatigue. The purpose of such a model is to advance a system to detect fatigue symptoms in drivers and control the speed of vehicle to avoid accidents. The main components of the system consist of number of real time sensors like gas, eye blink, alcohol, fuel, impact sensors and a software interface with GPS and Google Maps APIs for location. Dr.Sivanthi Aditanar College of Engineering

AN ONTOLOGY BASED APPROACH FOR ACTIVITY RECOGNITION FROM SMART HOMES Mr. Diwakar, Mr.S. Nandha Gopal, Arunai Engineering College Abstract

111

In the first instance it aims to it provide an overview addressing the state-of-the-art in the area of activity recognition in Smart homes. Smart homes are augmented residential environments equipped with sensors, actuators and devices. In early method, they used data-driven approaches only for sensor data in this paper introduces a knowledge-driven approach to real-time, continuous activity recognition and describe the underlying ontology-based recognition process. We analyze the characteristics of smart homes and Activities of Daily Living (ADL) upon which we built both context and ADL ontologies.we will concern ourselves with one type of stochastic signal model is hidden markov model for recognition process. CROP CLASSIFICATION USING MODIS IMAGERY G. ARULSELVI , N.NANDHINI Annamalai University Abstract This paper aims at proposing a remote sensing-based methodology to map the spatio-temporal evolution of the frontiers and the evolution stages of the agricultural frontier in Tamil Nadu. This active agricultural frontier is moving forwards the Cuddalore, Perambulur and Nagapattinam area in the State of Tamil Nadu where expansion for crops like sugarcane, cashew nut and casuarinas has been considered as a main driver of deforestation for more than 30 years. Here deforestation and classification maps (computed on MODIS EVI time series) are used. Geographical concepts of this agricultural frontier assume that its progress is carried out through five stages corresponding to the evolution of three frontiers (deforestation, economic and intensification frontiers). Final maps highlight the fact that few areas have reached the final intensive stage of the agricultural frontier.

DATA MINING OF SOCIAL MEDIA SPECIFIC STRINGS FOR RAPID FORENSIC INVESTIGATION N.Nivaedita., G.Rajeswari., V.Sowmiya kalaimathi., Abstract Instant Messengers become an important means of communication. Millions of people, regardless of age, nationality, gender and computer skills, spend a lot of time using them every day. Thus, Social networks already occupied the place of traditional messaging systems of the past. More and more communications are migrating from public chat rooms and private messengers into online Social Networking Sites(SNSs). As cybercrimes mushroom in recent years, more and more digital crime investigations have strong relations to these SNSs. Communications extracted from social
112

JCE

networking sites can be extremely valuable and useful to all kinds of Investigators including forensic investigators. We now apply the spotlight on distinct strings specific to each SNSs and volatile memory analysis and display of any case detail as a key to a successful digital investigation through rapid & improved retrieval performance. BRAIN TUMOR DETECTION AND IDENTIFICATION USING IMAGE PROCESSING AND SOFT COMPUTING G.Athilakshmi vinothini., P.Nivetha., A.Sahaya Suji Abstract In this paper, modified image segmentation techniques were applied on MRI scan images in order to detect brain tumors. In order that for segmentation purpose we have handled analysis process from that we have proposed better algorithm for detection of brain tumor. In case of the next process that is of identification, in which we are using probabilistic neural network classifier, in this it classifies tumor tissue from normal tissue.

CIPHERTEXT-POLICY ATTRIBUTE SET BASED ENCRYPTION FOR SECURE CLOUD COMPUTING Priyanga P.T and Anand T Madha Engineering College Abstract Cloud computing is the delivery of computing and storage capacity as a service to a community of end-recipients. The cloud make it possible for you to access your information from anywhere at any time. The cloud removes the need for you to be in the same physical location as the hardware that stores your data. To keep the shared data confidential against untrusted cloud service providers, a natural way is to store only the encrypted data in a cloud. Several schemes are employed to provide access control of outsourced data but, most of them suffer from inflexibility in implementing complex access control policies. To provide access control it uses hierarchical attribute-set-based encryption scheme that extends the ciphertext-policy attribute-set-based encryption. It consists of a hierarchical structure of system users such as a cloud service provider, data owners, data consumers, a number of domain authorities, and a trusted authority. In this system neither data owners nor data consumers will be always online. They come online only when necessary, while the cloud service provider, the trusted authority, and domain authorities are always online. It can also solve the user revocation problem by assigning multiple values to the same attribute. Users may try to access data files either within or outside the scope of their access
113

privileges, so malicious users may collude with each other to get sensitive files beyond their privileges. Using this ciphertext-policy attribute-set-based encryption we can encrypt the files while uploading and also helps to detect and recover the data in case of any corruption might be happened on the cloud. MEDICAL IMAGE ANALYSIS USING MORPHIC CLUSTERING ALGORITHM M. Karthikeyen, R.Rajakumari, G.R.Hemalakshmi, N.B.Prakash National Engineering college Abstract Medical image analysis is used for understanding the underlying physiological causes of a disease. In medical practice, physicians refer their critical patients and determine the treatment to the particular disease by imaging the affected part of the body. Imaging will help the Normal patients to pursue further treatment and Abnormal patients to identify the nature of abnormality. Our approaches automated image analysis method includes registration, segmentation, anatomical parameterization and modeling, tissue classification and shape analysis, and pathology detection in individuals or groups. This paper proposes a clustered morphic algorithm to identify branching points in images. This method is used to change the representation of an image into something that is more meaningful and easier to analyze the interested object

AN AUTHENTICATED AND SECURE COMMUNICATION FOR DISTRIBUTED CLUSTERS VIA MESSAGE PASSING INTERFACE M.A.Maffina., R.S.RamPriya Abstract In a public network, when a number of clusters connected to each other is increased becomes a potential threat to security applications running on the clusters. To address this problem, a Message Passing Interface (MPI) is developed to preserve security services in an unsecured network. The proposed work focuses on MPI rather than other protocols because MPI is one of the most popular communication protocols on distributed clusters. Here AES algorithm is used for encryption/decryption and interpolation polynomial algorithm is used for key management which is then integrated into Message Passing Interface Chameleon version 2 (MPICH2) with standard MPI interface that becomes ES-MPICH2. This ES-MPICH2 is a new MPI that provides security and authentication for distributed clusters which is unified into cryptographic and mathematical
114

Jayamatha Engineering College

concept. The major desire of ES-MPICH2 is supporting a large variety of computation and communication platforms. The proposed system is based on both cryptographic and mathematical concept which leads to full of error free message passing interface with enhanced security. PERFORMANCE ANALYSIS OF COGNITIVE RADIO WITH SVD AND WATER FILLING TECHNIQUES Remika Ngangbam, R. Anandan, Dhanalakshmi Srinivasan College of Engg. and Tech., , Abstract In this paper the analysis of the performance of secondary users (SUs) is considered. Multi-carrier systems are one of the best candidates for applying in cognitive radio (CR) networks because of the spectrum shaping and high adaptive capabilities. Since SUs in this structure use a limited number of sub-carriers because of deactivation of the primary users (PUs) band, the total capacity of CR networks is limited. Considering different conditions to obtain maximum total capacity of CR networks, Water filling algorithm for Rayleigh fading channel and singular value decomposition is proposed. Theoretically, it is shown that this proposed algorithm can maximize the total capacity keeping the caused interference in PUs bands in a tolerable range. To simplify the algorithm complexity, a sub-optimal scheme is proposed. The simulation results of the new algorithms are compared with previous methods, which present the enhancement and efficiency of the proposed algorithms. Furthermore, the simulation results show that the proposed schemes can load more power into the CR users band in order to achieve higher transmission capacity for a given interference threshold.

BRAIN CONTROLLED CAR FOR DISABLED USING ARTIFICIAL INTELLIGENCE R.Muthu lakshmi N.Dharini DR.Sivanthi aditanar college of engineering tiruchendur Abstract This paper considers the development of a brain driven car, which would be of great help to the physically disabled people. Since these cars will rely only on what the individual is thinking they will hence not require any physical movement on the part of the individual. The car integrates signals from a variety of sensors like video, weather monitor, anti-collision etc. it also has an
115

automatic navigation system in case of emergency. The car works on the asynchronous mechanism of artificial intelligence. Its a great advance of technology which will make the disabled, abled. ABE BASED EXTENDABLE SHARING OF PERSONAL HEALTH RECORD IN CLOUD N.karthikeyan Sri SaiRam Engineering College Abstract In medicine oriented organizations patients personal information will be maintained at the cloud servers which will be insecure. This information is generally called as personal health records (PHR). This should not be outsourced to the third party, which is happening now. To avoid this Personal health records will be encrypted before being outsourced. Yet, issues such as risks of privacy exposure, scalability in key management, flexible access and efficient user revocation, have remained the most important challenges toward achieving fine-grained, cryptographically enforced data access control. We implement a patient-centric framework which will have two different domains known as public domains (PUD) and personal domains (PSD). To provide a fine-grained access control we introduced Attribute-based Encryption (ABE). Our work is differentiated from previous work by including multiple data owner scenario. It will reduce the key complexity management for owners and users.

VIRTUAL REALITY-CONFUSING THE BRAIN TO REDUCE PAIN FOR PATIENTS D.Nandhini, K.Gomathi, Abstract The essence of immersive virtual reality (iVR) is the illusion it gives users that they are inside the computer-generated virtual environment. This unusually strong illusion is theorized to contribute to the successful pain reduction observed in burn patients who go into VR during woundcare (www.vrpain.com) and to successful VR exposure therapy for phobias and post- traumatic stress disorder (PTSD). The present study demonstrated for the first time that subjects could experience a strong illusion of presence during an fMRI despite the constraints of the fMRI magnet bore (i.e., immobilized head and loud ambient noise). REVERSE SKYLINE QUERY PROCESSING FOR UNCERTAIN DATA G.Saraswathi Oxford Engineering College
116

V.R.S College of engineering and technology,

Abstract Data hesitation is an intrinsic attribute of Multidimensional dataset. So that reverse skyline query is functional to the Multidimensional dataset and trace out the important points in the multidimensional data space. Reverse Skyline query is very valuable for geographical information system, urban planning and military deployment. The proposed system to process the Reverse skyline query and using the skyband method to answer the problem of resultant query set. Energy Efficient technique drastically reduces the communication cost among the nodes and save the energy at the survey of reverse skyline queries. Finally, reverse skyline query is applied in the multiple networks and extract useful information from substantial hesitant data readings. MAPPING RESOURCE USING NETWORKED CLOUD THROUGH ILS N.Sharanya,C.Selva kumar Abstract In cloud computing the configurable resources are provided in the form of internet as a service. The basic goal is to create a fluid pool of virtual resources across computers, servers and data centers that enable users to access stored data and applications on an on-demand basis. Virtual Network Mapping (VNM) plays a central role in building a virtual network (VN). During this mapping process each node of the VN is assigned to a node of the physical network (PN) and each virtual link is assigned to a path or flow in the PN. To deal with the inherent complexity and scalability issue of the resource mapping problem across different administrative domains, in this article so many techniques are described. AN EFFICIENT PROTECTION FOR MULTITIER WEB APPLICATION USING DOUBLE GUARD SYSTEM B.SHARMILA DEVI S.P. MANIKANDAN S.M.K.Fomra institute of technology Abstract Internet services and applications have become an inextricable part of daily life, enabling communication and the management of personal information from anywhere. To accommodate this increase in application and data complexity, web services have moved to a multitiered design wherein the web server runs the application front-end logic and data are outsourced to a database or file server. In this paper, we present Double Guard, an IDS system that models the network behavior of user sessions across both the front-end web server and the back-end database. By monitoring both web and subsequent database requests, we are able to ferret out attacks that independent IDS would
117

Oxford Engineering College

not be able to identify. Furthermore, we quantify the limitations of any multitier IDS in terms of training sessions and functionality coverage. We implemented DoubleGuard using an Apache webserver with MySQL and lightweight virtualization. We then collected and processed real-world traffic over a 15-day period of system deployment in both dynamic and static web applications. Finally, using DoubleGuard, we were able to expose a wide range of attacks with 100 percent accuracy while maintaining 0 percent false positives for static web services and 0.6 percent false positives for dynamic web services.

INTELLIGENT SYSTEM ON WATER QUALITY MONITORING AND PURIFICATION IN WATER TANK


R.SHARMILA, G.K.YAGHAVI, T.VASANTH, V.TAMILARASU K.S.RANGASAMY COLLEGE OF TECHNOLOGY

Abstract The main objective of the paper is about providing automation in water tank. It includes Water level indicating system, Quality monitoring system, a facility for checking the purity of water, by determining the pH range in water and also to purify the water which is coming from the source with the help of Reverse osmosis process. The level of water is determined with capacitance level sensor and it is displayed with the help of LCD interfacing. And this level indication acts as a protection mechanism in order to protect the motor from Dry-run conditions by turning the motor ON and OFF according to the upper and lower level of water in the tank. The pH content present in the water is determined with the pH sensor. The efficiency and performance of water purification can be improved with the help of reverse osmosis process. This purification system is cost effective and more efficient. Thus this project is developed from the thought of getting automation in the water tank which helps to consume time and avoids the health problems due to the continuous quality monitoring of the water in the tank. The mud identification in this enhances the advantage of this project. SELFCARE EMERGENCY DETECTION AND MEDICAL ASSISTANCE USING ANT COLONY ALGORITHM Sreema.S.Kumar C.Jeyanthi Abstract PSN College of Engineering & Technology

118

With the pervasiveness of smart phones and the advance of Wireless Body Sensor Networks (WBSN), mobile Healthcare (m-Healthcare), which extends the operation of Healthcare provider into a pervasive environment for better health monitoring, has attracted considerable interest recently. However, the ourish of m-Healthcare still faces many challenges including medical assistance and the adequate emergency assistance. In this paper, we propose a emergency check and the quick medical aid providence to the patients which would enhance the m-Healthcare in rural areas. By using a smart phone the Personal Health information (PHI) of the registered patients can be intensively collected. In the existing systems the medical observer should be stay active for all the time before the system gathering the PHI of the patient checking whether the whole biological parameters are in the normal condition. In this paper the emergency is detected by monitoring the patients with the updated and the previously stored data. Also a Swarm based algorithm (Ant Colony Algorithm) is implemented to calculate the shortest path from the hospital to the patient whenever the emergency is being detected.

SECURITY USING KERBEROS FOR STORAGE CLOUD WITH HIGH VOLUME OF DATA FORWARDING K. Subalakshmi Oxford Engineering College, Abstract Cloud computing is technology that uses the internet and central remote servers to maintain data and applications. In Cloud storage system is the collection of storage servers such as data storage server and key storage server. Data Storage server to maintain the datas, files etc. Key Storage server to maintain the secrets key. General encryption defends the data confidentiality but restrains the functionality of the storage system. In existing system the datas or files are stored in cloud system but its not secured because the third party are entered in cloud system and corrupted the datas or files in cloud system. So data confidentiality is less and data forwarding is not possible. To overcome this problem introduced Kerberos and RMI Distributor in proposed system. Kerberos is a network authentication protocol. Datas or files are securely stored in cloud storage system. The proxy re-encryption scheme supports encrypted messages as well as forwarding operations over encoded and encrypted messages. In method fully integrates encrypting, encoding, and forwarding. GPRS DRIVING WAP ON THE ROAD TO 3G
119

E.NISHA P.SUBITHA SCAD ENGINEERING COLLEGE

Abstract Mobile telephony allowed us to talk on the move. The Internet turned raw data into helpful services that people found easy to use. Now, these two technologies are converging to create third generation mobile services. In simple terms, third generation (3G) services combine high-speed mobile access with Internet Protocol (IP)-based services. But this doesnt just mean fast mobile connection to the world wide web. Rather, it means whole new ways to communicate, access information, conduct business, learn, and be entertained -- liberated from slow, cumbersome equipment and immovable points of access. Mobile computing is being heralded as the new killer app for the Internet. While 3G hasnt arrived yet 2.5G is here! The technologies at the forefront

of 2.5G push are GPRS(General Packet Radio Service), EDGE (Enhanced Data rates for Global Evolution.), WCDMA (Wideband Code Division Multiple Access), and WAP (Wireless Application Protocol). SIGNIFICANT ANALYSIS OF TERMS IN BOOTSTRAPPING ONTOLOGY FOR WEB SERVICES swathi Abstract Ontology construction is important for semantic based web services. Ontological bootstrapping which aims at automatically generating concepts and their relations in a given domain is a promising technique for ontology construction. Bootstrapping anOntology based on a set of predefined textual sources, such as web services, must address the problem of multiple, largely unrelatedconcepts. In this paper ontology bootstrapping process for web services is done based on WSDL document. The proposed approach uses result of significant analysis and web context extraction for the ontology evolution. The significant analysis provide us the importance of the every token extracted from the WSDL document. Based on the significant score the ontology constructed for web services which provides best service to the users.

ENHANCED PRIVACY ID WITH REVOCATION CAPABILITIES R.Thenmoli S.Sathyaraj Oxford Engineering college Abstract

120

Direct Anonymous Attestation (DAA) is a scheme that enables the remote authentication of a Trusted Platform Module (TPM) while preserving the users privacy. A TPM can prove to a remote party that it is a valid TPM without revealing its identity and without link ability. In the DAA scheme, a TPM can be revoked only if the DAA private key in the hardware has been extracted and published widely so that verifiers obtain the corrupted private key. The TPM cannot be revoked. Furthermore, a TPM cannot be revoked from the issuer, if the TPM is found to be compromised after the DAA issuing has occurred.While still providing unlinkability, our scheme provides a method to revoke a TPM even if the TPM private key is unknown.Our EPID scheme is efficient and provably secure in the same security model as DAA, i.e., in the random oracle model under the strong RSA assumption and the decisional Diffie-Hellman assumption. CLOUD COMPUTING R.Uma Mageswari G.Jeba Glorinthal Abstract Cloud computing is basically an Internet-based network made up of large numbers of servers mostly based on open standards, modular and inexpensive. Clouds contain vast amounts of information and provide a variety of services to large numbers of people. The benefits of cloud computing are Reduced Data Leakage, Decrease evidence acquisition time, they eliminate or reduce service downtime, they Forensic readiness, they Decrease evidence transfer time. The main factor to be discussed is security of cloud computing, which is a risk factor involved in major computing fields. USING SMS IN MOBILE PHONE FOR HOME APPLIANCES CONTROLLING THROUGH PC PARALLEL PORT INTERFACING C.M.Vasuki, S.Pavithra Abstract This paper presents a system of the PC remote Controlling with the Mobile Telephone through accessing the main PC ports; serial and parallel. Serial port for transferring data from Mobile phone to PC and parallel port for interfacing PC with real time controlling hardware. The system is implemented by using the SMS (Short Message Service) as associated with all modern mobile phone devices and mobile telecommunication networks. The software for whole system is designed and implemented with KORAK Telecom Network in Erbil City, Nokia mobile phone device and with ordinary type of PC that running under Windows XP or compatible. The software for system
121

Adiyamaan college of engineering

is divided into two parts; Mobile to PC through serial port is a general commercial program that associated with the Nokia mobile devices, and second which access SMS file and control all parts of system is designed by using Microsoft Visual C++ Ver. 6 . Such idea is quiet new and represents the ability of anyone who has Mobile and PC to control remotely major devices in his/her home, office and etc.

REFINEMENT IN COGNITIVE RADIO NETWORK FOR SPECTRUM HANDOFF Venkatesan.D, Sathishkumar.S Abstract Cognitive radio (CR) can significantly improve spectrum efficiency by allowing the secondary users to temporarily access the primary users under-utilized licensed spectrum. Spectrum mobility issues arise when the primary user appears at the channels being occupied by the secondary users. The secondary users return to the occupied channel because the primary users have the preemptive priority to access channels. Spectrum handoff technique can help the interrupted secondary user to vacate the occupied licensed channel and find a suitable target channel to resume its unfinished data transmission. Preemptive resume priority (PRP) M/G/1 queuing network model characterizes the spectrum usage behaviors of the connection-based multiple channel spectrum handoffs. The proposed model derives the closed-form expression for the extended data delivery time of different proactive designed target channel sequences under various traffic arrival rates and service time distributions. The analytical method analyzes the latency performance of spectrum handoffs based on the target channel sequences specified in the IEEE 802.22 wireless regional area networks standards. We also suggest a traffic-adaptive target channel selection principle for spectrum handoffs under different traffic conditions. I-DRIVE - AN INTELLIGENT DRIVING SYSTEM
S.VINOTHKUMAR G.ESWARAN

Srinivasan Engineering College

SNS college of engineering

Abstract Our project is to create an automated four wheeler driving system without using GPS (Global Positioning System) for position tracking. This is to reduce the occurrences of road accidents and make traveling safe. It integrates various technologies and provides a platform for a wide range of
122

applications too. In the play, to detect obstacles we use sensors. For sensing other vehicles we use a reader which reads the signals transmitted from those vehicles. We use a roadmap to travel, and the automobiles dynamics to track its own position. This is used as an alternative to GPS. All these data are then combined and plotted on a frame buffer. Then we apply algorithms to generate path by considering the frame buffer. Using this we can achieve a safe and easy travel. iDrive allows the driver and front-seat passenger to control such amenities as the climate (air-conditioned and heater), the audio system (radio and CD player), the navigation system and communication system. Recently, iDrive is used in BMW cars. AUTHENTICATION ON KEY MANAGEMENT FRAMEWORK WITH HYBRID MULTICASTING NETWORKS Suganthi P, Ramya K Sree Sowdambika College of Engineering, Abstract Adhoc networks are dynamically created and maintained by the individual nodes comprising the network. They do not require a pre-existing architecture for communication purposes and do not rely on any type of wired infrastructure. In an adhoc network all communication occurs through a wireless mediam. The design and management of ad-hoc networks is significantly a challenging one when compared to contemporary networks. Authenticating the multicast session is an important one. To authenticate several factors should be considered, major issue are resource constraints and the wireless links. In addition to being resource efficient and robust, security solution must be provides to large group of receivers and to long multi-hop paths. The authentication must be done without much delay and should independent of the other packets. In existing TAM Tired Authentication scheme for Multicast traffic is proposed for ad-hoc networks. It exposed network clustering to reduce the overhead and to improve the scalability. Its two tired hierarchy combines the time and secret-information asymmetry to achieve the resource efficiency and scalability. In the proposed system, a Asynchronous authentication scheme as using shared key management is proposed to resolve the most conflicting security requirements such as group authentication and conditional privacy. The proposed batch verification scheme as a part of the protocol poses a significant reduction in the message delay, then we use shared key process so requirement of the storage management is very less.

AUTHORITY DETECTION AND COMMUNICATIONS INTO DIVERSE SERVICEORIENTED SYSTEMS Anitha.S.George .Anto.D.Besant
123

Abstract Web-based collaborations and processes have become essential in todays business environments. Such processes typically span interactions between people and services

across globally distributed companies. Web services and SOA are the defacto technology to implement compositions of humans and services. The increasing complexity of

compositions and the distribution of people and services require adaptive and contextaware interaction models. To support complex interaction scenarios, we introduce a mixed service-oriented system composed of both human-provided and Software-Based Services (SBSs) interacting to perform joint activities or to solve emerging problems. However, competencies of people evolve over time, thereby requiring approaches for the automated management of actor skills, reputation, and trust. Discovering the right actor in mixed service-oriented systems is challenging due to scale and temporary nature of collaborations. We present a novel approach addressing the need for flexible involvement of experts and knowledge workers in distributed collaborations. We argue that the automated inference of trust between members is a key factor for successful collaborations. Instead of following a security perspective on trust, we focus on dynamic trust in collaborative networks. We discuss Human-Provided Services (HPSs) and an approach for managing user preferences and network structures. HPS allows experts to offer their skills and capabilities as services that can be requested on demand. Our main contributions center around a context-

sensitive trust-based algorithm called ExpertHITS inspired by the concept of hubs and authorities in web-based environments. ExpertHITS takes trust-relations and link properties in social networks into account to estimate the reputation of users.

ADAPTIVE MULTIPLE REGION SEGMENTATION BASED ON OUTDOOR OBJECT DETECTION Anju.J.A Mr.Jenopaul Abstract The main research objective of this paper is to detecting object boundaries in outdoor scenes of images solely based on some general properties of the real world objects. Here, segmentation and recognition should not be separated and treated as an interleaving procedure. In this project, an adaptive global clustering technique is developed that can capture the non-accidental structural relationships among the constituent parts of the structured objects which usually consist of multiple constituent parts. The background objects such as sky, tree, ground etc. are also recognized based on the color and texture information. This process groups them together accordingly without
124

PSNCollege of Engineering and Technology,

depending on a priori knowledge of the specific objects. The proposed method outperformed two state-of-the-art image segmentation approaches on two challenging outdoor databases and on various outdoor natural scene environments, this improves the segmentation quality. By using this clustering technique is to overcome strong reflection and over segmentation. This proposed work shows better performance and improve background identification capability. IMPLEMENTING SCALING ONLINE SOCIAL NETWORKS USING SOCIAL PARTITIONING AND REPLICATION Anusree V.K M. Edwin Jayasingh Abstract Distributed network is the important application of networking. A social networking service is an online service that focuses on building of social networks. Vertical scaling introduces, add resources to a single node in a system, involving the addition of CPUs or memory to a single computer. Such vertical scaling of existing systems also enables more effectively and it provides more resources for the hosted set of operating system. Horizontal scaling introduces to add more nodes to a system, such as adding a new computer to a distributed software application. This model has created an increased demand for shared data storage with very high I/O performance. When a servers CPU is bound, adding more servers do not help serve more requests [1]. Due to complex nature of the OSNs the existing partitioning techniques does not produce any optimal solution for the back end data scalability. In this paper, Social partitioning and replication model for the middleware joint partitioning and replication of the underlying community structure to provide an optimal solution for the band end data scalability for the online social networks and to ensure that all the data is local. Infant Jesus College of Engineering

EFFICIENT FORWARDING PROTOCOL TO TOLERATE SELFISH BEHAVIOR IN SOCIAL MOBILE NETWORK Mr.M.Arul Sanka Prof.S.Gokul Pran Ratnavel Subramaniam College of Engineering and Technology Abstract Nodes should accept to use their own energy and bandwidth just to carry other peoples messages. One fundamental and natural question, especially in this setting, is why nodes should do the above function. Present two forwarding protocols for mobile wireless networks of selfish individuals. Assume that all the nodes are selfish and show formally that both protocols are strategy proof that
125

is, no individual has an interest to deviate. Improve performance by reducing the number of replicas and the storage requirements. Extensive simulations with real traces show that this protocols introduce an extremely small overhead in terms of delay, while the techniques introduce to force faithful behaviour have the positive and quite surprising side effect to improve performance by reducing the number of replicas and the storage requirements for the node. Test this protocols also in the presence of a natural variation of the notion of selfishness nodes that are selfish with outsiders and faithful with people from the same community. Even in this case, this protocols are shown to be very efficient in detecting possible misbehaviour. IMAGE PROCESSING
S.ASWINI K.SUBBAMMAL SCAD College of Engineering And Technology

Abstract Morphological Image Processing is an important tool in the Digital Image processing, since that science can rigorously quantify many aspects of the geometrical structure of the way that agrees with the human intuition and perception. Morphologic image processing technology is based on geometry. It emphasizes on studying geometry structure of image. We can find relationship between each part of image. When processing image with morphological theory. Accordingly we can comprehend the structural character of image in the morphological approach an image is analyzed in terms of some predetermined geometric shape known as structuring element. Morphological processing is capable of removing noise and clutter as well as the ability to edit an image based on the size and shape of the objects of interest. Morphological Image Processing is used in the place of a Linear Image Processing, because it sometimes distort the underlying geometric form of an image, but in Morphological image Processing, the information of the image is not lost. In the Morphological Image Processing the original image can be reconstructed by using Dilation, Erosion, Opening and Closing operations for a finite no of times. The major objective of this paper is to reconstruct the class of such finite length Morphological Image Processing tool in a suitable mathematical structure using Java language. The Morphological Image Processing is implemented and successfully tested in FORENSICS

126

Operations are applied for binary images. FORENSICS: Fingerprint Enhancement and reduction of noise in finger print image

MATHEMATICAL MORPHOLOGY BASED FEATURE EXTRACTION FOR REMOTELY SENSED IMAGE TRAFFIC PATTERN S.Jayarani S.Nivetha P.Mohanadivya S.Pavithra S. Premkumar Narasus Sarathy Institute of Technology Abstract Feature Extraction is nothing but transforming the input data into the set of features .It is a special form of dimensionality reduction. The process of feature extraction using traditional methods is a tedious and time consuming process. In order to reduce human efforts involved in feature extraction from remotely sensed imagery, semi-automatic and automatic feature extraction algorithms are developed. GIS data bases must be frequently and accurately maintained and updated. So it is required to analyze the remotely sensed images and extract the features on a regular basis for the effectiveness of GIS databases. This project presents a new method for extraction of High Resolution remotely sensed images based on binary mathematical morphology operators. The proposed approach involves several advanced morphological operators among which an adaptive hit-or-miss transform with varying sizes and shapes of the structuring element used to extract different features. Structuring element is the important one in the feature extraction Because based on the size and shape it has to extract different types of features in a single images. Using our Methodology has been found the Rectangular shape or Square are used for Building Extraction and Line shape is used for Road Extraction. In the earlier days feature extraction can be done by different methods. one is the Principal component analysis(PCA) which is optimal in the mean square sense for representation, it is not appropriate for classification. So they developed Decision Analysis Feature extraction(DAFE) algorithm. It also has the weakness that it is not directly related to the probability of error in classification. On the other hand Decision Boundary Feature Extraction(DBFE) is outperformed some approaches in terms of overall and average accuracies. Our project is used to extract different types of feature from a single remotely sensed image. Experiments made on a IKONOS and WORLD VIEW satellite image shows the effectiveness of the methodology. MULTIPLE KEY GENERATION USING ELLIPTIC CURVE CRYPTOGRAPHY FUSION ALGORITHM FOR BIOMETRIC SOURCE Joju John,, Mr T. Rajesh,
127

Abstract The biometric data are stored in the template in which various techniques are used to protect it against privacy and security threats. The binary vector derived from biometric samples provide a great portion of template protection technique. For the same template protection system it is observed that there is a large variation on its key length. It determine the analytical relationship between the classification performance of the fuzzy commitment scheme and theoretical maximum key size given as input on Gaussian biometric source. The number of enrolment, verification sample, features component and biometric source capacity. It shows features of estimated maximum key size and classification performance are interdependent in analysis of the work. Both the theoretical analysis, as well as an experimental evaluation showed that feature interdependencies have a large impact on performance and key size estimates

A NONLINE OF SIGHTS PROVIDES IN VANET USING THE MOBILE TOWERS TECHNOLOGY A.Joshua Issac S.Sathyaraj Abstract In past decade GPS system is used in vehicles GPS is starting to show some undesired problems such as not always being available or not being robust enough for some applications. For this reason a number of other localization techniques such as Dead Reckoning, cellular Localization, and image/video localization have been used in VANETs to overcome GPS limitations. In vehicular Ad Hoc Networks (VANETs), vehicles communicate with each other and possibly with a roadside infrastructure to provide a long list of applications varying from transit safety to driver assistance and internet access with the direct communications. The Direct communication affects the localization services. In this project to overcome this problem a location verification protocol has been proposed. Dealing with such obstacles is a challenge in VANETs as moving obstacles such as trucks are parts of the network and have the same characteristics of a VANET node. It is providing VANET position integrity through filtering. Additionally a collaborative protocol to verify an announced position when direct communication between the questioned node and the verifier is not possible. In addition to verifying a node location in a multihop cooperative approach, several security measures were included to improve the message integrity.
128

Oxford Engineering College, Trichy

VOICE CALLS OVER WI-FI K.PRIYADHARSHINI G.LAKSHMI SRI SAI RAM ENGINEERING COLLEGE Abstract The use of Wi-Fi enabled cell phones to access internet away from the PC is greatly increasing. Using Wi-Fi enabled phones as IP phones and their communication within a local wireless LAN is discussed in this paper. This proposed model is a form of telecommunication that allows data and voice transmissions to be sent across a wide range of interconnected networks. The models which are Wi-Fi enabled and have J2ME platform can be used to communicate with each other through the free 2.4GHz communication channel. Since this is free, channel security is a concern. To overcome this, the packets of data may be encrypted in the header and payload by different encryption techniques. However even the security is a concern only within the specific network. The communication is completely safe from attacks external to this local network. Each mobile device connects to a WLAN router and identifies itself in the routing table. Calls can be placed by a user by sending the packets to the router, which then tries to find the destination. The destination must also be connected to the WLAN. If not the Wi-Fi server can tunnel the calls to the GSM network using UNC (Unified Mobile Access Network Converter). Since the communication channel is capable of being affected by an outside influence (hacking), it is provided with complex cryptography techniques, which engenders high security. Our proposal allows free calls within the network, with high quality voice transmission. This model will be a prototype of itinerant devices communicating in the Wi-Fi bandwidth, and will greatly reduce the communication cost in large organizations.

PRIVACY PRESERVING ON DEMAND ROUTING USING USOR FOR MANET Mahesh kumar.M, Saravanan. S, Abstract Mobile ad hoc networks often support sensitive applications. These applications may require that users identity, location, and correspondents be kept secret. This is a challenge in a MANET because of the cooperative nature of the network and broadcast nature of the communication. A number of anonymous routing schemes have been proposed for ad hoc networks to provide better
129

Srinivasan Engineering College

support for privacy protection but bring significant computation overhead. However, none of these schemes offer complete unlinkability or unobservability property since data packets and control packets are still linkable and distinguishable in these schemes. s. USOR is efficient as it uses a novel combination of group signature and ID-based encryption for route discovery. The wormhole attacks cannot be prevented in USOR mechanism. The proposed system aimed at developing unobservable routing scheme resistant against DoS attacks such as Gray hole/Black whole attacks to protect network-layer reactive protocols. It discovers malicious nodes during route discovery process when they mitigate fabricated routing information to attract the source node to send data through malformed packet. Security analysis demonstrates that USOR can well protect user privacy against internal and external attackers. The simulation results show that it achieves stronger privacy protection than existing schemes.

ENERGY CONSUMPTION IN SENSOR NETWORK USING CONTINUOUS NEIGHBOR DISCOVERY G.Senthil Kumar, A.Maria Nancy SRM University Abstract In wireless sensor network to make reliable path connectivity and packet exchange will take more time and also need more power. Two techniques are analysed here to reduce time and maintain power consumption. One of the technique is Continuous Neighbor Discovery, It will find neighbor node and also continuously maintain a immediate neighbour node view. Another technique is Link Assessment Method, It allows for probabilistic guarantee of collision-free packet exchange. Each sensor using a simple protocol in a coordinate effort to reduce power consumption without increasing the time required to detect hidden sensors. SASY USER NAME AND PASSWORD CLOUD R.Monisha K.S.Viveka SCAD College of Engineering and Technology Abstract In this paper, we will discuss the user authentication problems and difficulties to managing user names and Passwords. In many cases the lack of standard rules for choosing. User name and Password has made it really challenging to Remember login information. We continue to propose a model and a technique for this issue and discuss the implementation and Utilizing of this service.
130

AN EFFICIENT DATA AUDITING IN CLOUD COMPUTING K. Nanthini T. Saravanan PSN College of Engineering and Technology Abstract Cloud Storage allows users to store their data and use the cloud applications without the need of local hardware and software resources. Cloud Storage service possess many security risks against storage exactness. This paper presents a flexible distributed storage integrity auditing mechanism to achieve fast localization of data error with low complexity. The proposed design allows users to audit their data and it achieves dynamic data support to ensure the correctness and availability of users data in cloud. i.e., it efficiently supports block modification, deletion, append.

NATURE AGENT BASED ADAPTIVE ENERGY EFFICIENT MOBILITY PATTERN AWARE ALGORITHM FOR MANET K.Naresh Kumar Thapa Dr.T.Pearson Abstract Over the last decade, research efforts are made in MANETs (Mobile Ad Hoc Network) to develop an efficient routing based on energy consumption, time delay and also based on Quality Of Service (QOS). But many research papers concentrate with the routing and with less security or with no security. Security and Routing do not go hand in hand as security features when included with routing leads to cost factor. In our paper we have suggested theoretically to provide routing with Security as a software package so that we need not add extra features for security to shell out extra perks. Thus in this paper security and routing go hand in hand. Thus we conclude the paper stating that adding security to routing protocol does not affect the time delay, energy consumed and other Quality of service (QOS). RESCUE ROBOT FOR LIFE SAVING OPERATION X.Mary Ajila Xavier Scad College of Engineering and Technology, Abstract Science and technology has gone in depth to all Day to day application like automation, biometrics, bio-medical and life saving system and equipment. We should like to develop a rescue robot for saving human life and their belongings during natural and abnormal causes. The proposal
131

DMI College of engineering

robot is a life saving machine and may extend the life of suffered peoples in the accidents and collision area. The proposed robot will be designed with extreme care to withstand load, high and low temperature and chemical environments

FUZZY NETWORK PROFILING FOR INTRUSION DETECTION Bharath .B Abstract The Fuzzy Intrusion Recognition Engine (FIRE) is an anomaly-based intrusion detection system that uses fuzzy logic to assess whether malicious activity is taking place on a network. It uses simple data mining techniques to process the network input data and help expose metrics that are particularly significant to anomaly detection. These metrics are then evaluated as fuzzy sets. FIRE uses a fuzzy analysis engine to evaluate the fuzzy inputs and trigger alert levels for the security administrator. This paper describes the components in the FIRE architecture and explains their roles. Particular attention is given to explaining the benefits of data mining and how this can improve the meaningfulness of the fuzzy sets. Fuzzy rules are developed for some common intrusion detection scenarios. The results of tests with actual network data and actual malicious attacks are described. The FIRE IDS can detect a wide-range of common attack types. Sri Manakula Vinayagar Engineering College

A SECURE CLOUD STORAGE SERVICES BASED ON A SEPARATE ENCRYPTION AND DECRYPTION


MR.K.MANIVANNAN. S.PRIADARSINI PSNA COLLEGE OF ENGINEERING AND TECHNOLOGY Abstract - Enterprises usually store data in internal storage and install firewalls to protect against intruders to access the data. They also standardize data access procedures to prevent insiders to disclose the information without permission. In cloud computing, the data will be stored in storage provided by service providers. Service providers must have a viable way to protect their clients data, especially to prevent the data from disclosure by unauthorized insiders. Storing the data in encrypted form is a common method of information privacy protection. If a cloud system is responsible for both tasks on storage and encryption/decryption of data, the system administrators may simultaneously obtain encrypted data and decryption keys. This allows them to access information without authorization and thus poses a risk to information privacy. This study proposes a business model for cloud computing based on the concept of separating the encryption and decryption service from the storage service. A CRM (Customer Relationship Management) service is described in this paper as an example to illustrate the proposed business model.
132

HIGH EFFICENCY VIDEO CODEC USING CDSA J.Cinista Mrs.S.Buvaneswari Abstract This paper presents an efficient low power VLSI architecture of in-loop Adaptive Bilateral Filter with high efficiency data access system for supporting multiple video coding standards including H.264 BP/MP/HP, SVC, MVC, AVS, and VC-1. Advanced standards, such as H.264 MP/HP, SVC, and MVC, adopt Micro Block Adaptive Frame Field to enhance motion estimation which results in the poor performance, leads to motion estimation at lower data rate and speed that makes. The Adaptive Bilateral Loop filter with an In-Loop Filter is used to eliminate the ringing artifacts .This design challenge has not been discussed in previous works according to our best knowledge. Therefore, we develop a Cross diamond search Algorithm to manipulate motion prediction vectors that provides higher compression ratio. A FLASH TRIE ARCHITECTURE LOOKUP FOR IPv6 PROTOCOL K.Nithya Abstract: It is becoming apparent that the next generation IP route lookup architecture needs to achieve speeds of 100 Gbps and beyond while supporting both IPv4 and IPv6 with fast real-time updates to accommodate ever-growing routing tables. Some of the proposed multi bit-trie based schemes, such as Tree Bitmap, have been used in todays high end routers. However, their large data structure often requires multiple external memory accesses for each route lookup. A pipelining technique is widely used to achieve high-speed lookup with a cost of using many external memory chips. Pipelining also often leads to poor memory load balancing. In this project, a method is proposed where a new IP route lookup architecture called Flash Trie that overcomes the shortcomings of the multi bit-trie based approach. It uses a hash-based membership query to limit off-chip memory accesses per lookup to one and to balance memory utilization among the memory modules. A data structure called PrefixCompressed Trie is off-chip memory accesses per lookup to one and to balance memory utilization among the memory modules. A new data structure called Prefix-Compressed Trie is developed, that reduces the size of a bitmap. Flash Trie also supports incremental real-time updates. CHANNEL ESTIMATION ALGORITHMS USING LSE FOR OFDM-IDMA Kamatchi.M Saravana Kumar.P Shri Andal Alagar College of Engineering
133

Shri Andal Alagar College Of Engineering

Mr.A.Rajan

Shri Andal Alagar College Of Engineering

Abstract This project presents pilot based algorithm performing LSE (Least Square Error) based channel estimation at channel equalizer at better equalization at receiver. LSE consist of

block type and comb type arrangement. OFDM signal the bandwidth is divided into many narrow sub-channels which are transmitted in parallel. Each sub-channel is typically chosen narrow enough to eliminate the effect of delay spread. OFDM with IDMA to overcome the effect of ISI. Broadband wireless systems based on orthogonal frequency division multiplexing (OFDM) often require IFFT/FFT to produce multiple sub carriers. Channel estimation is a outstanding process at receiver side. We propose a new method that is LSE to estimate faded channel signals that reconstruct the received signal that has equal offset with transmitted signal.

134

135

Вам также может понравиться