Вы находитесь на странице: 1из 212

International Conference On Innovative Trends In Computing And Technology Icitct'13 March 2013


JISHNUVIMAL, PG, A.SARAVANAN, MahaBarathi Engineering College

Abstract The Advanced Encryption Standard is the newly accepted symmetric cryptographic standard for transferring block of data safely. In order to prevent the Advanced Encryption Standard from suffering from fault attacks, the technique of error detection can be adopted to detect the errors during encryption or decryption. In this paper, low complexity fault detection schemes for reaching a reliable AES architecture are mentioned. Parity based mechanism is implemented instead of look up table method in the case of sub bytes and there by we propose low complexity fault detection schemes for the AES encryption and decryption. Our simulation results show the error coverage of greater than 99 per cent for the proposed schemes.

S.RENUGADEVI, S.AFSAR SALEEMA, Anna University, Chennai,

Abstract There are too many information available on the web; users are often not patient enough to see long list of results given by search engines to find relevant information. Web search can be made more useful, effective and less burdensome to users by trying to infer what would be relevant for the current user for a given query considering individual users interests and provide those results on the top, so that the user need not have to scroll down a long list of results. The collaboration makes sense to search and recommend the results in effective manner among many users. While working with multiple word queries, N-gram approach is applied to search in documents, especially in cases when one must work with phrase queries. Here the n gram approach is used for searching and retrieving the utmost matched key phrase. For multiple word queries, we internally get a result set for each word inside the query; but better matches combined with a good UsersRank are more probable to occur at the first spots. Collaborative information retrieval in the Ranking phase provides more relevant and preferred pages by the users in the Collaboration.
Organized By, Department Of Computer Science And Engineering & Information technology The Rajaas Engineering College Vadakangulam

International Conference On Innovative Trends In Computing And Technology Icitct'13 March 2013

Vickram college f engineering


Abstract Engineering as it finds its wide range of application in every field not an exception even the medical field. One of the technologies which aid the surgeons to perform even the most complicated surgeries successfully is Virtual Reality. Even though virtual reality is employed to carry out operations the surgeons attention is one of the most important parameter. If he commits any mistakes it may lead to a dangerous end. So, one may think of a technology that reduces the burdens of a surgeon by providing an efficient interaction to the surgeon than VR. Now our dream came to reality by means of a technology called HAPTIC TECHNOLOGY. Haptic is the science of applying tactile sensation to human interaction with computers. In our paper we have discussed the basic concepts behind haptic along with the haptic devices and how these devices are interacted to produce sense of touch and force feedback mechanisms. Also the implementation of this mechanism by means of haptic rendering and contact detection were discussed. We mainly focus on Application of Haptic Technology in Surgical Simulation and Medical Training. Further we explained the storage and retrieval of haptic data while working with haptic devices. Also the necessity of haptic data compression is illustrated.


INDU R NETHAJI Mahendra Institute of Engineering & Technology

Abstract This research paper implements i-TreeSearch using Top-k Approximate Subtree Matching (TASM). It is the problem of finding the k best matches of a small query tree within a large document tree using the canonical tree edit distance as a similarity measure between subtrees.
Organized By, Department Of Computer Science And Engineering & Information technology The Rajaas Engineering College Vadakangulam

International Conference On Innovative Trends In Computing And Technology Icitct'13 March 2013

Evaluating the tree edit distance for large XML trees is difficult. The best known algorithms have cubic runtime and quadratic space complexity, and, thus, do not scale. TASM-postorder is a memory-efficient and scalable TASM algorithm. This paper proves an upper bound for the maximum subtree size for which the tree edit distance needs to be evaluated. The upper bound depends on the query and is independent of the document size and structure. A core problem is to efficiently prune subtrees that are above this size threshold. I develop an algorithm based on the prefix ring buffer that allows us to prune all subtrees above the threshold in a single postorder scan of the document.

SEMANTIC WEB SERVICES-A SURVEY Gayathiri Abstract The technology where the meaning of the information and the service of the web is defined by making the web to understand and satisfies the request of the people is called Semantic Web Services. The idea of having data on the web defined and linked in a way that it can be used by machines not just for display purpose, but for automation, integration and reuse of data across various application and semantic is raised to overcome the limitation of the Web services such as Average WWW searches examines only about 25% of potentially relevant sites and return a lot of unwanted information, Information on web is not suitable for software agent and Doubling of size. It is built on top of the Web Services extended with rich semantic representations along with capabilities for automatic reasoning developed in the field of artificial intelligence. This work attempts to give an overview of the underlying concepts and technologies along with the categorization, selection and discovery of services based on semantic.

L.M. GLADIS BEULA Mrs. N. SARAVANAN VelTech MultiTech Dr.Rangarajan Dr.SakunthalaEngineering College.

Organized By, Department Of Computer Science And Engineering & Information technology The Rajaas Engineering College Vadakangulam

International Conference On Innovative Trends In Computing And Technology Icitct'13 March 2013

Abstract A number of high-bandwidth networks have been constructed; existing, high-speed protocols cannot fully utilize the bandwidth of the high speed networks. As their fixed size application

level receiving buffers suffer from buffer bottleneck. In this paper, analyze the buffer bottleneck problem and propose Rada. By periodically detecting the data arrival rate and consumption rate in the buffer using Exponential Moving Average Scheme. Rada adapts the buffer size dynamically. Rada decides to increase/decrease the buffer when the data arrival rate is constantly faster/slower than the data consumption rate. The adaptation extent in each buffer increase/decrease operation based on a Linear Aggressive Increase Conservative Decrease scheme. Memory utilization is based on Weighted Mean Function. To achieve a high-speed data transfer, as well as easy deployment, User Datagram Protocol (UDP) based high-speed protocols running at the application level have recently been proposed and deployed. UDP based high-speed protocols still cannot fully utilize these high-bandwidth networks.


GREESHMA BANERJI., HEMALATHA.B., Mahendra College of Engineering for Women

Abstract Dynamic loading is an important mechanism for software development. It allows an application the flexibility to dynamically link a component and use its exported functionalities. In general, an operating system or a runtime environment resolves the loading of a specifically named component by searching for its first occurrence in a sequence of directories determined at runtime. Correct component resolution is critical for reliable and secure .Dynamic loading can be hijacked by placing an arbitrary file with the specified name in a directory searched before resolving the target component. A key step in dynamic loading is component resolution, i.e., how to locate the correct component for use at runtime. Operating systems generally provide two resolution methods, either specifying the fullpath or the filename of the target component. It is now important to detect and fix these vulnerabilities. This is first automated technique to detect vulnerable and unsafe dynamic component loadings. Classify two types of unsafe dynamic loadings, resolution
Organized By, Department Of Computer Science And Engineering & Information technology The Rajaas Engineering College Vadakangulam

International Conference On Innovative Trends In Computing And Technology Icitct'13 March 2013

failure and resolution hijacking and develop an effective dynamic program analysis to detect and avoid both types of unsafe loadings. It can detect more than 1,700 unsafe DLL loadings and discover new serious attack vectors for remote code execution. In this the detected malicious dll files are prevented from loading while opening any of the file or copying any files and then asks the user whether or not to continue opening or copying .If the user wants to continue, the system continues the opening of the corresponding software if opening or continues copying process. If the user wants to stop the process, user can select the stop option.



Abstract Modern society is heavily dependent on wireless network for data Transmission. While data transmission in wireless medium, the jamming attacks occur.That selective jamming attacks can be launched by performing real-time packet classification at the physical layer.In All-Or-Nothing Transformation methods introduce a modest communication and computation overhead.In this method Block encryption algorithm is used to hiding the massages.But this algorithm not considered the Timing limits and Parameters length.To overcome this problem the Smart code generator algorithm is used.This techniques provide the strong security level in wireless Medium.

PREVENTION OF BLACK HOLE ATTACK AND CO-OPERATIVE BLACK HOLE ATTACK IN MANET jeeva Abstract Advancement in the research field has witnessed a rapid development in Mobile Ad-hoc Networks. The distributive nature and the infrastructureless structure make it an easy prey to security related threats. A black hole is a malicious node which replies the route requests that it has a fresh route to destination and drops all the receiving packets. The damage will be serious when they work as a group and this type of attack is called Co-operative black hole attack. In this
Organized By, Department Of Computer Science And Engineering & Information technology The Rajaas Engineering College Vadakangulam

International Conference On Innovative Trends In Computing And Technology Icitct'13 March 2013

work, we have designed a routing solution called Trust Based DSR (TBDSR) that enables the Dynamic Source Routing Protocol (DSR) to find a secure end-to-end route free of black hole nodes with cooperation from the neighbors. Also our solution can be able to protect the network in the presence of colluding attackers without the need of promiscuous monitoring the neighbor nodes. The extended defense routing protocol worked efficiently for the malicious node detection and removal in case of Co-operative Black Hole attack resulting in increased network performance. Keywords: Black Hole Attack, Cooperative Black Hole Attack, Ad Hoc Networks, DSR.


JEYABHARATHI P.S.R. Engineering college.

Abstract Wireless Ad Hoc Network is a collection of wireless hosts that can be rapidly deployed as a multi hop packet radio network without the aid of any established infrastructure or centralized administration. The cost reduction and fast evolution experienced by wireless communication technologies have made them suitable for a wide spectrum of applications, One of them is multicasting Networks. Multicasting systems aim at providing a platform for various applications that can improve safety and efficient group communication. This proposed asynchronous key verification scheme as a part of the protocol poses a significant reduction in the message delay.

Organized By, Department Of Computer Science And Engineering & Information technology The Rajaas Engineering College Vadakangulam

International Conference On Innovative Trends In Computing And Technology Icitct'13 March 2013


Nandha Engineering College,

Abstract Cloud computing has emerged as one of the most influential paradigms in the IT industry. In this, new computing technology requires users to entrust their valuable data to cloud providers, there have been increasing security and privacy concerns on outsourced data. Several schemes employing attribute-based encryption (ABE) have been proposed for access control of outsourced data in cloud computing. The most of them suffer from inflexibility in implementing complex access control policies. In this paper, allowing cloud service providers (CSPs), which are not in the same trusted domains as enterprise users, to take care of confidential data, may raise potential security and privacy issues. To keep the sensitive user data confidential against untrusted CSPs, a natural way is to apply cryptographic approaches, by disclosing Decryption keys only to authorized users. But also provide high performance, full delegation, and scalability, so as to best serve the needs of accessing data anytime and anywhere, delegating within enterprises, and achieving a dynamic set of users.HASBE employs multiple value assignments for access expiration time to deal with user revocation more efficiently than existing schemes. It can be provide fine-grained access control and full delegation. Based on the HASBE model, Finally, we propose a scalable revocation scheme by delegating to the CSP most of the computing tasks in revocation, to achieve a dynamic set of users efficiently.


METILDA.D, KALAIVANI.I Dr. Sivanthi Aditanar College of Engineering

Abstract Lymphoma is a cancer of the lymphocyte. The proposed approach tends to classify three types of malignant lymphoma: chronic lymphocytic leukemia, follicular lymphoma, and mantle cell lymphoma. Initially, raw pixels were transformed with a set of transforms into spectral planes. Simple and compound transforms were computed. Raw pixels and spectral planes were then routed to the second stage. At the inner level, the set of features was computed. A single feature

Organized By, Department Of Computer Science And Engineering & Information technology The Rajaas Engineering College Vadakangulam

International Conference On Innovative Trends In Computing And Technology Icitct'13 March 2013

vector is formed by fusing all computed features. The classification mechanism carried out to classify the malignancies by type.


M.KARTHI S.NACHIYAPPAN Velammal College of Engineering and Technology

Abstract Cloud providers can offer cloud consumers two provisioning plans for computing resources, namely reservation plan and on-demand plan. In generally, the cost of utilizing computing resources provisioned by reservation plan is cheaper than on demand plan. There are many kinds of resource provisioning options available in cloud environment to reduce the total paying cost and better utilizing cloud resources.However, the best advance reservation of resources is difficult to be achieved due to uncertainty of consumers future demand and providers resource prices. To address this problemProbabilistic based cloud resource provisioning (PCRP) algorithm is proposed by formulating a Probabilistic model.In this paper to obtain the solution of the PCRP algorithm is considered including State base machine, probability of utilization and Estimate future demand.

MR.K.A.RAJA, M.E, K.LAVANYA, Ranipettai Engineering college.

Abstract A wireless sensor network (WSN) consists of spatially distributed sensors to monitor environmental condition, and to cooperatively collect and pass their data through the network to a main location or sink. So data collection is a fundamental function provided by wireless sensor network. The performance of data collection in sensor networks can be characterized by the rate at which sensing networks can be collected and transmitted to the sink node. Data collection capacity reflects how fast the sink can collect sensing data from all sensors with interference constraint. In
Organized By, Department Of Computer Science And Engineering & Information technology The Rajaas Engineering College Vadakangulam

International Conference On Innovative Trends In Computing And Technology Icitct'13 March 2013

this project, in order to get optimal performance for any arbitrary sensor networks, we use a simple BFS tree method for data collection and greedy scheduling algorithm for deriving capacity bounds of data collection under general graph model where two nearby nodes may be unable to communicate due to barrier or path fading where sensor nodes can be deployed in Gaussian distribution with any network topology.



Abstract This paper focus on the detection of the compromised machines in a network that are used for sending spam messages which are commonly referred to as spam zombies. The nature of sequentially observing outgoing messages gives rise to the sequential detection problem. In this project, we will develop a spam zombie detection system, named SPOT, by monitoring outgoing messages. SPOT is Ratio designed based on a statistical method called Sequential Probability

Test (SPRT), developed by SPRT. It is a powerful statistical method that can be used to

test between two hypotheses (in our case, a machine is compromised versus the machine is not compromised), as the events (in our case, outgoing messages) occur sequentially. We develop an effective spam zombie detection system named SPOT by monitoring outgoing messages of a network. Our evaluation studies show that SPOT is an effective and efficient system in in a network. This system focuses on

automatically detecting compromised machines

identifying the spam message and also detects the compromised machine based on that spam message. After identifying the spam message the SPOT system detect the compromised machine and also restrict the outgoing messages for the corresponding compromised machine. This system can be used in online applications. Based on this project we have to reduce the large number of compromised machines in a same network.
Organized By, Department Of Computer Science And Engineering & Information technology The Rajaas Engineering College Vadakangulam

International Conference On Innovative Trends In Computing And Technology Icitct'13 March 2013



Abstract Internet services and applications have become an inextricable part of daily life, enabling communication and the management of personal information from anywhere. To accommodate this increase in application and data complexity, web services have moved to a multi-tiered design wherein the web server runs the application front-end logic and data are outsourced to a database or file server. This is the main reason for the attackers try to attack the data base. The Cyber security attacks can be detected by using double guard. Double Guard differs from other type of approach that correlates alerts from independent IDSs. The cyber security uses Container-based and session-separated web server architecture enhances the security performances and also provides the isolation between the information flows that are separated in each container session. Virtualization is used to isolate objects and enhance security performance. Lightweight containers can have considerable performance advantages over full virtualization. SECURE AUTHENTICATION USING BIOMETRIC CRYPTOSYSTEM
MS. N.MADHU SUGANYA, MS.T.MEKALA M.Kumarasamy College of engineering

Abstract Cryptography is a concept to protect data during transmission over wireless network. Cryptography is used in information security to protect information from unauthorized or accidental disclosure while the information is in transmitting (either electrically or physically) and while information is in storage. The information could be accessed by the unauthorized user for malicious purpose. Therefore, it is necessary to apply effective encryption/decryption methods to enhance data security. The existing system limits only the total number of users from the unknown remote host to as low as the known remote host. It uses the white list values for tracking legitimate users. But the cookie value expires after certain time period. So the attackers may use different browsers or may try on another machine or may retry after certain time. If any malicious attacks occurred the authenticated user does not know about that. The proposed system uses two algorithms known us Bio-Metric Encryption Algorithm (BEA), Minutiae Extraction Algorithm (MEA). It uses Multi Bio-metric features for authentication purpose. And also this system
Organized By, Department Of Computer Science And Engineering & Information technology The Rajaas Engineering College Vadakangulam

International Conference On Innovative Trends In Computing And Technology Icitct'13 March 2013

dynamically generates a new Session Key for each transaction. After completion of each transaction Authenticated user must change their PIN Number to improve the security. So the proposed system will protect Data Confidentiality, Data Integrity, Authentication, Availability, Access control of information over the network.


MR.M.ISLABUDEEN.M.E.,(PH.D) P.NAGARAJAN. Syed Ammal Engineering College

Abstract The main challenge in the design of minimum delay routing policies is balancing the exchange between routing the packets along the shortest paths to the destination and distributing traffic according to the maximum backpressure. Combining important aspects of shortest path and backpressure routing, this paper provides a systematic development of a distributed opportunistic routing policy with congestion diversity(D-ORCD) in wireless Ad-hoc networks. D-ORCD uses a measure of draining time to opportunistically identify and route packets along the paths with an expected low overall congestion. D-ORCD is proved to ensure a bounded expected delay for all networks and under any admissible traffic. RealisticQualnet simulations for 802.11based networks demonstrate a significant improvement in the average delay over comparative solutions in the literature.


MR.P.PRABU MR.T.GOPALAKRISHNAN Bannari Amman Institute of Technology

Abstract The web search results reordering should be performed along with the user results for more relevance according to his profile. The above concepts should be called as personalization. Creation of the profile based on the input directly given in any form by the user and also users browsing patterns. Input given by the user in the way of keywords, instruction, etc. The profile
Organized By, Department Of Computer Science And Engineering & Information technology The Rajaas Engineering College Vadakangulam

International Conference On Innovative Trends In Computing And Technology Icitct'13 March 2013

refers to data that should be maintained by client or server level. Reordering of results that should maintained standards or degree while retrieving by the user. Hyperlink to be formed with data collection based on reordering search result and hyperlink data is page ranked using page ranking algorithm and apriority algorithm.



Abstract Ontologies have become the de-facto modeling tool of choice, employed in a variety of applications and prominently in the Semantic Web. Nevertheless, ontology construction remains a daunting task. Ontological bootstrapping, which aims at automatically generating concepts and their relations in a given domain, is a promising technique for ontology construction. Bootstrapping ontology based on a set of predefined textual sources, such as Web services, must address the problem of multiple concepts that are largely unrelated. This paper exploits the advantage that Web services usually consist of both WSDL and free text descriptors. The WSDL descriptor is evaluated using two methods, namely Term Frequency/Inverse Document Frequency (TF/IDF) and Web context generation. We propose an ontology bootstrapping process that integrates the results of both methods and validates the concepts using the free text descriptors. The web service free text descriptor offering the more accurate definition of ontologies. They extensively validated our ontology.

Organized By, Department Of Computer Science And Engineering & Information technology The Rajaas Engineering College Vadakangulam

International Conference On Innovative Trends In Computing And Technology Icitct'13 March 2013

K.RAJA MR.N.ANANDA KUMAR Arunai College of Enginering.

Abstract Our paper focuses on design and implementation of computerization of Public Distribution System (Ration Shop) through the state. In recent scenario, all the public and private sectors are going for computerization in their process to simplify and to reduce errors. Civil Supplies Corporation is the major public sector which manages and distributes the essential commodities to all the citizens. In that system various products like Rice, Sugar, Dhal, and Kerosene etc., are distributed using conventional ration shop system. Some of the limitations of conventional ration shop system are due to the manual measurements in the conventional system, the user can not able to get the accurate quantity of material. And also there is a chance for the illegal usage of our products in the conventional system. So we have proposed computerization of Ration Shop and to enhance security we have introduced fingerprint for opening the billing interface so as to avoid illegal entries without the knowledge of the ration card holder. User can also get the accurate quantity of supplies and correct price. In this automated system we replace the convectional ration card by smart card in which all the details about users are provided RFID Smart card and providing Fingerprint of the card holders is used for user authentication. Monitoring Public Distribution system is one of the big issues among public sector, so we have eased the process monitoring the whole system and also the public complaints are directly sent to the higher authority without any intermediate.


A.RAJASEKAR,G.SUTHAKAR, Jayaraj Annapackiam C.S.I.College of Engineering

Abstract In this work, we present a social network spam detection application based on texts. Particularly, we tested on the Face book spam. We develop an application to test the prototype of Face book
Organized By, Department Of Computer Science And Engineering & Information technology The Rajaas Engineering College Vadakangulam

International Conference On Innovative Trends In Computing And Technology Icitct'13 March 2013

spam detection. The features for checking spams are the number of keywords, the average number of words, the text length, the number of links. The data mining model using the decision tree J48 is created using Weka [1]. The methodology can be extended to include other attributes. The prototype application demonstrates the real use of the Face book application.


RAJASEKARAN.S.,KALIFULLA.Y., MURUGESAN.S Veltech Multitech Dr.Rangarajan Dr.Sakunthala Engineering College,

Abstract A cloud storage system, consisting of a collection of storage servers, provides long-term storage services over the Internet. Storing data in a third partys cloud system causes serious concern over data confidentiality. General encryption schemes protect data confidentiality, but also limit the functionality of the storage system because a few operations are supported over encrypted data. Constructing a secure storage system that supports multiple functions is challenging when the storage system is distributed and has no central authority. We propose a threshold proxy reencryption scheme and integrate it with a decentralized erasure code such that a secure distributed storage system is formulated. The distributed storage system not only supports secure and robust data storage and retrieval, but also lets a user forward his data in the storage servers to another user without retrieving the data back. The main technical contribution is that the proxy re-encryption scheme supports encoding operations over encrypted messages as well as forwarding operations over encoded and encrypted messages. Our method fully integrates encrypting, encoding, and forwarding. We analyze and suggest suitable parameters for the number of copies of a message dispatched to storage servers and the number of storage servers queried by a key server. These parameters allow more flexible adjustment between the number of storage servers and robustness.

P.RAJESWARI Anna University, Regional Centre,

Organized By, Department Of Computer Science And Engineering & Information technology The Rajaas Engineering College Vadakangulam

International Conference On Innovative Trends In Computing And Technology Icitct'13 March 2013

Abstract A novel normalized mean median filter is presented for the removal of salt and pepper noise from highly corrupted noisy images. The noisy pixels are replaced with either the computed mean value or the computed median value. The proposed method replaces only the noisy pixel.Experimental results show the superiority of the performance of the proposed algorithm as compared to that of the state-of-the art methods like standard median filter, progressive switching median filter especially when the image is corrupted with high density impulse noise.


RAMYASREE.R.S. MRS.A.CYNTHIA Dhanalakshmi Srinivasan College of Engg.and Tech.,

Abstract The prevalence of high-denition (HD) cameras, televisions, Blu-Ray players, and DVD recorders means that almost all video content is now captured and recorded digitally and much of it in HD. MPEG-2, H.264/AVC, and VC-1 are the most popular codecs in use today, and these rely on decorrelating trans- forms, motion estimation, intra prediction, and variable-length entropy coding (VLC) to achieve good picture quality at high compression ratios .Alongside the need for efcient video compression, there is a critical requirement for error resilience, in particular in association with wireless networks which are characterized by highly dynamic variations in error rate and bandwidth . Compression techniques based on prediction and variable- length coding render an encoded bit stream highly sensitive to channel errors. In the paper, techniques such as pyramid vector quantization (PVQ) have been implemented for increasing the ability to prevent error propagation through the use of fixed-length codeword in the Wireless Environment. In the paper, frame performance of the video has been observed in the pyramid vector section which offers greater compression performance in various techniques.

Organized By, Department Of Computer Science And Engineering & Information technology The Rajaas Engineering College Vadakangulam

International Conference On Innovative Trends In Computing And Technology Icitct'13 March 2013


KLN College of information technology

Abstract The term, cloud computing, has become one of the latest buzzwords in the IT industry. Cloud computing is an innovative approach that leverages existing IT infrastructure to optimize compute resources and manage data and computing workloads. Cloud computing promises to increase the velocity with which applications are deployed, increase innovation, and lower costs, all while increasing business agility. cloud computing that allows it to support every facet, including the server, storage, network, and virtualization



Abstract Neuron reconstruction and dendritic spine identification on a large data set of microscopy images is essential for understanding the relationship between morphology and functions of dendritic spines.Dendrites are the tree-like structures of neuronal cells, and spines are small protrusions on the surface of dendrites. Spines have various visual shapes (e.g., mushroom, thin, and stubby) and can appear or disappear over time. Existing neurobiology literature shows that the morphological changes of spines and the dendritic spine structures are highly correlated with their underlying cognitive functions.How to accurately and automatically analyse meaningful structural information from a large microscopy image data set is a difficult task. One challenge in spine detection and segmentation is how to automatically separate touching spines. In this paper, based on various global and local geometric features of the dendrite structure touching spines are detected and to segment them a breaking-down and stitching-up algorithm is used.
Organized By, Department Of Computer Science And Engineering & Information technology The Rajaas Engineering College Vadakangulam

International Conference On Innovative Trends In Computing And Technology Icitct'13 March 2013


G.BASKARAN, M.SARANYA Srinivasan Engineering College,

Abstract Location awareness is highly difficult for wireless sensor networks. To localize the node using GPS, it is observed that the network is not entirely localized and also cant identify the number of nodes can be located within a network. Node localizability testing cannot be achieved. A new scheme called as Euclidean distance ranging techniques and polynomial algorithm localizability testing, for the node localizability is proposed. It can identify the number of nodes can be located in a connected network. When localize the node, the nodes can be uniquely localized and also the path can be identified using vertex disjoin path. Node localizability provides useful guidelines for network deployment and other location based services.


Abstract A portable device such as a digital camera with a single sensor and Bayer color filter array (CFA) requires demosaicing to reconstruct a full color image. In most digital cameras, Bayer CFA images are captured and demosaicing is generally carried out before compression. Recently, it was found that compression-first schemes outperform the conventional demosaicing-first schemes in terms of output image quality. A Genetic Algorithm based lossless compression scheme for Bayer CFA images is proposed in this Project. Simulation results show that the proposed compression scheme

Organized By, Department Of Computer Science And Engineering & Information technology The Rajaas Engineering College Vadakangulam

International Conference On Innovative Trends In Computing And Technology Icitct'13 March 2013

can achieve a better compression performance than conventional lossless CFA image coding schemes. HOMOMORPHIC AUTHENTICATION WITH DYNAMIC AUDIT FOR CATCHING THE MODIFICATIONS OF DATA IN MULTI CLOUD STORAGE
V.B.VINITHA Prathyusha Institute of Technology and Management

Abstract A multi cloud is a cloud computing environment in which an organization provides and manages some internal and external resources. Provable data possession (PDP) is a audit technique for ensuring the integrity of data in storage outsourcing. However, early remote data audit schemes have focused on static data and the fact that users no longer have physical possession of the possibly large size of outsourced data makes the data integrity protection is very challenging task. In this project proposes the homomorphic authentication with dynamic audit mechanism in multi clouds to support the scalable service and data migration, in which multiple cloud service providers to collaboratively store and maintain the clients' data. Security in cloud is achieved by signing the data block before sending to the cloud by using Boneh LynnShacham (BLS) algorithm which is more secure compared to other algorithms. To ensure the correctness of data, we consider an external auditor called as third party auditor (TPA), on behalf of the cloud user, to verify the integrity of the data stored in the cloud. The audit service construction is based on the techniques, fragment structure, random sampling and index-hash table, supporting provable updates to outsourced data and timely anomaly detection. The security of this scheme based on multi-prover zero-knowledge proof system, which can satisfy completeness, knowledge

soundness, and zero-knowledge properties. The technique of bilinear aggregate signature is used to achieve batch auditing. Batch auditing reduces the computation overhead. Extensive security and performance analysis shows the proposed schemes are provably secure and highly efficient. A SHORT-WAVE INFRARED NANOINJECTION IMAGER WITH 2500 A/W RESPONSIVITY AND LOW EXCESS NOISE

Organized By, Department Of Computer Science And Engineering & Information technology The Rajaas Engineering College Vadakangulam

International Conference On Innovative Trends In Computing And Technology Icitct'13 March 2013

We report on a novel nanoinjection-based short-wave infrared imager, which consists of InGaAs/GaAsSb/InAlAs/InP-based nanoinjection detectors with internal gain. The imager is 320256 pixels with a 30m pixel pitch. The test pixels show responsivity values in excess of 2500 A/W, indicating generation of more than 2000 electrons/photon with high quantum efficiency. This amplification is achieved at complementary metal-oxide semicon- ductor (CMOS) compatible, subvolt bias. The measured excess noise factor F of the hybridized imager pixels is around 1.5 at the responsivity range 1500 to 2000 A/W. The temperature behavior of the internal dark current of the imager pixels is also studied from 300 to 77 K. The presented results show, for the first time, that the nanoinjection mechanism can be implemented in imagers to provide detector-level internal amplification, while maintaining low noise levels and CMOS compatibility.



Abstract Data hiding consists of two set of data, namely the cover medium and digital medium embedding data ,which is called message. Early video data hiding approaches were proposing still image watermarking techniques. This work extended to video by hiding the message in each frame independently. This work deals with two approaches for data hiding .First approach, quantization scale of constant bit rate video and second-order multivariate regression .However, the message payload is restricted to one bit per macro block. Second approach, Flexible Macro block Ordering was used to allocate macro block to slice group according to the content of message. In existing work of compressed video, packets may lost if channel is unreliable. The enhancement to robustness of existing work may proposes a Block shuffling scheme to isolate erroneous block caused by packet loss. And apply data hiding to add additional protection for motion vector. The existing solutions cause compression overhead and proposed solution reduces the packet loss during transmission.

Organized By, Department Of Computer Science And Engineering & Information technology The Rajaas Engineering College Vadakangulam

International Conference On Innovative Trends In Computing And Technology Icitct'13 March 2013


SAKTHI LAKSHMI PRIYA C Anna University, Regional centre

Abstract Vehicle Ad Hoc Network is a special category of Ad-Hoc Network in which vehicles act as nodes. Due to its fast moving nature the network connectivity is an important factor because it can be greatly the performance of VANETs. Percolation theory[7] can be used to analyze the connectivity of VANETs through theoretical deduction, discover the quantitative relationship among network connectivity, vehicle density and transmission range. When vehicle density or transmission range is big enough then there is a jump of network connectivity. By knowing the vehicle density it is possible to calculate the minimum transmission range to achieve good network connectivity. As a large transmission range can cause serious collisions in wireless links, it is tradeoff to choose proper transmission range. Proper analysis of the transmission range can be useful in the realworld deployment of VANETs.


SREEKUMAR K N Mahendra Institute of Engineering & Technology

Abstract A data distributor has given sensitive data to a set of supposedly trusted agents (third parties).Some of the data are leaked and found in an unauthorized place (e.g., on the web or somebodys laptop). The distributor must assess the likelihood that the leaked data came from one or more agents, as opposed to having been independently gathered by other means. This paper proposes data allocation strategies (across the agents) that improve the probability of identifying leakages. These methods do not rely on alterations of the released data (e.g., watermarks). In some cases, we can also inject realistic but fake data records to further improve our chances of detecting leakage and identifying the guilty party.

Organized By, Department Of Computer Science And Engineering & Information technology The Rajaas Engineering College Vadakangulam

International Conference On Innovative Trends In Computing And Technology Icitct'13 March 2013


A.ANTON STENY, MRS.A.MANIAMMAL Kurinji College of Engineering & Technology

Abstract DoS attacks aim to reduce scarce resource by generating illegal requests from one or more hosts. This affects the reliability of the Internet. Also this threatens both routers as well as hosts. To avoid this new concept is proposed named Adaptive Selective Verification (ASV) to avoid Dos attacks, which is a distributed adaptive mechanism for thwarting attackers efforts to deny service to legitimate clients based on selective verification. It performs empirical evaluation AVS protocol with the aim of understanding


K.FATHIMA BUSHRA N. MARTINA P. USHADEVI Dr.Sivanthi Aditanar College of Engineering

Abstract A user stores his personal files in a cloud, and retrieves them wherever and whenever he wants. For the sake of protecting the user data privacy and the user queries privacy, a user should store his personal files in an encrypted form in a cloud, and then sends queries in the form of encrypted keywords. However, a simple encryption scheme may not work well when a user wants to retrieve only files containing certain keywords using a thin client. First, the user needs to encrypt and decrypt files frequently, which depletes too much CPU capability and memory power of the client. Second, the service provider couldnt determine which files contain keywords specified by a user
Organized By, Department Of Computer Science And Engineering & Information technology The Rajaas Engineering College Vadakangulam

International Conference On Innovative Trends In Computing And Technology Icitct'13 March 2013

if the encryption is not searchable. Therefore, it can only return back all the encrypted files. A thin client generally has limited bandwidth, CPU and memory, and this may not be a feasible Solution under the circumstances. In this paper, we investigate the characteristics of cloud computing and propose an efficient privacy preserving keyword search scheme in cloud computing. It allows a service provider to participate in partial decipherment to reduce a clients computational overhead, and enables the service provider to search the keywords on encrypted files to protect the user data privacy and the user queries privacy efficiently. By proof, our scheme is semantically secure.


A.T.SUMITHAR. VAISHNAVI Sri Sairam Engineering College

Abstract Ever growing internet has a very large amount of digital information in the form of semi structured documents and retrieving interesting data according to the user query is a Herculean task. Indeed, documents are often so large that the dataset returned as an answer to a query may be huge to convey interpretable knowledge. In this work an approach is described based on Tree-based Association Rules (TARs), which provides approximate, intensional information on both the structure and the contents of XML documents, and can be stored in XML format as well. This mined knowledge is later used to provide: (i) a concise idea of both the structure and the content of the XML document and (ii) quick, approximate answers to queries. In this work we focus on the second feature. A prototype system and experimental results demonstrate the effectiveness of the approach.


A.THILAGAVATHY K.VIJAYA KANTH Srinivasan engineering college, Perambalur.

Organized By, Department Of Computer Science And Engineering & Information technology The Rajaas Engineering College Vadakangulam

International Conference On Innovative Trends In Computing And Technology Icitct'13 March 2013

Abstract Artificial Neural Networks support their processing capabilities in a parallel architecture. It is widely used in pattern recognition, system identification and control problems. Multilayer Perceptron is an artificial neural network with one or more hidden layers. This paper presents the digital implementation of multi layer perceptron neuron network using FPGA (Field Programmable Gate Array) for image recognition. This network was implemented by using three types of non linear activation function: hardlims, satlins and tansig. A neural network was implemented by using VHDL hardware description Language codes and XC3S250E-PQ 208 Xilinx FPGA device. The results obtained with Xilinx Foundation 9.2i software are presented. The results are analyzed by using device utilization and time delay.

A.ATHIRAJA, Dr.P.VENKATA KRISHNAN , Dr. A. ASKARUNISHA Vickram college of engineering,

Abstract In this project to diagnosis cancer based on micro calcification (MC) in the mammography image. Pre-processing technique is applied to mammogram after that it will convert into training dataset. Perception algorithm used to classify the MC present cells and Mc absent cells. To collect the MC present cells only is called region of interest (ROI). Case Based Reasoning (CBR) classifier used to classify MC present cells into following classes initial, small, medium, high and very high. To make prediction or decision making either patient can be affected by cancer or not. If the patient affected by cancer, they are suggested to antibiotics or operations. The decision making using the fuzzy set approach. This method provides better performance compare to the previous diagnosis techniques.

Organized By, Department Of Computer Science And Engineering & Information technology The Rajaas Engineering College Vadakangulam


M.AYSHWARIYA, M.ANTONY ROBERT RAJ Alpha College of Engineering

Abstract Character recognition is the important area in image processing and pattern recognition fields. Handwritten character recognition refers to the process of conversion of handwritten character in to Unicode character. The recognition system can be either on-line or off-line. Offline TAMIL handwritten character recognition has become a difficult problem because of the high variability and ambiguity in the character shapes written by individuals. A lot of researchers have proposed many approaches are designed to solve this complex problem. But still some of the problems encountered by researchers include, long network training time, long recognition time and low recognition accuracy. The performance of character recognition system is depends on proper feature extraction and correct classifier selection. This paper proposes an approach for offline recognition of Tamil characters using their structural features. Structural features are based on topological and geometrical properties of the character, such as aspect ratio, cross points, loops, branch points, inflection between two points, horizontal curves at top or bottom, etc. These features utilize Hidden Markov Models (HMMs) classifier for recognizing offline Tamil handwritten characters. Higher degree of accuracy in results has been obtained with the implementation of this approach on a comprehensive database and the precision of the results demonstrates its application on commercial usage. The concept proposed is a solution crafted to enhance computational efficiency and improve the recognition accuracy.


L. BERWIN RUBIA K. MANIMALA Dr. Sivanthi Aditanar College of Engineering

Abstract Slow Feature Analysis (SFA) extracts slowly varying features from a quickly varying input signal. SFA framework is introduced to the problem of human action recognition by incorporating the supervised information with the original unsupervised SFA learning. Firstly, interest points are detected in the local spatial and temporal regions, and local feature is described with SFA method. Each action sequence is represented by the Accumulated Squared Derivative (ASD), which is a statistical distribution of the slow features in an action sequence [1]. The descriptive statistical features are extracted inorder to reduce the dimension of the ASD feature is proposed. Finally, one against all support vector machine (SVM) is trained to classify action represented by statistical features.


DIVYA PRIYADHARSHINI M PREETHI S Anna University, Regional Centre

The means of communication not until recently has been only voice and text. Voice and SMS services were given top priority by telecom networks. But, the Internet has provided many other services like electronic file sharing, online gaming, e-commerce and getting access to any information by just goggling which appeal to people as these services are cost effective and also reduces burden on the human part. Making these services available on mobile devices has far more benefits and interesting situations. However, todays internet through cables and wireless limits connectivity only to a small region called Local Area Network (LAN) and Wireless Local Area Network (WLAN) hot spot respectively. Also getting an advanced service support to todays voice dominated telecom mobile networks is not an easy task either. Globally there is a perception that IP is the protocol that will enable new possibilities for telecom sector in future. This article discusses about the features of 4G, the edge it provides once operational, impact on India, barriers to implementation of 4G and recommendations to overcome these barriers.


R.NARMATHA, C.K.NITHYA, G.RANJITHA, M.KALAIYARASI P.S.R.Rengasamy College of Engineering for women

Abstract A greenhouse is a building in which plants are grown in closed environment. Greenhouse management is controlling of several greenhouse. The wireless section is located in the indoor environment where great flexibility is needed, particularly in the production area of greenhouse. Instead, the wired section is mainly used in the outside area as a control backbone, to interconnect the greenhouse with the control room. An integrated wired/wireless solution is to use the advantages of both technologies by improving performances. In the wired section, a controller area network (CAN) type network has been chosen on the account of its simplicity, strongest, cheapness, and good performances. for the wireless part, a zigbee type network has been chosen. The SCADA system is to monitor and control data in a

simple way. To maintain the optimal conditions of the environment, greenhouse management requires data acquisition using the SCADA (supervisory control and data acquisition.


K.JEYASREE, V.RAJALEKSHMI Lord Jegannath College of Engineering & Technology

Abstract This paper proposes Automatic detection of facial recognitation in an image that can be important stage for various facial image manipulation works, such as face recognition, facial expression recognition, 3D face modeling and facial features tracking. R egi on d etection of facial features like eye, pupil, mouth, nose, nostrils, lip corners, eye corners etc., with different facial image with neutral region selection and illumination is a challenging task. In this paper, we presented different methods for fully automatic region detection of facial features. Object detector is used along with haar-like cascaded features in order to detect face, eyes and nose. Novel techniques using the basic concepts of facial geometry are proposed to locate the mouth position, nose position and eyes position. The estimation of detection region for features like eye, nose and mouth enhanced the detection accuracy effectively. An algorithm, using the H-plane of the HSV color space is proposed for detecting eye pupil from the eye detected region. Proposed algorithm is tested over 100 frontal face images with two different facial expressions (neutral face and smiling face).


K.JEYASREE, V.RAJALEKSHMI Lord Jegannath College of Engineering & Technology

Abstract Vehicular AdHoc Networks can offer various service to the user. In this paper, A Key management scheme based on vector group is to be proposed for VANET to overcome high memory overhead and also to reduce the high computational time in the existing system. We propose vector Based Cryptosystem to achieve security in terms of privacy and authentication. KeywordsKey Management, privacy, Authentication Energy consumption, Pair wise key Establishment



National College of Engineering

Abstract Energy efficiency is one of the most required features for the modern electronic systems designed for high-performance and portable applications. Based on this, a high-speed and low-power full-adder cell is designed with DPL and SR-CPL internal logic to reduce the power delay product. The full adder is implemented with an alternative internal logic structure based on multiplexing in order to reduce the power consumption and delay. The designed DPL and SR CPL full adder shows the reduction in power consumption and delay . Post Layout Simulations shows that the proposed full adders gives better energy efficieny . The resultant full adder show to be more efficient then other logic implementation. Thus the full adders were designed and the performances were analysed in which the full-adder cells are implemented with an alternative internal logic structure, based on the multiplexing of the Boolean functions XOR/ XNOR and AND/OR, to obtain balanced delays in SUM and CARRY outputs, respectively, and pass-transistor powerless/groundless logic styles, in order to reduce power consumption.



Abstract In an Intrusion Detection System (IDS) has emerged as one of the most effective way of furnishing security to those connected to the network and the heart of the modern intrusion detection system is a pattern matching algorithm. A network security application needs the ability to perform the pattern matching to protect against attacks like viruses and spam. The solutions for firewall are not scalable; they dont address the difficult of antivirus. The main work is to furnish the powerful systematic virus detection hardware solution with minimum memory for network security. Instead of placing the entire patterns on a chip, a two phase antivirus processor works by condensing as much of the important filtering information as possible onto a chip. Thus, the proposed system is mainly concentrated on reducing the memory gap in on chip memory.


MR. V. VENKATESA KUMARM., MUTHULAKSHMI Anna University Regional Centre,

Abstract The Cloud Computing user will store the data in storage area provided by service providers. To achieve the security in storage devices, the HASBE (Hierarchical Attribute Set Based Encryption) this is driven by the CP-ABE with a hierarchical structure of Cloud users. The data owner can concurrently obtain encrypted data and decryption keys and allows the user to access files without authorization. When user revocation taken the Data must re-encrypt and re-upload to Cloud. This process has to do by data owner itself. The Computation cost and bandwidth cost increased. The HASBE scheme proposes a new scalable hierarchical attribute access control by introducing a key server which shows both efficient access control for outsourced data and encryption/decryption keys. The key must be generated to the users when user revocation taken. The generated key may be useless when user revocated after key generation process. Here, the Key generation time is increased. The HASBE scheme also proposes a new scheme for balancing the key generation and user revocation. It shows effective way for generating key in Cloud environment. It shows high secure and effective way for accessing data in Cloud environment.

N.NAVEEN KUMAR MR.P.SAMPATH KUMAR Varuvan Vadivelan institute of technology

Abstract The uncoded systems have been discussed and energy efficiency has been calculated in previous works. An energy-efficient virtual multiple-input multiple-output (MIMO) communication architecture based on turbo coder is proposed for energy constrained, distributed wireless sensor networks. As sensor nodes are generally battery-powered devices, the critical aspects to face concern how to reduce the energy consumption of nodes, so that the network life time can be increased to reasonable times. The efficiency of space-time block code-encoded (STBC) cooperative transmission is studied. Energy consumption differs for coded and uncoded systems. Though STBC is discussed, a channel encoding scheme consumes more power while system is operating. Energy efficiency is analyzed as a trade-off between the reduced transmission energy consumption and the increased electronic and overhead energy consumption. Simulations are expected to show that with proper design, cooperative

transmission can enhance energy efficiency and prolong sensor network lifetime. Along with that the BER performance is also analyzed under various SNR conditions. Simulation results are included. Since we use turbo coder and decoder for this coded system, BER is expected to be zero at a least value as less as 3dB. CLOUD COMPUTING
S.SANTHOS KUMAR NIRMAL KUMAR N Gnanamani College of Technology

Abstract Cloud computing is a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction. The concept of cloud computing fills a perpetual need of IT a way to increase capacity or add capabilities on the fly without investing in new infrastructure, training new personnel, or licensing new software. Cloud computing encompasses any subscription-based or pay-per-use service that, in real time over the Internet, extends its existing capabilities. Cloud computing provides computation, software, data access, and storage services that do not require end-user knowledge of the physical location and configuration of the system that delivers the services. Parallels to this concept can be drawn with the electricity grid, wherein end-users consume power without needing to understand the component devices or infrastructure required to provide the service Cloud computing THEFT IDENTIFICATION DURING DATA TRANSFER IN IT SECTOR

Abstract Theft identification during data transfer can be elaborated as in when a data distributor has given sensitive data to the trusted agents and some of the data is leaked and found in an unauthorized place. For this the system can use data allocation strategies or can also inject "realistic but fake" data records to improve identification of leaked data and who leaks the data. The Fake Objects looks exactly like original data in which the agents cannot be identified. Many of the data from the organization can be mostly leaked through the e-mails. In order to secure the data which are leaked from the mail can be detected and identified through the Fake Objects. E-Random and S-Random algorithm are used to minimize the content as well as to detect the guilty agent. The leaked data from the organization can be sent to the third parties in the form of cipher text.



Abstract It is a very important factor to allocate the resources for increasing the QoS (Quality of service) for any network carrying various types of real time traffic. Real-time applications are most important to get the benefit of QoS adaptation. More scheduling disciplines are employed at the router to guarantee the QoS of the network. DiffServ (Differentiated Services) is an IP based QoS support framework that differentiates between different classes of traffic. The function of the core router of the network is to forward packets as per the per-hop behavior associated with the DSCP (Differentiated services code point) value. So we are going to propose Quality of Service Model Scheduling Algorithm (QSMA) and random walk protocol for an effective scheme to maintain QOS Parameters such as packet loss, packet delay and Bandwidth providing absolute differentiated services for real-time applications.


Abstract Our project is a method of preliminary formulation of ascent plunge with data seclusion preservation. Here there are two approach named stochastic approach and least square approach. The two methods are proposed for securely complex blocks for both horizontally partitioned data and vertically portioned data. In horizontal and vertical partitioned the parties hold the same object for same set of attributes. The mining of attribute is confined securely and it can access by the key, which is generated from DSA algorithm. Access that data the method involved is the linear regression method... The secure matrix multiplication is used for the computation for linear regression and classification for performing the operation securely and finding the data set of user without accessing the private data. This method is used for securely performing ascent plunge method over vertically partitioned data.



Abstract Web application is an application that is accessed over a network such as the Internet. They are increasingly used for critical services. In order to adopt with increase in demand and data complexity, web application are moved to multitier Design. Thus web applications are become a popular and valuable target for security attacks. These attacks have recently become more diverse and attention of an attacker have been shifted from attacking the front-end and exploiting vulnerabilities of the web applications in order to corrupt the back-end database system. In order to penetrate their targets, attackers may exploit well known service vulnerabilities. To protect multitier web applications, several intrusion detection systems has been proposed. By monitoring both web and subsequent database requests, we are able to ferret out attacks that an independent ID would not be able to identify. An intrusion detection system (IDS) is used to detect potential violations in database security. In every database, some of the attributes are considered more sensitive to malicious modifications compared to others.


A.SIVAGAMI, M.RENUKADEVI P.S.R.Rengasamy College of Engineering For Women

Abstract Radio link fluctuations is a difficult task in packet transmission in mobile ad hoc networks. To overcome this we are proposing a new protocol called novel channel adaptive routing protocol which reduces channel fading. The proposed channel used to select stable links for route discovery by using average non fading duration technique and handoff strategy maintains reliable connections. This protocol provides a dual-attack for avoiding unnecessary route discoveries, predicting path failure leading to handoff and then bringing paths back into play when they are again available, rather than simply discarding them at the first sign of a fade. Keywords- Mobile ad hoc networks, Average non-fading duration, Routing protocols, Channel adaptive routing.


TANIA VERONICA SEBASTIAN J. RAJA Annai Mathammal Sheela Engineering College

Abstract An organization undoubtedly wants to preserve and retain data stored in an organizations computers. On the other hand, this data is necessary for daily work processes. Users within the organizations perimeter (e.g., employees, subcontractors, or partners) perform various actions on this data (e.g., query, report, and search) and may be exposed to sensitive information embodied within the data they access. In an effort to determine the extent of damage to an organization that a user can cause using the information obtained, the concept of a ranking based approach in security alert for handling sensitive data organization is introduced. The score measure is tailored for tabular data sets (e.g., result sets of relational database queries) and cannot be applied to non-tabular data such as intellectual property, business plans, etc. By assigning a score that represents the sensitivity level of the data that a user is exposed to, the weight can determine the extent of damage to the organization if the data is misused. Using this information, the organization can then take appropriate steps to prevent or minimize the damage.


Abstract Mobile networks have received great deal of attention during last few decades due to their potential applications such as large scale, improved flexibility, and reduced costs. Variation in link quality and routing assignment are the major problems in communication network. This proposed work addresses two problems associated with mobile network such as method to reduce overhead between the nodes, and energy balanced routing of packets by Co-Operative opportunistic routing for cluster based communication. We propose a modified algorithm that uses On-Demand Opportunistic Group mobility based clustering (ODOGMBC) for forming the cluster and predicting the cluster mobility by neighborhood update algorithm. Cluster formation involves election of a mobile node as Cluster head. Each cluster comprises of cluster head and non-cluster head node that forms a cluster dynamically. Each node in the network continuously finds it neighbour by communicating with them, and nodes have consistent updated routing information in route cache by neighborhood update algorithm. In routing process packet forwarded by the source node is updated by intermediate forwarder if topology undergo changes. This opportunistic routing scheme provides responsive data transportation and managing the node effectively, even in heavily loaded environment. Thus, our proposed routing technique helps us to reduce overhead, increases efficiency and better control of path selection.



Abstract Text passwords have been adopted as the primary mean for user authentication in online websites. Humans are not experts in memorizing them, therefore they rely on the weak passwords. As they are the static passwords there are some adversary who can launch attacks to steal passwords, and suffers quitely from few security drawbacks: phishing, keyloggers and malware. This problem can be overcome by a protocol named oPass which leverages a users cellphone and an SMS to thwart password stealing. Opass greatly avoids the man-in-middle attacks. In case of users lose their cellphones, this still works by reissuing the SIM cards and long-term passwords. This is a efficient user authentication protocol and is at affordable cost.


ATHIRA JAYAN , MALU FATHIMA AFSAR National College Of Engineering

Abstract Because the Internet has been widely applied in various fields, more and more network security issues emerge and catch peoples attention. However, adversaries often hide them selvesby spoofing their own IP addresses and then launch attacks. For this reason, researchers have proposed a lot of trace back schemes to trace the source of these attacks. Some use only one packet in their packet logging schemes to achieve IP tracking. Others combine packet marking with packet logging and therefore create hybrid IP trace back schemes demanding less storage but requiring a longer search. In this paper, we propose a new hybrid IP trace back scheme with efficient packet logging aiming to have a fixed storage requirement for each router (under 320 KB, according to CAIDAs skitter data set) in packet logging without the need to refresh the logged tracking information and to achieve zero false positive and false negative rates in attack-path reconstruction. In addition, we use a packets marking field to censor attack traffic on its upstream routers. Lastly, we simulate and analyze our scheme, in comparison with other related research in the following aspects: storage requirement, computation, and accuracy. EFFECTIVE ADAPTIVE PREDICTION SCHEME FOR WORKLOADS IN GRID ENVIRONMENT

Abstract It is easier to predict workload when task is not complex but it is difficult to predict grid performance if a task is complex because heterogeneous resource nodes are involved in a distributed environment. Time-consuming execution of workload on a grid is even harder to predict due to heavy load fluctuations. In this paper we use, polynomial fitting method for CPU workload prediction. While predicting the workload of the grid error may occur during the prediction, such errors are denoted as prediction errors. These errors are minimized by using the technique called EBAF (Estimation Based Adaptive Filter method). Resource window is generated and mean square error analysis is done, error means difference between the true value and predicted value is calculated. Finally benchmark techniques have been applied to evaluate the performance of the grid.


Abstract Cloud computing is the long dreamed vision of computing as a utility, where data owners can remotely store their data in the cloud to enjoy on-demand high-quality applications and services from a shared pool of configurable computing resources. In this paper, to focus on the security of cloud data storage, effective and flexible distributed storage verification scheme to ensure the correctness and availability of users data all time in the cloud.The proposed design allows encryption process of the data by the data owner before it reaches the Cloud server. To guarantee the simultaneous identification of the misbehaving servers such as Byzantine failure, malicious data modification attack and even sever colluding attacks. By implementing Erasure code technique, data can be recovered from the above failures and that to achieve data availability all time in cloud. Storing data in a third partys cloud system causes serious concern on data confidentiality. This project also provides where users can safely delegate the integrity checking tasks to third-party auditors (TPA).The proposed design further supports secure and efficient dynamic operations on outsourced data, including block modification, deletion, and append This project ensures proper double time data security. AN EFFECTIVE METHOD TO COMPOSE RULES USING RULE ONTOLOGY IN REPEATED RULE ACQUISITION FROM SIMILAR WEB SITES

Abstract Semantic content of the web page is used to extract the rules from similar web pages of same domain. Rule acquisition is used to acquire rules. We obtain rules from web pages which contain unstructured texts .Acquiring rules from a site by using similar rules of other sites in the same domain rather than extracting rules from each page from the start. We proposed an automatic rule acquisition procedure using rule ontology Rule To Onto, which represents information about the rule components and their structures. The rule acquisition procedure consists of the rule component identification step and the rule composition step. The rule component identification is complete. We use Genetic algorithm for the rule composition and we perform experiments demonstrating that our ontology-based rule acquisition approach works in a real-world appln.


SILPA.L., MRS.NAGESWARI., The Rajaas Engineering College

Abstract Multi-field packet classification has evolved from tra- ditional fixed 5-tuple matching to flexible matching with arbitrary combination of numerous packet header fields. For example, the recently proposed Open Flow switching requires classifying each packet using up to 12-tuple packet header fields. It has become a great challenge to develop scalable solutions for nextgeneration packet classification that support higher throughput, larger rule sets and more packet header fields. This paper exploits the general packet classication problem has received a great deal of attention over the last decade. The ability to classify packets into ows based on their packet headers is important for security, virtual private networks and packet ltering applications. Multi-eld packet classication has evolved from traditional xed 5-tuple matching to exible matching with arbitrary combination of numerous packet header elds. In this project we propose a new approach to packet classication based on fuzzy logic decision trees. We focus here only on the problem of identifying the class to which a packet belongs. Here we present a fuzzy decision-tree-based linear multi-pipeline architecture on FPGAs for wire-speed multield packet classication. A new method of fuzzy decision trees called soft decision trees is used. This method combines tree growing and pruning, to determine the structure of the soft decision tree, with refitting and back-fitting, to improve its generalization capabilities. We considered the next-generation packet classication problems where more than 5-tuple packet header elds would be classied. Several optimization techniques were proposed to reduce the memory requirement of the state-of-the-art decision-tree-based packet classication algorithm. When matching multiple fields simultaneously, it is difficult to achieve both high classification rate and modest storage in the worst case. Our soft decision tree-based scheme, which can be considered among the most algorithms which has high throughput and efficiency.


ANANDH.A Saveetha Engineering college

Abstract Cloud storage is an online storage where data is stored in virtualized pools of storage which are hosted by third parties. However, data storage may not be fully trustworthy which possesses many security challenges on cloud storage. Access control, version control, and public auditing are taken into account to secure the data stored in the cloud. Proposed secured overlay cloud storage system will provide finegrained access by using hierarchy based access control, access to the cloud storage system is provided based on the users group type. Version control is the framework which eliminates data redundancy and provides version backup in the cloud storage. On top of the version control design, layered approach of cryptographic protection is added to enhance the data security. Version control will be employed in to the cloud storage by implementing appropriate storage mechanism. Finally, public auditing in the cloud storage system will be enforced to maintain the activity log and to analyses the data accessed by the users, thus data stored in the cloud storage is audited and any kind of modification to the data will be reported to the administrator. DETECTING SESSION HIJACKS IN WIRELESS NETWORKS
BANU PRIYA.E , MOHAMMAD MALIK MUBEEN.S. National College of Engineering

Abstract Among the variety of threats and risks that wireless LANs are facing, session hijacking attacks are common and serious ones. When a session hijacking attack occurs, an attacker forces a normal user to terminate its connection to an access point (AP) by rst masquerading the APs MAC address. The attacker then associates with the AP by masquerading the us ers MAC address and takes over its session. Current techniques for detecting session hijacking attacks are mainly based on spoof able and predictable parameters such as sequence numbers, which can be guessed by the attackers. To enhance the reliability of intrusion detection systems, mechanisms that utilize the un spoofable PHY layer characteristics are needed. We show that using a Wavelet Transform (WT), the colored noise with complex POWER Spectral Density (PSD) in our case can be approximately whitened. Since a larger Signal to Noise Ratio (SNR) increases the detection rate and decreases the false alarm rate, the SNR is maximized by analyzing the signal at specic frequency ranges.


ARAVINTH.S Mr. RAMALINGAM SAKTHIVELAN N.M.K. Shri Krishna Engineering College

Abstract Wireless sensor networks (WSNs) are used in many areas for critical infrastructure monitoring and information collection. For WSNs, SLP service is further complicated by the nature that the sensor nodes generally consist of low-cost and low-power radio devices. Computationally intensive cryptographic algorithms (such as public-key cryptosystems), and large scale broadcasting-based protocols may not be suitable. Propose criteria to quantitatively measure source-location information leakage in routing-based SLP protection schemes for WSNs. Through this model, identify the vulnerabilities of SLP protection schemes. Propose a scheme to provide SLP through routing to a randomly selected intermediate node (RSIN) and a network mixing ring (NMR). The security analysis, based on the proposed criteria, shows that the proposed scheme can provide excellent SLP.The message will send securely.The adversaries cannot able to identify the source location. The adversaries cannot make any interuption to the message because of the secure algorithms. The comprehensive simulation results demonstrate that the proposed scheme is very efficient and can achieve a high message delivery ratio. It can be used in many practical applications. PACKET DATA REDUNDANCY ELIMINATION IN DATA AGGREGATION

Abstract Wireless Network consists of sizable amount of device nodes and base station. Every nodes transmit the similar information to the bottom station. So Energy are wasted and network life is drained quicker in device network. During this paper we have a tendency to projected Energy efficient Heterogeneous cluster Protocol with Support Vector Machine (SVM) supported the data Aggregation. It collects or compress the data from the various finish points. Therefore minimum energy are spent and prolong the network life time and additionally it minimizes the amount of transmissions. The performance of the projected technique is then compared with the LEACH protocol. Simulation results shows that this projected mechanism will with efficiency take away the data redundancy in wireless device network.

P.ARUNA .P.RANJAN School of Computing Sciences, Hindustan University.

Abstract Efficient and effective full-text retrieval and ranking process in unstructured peer-to-peer networks remains a challenge in the research community because it is difficult, if not

impossible, for unstructured P2P systems to effectively locate items with guaranteed recall and existing schemes to improve search success rate often rely on replicating a large number of item replicas across the wide area network, incurring a large amount of communication and storage costs. Due to the exact match problem of DHTs and federated search problem, such schemes provide poor full-text search capacity. It proposes replication of Bloom Filters for efficient and effective data retrieval and ranking of data in unstructured P2P networks. Ranking that provides the needs of the users vary, so that what may be interesting for one may be completely irrelevant for another. The role of ranking process is thus crucial: select the pages that are most likely be able to satisfy the users needs, and bring them in the top positions. Ranking of data is performed based on the term frequency and keywords. By replicating the encoded term sets using BFs and stemming of words instead of raw documents among peers, the communication and storage costs are greatly reduced, while the full-text multi keyword searching is supported. BEHAVIOURAL BASED SECURED USER AUTHENTICATION USING IMAGE CAPTCHA
S.DEEPAN K.SURESH KUMAR Saveetha Engineering College,

Abstract Recent days, web access has become more popular. Due to more number of user access there are many threats. Web access has been controlled by providing a secured authentication (username and password). Remembering the Password is the main challenging task for the user when accessing the webpage or Email account. A major problem in security is the fact that internet users have online accounts to many websites, systems, and devices that require them to remember passwords for identification. Because users can only remember a limited number of passwords, many simply forget them. In order to reaccess their account security question provides the solution. Since remembering the Password is the main challenge, it is difficult for the user to give the appropriate answer. To overcome this problem a new technique proposed called as image CAPTCHA.


DR. S.SAKTHIVEL D.GOMATHI Anna University of Chennai

Abstract Microarray is an array of gene data; in turn a gene data is nothing but a cell. The Microarray technology is an important biotechnological means that allows us to record the expression levels of thousands of genes. This process is carried out simultaneously within a number of different samples. An important application of microarray gene expression data in functional genomics is to classify samples according to their gene expression profiles. The proposed work attempts to find the application of the mutual information criterion to evaluate a set of attributes and to select an informative subset to be used as input data for microarray classification. In this set of large amount of genes, only a few is effective to perform diagnostics in an optimal way. In order to find the effective group of genes we are proposing a Supervised Clustering Algorithm (SCA) in this work. All the existing Unsupervised Clustering Algorithms groups genes according to mean and Standard Deviation measures. These existing algorithms will not consider parameters such as mutual information or correlation. The proposed algorithm is introduced to compute the similarity between attributes. This similarity measure is useful for reducing the redundancy among the attributes. The original gene set is partitioned into subsets or clusters with respect to the similarity measure. A single gene from each cluster having the highest gene-class relevance value is selected as the representative gene. The proposed supervised attribute clustering algorithm yields biologically significant gene clusters. The performance of the proposed algorithm is effective when compared with existing algorithms on both qualitatively and quantitatively. INDIAN LICENSE PLATE RECOGNITION BASED ON OPTICAL CHARACTER RECOGNITON
R.DENNIS,DR.R.K.SELVAKUMAR Cape Institute of Technology

Abstract Indian license plate recognition based on optical character recognition (ILPROCR) plays an important role in numerous applications and a number of techniques have been proposed. The approach concerns stages of pre-processing, license plate detection, extract character and number from the detection plate, license plate segmentation and character recognition. In the experiments all types of license plate, camera obtained at different day time and weather conditions were used. This paper provides character recognizer for the identification of the characters in the license plate.

A.IDA BERYL J.BEMINA Anand Institute of Higher Technology

Abstract The analytical characterization of the nonlinear interference that results when passing more than one high-order modulation carrier through the same nonlinear transponder high-power amplifier. A Volterra filter is proposed which is novel in its implementation of this analytical characterization and modeling of inter symbol interference and adjacent channel interference. The focus is on adaptive algorithms with pilot-based training so that the solutions are completely blind to unknown transponder HPA characteristics, and can rapidly respond to varying operating back-off level. Furthermore, two families of adaptive solutions are provided to compensate for nonlinear ISI and ACI. The first set performs adaptive channel inversion and then applies equalization. The second set of solutions performs adaptive channel identification and then applies cancellation. The effectiveness of the proposed analysis and techniques is demonstrated via extensive simulations for high-order QAM and APSK modulations. It is also included the coded performance with selected LDPC codes designed for the DVB-S2 standard. Finally, computational complexity is assessed and performance impact is quantified when complexity is reduced by decreasing the number of Volterra coefficients.


SELVAMANJU.E,. MS.S.AGNES JOSHY, Francis Xavier Engineering College,

Abstract: Cloud Computing is a large scale distributed storage system. It offers the end user resources and highly scalable services. In the cloud services, users data are usually placed in the remote area. Users do not operate the data directly. Due to this user fears about losing of their own data. Then a highly decentralized information accountability framework is mainly used to keep on monitor the users data in the cloud. To provide both logging and auditing mechanisms for users data and users control. To ensure the users data will trigger authentication and automated logging to the JAR (JavaARchives) programmable capabilities. JAR files automatically log the usage of the users data by any entity in the cloud. In addition, an approach can handle personal identifiable information and also to provide a security analysis and reliability. It is essential to provide an effective mechanism.

I.S.JENZI, P.PRIYANKA, DR.P.ALLI Velammal College of Engineering and Technology

Abstract Cardio vascular disease is a major threat to half of the world population. The term heart disease is related to all the diverse diseases affecting the heart. The healthcare industry generates large amount of data that are too difficult to be analyzed by traditional methods. This shows the significance of computer assisted method to make correct decisions. The objective of this paper is to develop a heart disease prediction system using data mining techniques. This helps to identify useful patterns of information from the medical data for quality decision making. Association rules and Classification techniques like Decision Tree, Naive Bayes and Neural Networks are used in literature for disease prediction. This work concentrates on building a classifier model using Decision Tree for predicting heart disease. The system is implemented in .NET platform and the popular data mining tool WEKA is also used. The result obtained from the classifier enables to establish significant patterns and relationships between the medical factors relating to heart disease.

S.GEOFFRIN, MRS.J.C KANCHANA, KLN College of Engineering,

Abstract This paper presents a new spectral clustering method called correlation preserving indexing (CPI),which is performed in the correlation similarity measure space. In this framework, the documents are projected into a low-dimensional semantic space in which the correlations between the documents in the local patches are maximized while the correlations between the documents outside these patches are minimized simultaneously. Since the intrinsic geometrical structure of the document space is often embedded in the similarities between the documents, correlation as a similarity measure is more suitable for detecting the intrinsic geometrical structure of the document space than euclidean distance. Consequently, the proposed CPI method can effectively discover the intrinsic structures embedded in high-dimensional document space. The effectiveness of the new method is demonstrated by extensive experiments conducted on various data sets and by comparison with existing document clustering methods.

A.IDA BERYL J.BEMINA Anand Institute of Higher Technology

Abstract The analytical characterization of the nonlinear interference that results when passing more than one high-order modulation carrier through the same nonlinear transponder high-power amplifier. A Volterra filter is proposed which is novel in its implementation of this analytical characterization and modeling of intersymbol interference and adjacent channel interference. The focus is on adaptive algorithms with pilot-based training so that the solutions are completely blind to unknown transponder HPA characteristics, and can rapidly respond to varying operating back-off level. Furthermore, two families of adaptive solutions are provided to compensate for nonlinear ISI and ACI. The first set performs adaptive channel inversion and then applies equalization. The second set of solutions performs adaptive channel identification and then applies cancellation. The effectiveness of the proposed analysis and techniques is demonstrated via extensive simulations for high-order QAM and APSK modulations. It is also included the coded performance with selected LDPC codes designed for the DVB-S2 standard. Finally, computational complexity is assessed and performance impact is quantified when complexity is reduced by decreasing the number of Volterra coefficients. RISK FACTOR ASSESSMENT FOR HEART DISEASE USING DATA MINING TECHNIQUES
I.S.JENZI, P.PRIYANKA, DR.P.ALLI Velammal College of Engineering and Technology

Abstract Cardio vascular disease is a major threat to half of the world population. The term heart disease is related to all the diverse diseases affecting the heart. The healthcare industry generates large amount of data that are too difficult to be analyzed by traditional methods. This shows the significance of computer assisted method to make correct decisions. The objective of this paper is to develop a heart disease prediction system using data mining techniques. This helps to identify useful patterns of information from the medical data for quality decision making. Association rules and Classification techniques like Decision Tree, Naive Bayes and Neural Networks are used in literature for disease prediction. This work concentrates on building a classifier model using Decision Tree for predicting heart disease. The system is implemented in .NET platform and the popular data mining tool WEKA is also used. The result obtained from the classifier enables to establish significant patterns and relationships between the medical factors relating to heart disease.


R.JEGATHEESWARI , R.KANITHA Kalasalingam Institute of Technology

Abstract Automatic Identification Technologies (AIT) have revolutionize the way the world conducts commerce, but many people do not really understand what these technologies do or how AIT is changing our lives. Automatic Identification Technology is comprised of numerous technologies such as RFID, OCR, 2D-bar codes, magnetic strips, smart cards, voice recognition, and biometrics. Automatic identification holds the promise of collecting data about a movable asset in the physical world with 100 percent accuracy in real-time. Traditional methods of monitoring production in enterprises by humans on site are unable to meet the expectations for efficiency, accuracy and cost as product lifecycles are shortened continuously. Setting up an RFID and ZigBee based manufacturing monitoring system is a good approach to improve monitoring efficiency so as to improve management efficiency in enterprises. Although there are still some problems to be solved for RFID and ZigBee technologies, their unique features still make the monitoring system based on them a promising system in manufacturing enterprises. The architecture of the RFID and ZigBee based monitoring system is presented in this paper. A MACHINE DOCTOR THAT DIAGNOSING OPHTHALMOLOGY PROBLEMS USING NEURAL NETWORKS
A.JENEFA., S RAJI., Francis Xavier Engineering college

Abstract Ophthalmology is the branch of medicine it deals with eye and its problems. ExpertSystem contain the knowledge about a particular diseases. Machine doctor is one without any human doctor machine can cure ophthalmology problems by using expert system. We are ophthalmology problems such as myopia hypermetropia, astigmatism, mainly dealing with

presbiopia , retinopathy

and glaucoma . Our machine doctor provides both advice about diseases and the information about diseases using the expert's system and auto refraction. The machine doctor get the input as either the queries, data, voice etc.. and provide the output as data can be taken either by printed statement or using any electronic devices. The Neural Networks concept is used to get the input Using back propagation alg. The main aim is to give advice to rural people and make our machine doctor as user friendly one and cost effective.

M.JOTHIMANI Nandha Engineering College

Abstract Cloud computing has emerged as one of the most influential paradigms in the IT industry. In this, new computing technology requires users to entrust their valuable data to cloud providers, there have been increasing security and privacy concerns on outsourced data. Several schemes employing attributebased encryption (ABE) have been proposed for access control of outsourced data in cloud computing. The most of them suffer from inflexibility in implementing complex access control policies. In this paper, allowing cloud service providers (CSPs), which are not in the same trusted domains as enterprise users, to take care of confidential data, may raise potential security and privacy issues. To keep the sensitive user data confidential against untrusted CSPs, a natural way is to apply cryptographic approaches, by disclosing Decryption keys only to authorized users. But also provide high performance, full delegation, and scalability, so as to best serve the needs of accessing data anytime and anywhere, delegating within enterprises, and achieving a dynamic set of users. HASBE employs multiple value assignments for access expiration time to deal with user revocation more efficiently than existing schemes. It can be provide fine-grained access control and full delegation. Based on the HASBE model, Finally, we propose a scalable revocation scheme by delegating to the CSP most of the computing tasks in revocation, to achieve a dynamic set of users efficiently. A NOVEL CORRELATION PRESERVING INDEXING METHOD FOR DOCUMENT CLUSTERING IN CORRELATION SIMILARITY MEASURE SPACE
S.GEOFFRIN, MRS.J.C KANCHANA KLN College of Engineering,

Abstract This paper presents a new spectral clustering method called correlation preserving indexing (CPI), which is performed in the correlation similarity measure space. In this framework, the documents are projected into a low-dimensional semantic space in which the correlations between the documents in the local patches are maximized while the correlations between the documents outside these patches are minimized simultaneously. Since the intrinsic geometrical structure of the document space is often embedded in the similarities between the documents, correlation as a similarity measure is more suitable for detecting the intrinsic geometrical structure of the document space than euclidean distance. Consequently, the proposed CPI method can effectively discover the intrinsic structures embedded in high-dimensional document space. The effectiveness of the new method is demonstrated by extensive experiments conducted on various data sets and by comparison with existing document clustering methods.


S.BRENIJA STANLEY DR.M.IRULAPPAN Francis Xavier Engineering College

Abstract This paper, deals with an opportunistic resource scheduling problem for the relay-based Orthogonal Frequency Division Multiple Access(OFDMA) cellular network where relay stations (RSs) perform opportunistic network coding with downlink and uplink sessions of a mobile station (MS). To this end, consider time-division duplexing (TDD) where each time-slot is divided into three phases according to the type of transmitter nodes, i.e., the base station(BS), MSs, and RSs. Moreover, to improve the flexibility for resource allocation, dynamic TDD scheme is applied, in which the time duration of each phase in each time-slot can be adjusted. The opportunistic network coding, introduces a novel model for network coding aware RSs with which an opportunistic network coding problem can be reduced to an opportunistic sub channel scheduling problem. Scheduling can be provided by the Fuzzy algorithm. This paper formulate an optimization problem that aims at maximizing the average weighted-sum rate for both downlink and uplink sessions of all MSs, in order to satisfy the quality-of service(QoS) requirements of each MS. It develops a resource scheduling algorithm that optimally and opportunistically schedule sub channel, transmission power, network coding, reduced power consumption and time duration of each phase in each time-slot. Through the numerical results, measures how each of network coding strategy and dynamic TDD affects the network performance with various network environments. RTOS BASED MONITOR THE INDUSTRIAL ENVIRONMENT AND EMBEDDED SYSTEM INTEGRATED IN A WSN
R.JAYAKUMAR MRS.R.THENMOZHI Ganadipathy tulsis jain engineering college

Abstract The system proposed in this paper aims to reduce the switching time delay and increase number of motors at monitoring the torque and efficiency in Automatic industrial environment, in real time by employing wireless sensor networks (WSNs). An embedded system is employed for acquiring electrical signals from the motor in a noninvasive and invasive manner, and then performing local processing for torque and efficiency estimation. The values calculated by the embedded system are transmitted to a monitoring unit through an IEEE 802.15.4-based WSN. At the base unit, various motors can be monitored in real time. The RTOSVXWOKS Reduce switching delay time between the twotasks according to assigned priority, it is theoretically zero time delay, so increase the monitoring time. IEEE 802.15.4Zigbee is work according to MAC address based. When the speed and

temperature exceeds threshold level, the ARM controller were control the motor and enables the buzzer. VXWORKS provide high performance, scalable, Reliable and high throughput monitoring system. Keyword: VXWORKS, Efficiency estimation, embedded systems, induction Motors, torque measurement, wireless sensor networks (WSNs). TOP-K RESPONSES USING KEYWORD SEARCH OVER RELATIONAL DATABASES THROUGH TUPLE UNITS
MRS.G.JEYASRI, MRS.K.UMAMAHESWARI, University College of Engineering

Abstract Existing keyword search methods on databases usually find Steiner trees composed of connected database tuples as answers. By discovering rich structural relationship between database tuples they on-the fly identify Steiner trees, and without consider the fact that structural relationship can be precomputed and indexed. Tuple units are proposed to improve search efficiency by indexing structural relationships between tuples, and existing methods identify a single tuple unit to answer keyword queries. In many cases, multiple tuple units should be combined together to answer a keyword query. Hence these methods will involve false negatives. To handle this problem, we study how to integrate multiple related tuple units to effectively answer keyword queries and to achieve a high performance, two novel indexes are used, single keyword based structure aware index and keyword pair based structure aware index .Structural relationships between different tuple units are incorporated into the indexes. By using the indexes, we can effectively identify the answers of integrated tuple units. New ranking techniques and algorithms are progressively implemented to find top-k answer.


ANAND.G , ABINAYA .S , KARTHIGA RPSRR College of engineering for women

Abstract Steganography is the art and science of hiding secret data within an innocent cover data so that it can be securely transmitted over a network. Steganography is the art of hiding the fact that communication take place by secretly hiding the information from others hiding. Many different carrier file formats are used, but digital images are popular because of their frequency on the internet. Hiding the information within the image includes various steganography methods some are more complex than others and all of them have respective good and weak points. This project report intends to give an view of image steganography, its uses and techniques. In this paper we proposed on edge adaptive scheme that will select the sharper pixel along the edges for hiding the message. The advantage is that the smooth edges are very less affected. A new technique that adaptively selects the

pixels at the edge (sharper regions) for hiding the secret data rather than selecting randomly using PRNG. For lower embedding rates, only sharper edge regions are used. When the embedding rate increases, the edge regions can be released adaptively for data hiding by adjusting a few parameters. A TREE BASED MINING APPROACH FOR DETECTING INTERACTION PATTERNS
LINCY JANET. F, KARTHIKA. N, M. JANAKI MEENA Velammal College of Engineering and Technology

Abstract Human interactions are defined as the social behavior or communicative actions taken by meeting participants related to the current topic. The various kinds of human interactions are proposing an idea, giving comments, acknowledgement; ask opinion, positive opinion and negative opinion. These interactions are essential to predict the user role, attitude and intention towards the topic. This project focuses on only the task oriented interactions that address the task related aspects. Mining human interaction is important to access and understand the meeting content quickly. This project proposes a mining method to extract the frequent human interaction patterns. The interaction flow is represented as a tree. Hence, the popular tree based mining algorithms namely frequent interaction tree pattern mining and frequent interaction subtree pattern mining are designed to analyze and extract frequent patterns from the constructed trees.

T.KATHIRAVAN., MR.M.SUDHAKARAN Ganadipathy tulsis jain engineering college

Abstract The security management system is described in this project. System structure of wireless security adopts two level structures. The first level is consist of some remote controllers and a launcher which is include wireless burglar alarm, fault alarm, power-off alarm, self-checking alarm and some wireless night patrol point. The second level is consisting of a wireless receiver and a wireless alarm controller in the system. Gas sensor to detect the flammable gas which generally evolves from the oil wells, in case of any detection occur automatically exhaust will switch on to pass away the particles. Temperature level feel to be high in the mean time cooling fan will trigger to reduce or to maintain the particular temperature in the wells. With the help of current & potential Transformer we can find out the fluctuations in the pumping section, The level of oil should be vary from the indicated level its gives alert message via voice indicator, measure the humidity, use PH sensor. All datas transmitted & monitored in PC, which means control room.

P. LAKSHMIPRIYA, M.KALIDASS Maharaja Engineering College

Abstract The development of wireless and web technologies has allowed the mobile users to request various kinds of services by mobile devices at anytime and anywhere but also use their mobile devices to make business transactions easily, e.g., via digital wallet. The location of the mobile phone user is an important piece of information used during mobile commerce or m-commerce transactions. Mining and Prediction of users mobile commerce behaviors such as their movements and purchase transactions has been studied in data mining research. Most of the previous studies adopt an Pattern Mining approach. However, Pattern Mine needs more time to mine the frequent patterns in transaction databases when data size increases. In this study, propose a Weighted Sequential Pattern (WSP) and Periodical Pattern method for mining and prediction of purchase behavior of mobile users which reduces the time complexity and mining the accurate result for each item set. Performance study shows that the Weighted Sequential Pattern (WSP) and Periodical Pattern mining is efficient and accurate for Predict both large and small frequent patterns, and is about an order of magnitude faster than some recently reported new frequent-pattern mining methods. A JOINT SENTIMENT TOPIC DETECTION FROM TEXT USING SEMI-SUPERVISED TACTIC
KARTHIKA .N , LINCY JANET .F, JANAKI MEENA.M., Velammal College of Engineering and Technology

Abstract Sentiment analysis or Opinion mining is the process of detecting the subjective information in given text. Text may include subjective information like opinions, attitudes and feelings. Sentiment analysis also has an important potential role as enabling technologies for other systems. This paper employed two semi supervised probabilistic approaches called JST model and Reverse JST model to detect sentimental topic. The system designed in this paper classifies positive and negative labels of an online review. In JST, the document level sentiment classification is based on topic detection and topic sentiment analysis. JST process, the sentiment labels are associated with documents, the topics are generated dependent on sentiment distribution and words are generated conditioned on the sentiment topic pair. In Reverse JST, the sentiment label is dependent on the topics. In this process, where the topics are associated with documents, the sentiment labels are associated with topics and words are associated with both topics and sentiment labels. In LDA, where topic are associated with documents and words are associated with topic distribution. JST and Reverse JST are evaluated on

four different domains using the Gibbs Sampling Algorithm. The nature of JST makes it highly portable to other domains. It compares JST and Reverse JST with latent dirichlet allocation. In this paper observed topic and topic sentiment detected by JST are indeed coherent and informative. A SURVEY ON TRANSACTION MAPPING ALGORITHM FOR MINING FREQUENTLY OCCURRING DATASETS
S.SURIYA, R.M.MEENAKSHI Velammal College of Engineering and Technology

Abstract An algorithm for mining complete frequent itemsets is used. This algorithm is referred to as the Transaction mapping algorithm. This algorithm contains transaction ids of each itemset that are mapped and compressed to continuous transaction intervals in a various space and the counting of itemsets is performed by intersecting these interval lists in a depth-first order along the lexicographic tree. And as when the compression coefficient becomes smaller than the average number of comparisons for intervals intersection at a particular level, the algorithm switches to transaction id intersection. DIGITAL IMAGE FORGERY DETECTION AND ESTIMATION THROUGH EXPLORING IMAGE MANIPULATIONS
MARY METILDA. J Roever Engineering College

Abstract In this modern age in which we are living, digital images play a vital role in much application areas. But at the same time the image retouching techniques has also increased which forms a serious threat to the security of digital images. To scope with this problem, the field of digital forensics and investigation has emerged and provided some trust in digital images. The proposed technique for image authentication that detects the manipulations that are done in the digital images. In most of the image forgeries such as copy-and-paste forgery, region duplication forgery, image splicing forgery etc basic image operations or manipulations are often involved, if there exists the evidence for basic image alterations in digital images we can say that the image has been altered. This paper aims at detecting the basic image operations such as re-sampling (rotation, rescaling), contrast enhancement and histogram equalization which are often done in forged images. The available interpolation related spectral signature method is used for detecting rotation and rescaling and for estimating parameters such as rotation angle and rescale factors. This rotation/rescaling detection method detects some unaltered images as altered one when the images are JPEG compressed. We have overcome that problem by adding noise in the input images. We have also used the existing fingerprint detection technique for detecting contrast enhancement and histogram equalization.

M.MINU SUNITHA MARY., MS.E.SALOME , Holy Cross Engineering College

Abstract Today security concerns are on the rise in all areas such as banks, governmental applications, healthcare industry, military organization, educational institutions, etc. Government organizations are setting standards, passing laws and forcing organizations and agencies to comply with these standards with non-compliance being met with wide-ranging consequences. There are several issues when it comes to security concerns in these numerous and varying industries with one common weak link being passwords.Most systems today rely on static passwords to verify the users identity. However, such passwords come with major management security concerns. Users tend to use easy-to-guess passwords, use the same password in multiple accounts, write the passwords or store them on their machines, etc. Furthermore, hackers have the option of using many techniques to steal passwords such as shoulder surfing, snooping, sniffing, guessing, etc.Several proper strategies for using passwords have been proposed. But they didnt meet the companys security concerns. Two factor authentication using devices such as tokens and ATM cards has been proposed to solve the password problem and have shown to be difficult to hack. Two factor authentications is a mechanism which implements two factors and is therefore considered stronger and more secure than the traditionally implemented one factor authentication system. Withdrawing money from an ATM machine utilizes two factor authentications; the user must possess the ATM card, i.e. what you have, and must know a unique personal identification number (PIN), i.e. what you know. ENERGY EFFICIENT SENSORY DATA COLLECTION AND RECONCILING FROM DAMAGED NETWORK
MS.SRIE VIDHYA JANANI.E., NANDHINI.B Anna University, Regional Centre

Abstract Wireless Sensor Network has wide range of applications in the field of networks. The sink nodes need to communicate effectively with other sensor nodes, for effective communication. The facts such as cluster size, energy and lifetime of the nodes should be considered to make the communication effective. While transmitting, the nodes are grouped in clusters with one head per cluster. The cluster, nearer to the sink nodes may run out of energy due to continuous utilization. So an intermediate node for communication is used, called as AGM node .The sensor nodes in the clusters, first send the information to their cluster head(chosen on the basis higher residual energy),the cluster head in turn

sends the information to the AGM node whereas the AGM transmits it to the respective sensor node and vice versa. Selection of AGM among many nodes and the entire process is carried out on the basis of The maneuver algorithm, which has 6 phases like compact clustering, AGM selection, Interclustering, Load balancing and data distribution, Communication and replenishment and Reconcile algorithm. In this process the CH transmits data after eliminating redundancy in it. In case of massive damage, Reconcile algorithm is used for the efficient usage of available AGM nodes to regain from the relapsed network. FEATURES EXTRACTION AND VERIFICATION OF SIGNATURE IMAGE
A.VAIRAMUTHU,NAVIAJOSEPH,A.RUBIYA,S.RAMYA P.S.R.Rengasamy college of engineering for women

Abstract Communication leads to the development of languages. Writing is an art which varies from person to person .Signature is one of the best way to identify the people. Signature of the same individual may vary with time and situation. Signature verification is very important in the field of person authentication such as military, banking, etc. In order to identify and control forgeries we go for signature verification. There are three types of forgeries present. They are random forgery, simple forgery and skilled forgery. In this paper we present a suitable and efficient method for offline signature verification with good reliability and accuracy. This method is very useful to identify the forgeries. Signature verification process includes preprocessing stage, feature extraction stage and signature verification stage. This method is reliable and less expensive. Skilled forgeries are also identified using this method. AIRBORNE INTERNET
NITHIYA.A ., MANIMEKALAI.V., National engineering college

Abstract The word on just about every Internet user's lips these days is "broadband." We have so much more data to send and download today, including audio files, video files and photos, that it's clogging our wimpy modems. Many Internet users are switching to cable modems and digital subscriber lines (DSLs) to increase their bandwidth. There's also a new type of service being developed that will take broadband into the air. Our paper explains some of the drawbacks that exist in satellite Internet and introduces the airborne Internet, called High Altitude Long Operation (HALO), which would use lightweight planes to circle overhead and provide data delivery faster than a T1 line for businesses. Consumers would get

a connection comparable to DSL. The HALO Network will serve tens of thousands of subscribers within a super-metropolitan area, by offering ubiquitous access throughout the networks signal "footprint". The HALO aircraft will carry the "hub" of a wireless network having a star topology. The initial HALO Network is expected to provide a raw bit capacity exceeding 16 Gbps. The concept of basic network connectivity could be used to connect mobile vehicles, including automobiles, trucks, trains, and even aircraft. Network connectivity could be obtained between vehicles and a ground network infrastructure. BRAIN TUMOR DETECTION AND AUTOMATIC SEGMENTATION
K.MUTHUREGA, A.NIVETHA, H.PADMAPRIYA, Dr.K.Ramasamy P.S.R.Rengasamy College of Engineering for women

Abstract Brain tumor detection and segmentation is the complicated task in MRI. The MRI indicates the regular and irregular tissues to differentiate the overlapping tissues in the brain. The automatic seed selection method has the problem if there is no growth of tumor and if even any small white region is present there. But the edges of the tumor is not sharped so, the result is not accurate. This happened only at the initial stage of tumor. So, the texture based detection is used and it segment automatically to separate the regular and irregular tissue to obtain the tumor area from unaffected area. The method used here is seeded region growing method and version using is MATLAB CLASSIFICATION MODEL FOR EARLY DISEASE DIAGNOSIS USING DATA MINING
P.PRIYANKA,I.S.JENZI, DR.P.ALLI Velammal College of Engineering and Technology

Abstract Cerebrovascular disease is a disease threatening human health seriously. It is ranked as the second leading cause of death after ischemic heart disease. To discover and prevent cerebrovascular disease as early as possible has become critical. In clinical practice its occurrence is so abrupt and fierce that it is hard to make early and accurate diagnosis and prediction beforehand. To overcome this cerebrovascular disease predictive model is constructed using the classification algorithms. This work aims at obtaining data on the patients including their physical exam results, blood test results and diagnosis data. The purpose is to construct an optimum cerebrovascular disease predictive model. Three classification models are constructed using the classification techniques like Bayesian classifier decision tree, and back propagation neural network. This work focuses on providing the pre-processed dataset where the missing values and unwanted values are removed. The pre-processed data are classified according to age attribute and are extracted using feature extraction. The mean and standard

deviation for each attribute is calculated. The attribute matching the threshold value based on the importance of the attributes are extracted employing the SVM algorithm. The extracted attributes are splitted as T1, T2, T3 and are used for constructing classification model. The efficiency of the models are compared with each other and the model with best efficiency is taken and rules are predicted. WIDE RANGE REPUTATION BASED ANNOUNCEMENT SCHEME FOR VANETS

Abstract Using mobile ad hoc networks in an automatic environment (VANET) opens a new set of applications, such as the passing the information about local traffic or road conditions. This can increase traffic safety and improve mobility. One of the main advantages is to forward event related message. Vehicular ad hoc network (VANETs) it can be allowing vehicles to generate and broadcast message to inform nearby neighboring vehicles about road conditions, such as traffic congestion and accidents. Neighboring vehicles can use this information which may improve road safety and traffic efficiency. But messages generated by vehicles may not be reliable. In existing system use an announcement scheme for vanets based on a reputation system it can be allows evaluation of message reliability .This can improve the secure and efficient reputation broadcast in vanets. Our Proposed system It might be of interest to extend the current scheme in such a way that a message can be utilized by vehicles in a greater area. AGENT TRUST FOR EFFECTIVE COMMUNICATION IN INTELLIGENT TRANSPORTATION SYSTEM
S.RAMAPRIYA, S.PADMADEVI Velammal College of Engineering and Technology

Abstract An increasingly large number of cars are being equipped with global positioning system and Wi-Fi devices, enabling vehicle-to-vehicle (V2V) communication with the aim of providing road safety and increased passenger. This technology functions the need for agents that assist users by intelligently processing the most effective received information. Some of these mobile agents try to maximize car owners utility by sending out erroneous information. The consequence of acting on erroneous information implies the serious need to establish trust among mobile agents. The main aim of this work is to develop a model for the trustworthiness of agents in other vehicles to receive the most effective information. The challenge is to design intelligent agents to enable the sharing of information between vehicles in mobile ad hoc vehicular networks (VANETs). This work develops a multifaceted trust modeling approach that incorporates role based trust, priority based trust, experience based trust and majority-based trust and this is able to restrict the number of reports

received. It includes an algorithm that proposes how to integrate various dimensions of trust, with the practice of experimenting to validate the benefit of agent approach, stressing the importance of each of the different facets. The result provides an important methodology to enable effective V2V communication via intelligent mobile agents. CRITICAL EVENT DETECTION AND MONITORING USING NOVEL SLEEP SCHEDULING IN WSN
K.RAMYA, MR. V.SENTHIL MURUGAN, Srinivasan Engineering College

Abstract In wireless sensor networks during critical event monitoring only a small number of packets have to be transmitted. The alarm packet should be broadcast to the entire network as earlier, if any critical event is detected. Therefore, broadcasting delay is an important problem for the application of the critical event monitoring. To prolong the network lifetime some of the sleep scheduling methods are always employed in WSNs it results in a significant broadcasting delay. A novel sleep scheduling method to be proposed it is based on the level-by-level offset schedule to achieve a low broadcasting delay in wireless sensor networks (WSNs). There are two phases to set the alarm broadcasting first one is, if a node detects a critical event, it create an alarm message and quickly transmits it to a center node along a pre-determined path with a node-by-node offset way. Then the center node broadcasts the alarm message to the other nodes along another predetermined path without collision. An on demand distance vector routing protocol is established in one of the traffic direction for alarm transmission. The proposed system is used in military and forest fire application.


Abstract Reducing the Peak to Average Power Ratio (PAPR) in OFDM system by using Convolution Partial Transmit Sequence (C-PTS). In C-PTS has several Inverse Fast-Fourier transform (IFFT) operation increase the computational complexity of C-PTS. In our method the number of IFFT operations are used to reduce the slight PAPR losses. Simulations are performed with QPSK modulation with OFDM signal and Salehmodel power amplifier. The linearity and efficiency of the Saleh model power amplifier (PA). is increased by the effect of digital predistortion (DPD).


S.RENUGA DEVI., S.CHIDAMBARAM., V.MANIMARAN., National Engineering College,

Abstract Now a days more number of clients using online banking, online banking systems are becoming more desirable and achieve in banking system secure in client information product data from attacker. To maintain the clients trust and confidentiality of their online banking services on purchase items, check account information etc. How attackers compromise accounts and develop methods to protect them. Towards this purpose, presents a modified model to authenticate clients for online banking transactions through utilizing Identity- Based mediated RSA (IB-mRSA) technique in conjunction with the one-time ID concept for the increase security in online banking, The introduced system exploits a method for splitting private keys between the client and the Certification Authority (CA) server. Generating key splitting into two parties one for SEM (SEcurity Mediator) another key using for client using this key encrypt the message. SEM using key for Decrypt the client requests.



Abstract: The Eye Mouse is the equivalent of the conventional computer mouse, but it is entirely controlled by the eyes and nose movements. This offers interesting possibilities for the study of eye movements during drawing, as well as providing a unique device to allow disabled users to operate computers. In this paper we would like to introduce design of a system for controlling a PC by eye movements. During last ten years the computers have become common tools of work it is nearly impossible to exist without them in everyday life. We are witnessing the time of revolutionary introduction of computers and information technologies into daily practice. Healthy people use keyboard ,mouse, trackball, or touchpad for controlling the PC. However these peripheries are usually not suitable for disabled people. They may have problems using these standard peripheries, for example when they suffer from myopathy, or cannot make moves with hands after an injury. Therefore we are coming with a proposal how to ease the disabled people to control the PC.

RESMI.S.P AND LINDA PHILIP Udaya school of engineering.

Abstract Future renewable energy systems will need to interface several energy sources such as fuel cells, photovoltaic (PV) array with the load along with battery backup. A three-port converter finds applications in such systems since it has advantages of reduced conversion stages, high-frequency aclink, multi winding transformer in a single core and centralized control. Some of the applications are in fuel-cell systems, automobiles, and stand-alone self-sufficient residential buildings


MR. R.SARAVANAN V.REVATHI, Anna University of Chennai,

Abstract Energy consumption becomes a primary concern in a Wireless Sensor Network. To pursue maximum energy saving at sensor nodes, a mobile collector should traverse the transmission range of each sensor in the field such that each data packet can be directly transmitted to the mobile collector without any relay. This approach leads to significantly increased data gathering latency due to the low moving velocity of the mobile collector. It studies the tradeoff between energy saving and data gathering latency in mobile data gathering by exploring a balance between the relay hop count of local data aggregation and the moving tour length of the mobile collector. This approach proposes a polling-based mobile gathering approach and formulates it into an optimization problem, named bounded relay hop mobile data gathering. Specifically, a subset of sensors will be selected as polling points that buffer locally aggregated data and upload the data to the mobile collector when it arrives. In the meanwhile, when sensors are affiliated with these polling points, it is guaranteed that any packet relay is bounded within a given number of hops. It then gives two efficient algorithms for selecting polling points among sensors.


Abstract In this paper, modified image segmentation techniques were applied on MRI scan images in order to detect brain tumor. In order that for segmentation purpose we have handled analysis process from that we have proposed better algorithm for detection of brain tumor. In case of the next process that is of identification, in which we are using probabilistic neural network classifier, in this it classifies tumor tissue from normal tissue.


Abstract: Cloud computing enables highly scalable services to be easily consumed over the Internet on an asneeded basis. cloud storage is a model of networked online storage where data is stored in virtualized pools of storage which are generally hosted by third parties. Hosting companies operate large data centers, and people who require their data to be hosted buy or lease storage capacity from them. Data robustness is a major requirement for storage systems. There have been many proposals of storing data over storage servers. One way to provide data robustness is to replicate a message such that each storage server stores a copy of the message. A decentralized erasure code is suitable for use in a distributed storage system. To construct a secure cloud storage system that supports the function of secure data forwarding by using a proxy re-encryption scheme. The encryption scheme supports decentralized erasure codes over encrypted messages and forwarding operations over encrypted and encoded messages. Our system is highly distributed where storage servers independently encode and forward messages and key servers independently perform partial decryption. We analyze and suggest suitable parameters for the number of copies of a message dispatched to storage servers and the number of storage servers queried by a key server. These parameters allow more flexible adjustment between the number of storage servers and robustness.

M.SARANYA, R. KAVITHA Velammal College of Engineering and Technology

Abstract Visual Sensor Networks (VSNs) are recent communication infrastructures used to capture and transmit dense visual information from an application context. This paper addresses the most relevant challenges posed by VSNs, namely energy efficiency and security.Energy efficiency is one of the most challenging issues for multimedia communication in Visual Sensor Networks due to the resource constraints, the requirements for high bandwidth and low transmission delay. When the nodes send any video data, it consumes more time. This is due to the large size of the video file when compared to text file. Therefore, compressed the video data before sending to the destination. Another important factor during data transfer is security. This paper proposes the joint compression and encryption which are employed to enable faster and secured transmission of video data. The joint compression and encryption algorithms resolve two major issues such as energy efficiency and security when confidential video data is sent over the Visual Sensor Networks. ADAPTIVE COUNTERMEASURE TO PREVENT DOS ATTACKS USING AN ADAPTIVE SENSITIVE AUTHORIZATION
N.SELVAGANAPATHY G.VINOTHCHAKKARAVARTHY Velammal College of Engineering and Technology

Abstract DoS attacks aim to reduce scarce resources by generating illegitimate requests from one or many hosts. It made damage to the system. To avoid this, propose a new concept Adaptive selective Verification Certification (ASV) method to avoid DoS attack, which is a distributed adaptive mechanism for thwarting attackers efforts to deny service to legitimate clients based on selective verification, with the auction based payment. Various users network path to be limited. For that, set adaptive bandwidth limit with server state whose size remains small and constant regardless of the actions and set band limit by dynamically changeable. The change depends on the usage of the clients. Perform empirical evaluation of the ASV protocol with the aim of understanding performance in practice of the attackers. And enhanced the system by adding multiple properties for the clients on finding attack rate.


P.SHARMILA, J.SHANTHALAKSHMI REVATHY Velammal college of engineering & technology

Abstract Clustering high dimensional data results in overlapping and loss of some data. This paper extends the k-means clustering using weight function for clustering high dimensional data. The weight function can be determined by vector space model that convert high dimensional data into vector matrix. Thus the proposed algorithm is for fuzzy projective clustering which is used to find the overlapping boundaries in various subspaces. The objective function is to find the relevant dimensions by reducing the irrelevant dimensions for cluster formation. This can be explained in document clustering. Email documents are taken as sample datasets to explain fuzzy projective clustering. FAST PERCEPTUAL VIDEO ENCRYPTION USING RANDOM PERMUTATION ON MODIFIED DCT CO-EFFICIENTS
M.SHENBAGAVALLI., S.RAJAGOPAL., L.JERART JULUS National engineering college

Abstract Generally videos are of larger volume. Video encryption is also known as video scrambling. It is one of the powerful techniques for preventing unwanted interception. In this paper a robust Perceptual Video Encryption technique is applied by selecting one out of multiple unitary transforms according to the encryption key generated from random permutation method at the transformation stage. The encryption is done by splitting each frame into their corresponding RGB components. By altering the phase angle of the encryption key the separated components of each frame are thus underwent to unitary transform. The transformed frame contains co-efficient which includes both high frequency component and low frequency component. In the first stage, IDCT is applied to encrypted frames and the frames are then combined together to get the encrypted video. In the second stage, the encrypted frames are quantized and encoded. To overcome the drawbacks of Huffman coding, adaptive arithmetic encoder is used at the coding stage. Thus the encrypted bit stream is obtained. In the third stage, the decryption is done to obtain the original video. Also the performance factors under various parameters are analyzed. This methodology will be useful for video-based services over networks.


A.SIVAGAMI, L.MARISELVI, M.RENUKADEVI P.S.R.Rengasamy College of Engineering For Women

Abstract Radio link fluctuations is a difficult task in packet transmission in mobile ad hoc networks. To overcome this we are proposing a new protocol called novel channel adaptive routing protocol which reduces channel fading. The proposed channel used to select stable links for route discovery by using average non fading duration technique and handoff strategy maintains reliable connections. This protocol provides a dual-attack for avoiding unnecessary route discoveries, predicting path failure leading to handoff and then bringing paths back into play when they are again available, rather than simply discarding them at the first sign of a fade.


SUDHAGAR.V, MUTHU PATTAN.V Lord Jegannath College of Engineering and Technology

Abstract The remote monitoring system is growing very rapidly due to the growth of supporting technologies as well. Problem that may occur in remote monitoring such as the number of objects to be monitored and how fast, how much data to be transmitted to the data center to be processed properly. This study proposes using a cloud computing infrastructure as processing center in the remote sensing data. This study focuses on the situation for sensing on the environment condition and disaster early detection. Where those two things, it has become an important issue, especially in big cities big cities that have many residents. This study proposes to build the conceptual and also prototype model in a comprehensive manner from the remote terminal unit until development method for data retrieval. We also propose using FTR-HTTP method to guarantee the delivery from remote client to server.


Abstract Now a days scarcity of spectrum is a major issue in the field of wireless communication, so efficient usage of spectrum is needed. This can be achieved by using the cognitive radio. Major problem concerned with cognitive radio is spectrum sensing. A multi resolution fast filter bank using cyclostationary feature detection to sense the various ranges of spectrum in military radio receivers is proposed. It overcomes the constraint of fixed spectrum sensing. In cyclostationary feature detection small sub bands can also be sensed and the small sub bands can be used for LAN communications in military applications. By means of cyclostationary feature detection we can classify and identify the primary signal either Digital Video Broadcasting- Terrestrial (DVB-T) or wireless microphone signal. By using the knowledge of identifying primary signals will help cognitive radio to use fraction of TV band when only a wireless microphone signal is present in the channel. It can also detect some features of the primary signal like double sideband, data rates and the modulation type .



Abstract Multicasting protocols supports group communication in Ad hoc networks where Receiver-based protocol is one among them which have been proposed as a means of allowing communication when nodes do not maintain any state information. In receiver-based protocols, receivers contend to be the next-hop router of a packet. Further, For multicast communication, the RB Multicast protocol is used, which simply uses a list of the multicast members addresses, embedded in packet headers, to enable receivers to decide the best way to forward the multicast traffic and a new retransmission scheme to enhance the performance of RB Multicast was proposed For receiverbased protocols using effective Duty Cycle Assignment technique based on distance. That minimizes the expected energy dissipation for a given node distance to the sink. Moreover, This

method achieves energy efficiency and high packet delivery ratio even in heavy network traffic without sacrificing the latency and throughput significantly.


S. VALARMATHI Mr. S. SATHISHKUMAR Srinivasan engineering college

Abstract VMM (Virtual Machine Monitor) is used to develop a resilient execution environment for a critical application even in the presence of corrupted OS kernel. The attacker tries to capture the application content by corrupting the OS when an application is executing. In previous case the attacker corrupts the OS by injecting code in the system then the application terminates immediately without executing it. In this current system even in the presence of corruption the application is executed without any interception and it provide a resilient authenticated execution of critical application in untrusted environment by using Virtual Machine Monitor (VMM). VMM is a monitoring technique to monitor all the activities during execution and it is one of the online based recovery schemes to identify any such corruption. It repairs the memory corruption and allows the process for normal execution. VMM solutions generally broadcast into two categories they are memory authentication and memory duplication. Memory authentication is to check the integrity of an application and memory duplication is to rectify the corruption. The system can be applied for military application, hospitals and for all critical applications. IMPLEMENTATION OF EFFICIENT LIFTING BASED MULTI LEVEL 2-D DWT
R.VIJAYAMOHANARENGAN Indra Ganesan College of Engineering,

Abstract To present a modular and pipeline architecture for lifting based multilevel 2-D DWT. A VHDL model was described and synthesized using implementation of our architecture. The whole architecture was optimized in efficient pipeline and parallel design way to speed up and achieve higher hardware utilization. The two dimensional discrete wavelet transform (2-D DWT) is widely used in many image compression techniques. This is because the DWT can decompose the signals into different sub-bands with both time and frequency information and facilitate to achieve a high compression ratio. It is therefore a challenging problem to design an efficient VLSI architecture to implement the DWT computation for real-time applications. Owing to its regular and flexible

structure, the design can be extended easily into Different resolution levels and its area is independent of the length of the 1-D input sequence. Compared with other known architectures, proposed design requires the least computing time for 1-D lifting DWT. EVALUATION OF DATA TRANSFERRING IN MULTICORE SYSTEM

Abstract Receive side scaling (RSS) is an NIC technology that provides the benefits of parallel receive processing in multiprocessing environments. However, RSS lacks a critical data steering mechanism that would automatically steer incoming network data to the same core on which its application thread resides. This absence causes inefficient cache usage if an application thread is not running on the core on which RSS has scheduled the received traffic to be processed and results in degraded performance. To remedy the RSS limitation, Intels Ethernet Flow Director technology has been introduced. However, our analysis shows that Flow Director can cause significant packet reordering. Packet reordering causes various negative impacts in high-speed networks. We propose an NIC data steering mechanism to remedy the RSS and Flow Director limitations. This data steering mechanism is mainly targeted at TCP. We term an NIC with such a data steering mechanism A Transport-Friendly NIC (A-TFN). Experimental results have proven the effectiveness of A-TFN in accelerating TCP/IP performance.


K.VINOTHINI Ms. B. AMUTHA Srinivasan engg college

Abstract The current system is having limitations in handling reconfigurations for a replica set and it is also difficult for life time membership. For that dynamically changing System membership in a large scale reliable storage system is maintained and carried out by a membership service .This service is done with an automatic reconfiguration. This reconfiguration is carried out by a membership service and dBQS[database Query Service]. dBQS is interesting in its own right because its storage algorithms extend existing Byzantine Quorum protocols to handle changes in the replica set, and it

differ from previous DHTs by providing Byzantine Fault tolerance and offering strong semantics. We develop two heuristic algorithms for the problems. Experimental studies show that the heuristic algorithms achieve good performance in reducing communication cost and are close to optimal solutions.


C.UMAMAHESWARI, S.ROSLIN MARY Anand Institute of Higher Technology.

Abstract One of the major high-level tasks in computer vision is the process of object detection and recognition. The human visual system observes and understands a scene or image by making series of fixations. Every fixation point lies inside a particular region of arbitrary shape and size in the scene which can either be an object or just a part of it. Using that fixation point will be an identification marker on the object, a method to segment the object of interest by finding the optimal closed contour around the fixation point in the polar space. The proposed segmentation framework combines visual cues, in a cue independent manner. The proposed algorithm is more suitable for an active observer capable of fixating at different locations in the scene: it applies in a single image. The optimal closed contour around a given fixation point is found. This proposed segmentation framework is used to establish a simple feedback between the mid level cues (regions) and the low level cues (edges). The segmentation refinement process based on this feedback process. Our algorithm is parameter-free, computationally efficient and robust.


ANANDHI.P MS.V.GAYATHRI., Srinivasan Engineering College

Abstract This graduation project aims to present an application that is able of replacing the traditional mouse with the human face as a new way to interact with the computer. Facial features (nose tip and eyes) are detected and tracked in real-time to use their actions as mouse events. The coordinates and movement of the nose tip in the live video feed are translated to become the coordinates and movement of the mouse pointer on the users screen. The left/right eye blinks fire left/right mouse click events. The only external device that the user needs is a webcam that feeds the program with

the video stream. In the past few years high technology has become more progressed, and less expensive. With the availability of high speed processors and inexpensive webcams, more and more people have become interested in real-time applications that involve image processing. One of the promising fields in artificial intelligence is HCI(Human Computer Interface.) which aims to use human features (e.g. face, hands) to interact with the computer. One way to achieve that is to capture the desired feature with a webcam and monitor its action in order to translate it to some events that communicate with the computer. In our work we were trying to compensate people who have hands disabilities that prevent them from using the mouse by designing an application that uses facial features (nose tip and eyes) to interact with the Computer.


T.SIRON ANITA SUSAN N.SURESH Kurinji College of Engineering and Technology

Abstract Vehicular ad hoc networks (VANETs) enable vehicles to communicate with each other but require efficient and robust routing protocols for their success. we exploit the infrastructure of roadside units (RSUs) to efficiently and reliably route packets in VANETs. Our system operates by using vehicles to carry and forward messages from a source vehicle to a nearby RSU and, if needed, route these messages through the RSU network and, finally send them from an RSU to the destination vehicle. All the RSUs are interconnected with each other to take our corresponding RSU. Here in our system we are going to implement the same vehicle communication with server based manner. By this server based method the communication will be more effective and we can reduce delay with maximized throughput in VANET and also we can able to predict the traffic density of particular area of our network. This will help the vehicles like Ambulance, Police Department and Commercial users with no cost of price and also for social network application. Here the RSU is going to work as reporter to its particular Server. The server will response for the table maintenance that the data in its database. So the communication will be faster than the existing model.Here we evaluate the performance of our system using the ns2 simulation platform and compare our scheme to existing solutions. The results prove the feasibility and efficiency of our scheme.



Abstract Web-based collaborations and processes have become essential in todays business and services

environments. Such processes typically span interactions between people

across globally distributed companies. Web services and SOA are the defacto technology to implement compositions of humans and services. The increasing complexity of

compositions and the distribution of people and services require adaptive and context-aware interaction models. To support complex interaction scenarios, we introduce a mixed serviceoriented system composed of both human-provided and Software-Based Services (SBSs) interacting to perform joint activities or to solve emerging problems. However,

competencies of people evolve over time, thereby requiring approaches for the automated management of actor skills, reputation, and trust. Discovering the right actor in mixed service-oriented systems is challenging due to scale and temporary nature of collaborations. We present a novel approach addressing the need for flexible involvement of experts and knowledge workers in distributed collaborations. We argue that the automated inference of trust between members is a key factor for successful collaborations. Instead of following a security perspective on trust, we focus on dynamic trust in collaborative networks. We discuss Human-Provided Services (HPSs) and an approach for managing user preferences and network structures. HPS allows experts to offer their skills and capabilities as services that can be requested on demand. Our main contributions center around a contextand

sensitive trust-based algorithm called ExpertHITS inspired by the concept of hubs

authorities in web-based environments. ExpertHITS takes trust-relations and link properties in social networks into account to estimate the reputation of users.


P. ARUL SELVAM Mani Institute of Engineering and Technology


There is a predictive modeling framework that integrates a diverse set of data sources from the cyber domain, provides automated support for the detection of high-risk behavioral "triggers" to help focus the analyst's attention and inform the analysis. Designed to be domain-independent, the system may be applied to many different threat and warning analysis/sense-making problems. In this paper, we proposed two important areas for cloud-related insider threats: normal user behavior analysis and policy integration. Few publicly available data sets exist that characterize normal user behavior in relation to indicators of insider threats, much less indicators related to cloud-based insiders. We addressing the challenge of collecting and analyzing normal user behavior should be careful to include attributes useful for cloud-based research as well. Other problem is exploring how organizations can better manage discrepancies among cloud-based security policies. We also plan to explore how such policies could be enforced on semi-trusted and/or untrusted cloud infrastructures PACKET CONCEALING METHODS FOR BLOCKING FUSSY JAMMING ATTACKS A.ARUNADEVI., S.ATHIRAYAN PandianSaraswathiYadav EngineeringCollege Abstract: Privacy is the major requirement in the wireless networks. Attacking and misusing such network could cause destructive consequences. Therefore it is necessary to integrate security to defend against the misbehavior. This paper considers the problem of an attacker disrupting an encrypted wireless network through jamming. The open nature of the wireless medium leaves it vulnerable to intentional interference attacks, typically referred to as jamming. This intentional interference with wireless transmissions can be used as a launch pad for mounting Denial-ofService attacks on wireless networks. Typically, jamming has been addressed under an external threat model. However, adversaries with internal knowledge of protocol specifications and network secrets can launch low-effort jamming attacks that are difficult to detect and counter. In this work, we address the problem of selective jamming attacks in wireless networks. In these attacks, the adversary is active only for a short period of time, selectively targeting messages of high importance. To mitigate these attacks, we develop three schemes that prevent real-time packet classification by combining cryptographic primitives with physical-layer attributes. By using these schemes, brute-force attacks against the encryption can be slowed down and can also provide

protection against chosen- plaintext and related-message attacks.



Abstract As a greater number of Web Services are made available today, automatic discovery is recognized as an important task. To promote the automation of service discovery, different semantic languages have been created that allow describing the functionality of services in a machine interpretable form using Semantic Web technologies. The problem is that users do not have intimate knowledge about semantic Web service languages and related toolkits. We propose a discovery framework that enables semantic Web service discovery and composition based on the ontology frame work. We describe a novel approach for automatic discovery of semantic Web services which employs LSI to match a user request, expressed in service discovery language, with a semantic Web service description. Additionally, we present an efficient semantic matching technique to compute the semantic distance between ontological concepts. As well as implementation of service composition is take place in the proposed paper. Our approach to semantic based web service discovery involves semantic-based service categorization and semantic enhancement of the service request. We propose a solution for achieving functional level service categorization based on an ontology framework. Additionally, we utilize clustering for accurately classifying the web services based on service functionality. The semantic-based categorization is performed offline at the universal description discovery and integration (UDDI). The semantic enhancement of the service request achieves a better matching with relevant services.

S. DIVYA MRS.C.AKILA Anna University, Regional Center

Abstract Non-contact biometrics such as face and iris have additional benefits over contact based biometrics such as fingerprint and hand geometry. However, three important challenges need to be addressed

in a non-contact biometrics-based authentication system: ability to handle unconstrained acquisition, robust and accurate matching and privacy enhancement without compromising security. In this paper, a novel possibilistic fuzzy matching strategy with invariant properties, which can provide a robust and effective matching scheme for two sets of iris feature points, is proposed. In addition, the nonlinear normalization model is adopted to provide more accurate position before matching. Moreover, an effective iris segmentation method is proposed to refine the detected inner and outer boundaries to smooth curves. For feature extraction, the Gabor filters are adopted to detect the local feature points from the segmented iris image in the Cartesian coordinate system and to generate a rotation-invariant descriptor for each detected point. After that, the proposed matching algorithm is used to compute a similarity score for two sets of feature points from a pair of iris images. The proposed approach includes enhancements to privacy and security by providing cancelable iris templates. Results on public datasets show significant benefits of the proposed approach. ENSURING AUTHENTICATION OF CLOUD INFORMATION USING JAR
FEMILA.D, GOLD BEULAH PATTUROSE. Holy Cross Engineering College

Abstract The information housed on the cloud is often seen as valuable to individuals with malicious intent. There is a lot of personal information and potentially secure data that people store on their computers, and this information is now being transferred to the cloud. This makes it critical to understand the security measures that the cloud provider has in place, and it is equally important to take personal precautions to secure the data. In order to provide security, a new highly decentralized cloud information accountability framework is used to keep track of the original usage of the users data in the cloud. The Java Archive programmable capabilities are used to ensure that any access to users data will trigger authentication and automated logging. To strengthen users control a distributed auditing mechanisms is used. The auditing mechanism involves two forms: push and pull mode.


M.GANGADHARAN Mr.R.MUTHU KUMAR National Engineering College

Abstract Cognitive radio (CR) is a emerging technology for accessing the spectrum dynamically, that can be used for flexibility and efficiently achieving open spectrum sharing. A CR system is an intelligent wireless communication system that is aware of incoming signals. In cognitive radio adhoc networks require reliable communication and exchange the spectrum management related information between neighbor nodes. This CR-adhoc network, a common control channel is usually used. This paper in CCC selection protocol that is implemented in a distributed way. According to DCP appearance patterns of primary systems and connectivity among nodes. In the proposed system minimizes the possibility of CCC disruption by primer user activities and maximizes node connectivity. Also reduces the frequency with which control channels are changed and cluster reformation. IMPROVEMENT OF SOURCE AND MESSAGE AUTHENTICATION USING TAM PROTOCOL FOR ADHOC NETWORKS

Abstract Multicast streams are the dominant application traffic pattern in many mission critical ad-hoc networks. The limited computation and communication resources, the large scale deployment and the unguaranteed connectivity to trusted authorities make known security solutions for wired and single-hop wireless networks inappropriate for such application environment. This paper promotes a novel Tiered Authentication scheme for Multicast traffic (TAM) for large scale dense ad-hoc networks. Nodes are grouped into clusters. Multicast traffic within the same cluster employs oneway chains in order to authenticate the message source. Cross-cluster multicast traffic includes a message authentication codes (MACs) that are based on a set of keys. Each cluster uses a unique subset of keys to look for its distinct combination of valid MACs in the message in order to authenticate the source. TAM combines the advantages of the secret information asymmetry and the time asymmetry paradigms and exploits network clustering to reduce overhead and ensure scalability. The numerical and analytical results demonstrate the performance advantage of TAM in terms of bandwidth overhead and delivery delay


L.JENITHA MARYROSELIN MARY Anand Institute of Technology

Abstract The diverse and emerging growth of web information is more critical to organize and utilize the information efficiently and effectively. User-generated information is more freestyle and less structured, which increases the difficulties in mining useful information from these data sources. The Recommender Systems are used to satisfy the information needs of Web users and improve the user experience in many Web applications. The recommendation systems are based on collaborative filtering. Collaborative Filtering is a technique that automatically predicts the interest of an active user by collecting rating information from other similar users or items. For web recommendation process there are three types of challenges. The first challenge is that it is not easy to recommend latent semantically relevant results to users. The second challenge is how to take into account the personalization feature. The last challenge is that it is time consuming and inefficient to design different recommendation algorithms for different recommendation tasks. An efficient Diffusion Graph method is proposed to overcome the above challenges. Diffusion graph method can interact with directed and undirected graphs to achieve 30 % efficiency over existing system.



Abstract: The multi-hop routing in wireless sensor networks (WSNs) offers little protection against identity deception through replaying routing information. An adversary can exploit this defect to launch various harmful or even devastating attacks against the routing protocols, including sinkhole attacks, wormhole attacks and Sybil attacks. The situation is further aggravated by mobile and harsh network conditions. To secure the WSNs against adversaries misdirecting the multi-hop routing, we have designed and implemented TARF, a robust trust-aware routing framework for

dynamic WSNs. Without tight time synchronization or known geographic information, TARF provides trustworthy and energy-efficient route. Most importantly, TARF proves effective against those harmful attacks developed out of identity deception; the resilience of TARF is verified through extensive evaluation with both simulation and empirical experiments on large-scale WSNs under various scenarios including mobile and RF-shielding network conditions. Further, we have implemented a low-overhead TARF module in TinyOS; as demonstrated, this implementation can be incorporated into existing routing protocols with the least effort. Based on TARF, we also demonstrated a proof-of-concept mobile target detection application that functions well against an anti-detection mechanism. MVEE DEFENCE AGAINST CODE INJECTION ATTACKS
JOHN SAMUEL. B BALA MURUGAN. Mohamed Sathak Engineering College

Abstract The growth of interconnected computer increases the number and complexity of attacks. So the computer systems need appropriate security mechanism. Intrusion detection and prevention systems play an important role in detecting and preventing the attacks before they can compromise software. Multi-variant execution environment (MVEE) is an intrusion detection and prevention mechanism that executes several slightly different versions of a same program, called variants, in concurrency. The variants are defined as n2. These variants contain the same operational unit of the original program. The variants are built to have indistinguishable behavior under normal execution conditions. If any of the variant is under attack, there are noticeable divergences in their execution behavior. A monitor compares the behavior of the variants at specific synchronization points and raises an alarm when a divergence is detected. Here a monitoring mechanism that works under userspace, to supervise the variants is presented.

LATHARANI .D MR NAGALINGA RAJAN.A Infant Jesus College of Engineering


The trustworthiness of photographs has an essential role in many areas, including: forensic investigation, criminal investigation, medical imaging, and journalism. But, in todays digital age, it is possible to very easily change the information. One of the main problems is the authentication of the image received in a communication. In this paper proposed a robust alignment method which makes use of an image hash component based on the Bag of Features paradigm. Forensic hash component is a short signature attached to an image before transmission and acts as side information for analyzing the processing history and trustworthiness of the received image. The estimator is based on a voting procedure. SIFT and block-based features to detect and localize image tampering. Experiments show that the proposed approach obtaining a significant margin in terms of registration accuracy, discriminative performances and tampering detection.


K.MALA, S.CHINNADURAI Srinivasan Engineering college,

Abstract Record linkage is the problem of identifying similar records across different data sources. The similarity between two records is defined based on domain-specific similarity functions over several attributes. De-duplicating one data set or linking several data sets are increasingly important tasks in the data preparation steps of many data mining projects. The aim is to match all records relating to the same entity. Different measures have been used to characterize the quality and complexity of data linkage algorithms,and several new metrics have been proposed. An overview of the issues involved in measuring data linkage and de-duplication quality and complexity. A matching tree is used to overcome communication overhead and give matching decision as obtained using the conventional linkage technique. Developed new indexing techniques for scalable record linkage and de-duplication techniques into the febrl framework, as well as the investigation of learning techniques for efficient and accurate indexing.

MR.T.MANIMARAN, MR. S. PACKIYA RAJKUMAR, Pandian Saraswathi Yadav Engineering



The mainstay of the project is to secure the data that stored on the cloud from the unauthorized users and to provide access control of outsourced data. Cloud computing has emerged as one of the most influential paradigms in the IT industry in recent years. Since this new computing technology requires users to entrust their valuable data to cloud providers, there have been increasing security and privacy concerns on outsourced data. Several schemes have been proposed for access control of outsourced data in cloud computing. Most of the proposed schemes employing attribute based encryption (ABE) for access control, but they lacks in flexibility and scalable access. To overcome this, we introduce a scheme for encryption based on hierarchical structure of users. This scheme provides efficient key management mechanism to distribute decryption keys to authorized users, also scalable when number of users becomes large and flexible to revoke keys for a previously legitimate user. The main advantage of this scheme is that no need of data owner always to be in online. The users without a decryption key cannot view the content of file and the file is secured from unauthorized access.

S.MANIPRIYA .B.HEMALATHA Sri Venkateswara College of Engineering

Abstract Filters play an important role in microwave applications. Micro strip filters play various roles in wireless or mobile communication systems. There is an increasing demand for newer microwave and millimeter-wave systems to meet the emerging telecommunication challenges with respect to size, performance and cost. Micro strip stepped impedance low pass filter is designed for low cost ,low insertion loss and return loss by using micro strip layout which works at 1.5GHZ for sixth order butter worth low pass filter for FR4 substrate ,permittivity is 4.4 ,substrate thickness is 1.6mm, and loss tangent is 0.02 is presented. It is used in the application of GPS. Micro strip technology is used for simplicity and ease of fabrication. The design and simulation are performed by using Agilent technologies of ADS (Advanced Design System) simulation tool to plot the insertion and return loss of the low pass filter.


RAJALAKSHMI.M SBC Engineering College

Abstract Karyotyping is a common method in cytogenetic. Automatic classification of the chromosomes within the microscopic images is the first step in designing an automatic karyotyping system. This is a difficult task especially if the chromosome is highly curved within the image. The main aim of this paper was to define a new group of features for better representation and classification of chromosomes. this paper proposes classification & analysis of human chromosomes which

includes the following steps i)we use image processing utilities and filter to remove noise .ii)the filtered image is then entered into segmentation algorithm to segment the image .iii)then the segments enter into two tracks for classifying chromosomes. the first one depends on image processing for measuring the length of chromosomes where the second one deals with initiating the feed forward neural network which is trained by means of back propagation algorithm. By using feed forward neural network and back propagation algorithm, width, position and the average intensity of chromosome was determined. Back propagation algorithm achieves high accuracy with minimum training time, which makes it suitable for real-time chromosome classification in the laboratoryIn our paper ,segmentation is done by using image processing and classification is done by using feed forward neural network and back propagation algorithm.


R.ELAKIYA P.HEMALATHA Anand Institute of Higher Technology

Abstract An automated scheme for information hiding and message prediction is proposed. The existing methods, shows the message prediction rate is low and audio loss occurs while embedding data in video. The two data hiding methods are introduced to improve the quality of video with high message prediction rate. To avoid audio loss, FFMPEG media tool is used to split the audio and video. Randomly any image from video is taken and then divided into n number of chunks, by

converting the message into ASCII format and once again transformed into binary bits and these bits are stored in each pixels RGB value. The first method multivariate regression hides one bit in each pixel. The second flexible macroblock ordering hides three bits in each pixel. Finally, the audio and video is combined and sent to the receiver. These two methods are compared and multivariate regression provides high video quality than the flexible macroblock ordering with exact message prediction and the methods are examined visually with existing methods to compare the performance. It improves the efficiency with respect to quality distortion, message prediction, channel bit errors and packet losses. The efficiency in terms of video quality of the proposed work can be enhanced up to 90%.

ENABLING CLOUD COMPUTING SECURITY FROM SINGLE TO MULTI CLOUDS K.A MOHAMED RIYAZUDEEN Abstract The use of cloud computing has increased rapidly in many organizations. Cloud computing provides many benefits in terms of low cost and accessibility of data. Ensuring the security of cloud computing is a major factor in the cloud computing environment, as users often store sensitive information with cloud storage providers but these providers may be untrusted. Dealing with single cloud providers is predicted to become less popular with customers due to risks of service availability failure and the possibility of malicious insiders in the single cloud. A movement towards multi-clouds, or in other words, interclouds or cloud-of-clouds has emerged recently. This paper surveys recent research related to single and multi-cloud security and addresses possible solutions. It is found that the research into the use of multi-cloud providers to maintain security has received less attention from the research community than has the use of single clouds. This work aims to promote the use of multi-clouds due to its ability to reduce security risks that affect the cloud computing user.

TOUCH SCREEN TECHNOLOGY K.Revathi,P.Vinotha Kalasalingam Institute of Technology Abstract

This document gives the detail about the touch screen technology, its history, its construction and its usage, the various technologies used for making touch screen also described in this document. Finally it adds the detail of its operating system.

DESIGN AND IMPLEMENTATION OF BI-QUAD ANTENNA WITH PARABOLIC REFLECTOR FOR ENHANCING THE COVERAGE AREA OF A Wi-Fi ACCESS POINT R.Kanchana., S.saraswathy., R.Ruban Thomas Vel tech multi tech Engg college Abstract The next decade will be the Wireless Era. Intel Executive Sean Maloney Todays network especially LAN has drastically changed. People expect that they should not be bound to the network. In this scenario, Wireless (WLAN) offers tangible benefits over traditional wired networking. Wi-Fi (Wireless Fidelity) is a generic term that refers to the IEEE 802.11 communications standard for Wireless Local Area Networks (WLANs). Wi-Fi works on three modes namely Ad hoc, Infrastructure and Extended modes. Ad hoc network is P2P mode. Ad hoc does not use any intermediary device such as Access Point. Infra Structure and Extended modes use Access Point as interface between wireless clients. The wireless network is formed by connecting all the wireless clients to the AP. Single access point can support up to 30 users and can function within a range of 100 150 feet indoors and up to 300 feet outdoors. The coverage area depends upon the location where the AP is being placed. The AP has the traditional Omni directional antenna The aim of this project is to increase the coverage area of an AP by replacing the traditional Omni directional antenna with Bi-quad antenna with parabolic reflector. MACHINE LEARNING APPROACH FOR MEDICAL LANGUAGE PROCESSING
SABARI ANANDHA RAJA C, SASIDHARAN G, VIKNESH R, Krishnasamy College of Engineering & Technology

Abstract The Machine Learning (ML) field has gained its momentum in almost any domain of research and just recently has becomea reliable tool in the medical domain. The empirical domain of automatic learning is used in tasks such as medical decision support,medical imaging, proteinprotein interaction, extraction of medical knowledge, and for overall patient management care. ML isenvisioned as a tool by which computer-based systems can be integrated in the healthcare field in

order to get a better, more efficientmedical care. This paper describes a ML-based methodology for building an application that is capable of identifying and disseminatinghealthcare information. It extracts sentences from published medical papers that mention diseases and treatments, and identifiessemantic relations that exist between diseases and treatments. Our evaluation results for these tasks show that the proposedmethodology obtains reliable outcomes that could be integrated in an application to be used in the medical care domain. The potentialvalue of this paper stands in the ML settings that we propose and in the fact that we outperform previous results on the same data set. CRYPTOGRAPHY A NETWORK SECURITY MEASURE

Abstract In todays world data transmissions means everything. The data transmission can decide even the future of a man or the whole nation. In other word if one can have any data then he rules the world. All the systems are being computerized and we can see computers in each and every field of human work. As the need of the computer increases, there is an equal increase in the computer crimes. Securing the data in computer from illegal access should be the most important task for anyone who owns it. Physical security can be given to secure the data that is being stored in a computer system moreover there are softwares available to protect those data. But it becomes hard to protect the data when it is being transferred. Cryptography comes into play here but all the cryptographic methods are almost completely breakable by using frequency analysis and other methods. As the technique uses the encryption in the compiler level it becomes relatively hard to crack. Moreover not like all encryption techniques it does dynamically encode the data so hard to break. DETECTION OF OCCUPIED & AVAILABLE SPACES IN THE CAR PARKING SYSTEM USING HAAR-LIKE FEATURES

Abstract This paper describes an approach to overcome a situation of monitoring and managing a parking area using a vision based parking system. With the rapid increase of cars the need to find available parking space in the most efficient manner, to avoid traffic congestion in a parking

area, is becoming a necessity in car park management. Current car park management is dependent on either human personnel keeping track of the available car park spaces or a sensor based system that monitors the availability of each car park space or the overall number of available car park spaces. In both situations, the information available was only the total number of car park spaces available and not the actual location available. In addition, the installation and maintenance cost of a sensor based system is dependent on the number of sensors used in a car park. This paper shows a vision based system that is able to detect and indicate the available parking spaces in a car park by using Haar-like feature. The methods utilized to detect available car park spaces were based on coordinates to indicate the regions of interest and a car classifier. This paper shows that the initial work done here has an accuracy that ranges from 90% to 100% for a 4 space car park. The work done indicated that the application of a vision based car park management system would be able to detect and indicate the available car park spaces.

TEXTURE CLASSIFICATION BASED ON NEURAL NETWORK Sheeba Thankachan, S. Nageswari Abstract As a newly developed 2-D extension of the wavelet transform using multiscale and directional filter banks, the con-tour let transform can effectively capture the intrinsic geometric structures and smooth contours of a texture image that are the dominant features for texture classification. In this paper, I propose a Neural network classifier. It is a classifier which works similar to human neural system. In my project I will be using it to classify the texture category. I propose to use Neural Network Classifier for classification of textures which are trained using Contourlet features. I enhance the classification result using feed forward back propagation neural network. A two-layer feed-forward network, with sigmoid hidden and output neurons, can classify vectors arbitrarily well, given enough neurons in its hidden layer. The network will be trained with scaled conjugate gradient back propagation algorithm.




This project represents how one PC can be controlled from remote place with one smart-phone device with the help of Internet. It means the monitor of PC will be seen in mobile. It turns your phone into a wireless keyboard and mouse with touchpad, using your own wireless network. This application can be performed on android based mobile. It requires server application for your computer. It requires device running on the Android operating system with some sort of wireless connection between them. By getting IP address from the PC and directly browse it on mobile phone. The PC screen will be access on the mobile. It Supports web applications with database for storing the web pages. On Mobile applications retrieves the required data information in certain time interval by connecting with the web server. Able to view your phones screen on your computer monitor which is great for: putting your Android notifications right along the side other notification boxes on your monitor, using it like an on-monitor caller ID, and taking screenshots and screen casts. Remote keyboard/mouse control is great for inputting data on the tiny screen without needing to peck at the on-screen keyboard.



Coimbatore Institute of Technology

A Mobile ad-hoc networks(MANET) is a collection of mobile hosts ,which can communicate by the aid of intermediate mobile hosts without utilizing a fixed infrastructure and centralized admission .Quality of Service (QoS) routing in MANETs has become an important issue due to emergence of Multimedia services.An optimal path is found by considering multiple QoS parameters namely bandwidth,delay,energy and no of hops.Simulation is carried out to measure the achieved throughput and delay.


Sample selection is to select a number of representative samples from a large database so that a learning algorithm can have a reduced computational cost and an improved learning accuracy. Many sample selection algorithms such as IBL, CNN, and their extensions have selection mechanisms that are associated with the class labels of the samples to be selected. Their selection results are directly depended on the class labels of the samples. Thus, these algorithms could just condense the data set but could not reduce labeling cost and also they do not clarify about how to use search query and how to return result from that decision tree. In order to overcome these drawbacks, this paper constructs a fuzzy decision tree in cloud environment and the accurate sample is found out. The Cloud storage method takes a single round of communication, to guarantee retrieval of the exact result. The method of storing uncertain data in cloud storage which are fully secured using encryption algorithm, all the data stored as cipher data. In cloud

environment the data are built by index and this leads to easy search the exact result using search query. VEHICULAR NETWORK FOR INTELLIGENT TRAFFIC SYSTEM TO AVOID ACCIDENTS USING HYBRID SCALABILITY PROTOCOL G.Vigneshchakkaravarthy Abstract Recent advances in wireless technologies have given rise to the emergence of vehicular ad hoc networks Vehicular ad hoc networks (VANETs) are highly mobile wireless networks that are designed to support vehicular safety, traffic monitoring, and other commercial applications. VANETs, vehicle mobility will cause the communication links between vehicles to frequently be broken. Such link failures require a direct response from the routing protocols, leading to a potentially excessive increase in the routing overhead and degradation in network scalability. In such networks, the limited coverage ofWiFi and the high mobility of the nodes generate frequent topology changes and network fragmentations. In this paper we are offering an efficient routing strategy is crucial to the deployment of VANETs. A series of representative metaheuristic algorithms (particle swarm optimization, differential evolution, genetic algorithm, and simulated annealing) are studied in this paper to find automatically optimal configurations of this routing protocol. Vehicular Ad-hoc Networks (VANETs) is attracting considerable attention from the Adhiparasakthi Engineering College

research community and the automotive industry to improve the services of Intelligent Transportation System (ITS). Traffic data from a limited region of road Map is collected to capture the realistic mobility. In this work, the entire region has been divided into various smaller routes. The realistic mobility model used here considers the drivers route choice at the run time. It also studies the clustering effect caused by traffic lights used at the intersection to regulate traffic movement at different directions. The fundamental challenges of providing live multimedia streaming (LMS) services in vehicular ad hoc networks (VANETs) come from achieving stable and high streaming rate (smooth playback) for all the interested vehicles while using minimal bandwidth resources, especially under the highly dynamic topology of VANETs and the lossy nature of vehicular wireless communications. In VANET high speed is the real characteristics which leads to frequent breakdown, interference etc. Therefore Performance of adhoc routing protocols is helpful to improve the Quality of Service (QOS). In this paper we studied various adhoc rWouting protocols, Reactive, Proactive & Hybrid, taking in to consideration parameters like speed, altitude, mobility etc in real VANET scenario The AODV and DYMO (Reactive), OLSR (Proactive) and ZRP (hybrid) protocols are compared for IEEE 802.11(MAC) and IEEE 802.11(DCF) standard using Qualnet as a Simulation tool. Since IEEE 802.11, covers both physical and data link layer. Hence performance of the protocols in these layers helps to make a right selection of Protocol for high speed mobility. Varying parameters of VANET shows that in the real traffic scenarios proactive protocol performs more efficiently for IEEE 802.11 (MAC) and IEEE 802.11(DCF).

DYNAMIC LOAD MONITORING AND RESOURCE ALLOCATION IN CLOUD ENVIRONMENT B.Vignesh Kumar S.Dhanasekaran Kalasalingam University Abstract This paper, dynamic load monitoring and resource allocation in cloud environment is managed. Load monitoring is the process of calculating free physical memory and CPU utilization of each virtual machine. In this continuous monitoring is made. And resource allocation is based on free maximum available physical memory. Here resource allocation done by two basic classification. In the first solution required physical memory to proceed the job is less than the available physical

memory so allocation is done if the condition is satisfied. In second classification where the required physical memory is greater than the available free physical memory so we cant allocate the job directly if we allocate means performance will be low. So we split job according to the need and available physical memory and then allocate to corresponding node where need is less than the free available physical memory after that we process the job. The aim is to achieve maximum utilization of cloud resource in effective manner.

GRID COMPUTING S.Vignesh Kumar R.VinothKumar Jayaram college of engineering and technology Abstract A Grid computing system is a geographically distributed environment with autonomous domains that share resources amongst themselves. Grid computing presents a new trend to distributed computing and Internet applications, which can construct a virtual single image of heterogeneous resources, provide uniform application interface and integrate widespread computational resources into super, ubiquitous and transparent aggregation. Grid computing is a systems design paradigm which efficiently uses distributed computing resources, usually on the scale of supercomputers. We design, implement and evaluate Mobile OGSI.NET, a distributed software system that permits device collaboration and better resource usage, while conforming to the grid computing OGSI Specification. Grid computing provides a challenge for visualization system designers. Grid computing gives users access to widely distributednetworks of computing resources to solve largescale taskssuch as scientific computation. Finally,we describe the middleware challenges implied by theapproach and an architecture for grid computing using virtual machines.

AN EFFICIENT MINING FOR HIGH UTILITY PATTERN FROM WEB LOG DATA USING UP CATALOG P.Visnu pria Abstract Discovering useful patterns hidden in a database plays an essential role in several data mining techniques, such as frequent pattern mining, weighted frequent pattern mining and high utility S.Geetha Oxford Engineering College

pattern mining. In Frequent pattern mining, importance of each item is not considered. To address this problem Weighted Association Rule Mining (WARM) was proposed. In WARM, the weights of each item considered. And, it doesnt satisfy the users requirement. Hence, Utility Pattern (UP) was proposed. Mining high utility item sets refers to the discovery of item sets with high utility like profits. Existing methods often generate a huge set of potential high utility item sets and their mining performance is degraded consequently while the database contains large transactions. Hence, for adopting large number of transactions UP catalog was proposed. To prune the candidate item sets effectively UP growth algorithm was used. Finally, Heuristic rule framing is done with respect to the datasets. On adopting this rule framing strategy the strength of the item sets are evaluated. This proposed mechanism reduces the tree construction cost and time and applicable for large number of logs.

BLOOD VESSEL SEGMENTATION ON DIGITAL FUNDUS IMAGES G.Priyanka1,Mrs.G.Jemilda, Jayaraj Annapackiam CSI College of Engineering, Abstract The main objective of this project is to detect and segment the blood vessels from the digital fundus images. Diabetic Retinopathy is one of the leading causes of visual impairment. It is characterized by the development of abnormal new retinal vessels. This project uses a gray level based features method for segmenting the blood vessels from the optic disk. Fifteen feature parameters associated with shape, position, orientation, brightness, contrast and line density are calculated for each candidate segment. Based on these features each segment is categorized as normal or abnormal using a support vector machine (SVM) classifier. This methodology uses morphological operation to obtain blood vessel.

COMMUNITY ANOMALY DETECTION SYSTEM IN COLLABORATIVE INFORMATION SYSTEMS Thiraviaselvi.G, Subbu Lakshmi.T.C Francis Xavier Engineering College Abstract Collaborative Information Systems (CIS) integrate and coordinate information from diverse sources and allow groups of users to communicate and cooperate over common tasks. CIS is increasingly

relied upon to manage sensitive information. Current security mechanisms to detect insider threats are ill-suited to monitor systems in which users function in dynamic teams. An insider threat is a malicious hacker who is an employee of an institution or an outside person who poses as an employee by obtaining false credentials and cause damages to the sensitive information. Community Anomaly Detection System (CADS), an unsupervised learning framework to detect insider threats based on the access logs of collaborative environments is introduced. This framework is based on the observation that typical CIS users tend to form community structures based on the subjects accessed (e.g., patients records viewed by healthcare providers). CADS consists of two components: 1) relational pattern extraction, which derives community structures and 2) anomaly prediction, which uses a statistical model based on nearest neighbor networks to determine when users have sufficiently deviated from communities. It is capable to detect anomalous insiders in systems that use dynamic teams.

DLA: DYNAMIC LEARNING ALGORITHM FOR ANOMALY DETECTIONIN MOBILE AD HOC NETWORKS (MANET) J.Vinoth KumarMs. K. Madheswari SSN College of Engineering Abstract Mobile ad hoc networks are multi hop networks of independent mobile nodes without any fixed infrastructure. In MANET it is difficult to identify malicious nodes because the network topology constantly changes due to node movement. As topology of MANET constantly changes over time, simple use of static base profile is not efficient. In this paper, the Dynamic Learning Algorithm (DLA) is proposed to detect anomalies and establishing normal profile. The anomaly detection scheme which is based on dynamic learning algorithm, where the training data is to be updated every time according to particular time interval. On comparison of sample packet in the normal baseline profile, the attack is identified and stored in the packet attack database. The NS2 Simulator is used for MANET simulation depending on scenarios based on routing attacks on AODV protocol. IMPROVED UNSUPERVISED SEGMENTATION ALGORITHM FOR TISSUE PATHOLOGY Aiswarya Gopinath Mr.Rajesh T PSN College of Engineering and Technology

Abstract Unsupervised segmentation of histopathological tissue images has two main contributions. First, a new set of high level texture features is introduced to represent the prior knowledge of spatial organization of the tissue components. Second, it proposes to obtain multiple segmentations by multilevel partitioning of a graph constructed on the tissue objects which are then combined using an ensemble function. To define objects in these components, K means algorithm can be used. The K-means algorithm partitions an image into K clusters and assigns each point to the cluster whose center (also called centroid) is nearest. Another clustering algorithm, Fuzzy C Means (FCM), also known as soft K means is present. In fuzzy clustering, each point has a certain degree of belonging to clusters, rather than completely belonging to just one cluster. Thus, points that lie on the edge of a cluster may be in the cluster to a lesser degree than points in the center of a cluster. So FCM is more suitable for elongated clusters and hence provides better results than K means. Hence, FCM is used as the clustering algorithm in this paper. Multilevel graph partitioning increases diversity of individual segmentations, and hence improves the final result.

CLUSTERING BASED DICTIONARY GENERATION FOR DIABETIC RETINOPATHY DETECTION Brindha.N.R, Mr.Sundaraguru.R Abstract A good eye is an important and a significant factor in retaining independence and quality of life of all living beings. Diabetic retinopathy is a retinopathy caused by complications of diabetes that damages the retina. It affects back part of the eye and also damages the blood vessels of the retina. This effects blurry vision, scarring, cloudiness and increased pressure, which leads to blindness. This work is helpful to detect the diabetic retinopathy (DR) using the lesions from fundus image and it can capable to detect without using pre/post processing into the affected red and bright lesions parts in retina. The method based on the concept of marking the points of interest (POI) in PSN College of Engineering and Technology,

lesion location to make a visual word dictionary. The POIs region helps to classify the fundus image neither the retina image is normal nor diabetes affected one. The approach extends by adding feature information with visual word dictionary and so it is applicable for different types of lesions in retina with specific projection space for each class of interest instead of common dictionary for all classes. The red and bright lesions are classified by visual words dictionary with cross validation and cross dataset validation to show the efficiency of this approach. For final classification SVM is proposed for a two class machine learning classification. The visual word dictionary does not depend on resolution of the image. The proposed work shows the ability to detect and classify the retina images in different conditions.

CANCER DISCOVERY USING CLUSTER ENSEMBLE APPROACH Dr.S.Ramakrishnan Mrs.L.Meenachi Ms. G.T.Citra Ms.I.Swathi shree Ms. C.Deepika Abstract Cancer discovery is one of the most important clinical applications .In this paper, we use the cluster ensemble approach to diagnose the cancer cells. Our aim is to improve the accuracy of the clusters that are formed. Initially, the Prior Knowledge about the dataset is represented as pairwise constraints. Then, multiple datasets are generated from the given dataset by using the Random subspace (RSS) technique. Spectral Clustering (SC) is applied on these datasets to obtain a set of clustering solutions. Using the pairwise constraints, confidence factors for each solution is calculated. A Consensus matrix is formed using the obtained clustering solutions and confidence factors. Finally, the consensus matrix is partitioned to get the final clusters. The above method is also applied to the datasets and the accuracy of those approaches has been found. The experiment shows that the cluster ensemble helps to further improve the accuracy of the single clustering algorithm. This method is more accurate,so we apply this for various applications. 3-D VOLUME RECONSTRUCTION FOR MEDICAL APPLICATON Manikandan.V Divya.P, Selvam College of Technology, Abstract DDoS (Distributed Denial of Service) attacks threaten Internet security nowadays. Such attacks are among the hardest security problems to address because they are simple to implement, difficult to

prevent, and very difficult to trace. As a result, there is no effective and efficient method to deal with this issue so far. This work proposes a novel traceback method for DDoS attacks that is based on entropy variations between normal and DDoS attack traffic, which is fundamentally different from commonly used packet marking techniques. In comparison to the existing DDoS traceback methods, the proposed strategy possesses a number of advantagesit is memory nonintensive, efficiently scalable, robust against packet pollution, and independent of attack traffic patterns

EFFICIENT APPROACH TO PATENT SEARCH PARADIGM T.Krishna Chaitanya D.Hemavathi SRM University Abstract Patent search process has attracted considerable attention recently because to search patent data from large data sets. Several other methods which the user issues a try and see approach and issue several queries to check whether the answers are relevant or not which is a complex process. The use of measures such as error correction, topic-based query suggestion, queries expansion for evaluating the interestingness in the search criteria. By these techniques this project proposes a new method for search process and improves search efficiency. The efficiency of search process to find the relevant answers from large data sets is partitioning the data sets. In this project partitions are constructed by taking the dataset from USPTO. First i partition patents into small partitions based up on their classes and topics. Then for a given query, i find the highly relevant partitions and answer the query based on the highly relevant partitions. After the selection of relevant answers from different partitions and generate top-k answers of patent-search query.

SLM : SECURE LEADER MODEL FOR INTRUSION DETECTION IN MOBILE AD-HOC NETWORK J.Mervin Ms.K.Madheswari SSN college of Engineering Abstract The Mobile Ad hoc Networks (MANETs) have no fixed chokepoints /bottlenecks where Intrusion Detection Systems (IDSs) can be deployed. Hence, nodes are clustered and the nodes need to run its

own IDS and cooperate with others to ensure security, since the mobile nodes are energy limited it is inefficient to provide IDS service to all the nodes present in cluster. To balance the resource consumption among all nodes and prolong the lifetime of an MANET, nodes with the most remaining resources should be elected as the leaders. The proposed system deals with the leader election in the presence of selfish nodes for intrusion detection in MANETs, the two obstacles in achieving this goal are the presence of malicious and selfish nodes , since the resources are private information the nodes may behave selfishly in order to increase their own benefits and the nodes may behave maliciously and it does not provide IDS service to all other nodes present in the cluster where the proposed system controls the selfish and malicious nodes by providing incentives in the form of reputation.

MULTIKEYWORD SEARCHING IN PEER- TO-PEER NETWORK USING GOSSIP ALGORITHM COMBINED WITH QUERY RATING TECHNIQUES A.Azhakeswari, V.Venkateshwaradevi Abstract A peer to peer network is a popular network tool for sharing information on the web, where information resides on millions of sites in a distributed manner. Existing P2P retrieval mechanisms provide a scalable distributed hash table (DHT) that allows every individual keyword to be mapped to a set of documents/nodes across the network that contain the keyword. Using this single keyword-based index, a list of entries for each keyword in a query can be retrieved by using existing DHT lookups. For multi keyword search, the simple solution which merges the results of each keyword search incurs a lot of traffic. Bloom Filter (BF) is an effective way to reduce such communication cost. Instead of sending set of documents simply a BF can be transfer, by doing so the traffic can be reduced. The bloom filter settings can be optimized by increasing the bloom filter size and increase the number of the hash function used. Gossip algorithm is used to scans the documents for specified keyword and counting the number of occurrence of the keyword in entire documents. When the counting value reached the threshold value it automatically stored in DHT table. This paper introduces a query rating technique to calculate the number of user who uses the same keyword. This query rating can be combined with gossip algorithm to get better performance thereby reducing the searching time. Oxford Engineering College

A NOVEL BASED CLOUD COMPUTING SECURITY Santhakumar.D, Pragash.K, Balamurali.K Abstract


CK College of Engineering & Technology

Cloud computing places an organizations sensitive data in the control of a third party, introducing a significant level of risk on the privacy and security of the data. This paper focus on Cloud Computing, various cloud deployment models and the main security risks and issues that are currently present within the cloud computing industry. It also proposes a collaboration-based security management framework for the cloud computing model. E-SECURE:PRESERVING SECURITY THROUGH THREE-FACTOR AUTHENTICATION IN DISTRIBUTED SYSTEMS K.USHA K.PRAKASH P.MANOJ M.SURIYAPRAKASH R.V.S College of Engineering and Technology Abstract This paper investigates a systematic and real time approach for authenticating clients by three factors namely password, smart card, and biometrics. E-Secure for Preserving security is proposed to upgrade two-factor authentication to three-factor authentication. As part of security within distributed systems, various services and resources protected from unauthorized use efficiently because of three authentication factors. It is not only significantly improves the information assurance at low cost but also protects client privacy in distributed systems. In addition, it maintains several practice-friendly properties of the underlying two-factor authentication.

A DYNAMIC DATA REPLICATION IN CLOUD TO INCREASE SYSTEM AVAILABILITY Rajapriya G Vijayakumar D Srinivasagan K G National Engineering College Abstract Data replication is a method to improve the performance of the data access in distributed systems. Dynamic replication is a replication that adapts replication configuration with the change of user behavior during the time to ensure the benefits of replication. To improve the system availability,

replicate the popular data to multiple suitable locations, as users can access the data from a nearby site. It decide a reasonable number and right locations for replicas has become a challenge in the cloud computing. A dynamic data replication strategy suitable for distributed computing environments. It includes, analyzing and modeling the relationship between system availability and the number of replicas, evaluating and identifying the popular data and triggering a replication operation when the popularity data passes a dynamic threshold, calculating a suitable number of copies to meet a reasonable system byte effective rate requirement and placing replicas among data nodes in a balanced way, designing the dynamic data replication algorithm in a cloud. As a result of the proposed method, increase the availability, performance, reduce user waiting time and also reduce the execution time of the system. AGENT BASED NETWORK SNIFFER INTRUSION DETECTION Mrs. Dhanalakshmi, Mr. A.R.M. Ravi Shankar Abstract Network sniffing was considered as a major threat to network and web application. Every device connected to the Ethernet-network receives all the data that is passed on the segment. By default the network card processes only data that is addressed to it. However listening programs turn network card in a mode of reception of all packets called promiscuous mode. As we know sniffer is a program that eavesdrops all the data moving onto the network [1]. In this mode the NIC does not perform its basic task of filtering. The NIC forwards all the packets moving in the network to the system for further processing. Sniffer does not generate any traffic in the network, so it can not be detected easily. Many sniffers like wireshark, Cain & Abel, ethersniff etc. are available at no cost on the internet. There are many proposed solutions are available for the detection of network sniffing including antisniff [2], SnifferWall [3], Sniffer Detector [4] etc. but any solution does not guarantee full security. Here in this paper we are proposing Mobile Agents as a solution for the problem of sniffer detection. Mobile agents perform a task by migrating and executing on several hosts connected to the network. For the sniffer detection, the network administrator sends some special types of mobile agents in the network and collects information from different nodes. After analyzing this information the network administrator can identify the computer system running in promiscuous mode. Arunai Engineering College

ANDROID AND WIRELESS BASED ROBOTIC VEHICLE CONTROL D.Santhakumar P.Prakash T.Pradeep Kumar Chander CK College of Engineering & Technology Abstract Our paper discusses on controlling the robot by Wireless communication and we are about to create an application in our smart phone for controlling the robot. The instructions are given to the smart phone through voice. The basic instructions are forward, backward, left, right and stop. The instructions are transferred to the robot through the Wireless medium. The robot can move with the help of wheel by receiving the command. The robot is made of arduino board. The board receives the instruction and performs the task. Simultaneously it captures the video and sends it to the android mobile.


Abstract Watermarking is the process that embeds data called a watermark, a tag, or a label into a multimedia object, such as images, video, or text, for their copyright protection. According to human perception, the digital watermarks can either be visible or invisible. A visible watermark is a secondary translucent image overlaid into the primary image and appears visible to a viewer on a careful inspection. The invisible watermark is embedded in such a way that the modifications made to the pixel value are perceptually not noticed, and it can be recovered only with an appropriate decoding mechanism. This paper presents new very large scale integration (VLSI) architecture for implementing two visible digital image-watermarking schemes. The proposed architecture is designed to aim at easy integration into any existing digital camera framework.

T.SIVARAM, DR.A.KAVITHAMANI, DR.V.MANIKANDAN Coimbatore Institute of Technology


A Conventional controller cannot maintain the desired liquid level of a two-tank system due to load disturbances and unpredictable environmental conditions. Maintenance of the desired level is necessary as otherwise the tank will over flow, leading to catastrophic losses. This paper compares conventional controllers like feedback control, feed forward control, Internal Model Control (IMC) control with model predictive controller. The performance analysis is be done by integral errors. Controllers are simulated and evaluated using MATLAB/SIMULINK. REFINING ANOMALISTIC MALICIOUS ATTACKS IN SOVEREIGN SETUP STEPHEN Abstract The Internet is a global system of interconnected computer networks that use the standard Internet protocol suite to serve billions of users worldwide. Recently the Internet is plagued by malicious activities such as spam, phishing to malware, denial-of-service (DoS), Man in the middle attack, Eavesdropping attacks etc. Much of it thrives on armies of compromised hosts, or botnets, which are scattered throughout the Internet. Malicious activity is not necessarily to be evenly distributed across the Internet: some networks may employ lax security, resulting in large populations of compromised machines, while others may tightly secure their network and not have any malicious activity. So the proposed scheme, concentrates on the three frequently occurring malicious attacks such as Botnet attack, Phishing attack, DDOS attack, by detecting and providing countermeasures. The evaluated results of the proposed methodologies are compared with the existing works to prove its efficiency. A DYNAMIC LOAD BALANCING SCHEME FOR ENERGY EFFICIENT RESOURCE UTILIZATION IN CLOUD COMPUTING

Abstract Cloud computing is an internet based use of computer applications. It mainly used to share hardware and software resources over the network rather than the remote server. Load balancing techniques used to share the workload to the individual nodes of the system to improve both resource utilization and job response time while also avoiding a situation where some of the nodes are always busy while other nodes are idle or doing lower priority work. Load balancing problem

can be balanced using the Load strategy method and create a multiple instances. Using the optimum scheduling algorithm, it schedule the task based on the deadline and cost based constraints which gives the minimum turn around time of the system. Load Strategy method use dynamic scheduling and to balancing the load and also reduce the execution time. And also consider the concept of

Green computing that are used to reduce the energy consumption. Task consolidation is a method used to increase resource utilization and reduces energy consumption which can lead to freeing up of resources that can sit idling yet still drawing power. ANALYSIS OF ION CHANNELS AND BIO-MOLECULES USING NANO-ELECTRONIC MOSFETS T.S.Guruprasath V. Nishanth Jain MNM Jain Engineering College. Abstract This paper considers the use of Nano-Scale MOSFETS (Double-Gated) which when interfaced with the Human Skin can help us analyze the ion channels and their functions in working of various BioMolecules and Cells. This can then be implemented in various fields such as Anti-venom Research, The Effects of Paralysis and Epilepsy can then be studied in detail and this can be possibly used to cure certain medical cases rather than adapting to measures such as amputation. AN IMPLEMENTATION OF SEMI-SUPERVISED LEARNING FROM MICROARRAY SAMPLE CLASSIFICATION Mr.A.P. Gopu, K.S.MANOJEE, Selvam College of Technology, Abstract The recent advancement and wide use of high throughput technology are producing an explosion in using gene expression phenotype for identification and classification in a variety of diagnostic areas. An important application of gene expression data in functional genomics is to classify samples according to their gene expression profiles. In most gene expression data, the number of training samples is very small compared to the large number of genes involved in the experiments. However, among the large amount of genes, only a small fraction is effective for performing a certain task. Furthermore, a small subset of genes is desirable in developing gene-expression-based diagnostic tools for delivering precise, reliable, and interpretable results. In this project, a new supervised attribute clustering algorithm is proposed to find co-regulated clusters of genes whose collective expression is strongly associated with the sample categories or class labels. A new

quantitative measure, based on mutual information, is introduced to compute the similarity between attributes. The proposed supervised attribute clustering method uses this measure to reduce the redundancy among genes. A single gene from each cluster having the highest gene-class relevance value is first selected as the initial representative of that cluster. The representative of each cluster is then modified by averaging the initial representative with other genes of that cluster whose collective expression is strongly associated with the sample categories. Finally, the modified representative of each cluster is selected to constitute the resulting reduced feature set.



Abstract The automatic control for laboratory sterilization process is designed by arduino. Arduino is an Open Hardware platform that allows a fast prototype development. The system is composed of four modules. Gas Control Module, CPU Module, flame detection, and pressure sensor module. A servomotor attached to a gas valve. Four Control positions are performed by the Module, two main positions are used to open or close the valve. The valve opens at 90 and close to 0. This library allows an Arduino board to control RC (hobby) servo motors. Servos have integrated gears and a shaft that can be precisely controlled. The DS18B20 digital thermometer provides 9-bit to 12-bit Celsius temperature measurements. The DS18B20 communicates over a 1-Wire bus that by definition requires only one data line (and ground) for communication with a central microprocessor. It is to detect a temperature rising after Arduino sends a signal to turn on gas otherwise turns off gas. Pressure sensing module was designed to obtain internal Pressure of the system. As a final product, pressure sensor hasnt an output signal present, only numeric value displayed on LCD screen.

DISTRIBUTED SECURE DATA FORWARDING IN CLOUD STORAGE SYSTEM Amritha.S Mr.S.Saravana Kumar., Srinivasan Engineering College, Abstract

A cloud storage system, used to store large number of data in storage server. Cloud system is used to provide large number storage servers, which provide long-term storage service over the Internet. Third partys cloud system does not provide data confidentiality. Constructing centralized storage system for the cloud system makes hackers stole data easily. General encryption schemes protect data confidentiality. In the proposed system a secure distributed storage system is formulated by integrating a threshold proxy re-encryption scheme with a decentralized erasure code. The distributed storage system not only supports secure and robust data storage and retrieval, but also lets a user forward data from one user to another without retrieving the data back. The main technical involvement is that the proxy re-encryption scheme supports encoding operations over encrypted messages as well as forwarding operations over encoded and encrypted messages. The method fully integrates encrypting, encoding, and forwarding. The proposed system is applied for military and hospital applications, then other secret data transmission. COMMUNITY ANOMALY DETECTION SCHEME IN COLLABORATIVE INFORMATION SYSTEMS Anitha.S, Mr.P.Krishna Kumar Pet Engineering College Abstract Collaborative Information Systems (CISs) are deployed within a diverse array of environments that manage sensitive information.Using current security mechanisms the insiders threats are detected, but they are not suited to monitor the systems in which user function in dynamic teams. This system use the Community Anomaly Detection System (CADS) that utilizes a relational framework to detect insider threats based on the access logs of collaborative information.Based on the observation that typical CIS users leads to form community structures based on the subjects accessed. Anomaly prediction, which can be used to detect deviation from expected behavior. It also extended CADS into MetaCADS to incorpate the semantics of subjects accessed by the users.


While many protocols for sensor network security provide

onfidentiality for the content of

messages, contextual information usually remains exposed. Such contextual information can be exploited by an adversary to derive sensitive information such as the locations of monitored objects and data sinks in the field. Attacks on these components can significantly undermine any network application. Existing techniques defend the leakage of location information from a limited adversary who can only observe network traffic in a small region. However, a stronger adversary, the global eavesdropper, is realistic and can defeat these existing techniques. This paper first formalizes the location privacy issues in sensor networks under this strong adversary model and computes a lower bound on the communication overhead needed for achieving a given level of location privacy. The paper then proposes two techniques to provide location privacy to monitored objects (source-location privacy)periodic collection and source simulationand two techniques to provide location privacy to data sinks (sink-location privacy)sink simulation and backbone flooding. These techniques provide trade-offs between privacy, communication cost, and latency. Through analysis and simulation, we demonstrate that the proposed techniques are efficient and effective for source and sink-location privacy in sensor networks.

OPTIMAL CSI FEEDBACK WITH WATER FILLING PRECODER IN A MIMO SYSTEM Duffy Asenath.S Abstract In this paper we study the relation between the ergodic capacity and feedback interval for a Multiple Input Multiple Output (MIMO) system. Based on this relation an optimal feedback interval is derived for the Rayleigh fading channel. The minimum differential feedback rate is also determined for this system considering the channelestimation error and channel quantization distortion Francis Xavier Engineering College


Eby Sam Stewart.L, Mr V.S.Selvakumar, Dr.L.Sujatha Abstract


Frequency Reconfigurable antenna has been widely used in most of the latest communication systems. Due to its excellent out-of-band rejection property it is being used in satellite communications as well. The most important component in reconfigurable antenna is the RF switch that provides switching between different antenna patterns. In typical application like satellite communication, the switch has to operate at high frequencies, very low temperatures and under the exposure to cosmic rays. The performance of conventional switches are affected by the bias lines and voltage provided to these switches. An RF switch with low actuation voltage and low loss so as to improve the performance of the reconfigurable antenna. Traditional PIN diode and FET switches exhibit high loss at high frequencies. Also these devices consume large amount for operation and is affected by temperature variations.

ENHANCE THE SECURITY TO MITIGATE ONLINE PASSWORD GUESSING L.Janani S.Ilavarasan Saveetha Engineering College Abstract Security has become a important aspect with respect to password. Several techniques are already available to provide a secure password. A new password guessing resistant protocol (PGRP) is introduced from attacks.(brute force and Dictionary attacks).The project to be developed is to Prevent Password Guessing Attackers between a server and number of clients. Brute force and Dictionary Attack on remote login services are wide spread and ever increasing. Automated Turing test which is continue to be very effective, easy to deploy. We propose a new Password Guessing Resistant protocol (PGRP) the protocol is easy to deploy and scalable requiring minimum computational resources in terms of memory, processing time and disk space. Online password guessing attacks are inevitable and commonly observed against web applications..PGRP limits the total number of login attempts from unknown remote host with reasonably cost of inconvenience to the user. PGRP analyses the performance with the two real world datasets and find more promising existing proposals

FREQUENT ITEMSETS MINING ON LARGE UNCERTAIN DATABASES: USING RULE MINING ALGORITHM Jency Varghese Abstract In recent years, due to the wide applications of uncertain data mining frequent item sets over uncertain databases has attracted much attention. In uncertain databases, the support of an item set is a random variable instead of a fixed occurrence counting of this itemset. In sensor monitoring system and data integration diligence the data manipulated is highly ambiguous. The important issue is extracting frequent itemsets from a large uncertain database, interpreted under the Possible World Semantics. An uncertain database contains an exponential number of possible worlds, by observing that the mining process can be modeled as a Poisson binomial distribution. Mining manifold Itemsets from generous ambiguous database illustrated under possible world semantics is a crucial dispute. Approximated algorithm is established to ascertain manifold Itemsets from generous ambiguous database exceedingly. We propose incremental mining algorithms, which enable probabilistic frequent itemset results to be refreshed. We criticize the support for incremental mining and ascertainment of manifold Itemsets. Tuple and Attribute ambiguity is reinforced. Incremental Mining Algorithm is adduced to retain the mining consequence. K.Soundararajan Vivekanandha College of Engineering for Women

ENERGY EFFICIENT IN IMPROVED EDF FOR MULTI-CORE SYSTEM X.Jude Roy Jeyaseelan., M.Poongothai, Coimbatore Institute of Technology, Coimbatore Abstract Embedded systems have been widely used in portable devices. To meet real time application demands computing capability of the embedded system should be high. It is very important for designing of embedded system to enable minimum energy consumption to while meeting the real time application demands. Dynamic voltage scaling technology enables effective reduction of energy consumption by utilizing slack time to modify operation voltage and frequency of processor in order to reduce energy consumption. Multi-core systems providing better throughput capability than single-core processor working under in same clock frequency. The proposed IEDF-DVS

(Improved EDF with Dynamic Voltage Scaling) scheduling algorithm can effectively reduce energy consumption in multi-core environment and ensure all tasks to meet their deadlines.

DEEP PACKET INSPECTION WITH BIT-REDUCED DFA FOR CLOUD SYSTEM S. Aravindh U. MuthuPandi@Vignesh Chandy College of Engineering Abstract With development of the cloud computing, its security issues have got more and more attention. There is a great demand for the examining the content of data or packets in order to improve cloud security. In this paper, we propose a new algorithm about pattern matching for cloud system, First it performs inexact matching to filter out the part of non- attack information and then do exact matching to get the final attack information. Our presented specific algorithm named Bit-Reduced DFA, it is feasible through a preliminary evaluation. EFFICIENT RFID STREAM ON STATIC AND DYNAMIC READER USING DATA INFERENCE AND DATA COMPRESSION

B. Mahendiran N. Velmurugan Abstract

Saveetha Engineering College

RFID Technology used to identify multiple objects simultaneously in many areas such as healthcare, pharmaceuticals, supply-chain etc. In RFID warehouse presents certain challenges including incomplete data, lack of location and containment and high volumes of data. Because of the collision, some of tags cannot be identified and difficult to identify the location as well containment relationship between tags. Redundant data can leads to excessive space and difficult for query processing in warehouse. They are fixed reader and mobile reader. In this paper, Data inference and compression technique is used to identify tags as well as containment relationship among tags and reduce redundant data efficiently and easy to handle query processing in distributed environment.


Abstract In attribute based encryption (ABE) scheme each user is identified by set of attributes and some functions of those attribute is used to determine decryption ability for each ciphertext. In our proposed system is different from multi authority ABE scheme where each authority can work independently without any cooperation and central authority. The global Identifier (GID) is used to tie all the users secret keys together, while the corrupted authorities cannot pool the users attributes by tracing it and revoked users also allowed to re-encrypt the data using key revocation algorithm. FPGA IMPLEMENTATION OF ROBOTIC ARM HAVING THREE LINK MANIPULATOR A.K.Nitin Subramonium D.Nanda Gokul N.Vijaya Kumar Mr G.Saravanan

KPR Institute of Engineering and Technology Abstract Small robots can be beneficial to important applications such as movement of objects, beverages, placing chips on board, welding purpose and many other industry applications. A reconfigurable technique based on Field Programmable Gate Array (FPGA) is used to implement the robotic arm movement which has the potential for greater functionality, higher performance and lower power dissipation. The FPGA controller is used to generate direction and the number of pulses required to rotate for a given angle. Pulses are sent as a square wave which is generated using Pulse Width Modulation (PWM) and the number of pulses determines the angle of rotation. The frequency of square wave determines the speed of rotation. The proposed control scheme has been realized using XILINX FPGA SPARTAN-3E. The real time operation is done with DC servo motors that are joined for testing motion of the robot arm.

PREDICTING EARTHQUAKES THROUGH DATA MINING Ramya.A.P Rizvana.M Anna University of Technology Abstract Data mining consists of evolving set of techniques that can be used to extract valuable information and knowledge from massive volumes of data. Data mining research &tools have focused on commercial sector applications. Only a fewer data mining research have focused on scientific data.

This paper aims at further data mining study on scientific data. This paper highlights the data mining techniques applied to mine for surface changes over time (e.g. Earthquake rupture). The data mining techniques help researchers to predict the changes in the intensity of volcanos. This paper uses predictive statistical models that can be applied to areas such as seismic activity , the spreading of fire. The basic problem in this class of systems is unobservable dynamics with respect to earthquakes. The space-time patterns associated with time, location and magnitude of the sudden events from the force threshold are observable. This paper highlights the observable space time earthquake patterns from unobservable dynamics using data mining techniques, pattern recognition and ensemble forecasting. Thus this paper gives insight on how data mining can be applied in finding the consequences of earthquakes and hence alerting the public.

SECURE COMMUNICATION BY QUANTUM PROTOCOL Ramya.A.P Rizvana.M Anna University of Technology Abstract In the communication scenario, there is a bottleneck problem for providing secure communication without encryption and without secret key between users. To achieve this objective, principle of Quantum Mechanics can be applied. The quantum mechanics replaces mathematical encryption in conventional encryption techniques. Enciphering and deciphering techniques are often utilized among senders and receivers to achieve enhanced secure communication. But it needs mathematical computation and reduces speed. To avoid this, the Quantum protocol contains an information channel whose existence is undetectable by any currently known technology. Such hidden channels could effectively provide secure communication without encryption technique. The protection that quantum mechanics offers to keys could extend to the information transmitted during communication itself, thereby eliminating the use of key-based encryption. The Email infrastructure has been chosen as a communication medium and provides Email based secure distributed information retrieval. The system is based on a store-and-forward paradigm, utilizing the public email system, to facilitate search distribution and collaborative information retrieval. It ensures the information security and privacy in the search and exchange of sensitive data in an open network. In this paper, a quantum protocol for secure transmission of data using qubits is presented.

Quantum protocol does not require a shared secret key and a secure communication can be made without encryption between two parties.

OPTIMAL BIASED SYSTEM TO IMPROVE SNR Renu Roy Abstract This paper deals with the problem of improving SNR.Modulating signal is the output of optical OFDM and is bipolar in general.It cannot be used for intensity modulated direct detected systems that is unipolar in general.Biasing followed by clipping to simply transform bipolar to unipolar.This work is not focussed on eliminating all the clipping but to provide sufficient bias,so that clipping is not the dominant source in the system.This suggests that provide sufficient bias,so that clipping is not the dominant source in the system. Francis Xavier Engineering College

LICENSE PLATE LOCALIZATION FROM LOCAL IMAGES , Abstract Automatic license plate recognition (ALPR) system for vehicles is a challenging area of research due to its importance to a wide range of commercial applications. The first and the most important stage for any ALPR system is the localization of the license plate within the image captured by a camera. Variety of techniques has already been reported for localization of license plate and recognition of license number thereafter. But most of the works seem to be applicable for a very controlled environment. In the current work, we have concentrated on localization of license plate regions from true color still snapshots captured in a very realistic situation. The technique is based on a novel multi-stage approach for analysis of vertical edge gradients from contrast stretched grayscale images. The technique successfully localizes the actual license plates in 89.2% images. ,


AbstractCloud computing is the use of computing resources(hardware and software) that are delivered as a service over the network (Internet).It entrusts remote services with users data, software and computation. Even though this new emerging technology has many advantages, users lose the control of their own data (particularly financial and health data).This paper proposes an enhanced Decentralized Cloud Accountability Framework (DCAF) system to keep track of actual usage of users data in the cloud. This object-centred framework uses JAR files which encloses the logging mechanism together with users data and policies. To strengthen the users control, DCAF is also provided with effective Auditing mechanisms. The proposed methodology applies advanced obfuscation techniques to the JAR files provides additional security to JRE

HUMAN ACTION RECOGNITION IN DYNAMIC BACKGROUND USING DYNAMIC PROTOTYPE TREE S.Subalin Abstract Action recognition has been a popular research topic in the vision community due to its wide applicability to multimedia analysis and video surveillance. Shape and motion are the most important and useful visual cues for human action recognition. The existing system relies on high dimensional descriptors for modeling action frames which for large-scale action retrieval and recognition require tremendous amounts of computation. On the other hand, existing approach mostly assumed simple backgrounds or static cameras, and did not explicitly consider the challenging cases of dynamic backgrounds. In the proposed system, an efficient prototype-based approach is used for action recognition which performs recognition efficiently via prototype matching and look-up table indexing. Actions are modeled by learning their prototypes in a joint shape-motion space via k-means clustering. Frame-to-frame distances are rapidly estimated via fast look-up table indexing. Once the action is recognized a message is triggered to the administrator. Finally the recognized actions are converted in to frames and at the same instance the frames from Global Server are retrieved and Gray Scale Conversion is applied to both the recognized frames and Global Server frames to recognize human. R.Femila Goldy Anand Institute of Technology,

DESIGN OF ENERGY OPTIMIZATION ALGORITHM FOR THE WIRELESS SENSOR NETWORKS P.YOGAPRIYA, Dhanalakshmi Srinivasan College of Engg. and Tech Abstract In a Power Constrained Wireless Sensor Network, Power Consumption is an important criterion. By reducing the overall power consumption of an network, the network lifetime can be extended. The power for idle state is more compared to sleep state during transmission, receiving, idle and sleep states. Thus much of the energy can be saved by making the idle nodes to sleep.Hence a proposal to reduce the overall energy consumption to a maximum extent is done. Initially the energy of nodes are estimated based on their respective states. Then the idle nodes which does not involve in the transmission are made to sleep thereby saving an overall power consumption of 10.23. After a detailed study of node state behavior, a neighbor table for all the nodes is maintained along with the RSSI metric. Based on the RTS/CTS concept, a neighbor node with higher RSSI value will be elected as a forwarding node, and other nodes which receive RTS are made to hibernate. Since the RSSI metric is based on the distance and a forwarding node elected is also going to be at the same distance always, Maximum Power Drain occurs at all the forwarding nodes. To overcome this, A Sequential Approach based on maximum RSSI and maximum residual power of the neighboring nodes is used for the selection of forwarding nodes.By this sequential approach, every individual node power is utilized effectively for the entire network.

VLSI DESIGN OF MIMO OFDM FOR THE FUTURE WIRELESS COMMUNICATION G.Jaya Padmapriya Nallathambi Jaya Engineering College Abstract The design of an OFDM physical layer that follows the IEEE standard 802.11a. We then devise an efficient pipelined architecture, and incorporate it into the MIMO-OFDM physical layer. In our experiments, we compare our pipelined architecture to the baseline MIMO-OFDM physical layer

implementation. The baseline MIMO-OFDM system uses the same number of fast Fourier transform (FFT) blocks as antennas. The implementation efficiency of our pipelined architecture, compared with the baseline MIMO-OFDM system, is evaluated using two methods: (1) using just one FFT block, and (2) using Radix-2pipelined streaming FFT block, versus a Radix-4 FFT block used in the baseline MIMOOFDM system. Our experiments show that atleast 30 percent of the resources in the baseline MIMO-OFDM system can be saved using our proposed architecture, while achieving the same data rate. We also show that this data rate can be doubled, with approximately the same resource reduction. Moreover, by exploiting the dynamic reconfiguration, our MIMOOFDM system can adapt to various operating modes. MIMO-OFDM systems using single Fast Fourier Transform block that is shared across modulations for the system. Our experimental results show that the proposed implementation saves at least 60 percent of the hardware resources, while achieving the same data rate as known baseline MIMO-OFDM implementations. We also show that as more channels are used, more resources can be saved by using our proposed architecture.

EVOLUTION OF ONTOLOGY BASED ON FREE TEXT DESCRIPTOR FOR WEB SERVICE S. Chitra, P.Ragavendiran, V. Kajendran, M. Manish Kumar Singh MVIT, Abstract Ontologies have become the de-facto modeling tool of choice, employed in a variety of applications and prominently in the Semantic Web. Nevertheless, ontology construction remains a daunting task. Ontological bootstrap- ping, which aims at automatically generating concepts and their relations in a given domain, is a promising technique for ontology construction. Bootstrapping ontology based on a set of predefined textual sources, such as Web services, must address the problem of multiple concepts that are largely unrelated. This paper exploits the advantage that Web services usually consist of both WSDL and free text descriptors. The WSDL descriptor is evaluated using two methods, namely Term Frequency/Inverse Document Frequency (TF/IDF) and Web context generation. We propose an ontology bootstrapping process that integrates the results of both methods and validates the concepts using the free text descriptors. The web service free text descriptor offering the more accurate definition of ontologies. They extensively validated our ontology.

ANOMALY DETECTION IN MULTI-TIER WEB APPLICATIONS WITH ENHANCED KEY GENERATION IN DOUBLE GUARD ARCHITECTURE P. Sivaranjani A.Sandhiya Devi R.Suganya S.Uthayashangar. Manakula Vinayagar Institute of Technology, Abstract Services provided by the internet is being increased due to the increasing usage of the web services and applications in order to satisfy our day-to-day requirements. Since internet services are common to all the users, the security level provided to the personal details of a single user is not satisfactory. Hence a Double Guard system that provides security to both the front end (web server) and the back end (database) using the lightweight virtualization concept. Double guard ensures security only after the creation of the user dedicated sessions and fails to provide initial level security. Hence we propose a new idea that provides initial security to the web applications using the double guard system by means of Re encrypted Secret Key generation. The results of the system are found to be successful in prominent web applications.

FAST AND ACCURATE MULTITASK SALIENCY DETECTION BY CONSIDERING THE FEATURES OF COLOUR, TEXTURE, ORIENTATION AND LUMINANCE S.Hambert Solomon raja Abstract We propose a new type of multitask saliency detection saliency which aims at detecting the image regions that represent the scene. This definition differs from previous definitions whose goal is to either identify fixation points or object. To detect the dominant object. In accordance with our saliency definition, we present a detection algorithm which is based on four principles observed in the psychological literature. The benefits of the proposed approach are evaluated in two applications where the context of the dominant objects is just as essential as the objects themselves. In image retargeting we demonstrate that using our saliency prevents distortions in the important regions. In summarization we show that our saliency helps to produce compact, appealing, and informative summaries. PSN College of Engineering and Technology

CLUSTERED TECHNIQUES OVER BIOLOGICAL SEQUENCE PREDICTED USING BIMAX S. K. Anantha Priyaa Abstract The five biclustering algorithm (QUBIC, SAMBA, ISA, FABIA, and BIMAX) has been made to identify the Bicluster in gene expression data. The GDS 1620 dataset and pathway dataset were used to compare the five algorithms under different dimension. The testing was preferred to verify the corresponding biological significance using Gene Ontology (GO) and Protein-Protein Interaction (PPI). Its performance and quality were evaluated using two scoring methods Weighted Enrichment (WE) scoring and PPI scoring. Finally, BIMAX algorithm were said to be efficient among the five biclustering algorithm. In proposed, the infections of viral disease datas are to be clustered using the BIMAX algorithm again the datas are biclustered in order to avoid the sequence of data. According to the study the clustering of data and biclustering of data is effective to analyze the viral infection easily. A TWIN PRECISION MULTIPLIER BASED MULTI-RESOLUTION FAST FILTER BANK FOR SPECTRUM SENSING IN MILITARY RADIO RECEIVERS Ancy Michel.M Abstract In this paper, we propose a twin precision multiplier based multi-resolution filter bank (MRFB) for spectrum sensing in military radio receivers. The flexibility in realizing multiple sensing resolution spectrum sensor is achieved by suitably designing the prototype filter and efficiently selecting the varying resolution sub bands without hardware re-implementation. Here to Oxford Engineering College

improve the performance use a twin precision technique for fast filter bank in MR receivers. Twin precision multiplier is able to adapt to different requirements of multi-resolution lter bank. By adapting to actual multiplication bit width using the twin-precision technique, it is possible to save power, increase speed and double computational through put. With our proposed twin-precision multiplier scheme for fast filter bank in MR receivers the execution time is reduced and efficiency is increased.The proposed filter bank architecture achieves less gate count and power reduction over conventional filter banks.


A FULLY AUTOMATIC SEMENTATIONOF LIVER AND HEPATIC TUMOR FROM 3D CT ABDOMINAL IMAGES Angel.G, Evangeline Angel.S , HyrunFathima I JayarajAnnapackiam C.S.I College of Engineering,Nazareth. Abstract: An adaptive initialization method was developed to producefully automatic processing frameworks based on graph-cut and gradient flow active contour algorithms. This method was applied to abdominal Computed Tomography (CT) images for segmentation of liver tissue and hepatic tumors. Twenty-five anonymized datasets were randomly collected from several radi-ology centres without specific request on acquisition parameter settings nor patient clinical situation as inclusion criteria. Resulting automatic segmentations of liver tissue and tumors were compared to their reference standard delineations manually performed by a specialist. The analyzed datasets presented 52 tumors: graph-cut algorithm detected 48 tumors while active contour algorithm detected only 44 tumors. In addition, in terms of time performances, less time was requested for graph-cut algorithm with respect to active contour one. The implemented initialization method allows fully automatic segmentation leading to superior overall performances of graph-cut algorithm in terms of accuracy and processing time. The initialization method here presented resulted suitable and reliable for two different segmentation techniques and could be further extended. PRE-COMPUTATION ARCHITECTURE WITH T-ALGORITHM FOR VITERBI DECODER DESIGN P.Moorthy ,Anju Sasi Vivekananda College of Engineering for Women Abstract A popular combination in modern coding system is the convolutional encoder and the Viterbi decoder . With a proper design, they can jointly provide an acceptable performance with feasible decoding complexity. In this paper, we propose an area efficient architecture based on precomputation for Viterbi decoders incorporating T-algorithm. Through optimization at both design level and architecture level, the new architecture greatly shortens the long critical path introduced by the conventional T-algorithm. A general solution to derive the optimal pre-computation steps is also given in the paper. Viterbi decoder is the dominant module in TCM decoders used in space communications. The design example using a rate convolutional code used in the TCM system,


provided in this work demonstrates more than twice improvement in clock speed with negligible computation overhead,high power reduction while maintaining decoding performance. INTERACTIVE IMAGE SEGMENTATION USING DYNAMIC BAYESIAN NETWORK Anu Antony, Benesh Selva Nesan , A.Prabin The Rajaas EngineeringCollege Abstract Segmenting semantically meaningful whole objects from images is a challenging problem, and it becomes especially so without higher level common sense reasoning. In this project, we present an interactive segmentation framework that integrates image appearance and boundary constraints in a principled way to address this problem. In particular, we assume that small sets of pixels, which are referred to as seed pixels, are labeled as the object and background. The seed pixels are used to estimate the labels of the unlabeled pixels using Dirichlet process multiple-view learning, The boundary extraction problem is formulated as a Dynamic Bayesian network and the novel approach to the Dirichlet Mixtures with state pruning is used to nd the optimal boundary in a robust and efcient manner based on the extracted external and internal local costs, thus handling much inexact user boundary donations than existing methods.

IMPLEMENTING SCALING ONLINE SOCIAL NETWORKS USING SOCIAL PARTITIONING AND REPLICATION Anusree V.K M. Edwin Jayasingh Abstract Distributed network is the important application of networking. A social networking service is an online service that focuses on building of social networks. Vertical scaling introduces, add resources to a single node in a system, involving the addition of CPUs or memory to a single computer. Such vertical scaling of existing systems also enables more effectively and it provides more resources for the hosted set of operating system. Horizontal scaling introduces to add more nodes to a system, such as adding a new computer to a distributed software application. This model has created an increased demand for shared data storage with very high I/O performance. When a servers CPU is bound, adding more servers do not help serve more requests [1]. Due to complex nature of the OSNs the existing partitioning techniques does not produce any optimal solution for the back end data scalability. In this paper, Social partitioning and replication model for the middleware joint partitioning and replication of the underlying community structure to provide an Infant Jesus College of Engineering


optimal solution for the band end data scalability for the online social networks and to ensure that all the data is local.

A STUDY ON UPCOMING TRENDS IN ELDERLY HOME MONITORING SYSTEM N.Beaulah, AL.Kumarappan Abstract Wireless-sensor-networks add extra advantage combined with the advancements in MEMS technology. These are nowadays widely used in the bio-medical applications. WSN based home monitoring system for elderly activity behaviour. By regular monitoring we can determine the wellness of elderly. The system can also be used to monitor physiological parameters, such as temperature and heart rate, of a human subject. Using MEMS sensors to detect falls and to measure different vital signs, the person is wirelessly monitored within his own house; this gives privacy to the elderly people. The captured signals are then wirelessly transmitted to an access point located within the patients home. This connectivity is based on wireless data transmission at 2.4-GHz frequency. The access point is also a small box attached to the Internet through a home asynchronous digital subscriber line router. Afterwards, the data are sent to the hospital via the Internet in real time for analysis and/or storage. Programmed system will minimize the number of false messages to be sent to care provider and supports inhabitant through suitable prompts and action to be performed when there is irregular behaviour in the daily activity. Sri Sai Ram Engineering College,

PERFORMANCE ASSESSMENT OF CLUSTER NODES WITH DYNAMIC PRIORITY SCHEDULING IN PRIVATE CLOUD ENVIRONMENT Balakannan S.P Bhavani R Abstract Cloud computing is an emerging technology for providing its users an efficient and reliable resource sharing mechanisms. Users of the cloud can store, retrieve and process their data and files on demand basis. The use of differentiated cloud services prevent the users from high computational cost and reduces the burden of data storage. In order to get an effective cloud service, it is crucial to find the performance of the cloud nodes. In the existing technique the performance of the cluster nodes have been evaluated with different cluster configurations deployed in a multi-cloud environment, based on throughput measures. In our proposed work we have analyzed the performance of the cluster nodes in a cloud environment. We have evaluated the performance of each node in the cluster based on its scalability and response time. Also we

Kalasalingam University

concentrated on scheduling of tasks to the cluster nodes. We have used the Dynamic Priority Scheduling algorithm for scheduling the tasks in the cluster. FIELD PROGRAMMABLE GATE ARRAY BASED ADAPTIVE NEUROMORPHIC CHIP FOR OLFACTORY SYSTEM M.Sudalina Devi, Mrs. J.Nalini, PSN College of Engineering and Technology Abstract Electronic nose system is an artificial neural network system used to detect or classify odour of a specimen and if finds wide application in all commercial industries. To identify a new sample and then to estimate its concentration, use to both spike timing dependent plasticity learning techniques and the least square regression principle. In the first one is aimed at teaching the system how to discriminate among different gases, while the second one uses the least squares regression approach to predict the concentration of each type of samples. The system proposed suggests a scalable and generic architecture. The system aims at reducing the area overhead by incorporating a transposable SRAM array that share learning circuits which grows with the number of neurons also the system is trained for usage in chemical industry by coupling a chemo sensor array. All the component subsystem implemented on neuromorphic chip has been successfully tested in FPGA.


L.ESTHER PONNAMMAL ,V.DIVYA MEENAKSHI R.M.K. engineering college Abstract The rapid development of data transfer through internet made it easier to send the data accurate and faster to the destination. One of the most important factors of information technology and communication has been the security of the information. For security purpose the concept of steganography is being used. Steganography is an art of invisible communication. In this paper we propose a new method for strengthening the security of information through a combination of

signal processing, cryptography and steganography. A data hiding technique which generates an identifier based on chaotic mixing which provides the structure for the generation of the sorted sequence from an original sequence is used.A concept of toral automorphism is used where a digital image subjected to iterated actions of A matrix would first encounter complete

chaos,i.e. the lattice L(stegoImage) disperses having its points distributed irregularly and then these points would come back to their original position after a specific number of iterations. This is one of the secure methods of data hiding. BIOMETRIC IDENTIFICATION OF PERIOCULAR REGIONS USING TEXTURE FEATURE P.Fasca Gilgy Mary, P.Sunitha Kency Paul, J. Dheeba Noorul Islam Centre For Higher Education Abstract Biometrics is used in computer science as a form of identification and access control. Biometric identification is the process by which a person can be identified by his characteristics. Automated Biometric identification can be done rapidly and uniformly, with a minimum number of training, which provides extremely accurate and secured access to information. In this project work Periocular biometric recognition is used, which is based on the appearance of the region around the eye. Periocular recognition may be useful in applications where it is difficult to obtain a clear picture of an iris for iris biometrics or a complete picture of a face for face biometrics. Acquisition of the Periocular biometrics does not require high user cooperation and close capture distance unlike other ocular biometrics. This region usually encompasses the eyelids, eyelashes, eyebrows, and the neighbouring skin area, which encompasses the information of face recognition and an iris recognition system. In this work, the Local Binary Patterns (LBPs) used for the feature extraction on the Periocular images. LBP is a type of feature used for classification in computer vision and a powerful feature for texture. Modified Backpropagation neural network classifier is used for the classification and recognition of an authenticated individual.

AN EFFICIENT REFINING OF CBMIR THROUGH SUPERVISED LEARNING APPROACH A.Glorin Brittal Rani ,Sangeetha Senthilkumar Oxford Engineering College Abstract CBIR technique is becoming increasingly important in medical field in order to store, manage, and retrieve image data based on user query. In order to reduce the computational time for SVM

training we introduce unsupervised clustering before supervised classification. Searching is done by means of matching the image features such as texture, shape, or different combinations of them. Texture features play an important role in computer vision, image processing and pattern recognition. We introduce a novel method of using SVM classifier followed by KNN for CBIR using texture and shape feature. PERFORMANCE ENHANCEMENT OF EZ-SOURCE INVERTER USING INDUCTION MOTOR N.Gurusakthi, R.Sivaprasad Abstract Since the Z-Source element has fewer complexes they are used mainly for buck-boost energy conversion with the help of passive elements. The further advancement in the Z-Source is the Embedded EZ-Source inverter that can produce the same gain as Z-Source inverter. The input to the Embedded EZ-Source inverter is been obtained from solar cell .Therein the ripples which we obtain from the output voltage of solar cell is filtered using Z-Filter. Pure DC is given to the threephase inverter, followed by the conversion to balanced AC. The output of the Embedded EZ Source inverter is used to control the harmonics present in the load. The entire process is analysed with the help of MATLAB-SIMULINK DESIGN OF SOC WIRE BASED ON NOC ARCHITECTURE Hemalatha.T, Daisy Rani.T Abstract The objective is to design and implement SoCWire using NOC architecture. SOC wire using NOC architecture have two parts they are SOC wire CODEC and SOC wire switch.SOC wire CODEC comprises of five major parts they are receiver FIFO, receiver ,statemachine,transmitter and transmitter FIFO. Which connects a node or host system to a SOC wire network.SOC wire switch is used for routing data of many CODEC from one node to many other nodes. In my work I have considered four such nodes. Along with SOC wire CODEC Hamming code is introduced for single bit error detection and correction. SOC wire switch is implement with 8 port crossbar switch .The design is synthesized on Xilinx ISE 9.1 using VHDL coding level.SOC wire is mainly used for space application. DOMOTATION USING HYBRID PROTOCOLS ABINAYA.K, BALAJI.G, JEGANATHAN.V Angel college of Engineering of Technology

Sri Sai Ram Engineering College

Hindustan University.

Abstract Building automation optimizes energy savings and reduces operating costs lowering total cost of ownership. It furthermore, enhances security, protection and convenience. This paper focuses on the integration of electrical automation devices using hybrid protocols. It facilitates the cohort of new functionalities by connecting individual working electrical systems and circuits into a network .It deals with automation of the building with aspects like heating, ventilation and air conditioning (HVAC), lighting control and health monitoring. So, the final building automation system has different subsystems which are finally taken to an incorporated building supervision system. The main purpose is to provide the end user with an economical fully centralized system in which home appliances are managed by both wired and wireless networks. The system is able to automate the building with low power expenditure and it can be implemented with increased power backup and the backup power can be utilized with priority basis. Continuance risk can be abridged because of the shared communication between the subsystems.

INVISIBILITY TECHNOLOGY REPLACES MAGIC P.David raj, T.Baskaran, V.R.S College of engineering and technology, Abstract This paper describes a kind of active camouflage system named Optical Camouflage. Optical Camouflage uses the Retro-reflective Projection Technology, a projection-based augmentedreality system composed of a projector with a small iris and a retro reflective screen. The object that needs to be made transparent is Painted or covered with retro reflective material. Then a projector projects the background image on it making the masking object virtually transparent.

NEIGHBORING NODE INTRUSION DETECTION SECURITY FOR MANET ROUTING Gopalakrishnan.T,Jahir Hussain.V Abstract Mobile ad hoc networking (MANET) has become an exciting and important technology in recent years, because of the rapid propagation of wireless devices. MANET is prone to both insider and outsider attacks more than wired and infrastructure based wireless networks. It is highly vulnerable to attacks due to the open medium, dynamically changing network topology, cooperative algorithms, and lack of centralized administration. Intrusion detection systems(IDS) present

M.A.M College of Engineering

inspection and observe capabilities that offer then earby security to a node and help to distinguish the specific trust level of other nodes. We propose a Neighboring Node Intrusion Detection (NNID) security y routing mechanism to detect Black Hole Attack (BHA) over Ad hoc On Demand Distance Vector (AODV) MANET routing protocol. In NNID security routing mechanism, the intrusion detection is performed by neighboring nodes using the nearby node of the attacker node. By performing NNID security routing mechanism, the security mechanism overhead would be decreased.

BRAIN CONTROLLED CAR FOR DISABLED USING ARTIFICIAL INTELLENG J.Subash, K.S.Sharath kumar Abstract This paper considers the development of a brain driven car, which would be of great help to the physically disabled people. Since these cars will rely only on what the individual is thinking they will hence not require any physical movement on the part of the individual. The car integrates signals from a variety of sensors like video, weather monitor, anti-collision etc. it also has an automatic navigation system in case of emergency. The car works on the asynchronous mechanism of artificial intelligence. Its a great advance of technology which will make the disabled,abled. In the 40s and 50s, a number of researchers explored the connection between neurology, information theory, and cybernetics. Some of them built machines that used electronic networks to exhibit rudimentary intelligence, such as W. Grey Walter's turtles and the Johns Hopkins Beast. Many of these researchers gathered for meetings of the Teleological Society at Princeton and the Ratio Club in England. Most researchers hope that their work will eventually be incorporated into a machine with general intelligence (known as strong AI), combining all the skills above and exceeding human abilities at most or all of them. A few believe that anthropomorphic features like artificial consciousness or an artificial brain may be required for such a project. DEVELOPMENT OF REPLICA FREE REPOSITORIES USING PARTICLE SWARM OPTIMIZATION ALGORITHM Jeby K Luthiya Abstract

SNS College Of technology

C. Umamaheswari

Vivekananda College of Engineering for Women

The increasing volume of information available in digital media becomes a challenging problem for data administrators. Usually built on data gathered from different sources, data repositories such as those used by digital libraries and e-commerce brokers present records with disparate schemata and structures. The increased volume even created redundant data also in the database. So a system or method is become immense to control the redundancy and duplication. In the proposed approach, we made a method that makes use of PSO (Particle swarm optimization) algorithm for generating the optimal similarity measure to decide whether the data is duplicate or not. PSO algorithm is used to generate the optimal similarity measure for the training datasets. Once the optimal similarity measure obtained, the deduplication of remaining datasets is done with the help of optimal similarity measure generated from the PSO algorithm. NONINVASIVE LOAD DELEGATION MODEL IN GRID ENVIRONMENT B.JEYANTHI, S.SUPRAKASH, Abstract In a Grid environment, the most consideration of scheduling policy is to reduce the response time and execution time of a job. Some creative applications might need several resources of different types. It's common for the resource desires of grid applications to travel on the far side what's available in any of the sites creating up a grid. To run such applications, a method known as coallocation, that is, the coinciding or coordinated access of single applications to resources of presumably multiple types in multiple locations managed by completely different resource managers is needed. Allocating resource across the multiple clusters will reduce the execution time of a job in an affordable amount. Such multi cluster systems will give access to larger computational power and to a large range of resources. In this paper, we propose a decentralized grid system model as a group of clusters. We tend to then introduce a Decentralized Job Scheduling Algorithms that performs intra cluster and inter cluster (grid) job scheduling. We model the grid as a bunch of clusters. Group of users submit jobs to the varied clusters. In centralized scheduling, the computer hardware of every cluster is scheduling of submitted jobs. In decentralized scheduling, jobs although submitted locally may be migrated to a different cluster so as to reduce the processing time of the jobs. Kalasalingam University,

A NOVEL CLUSTERING METHOD BASED ON HACO AND FAPSO FOR CONTINUOUS DOMAIN P.Kiruthiga, J.Mercy Geraldine, Srinivasan Engineering College Abstract

Data has to be clustered for easy mining of the required content. Data clustering is an essential technique for web applications and organizations. However, the clustering performance has to be optimized to form usable and efficient data clusters. Many optimizing methods have been suggested to improve the clustering performance of the fuzzy c- means clustering. The FAPSO and the HACO optimization techniques have been proposed to improve the clustering performance. However, these traditional methods suffer from various limitations such as sensitivity to initialization, trapping into local minima and lack of prior knowledge for optimum parameters of the kernel functions. Considering the performance of the clustering techniques, the kernel methods are used in kernelized fuzzy c-means algorithm for improving the clustering performance of the well know fuzzy c-means algorithm. This is processed by mapping the considered dataset into a higher dimensional space non-linearly. The obtained dataset is more likely to be linearly separable. In this algorithm, to overcome the drawbacks, a new clustering method based on the recently proposed optimization, hybrid ant colony optimization for continuous domain and particle swarm optimization are proposed. The proposed method is applied to a dataset which is extracted from MITBIH arrhythmia database and four domain features are extracted for each type and training and test sets are formed. This algorithm can be used in various applications such as web application, classifying ECG records. ONTOLOGY EXTRACTION FROM WIKIPEDIA Mahalekshmi Abstract Ontology plays an important role in knowledge management and the semantic web. In this paper we construct ontology for the domain of computer science and we propose an automatic updating methodology. Here we are using Wikipedia of computer science as our source. The proposed approach consists of three phases. In the first phase, the wiki pages are downloaded from the Wikipedia using the web robot. In the second phase, from the extracted wiki pages the concepts and relations are identified for constructing ontology by using the proposed CRI algorithm,. In the third phase any modification in the wiki pages is identified and updated automatically. ADAPTIVELY PIPELINED PARALLEL LINEAR PHASE MIXED DIGITAL FIR FILTER Maragathavalli.N Abstract Sriram Engineering College


Based on Fast FIR algorithms, this brief proposes high throughput low latency digital finite impulse response structures. These Structures are beneficial to symmetric convolution of odd length in terms of hardware cost. The proposed parallel FIR structures exploit the intrinsic nature of symmetric coefficients as well as asynchronous pipelines which reduces the latency and area efficient. The main aim is to reduce the multipliers which require more space. Including adders instead of multipliers is advantageous because adders occupy less space. For an N tap three-parallel FIR filter, the proposed structure can save N/3 multipliers with the increase in adders at pre processing and post processing blocks. The proposed parallel FIR structures can lead to significant hardware savings for symmetric convolution which is the main process in FIR filter implementation. The proposed structure can be viewed as the filter having the synchronous and asynchronous components performing the filter function.

POWER OPTIMIZATION IN MULTITHRESHOLD CMOS Minu Johny, P.Moorthy Vivekananda College of Engineering for Women Abstract Power gating (MTCMOS) has emerged as an increasingly popular technique to reduce leakage power during the standby mode, while attaining high speed in the active mode. We propose a new reactivation solution which helps in controlling mode transition noise and in achieving minimum reactivation times. A triple phase sleep signal slew rate modulation technique (TPS) has been proposed an efficient solution to such problems. In order to achieve best leakage power saving under equi noise constraints with UMC 80nm CMOS technology, a digital sleep signal modulator presented. Reactivation time, mode transition energy consumption, leakage power consumption reduced, thus optimizes total power consumption in Multithreshold cmos. In this method, a total power consumed of about 0 .618mw by new sleep signal generator. Results obtained indicate

that our proposed techniques can achieve 70.7% reduction for total power.


R.Selva suganthi DR.Sivanthi adithanar college of engineering tiruchendur Abstract : Mobile computing is beginning to break the chains that tie us to our desks, but many of today's mobile devices can still be a bit awkward to carry around. In the next age of computing, there will be an explosion of computer parts across our bodies, rather than across our desktops. Basically,

jewellery adorns the body, and has very little practical purpose. The combination of microcomputer devices and increasing computer power has allowed several companies to begin producing fashion jewellery with embedded intelligence i.e., Digital jewellery. Digital jewellery can best be defined as wireless, wearable computers. Even

the devices we use are protected by

passwords. It can be frustrating trying to keep with all of the passwords and keys needed to access any door or computer program. This paper discusses about a new Java-based, computerized ring that will automatically unlock doors and log on to computers
that allow us to communicate by ways of e-mail, voicemail, and voice communication and enlightens on how various computerized jewelry (like ear-rings, necklace, ring, bracelet, etc.,) will work with mobile embedded intelligence. INTELLIGENT CAR SYSTEM FOR ACCIDENT PREVENTIONUSING ARM-7 R.Muthu Lakshmi K.Anitha Abstract This project is about making cars more intelligent and interactive which may notify or resist user under unacceptable conditions, they may provide critical information of real time situations to rescue or police or owner himself. Driver fatigue resulting from sleep deprivation or sleep disorders is an important factor in the increasing number of accidents on today's roads. In this paper, we describe a real-time online safety prototype that controls the vehicle speed under driver fatigue. The purpose of such a model is to advance a system to detect fatigue symptoms in drivers and control the speed of vehicle to avoid accidents. The main components of the system consist of number of real time sensors like gas, eye blink, alcohol, fuel, impact sensors and a software interface with GPS and Google Maps APIs for location. Dr.Sivanthi Aditanar College of Engineering



In the first instance it aims to it provide an overview addressing the state-of-the-art in the area of activity recognition in Smart homes. Smart homes are augmented residential environments equipped with sensors, actuators and devices. In early method, they used data-driven approaches only for sensor data in this paper introduces a knowledge-driven approach to real-time, continuous activity recognition and describe the underlying ontology-based recognition process. We analyze the characteristics of smart homes and Activities of Daily Living (ADL) upon which we built both context and ADL ontologies.we will concern ourselves with one type of stochastic signal model is hidden markov model for recognition process. CROP CLASSIFICATION USING MODIS IMAGERY G. ARULSELVI , N.NANDHINI Annamalai University Abstract This paper aims at proposing a remote sensing-based methodology to map the spatio-temporal evolution of the frontiers and the evolution stages of the agricultural frontier in Tamil Nadu. This active agricultural frontier is moving forwards the Cuddalore, Perambulur and Nagapattinam area in the State of Tamil Nadu where expansion for crops like sugarcane, cashew nut and casuarinas has been considered as a main driver of deforestation for more than 30 years. Here deforestation and classification maps (computed on MODIS EVI time series) are used. Geographical concepts of this agricultural frontier assume that its progress is carried out through five stages corresponding to the evolution of three frontiers (deforestation, economic and intensification frontiers). Final maps highlight the fact that few areas have reached the final intensive stage of the agricultural frontier.

DATA MINING OF SOCIAL MEDIA SPECIFIC STRINGS FOR RAPID FORENSIC INVESTIGATION N.Nivaedita., G.Rajeswari., V.Sowmiya kalaimathi., Abstract Instant Messengers become an important means of communication. Millions of people, regardless of age, nationality, gender and computer skills, spend a lot of time using them every day. Thus, Social networks already occupied the place of traditional messaging systems of the past. More and more communications are migrating from public chat rooms and private messengers into online Social Networking Sites(SNSs). As cybercrimes mushroom in recent years, more and more digital crime investigations have strong relations to these SNSs. Communications extracted from social


networking sites can be extremely valuable and useful to all kinds of Investigators including forensic investigators. We now apply the spotlight on distinct strings specific to each SNSs and volatile memory analysis and display of any case detail as a key to a successful digital investigation through rapid & improved retrieval performance. BRAIN TUMOR DETECTION AND IDENTIFICATION USING IMAGE PROCESSING AND SOFT COMPUTING G.Athilakshmi vinothini., P.Nivetha., A.Sahaya Suji Abstract In this paper, modified image segmentation techniques were applied on MRI scan images in order to detect brain tumors. In order that for segmentation purpose we have handled analysis process from that we have proposed better algorithm for detection of brain tumor. In case of the next process that is of identification, in which we are using probabilistic neural network classifier, in this it classifies tumor tissue from normal tissue.

CIPHERTEXT-POLICY ATTRIBUTE SET BASED ENCRYPTION FOR SECURE CLOUD COMPUTING Priyanga P.T and Anand T Madha Engineering College Abstract Cloud computing is the delivery of computing and storage capacity as a service to a community of end-recipients. The cloud make it possible for you to access your information from anywhere at any time. The cloud removes the need for you to be in the same physical location as the hardware that stores your data. To keep the shared data confidential against untrusted cloud service providers, a natural way is to store only the encrypted data in a cloud. Several schemes are employed to provide access control of outsourced data but, most of them suffer from inflexibility in implementing complex access control policies. To provide access control it uses hierarchical attribute-set-based encryption scheme that extends the ciphertext-policy attribute-set-based encryption. It consists of a hierarchical structure of system users such as a cloud service provider, data owners, data consumers, a number of domain authorities, and a trusted authority. In this system neither data owners nor data consumers will be always online. They come online only when necessary, while the cloud service provider, the trusted authority, and domain authorities are always online. It can also solve the user revocation problem by assigning multiple values to the same attribute. Users may try to access data files either within or outside the scope of their access

privileges, so malicious users may collude with each other to get sensitive files beyond their privileges. Using this ciphertext-policy attribute-set-based encryption we can encrypt the files while uploading and also helps to detect and recover the data in case of any corruption might be happened on the cloud. MEDICAL IMAGE ANALYSIS USING MORPHIC CLUSTERING ALGORITHM M. Karthikeyen, R.Rajakumari, G.R.Hemalakshmi, N.B.Prakash National Engineering college Abstract Medical image analysis is used for understanding the underlying physiological causes of a disease. In medical practice, physicians refer their critical patients and determine the treatment to the particular disease by imaging the affected part of the body. Imaging will help the Normal patients to pursue further treatment and Abnormal patients to identify the nature of abnormality. Our approaches automated image analysis method includes registration, segmentation, anatomical parameterization and modeling, tissue classification and shape analysis, and pathology detection in individuals or groups. This paper proposes a clustered morphic algorithm to identify branching points in images. This method is used to change the representation of an image into something that is more meaningful and easier to analyze the interested object

AN AUTHENTICATED AND SECURE COMMUNICATION FOR DISTRIBUTED CLUSTERS VIA MESSAGE PASSING INTERFACE M.A.Maffina., R.S.RamPriya Abstract In a public network, when a number of clusters connected to each other is increased becomes a potential threat to security applications running on the clusters. To address this problem, a Message Passing Interface (MPI) is developed to preserve security services in an unsecured network. The proposed work focuses on MPI rather than other protocols because MPI is one of the most popular communication protocols on distributed clusters. Here AES algorithm is used for encryption/decryption and interpolation polynomial algorithm is used for key management which is then integrated into Message Passing Interface Chameleon version 2 (MPICH2) with standard MPI interface that becomes ES-MPICH2. This ES-MPICH2 is a new MPI that provides security and authentication for distributed clusters which is unified into cryptographic and mathematical

Jayamatha Engineering College

concept. The major desire of ES-MPICH2 is supporting a large variety of computation and communication platforms. The proposed system is based on both cryptographic and mathematical concept which leads to full of error free message passing interface with enhanced security. PERFORMANCE ANALYSIS OF COGNITIVE RADIO WITH SVD AND WATER FILLING TECHNIQUES Remika Ngangbam, R. Anandan, Dhanalakshmi Srinivasan College of Engg. and Tech., , Abstract In this paper the analysis of the performance of secondary users (SUs) is considered. Multi-carrier systems are one of the best candidates for applying in cognitive radio (CR) networks because of the spectrum shaping and high adaptive capabilities. Since SUs in this structure use a limited number of sub-carriers because of deactivation of the primary users (PUs) band, the total capacity of CR networks is limited. Considering different conditions to obtain maximum total capacity of CR networks, Water filling algorithm for Rayleigh fading channel and singular value decomposition is proposed. Theoretically, it is shown that this proposed algorithm can maximize the total capacity keeping the caused interference in PUs bands in a tolerable range. To simplify the algorithm complexity, a sub-optimal scheme is proposed. The simulation results of the new algorithms are compared with previous methods, which present the enhancement and efficiency of the proposed algorithms. Furthermore, the simulation results show that the proposed schemes can load more power into the CR users band in order to achieve higher transmission capacity for a given interference threshold.

BRAIN CONTROLLED CAR FOR DISABLED USING ARTIFICIAL INTELLIGENCE R.Muthu lakshmi N.Dharini DR.Sivanthi aditanar college of engineering tiruchendur Abstract This paper considers the development of a brain driven car, which would be of great help to the physically disabled people. Since these cars will rely only on what the individual is thinking they will hence not require any physical movement on the part of the individual. The car integrates signals from a variety of sensors like video, weather monitor, anti-collision etc. it also has an

automatic navigation system in case of emergency. The car works on the asynchronous mechanism of artificial intelligence. Its a great advance of technology which will make the disabled, abled. ABE BASED EXTENDABLE SHARING OF PERSONAL HEALTH RECORD IN CLOUD N.karthikeyan Sri SaiRam Engineering College Abstract In medicine oriented organizations patients personal information will be maintained at the cloud servers which will be insecure. This information is generally called as personal health records (PHR). This should not be outsourced to the third party, which is happening now. To avoid this Personal health records will be encrypted before being outsourced. Yet, issues such as risks of privacy exposure, scalability in key management, flexible access and efficient user revocation, have remained the most important challenges toward achieving fine-grained, cryptographically enforced data access control. We implement a patient-centric framework which will have two different domains known as public domains (PUD) and personal domains (PSD). To provide a fine-grained access control we introduced Attribute-based Encryption (ABE). Our work is differentiated from previous work by including multiple data owner scenario. It will reduce the key complexity management for owners and users.

VIRTUAL REALITY-CONFUSING THE BRAIN TO REDUCE PAIN FOR PATIENTS D.Nandhini, K.Gomathi, Abstract The essence of immersive virtual reality (iVR) is the illusion it gives users that they are inside the computer-generated virtual environment. This unusually strong illusion is theorized to contribute to the successful pain reduction observed in burn patients who go into VR during woundcare (www.vrpain.com) and to successful VR exposure therapy for phobias and post- traumatic stress disorder (PTSD). The present study demonstrated for the first time that subjects could experience a strong illusion of presence during an fMRI despite the constraints of the fMRI magnet bore (i.e., immobilized head and loud ambient noise). REVERSE SKYLINE QUERY PROCESSING FOR UNCERTAIN DATA G.Saraswathi Oxford Engineering College

V.R.S College of engineering and technology,

Abstract Data hesitation is an intrinsic attribute of Multidimensional dataset. So that reverse skyline query is functional to the Multidimensional dataset and trace out the important points in the multidimensional data space. Reverse Skyline query is very valuable for geographical information system, urban planning and military deployment. The proposed system to process the Reverse skyline query and using the skyband method to answer the problem of resultant query set. Energy Efficient technique drastically reduces the communication cost among the nodes and save the energy at the survey of reverse skyline queries. Finally, reverse skyline query is applied in the multiple networks and extract useful information from substantial hesitant data readings. MAPPING RESOURCE USING NETWORKED CLOUD THROUGH ILS N.Sharanya,C.Selva kumar Abstract In cloud computing the configurable resources are provided in the form of internet as a service. The basic goal is to create a fluid pool of virtual resources across computers, servers and data centers that enable users to access stored data and applications on an on-demand basis. Virtual Network Mapping (VNM) plays a central role in building a virtual network (VN). During this mapping process each node of the VN is assigned to a node of the physical network (PN) and each virtual link is assigned to a path or flow in the PN. To deal with the inherent complexity and scalability issue of the resource mapping problem across different administrative domains, in this article so many techniques are described. AN EFFICIENT PROTECTION FOR MULTITIER WEB APPLICATION USING DOUBLE GUARD SYSTEM B.SHARMILA DEVI S.P. MANIKANDAN S.M.K.Fomra institute of technology Abstract Internet services and applications have become an inextricable part of daily life, enabling communication and the management of personal information from anywhere. To accommodate this increase in application and data complexity, web services have moved to a multitiered design wherein the web server runs the application front-end logic and data are outsourced to a database or file server. In this paper, we present Double Guard, an IDS system that models the network behavior of user sessions across both the front-end web server and the back-end database. By monitoring both web and subsequent database requests, we are able to ferret out attacks that independent IDS would

Oxford Engineering College

not be able to identify. Furthermore, we quantify the limitations of any multitier IDS in terms of training sessions and functionality coverage. We implemented DoubleGuard using an Apache webserver with MySQL and lightweight virtualization. We then collected and processed real-world traffic over a 15-day period of system deployment in both dynamic and static web applications. Finally, using DoubleGuard, we were able to expose a wide range of attacks with 100 percent accuracy while maintaining 0 percent false positives for static web services and 0.6 percent false positives for dynamic web services.



Abstract The main objective of the paper is about providing automation in water tank. It includes Water level indicating system, Quality monitoring system, a facility for checking the purity of water, by determining the pH range in water and also to purify the water which is coming from the source with the help of Reverse osmosis process. The level of water is determined with capacitance level sensor and it is displayed with the help of LCD interfacing. And this level indication acts as a protection mechanism in order to protect the motor from Dry-run conditions by turning the motor ON and OFF according to the upper and lower level of water in the tank. The pH content present in the water is determined with the pH sensor. The efficiency and performance of water purification can be improved with the help of reverse osmosis process. This purification system is cost effective and more efficient. Thus this project is developed from the thought of getting automation in the water tank which helps to consume time and avoids the health problems due to the continuous quality monitoring of the water in the tank. The mud identification in this enhances the advantage of this project. SELFCARE EMERGENCY DETECTION AND MEDICAL ASSISTANCE USING ANT COLONY ALGORITHM Sreema.S.Kumar C.Jeyanthi Abstract PSN College of Engineering & Technology


With the pervasiveness of smart phones and the advance of Wireless Body Sensor Networks (WBSN), mobile Healthcare (m-Healthcare), which extends the operation of Healthcare provider into a pervasive environment for better health monitoring, has attracted considerable interest recently. However, the ourish of m-Healthcare still faces many challenges including medical assistance and the adequate emergency assistance. In this paper, we propose a emergency check and the quick medical aid providence to the patients which would enhance the m-Healthcare in rural areas. By using a smart phone the Personal Health information (PHI) of the registered patients can be intensively collected. In the existing systems the medical observer should be stay active for all the time before the system gathering the PHI of the patient checking whether the whole biological parameters are in the normal condition. In this paper the emergency is detected by monitoring the patients with the updated and the previously stored data. Also a Swarm based algorithm (Ant Colony Algorithm) is implemented to calculate the shortest path from the hospital to the patient whenever the emergency is being detected.

SECURITY USING KERBEROS FOR STORAGE CLOUD WITH HIGH VOLUME OF DATA FORWARDING K. Subalakshmi Oxford Engineering College, Abstract Cloud computing is technology that uses the internet and central remote servers to maintain data and applications. In Cloud storage system is the collection of storage servers such as data storage server and key storage server. Data Storage server to maintain the datas, files etc. Key Storage server to maintain the secrets key. General encryption defends the data confidentiality but restrains the functionality of the storage system. In existing system the datas or files are stored in cloud system but its not secured because the third party are entered in cloud system and corrupted the datas or files in cloud system. So data confidentiality is less and data forwarding is not possible. To overcome this problem introduced Kerberos and RMI Distributor in proposed system. Kerberos is a network authentication protocol. Datas or files are securely stored in cloud storage system. The proxy re-encryption scheme supports encrypted messages as well as forwarding operations over encoded and encrypted messages. In method fully integrates encrypting, encoding, and forwarding. GPRS DRIVING WAP ON THE ROAD TO 3G


Abstract Mobile telephony allowed us to talk on the move. The Internet turned raw data into helpful services that people found easy to use. Now, these two technologies are converging to create third generation mobile services. In simple terms, third generation (3G) services combine high-speed mobile access with Internet Protocol (IP)-based services. But this doesnt just mean fast mobile connection to the world wide web. Rather, it means whole new ways to communicate, access information, conduct business, learn, and be entertained -- liberated from slow, cumbersome equipment and immovable points of access. Mobile computing is being heralded as the new killer app for the Internet. While 3G hasnt arrived yet 2.5G is here! The technologies at the forefront

of 2.5G push are GPRS(General Packet Radio Service), EDGE (Enhanced Data rates for Global Evolution.), WCDMA (Wideband Code Division Multiple Access), and WAP (Wireless Application Protocol). SIGNIFICANT ANALYSIS OF TERMS IN BOOTSTRAPPING ONTOLOGY FOR WEB SERVICES swathi Abstract Ontology construction is important for semantic based web services. Ontological bootstrapping which aims at automatically generating concepts and their relations in a given domain is a promising technique for ontology construction. Bootstrapping anOntology based on a set of predefined textual sources, such as web services, must address the problem of multiple, largely unrelatedconcepts. In this paper ontology bootstrapping process for web services is done based on WSDL document. The proposed approach uses result of significant analysis and web context extraction for the ontology evolution. The significant analysis provide us the importance of the every token extracted from the WSDL document. Based on the significant score the ontology constructed for web services which provides best service to the users.

ENHANCED PRIVACY ID WITH REVOCATION CAPABILITIES R.Thenmoli S.Sathyaraj Oxford Engineering college Abstract


Direct Anonymous Attestation (DAA) is a scheme that enables the remote authentication of a Trusted Platform Module (TPM) while preserving the users privacy. A TPM can prove to a remote party that it is a valid TPM without revealing its identity and without link ability. In the DAA scheme, a TPM can be revoked only if the DAA private key in the hardware has been extracted and published widely so that verifiers obtain the corrupted private key. The TPM cannot be revoked. Furthermore, a TPM cannot be revoked from the issuer, if the TPM is found to be compromised after the DAA issuing has occurred.While still providing unlinkability, our scheme provides a method to revoke a TPM even if the TPM private key is unknown.Our EPID scheme is efficient and provably secure in the same security model as DAA, i.e., in the random oracle model under the strong RSA assumption and the decisional Diffie-Hellman assumption. CLOUD COMPUTING R.Uma Mageswari G.Jeba Glorinthal Abstract Cloud computing is basically an Internet-based network made up of large numbers of servers mostly based on open standards, modular and inexpensive. Clouds contain vast amounts of information and provide a variety of services to large numbers of people. The benefits of cloud computing are Reduced Data Leakage, Decrease evidence acquisition time, they eliminate or reduce service downtime, they Forensic readiness, they Decrease evidence transfer time. The main factor to be discussed is security of cloud computing, which is a risk factor involved in major computing fields. USING SMS IN MOBILE PHONE FOR HOME APPLIANCES CONTROLLING THROUGH PC PARALLEL PORT INTERFACING C.M.Vasuki, S.Pavithra Abstract This paper presents a system of the PC remote Controlling with the Mobile Telephone through accessing the main PC ports; serial and parallel. Serial port for transferring data from Mobile phone to PC and parallel port for interfacing PC with real time controlling hardware. The system is implemented by using the SMS (Short Message Service) as associated with all modern mobile phone devices and mobile telecommunication networks. The software for whole system is designed and implemented with KORAK Telecom Network in Erbil City, Nokia mobile phone device and with ordinary type of PC that running under Windows XP or compatible. The software for system

Adiyamaan college of engineering

is divided into two parts; Mobile to PC through serial port is a general commercial program that associated with the Nokia mobile devices, and second which access SMS file and control all parts of system is designed by using Microsoft Visual C++ Ver. 6 . Such idea is quiet new and represents the ability of anyone who has Mobile and PC to control remotely major devices in his/her home, office and etc.

REFINEMENT IN COGNITIVE RADIO NETWORK FOR SPECTRUM HANDOFF Venkatesan.D, Sathishkumar.S Abstract Cognitive radio (CR) can significantly improve spectrum efficiency by allowing the secondary users to temporarily access the primary users under-utilized licensed spectrum. Spectrum mobility issues arise when the primary user appears at the channels being occupied by the secondary users. The secondary users return to the occupied channel because the primary users have the preemptive priority to access channels. Spectrum handoff technique can help the interrupted secondary user to vacate the occupied licensed channel and find a suitable target channel to resume its unfinished data transmission. Preemptive resume priority (PRP) M/G/1 queuing network model characterizes the spectrum usage behaviors of the connection-based multiple channel spectrum handoffs. The proposed model derives the closed-form expression for the extended data delivery time of different proactive designed target channel sequences under various traffic arrival rates and service time distributions. The analytical method analyzes the latency performance of spectrum handoffs based on the target channel sequences specified in the IEEE 802.22 wireless regional area networks standards. We also suggest a traffic-adaptive target channel selection principle for spectrum handoffs under different traffic conditions. I-DRIVE - AN INTELLIGENT DRIVING SYSTEM

Srinivasan Engineering College

SNS college of engineering

Abstract Our project is to create an automated four wheeler driving system without using GPS (Global Positioning System) for position tracking. This is to reduce the occurrences of road accidents and make traveling safe. It integrates various technologies and provides a platform for a wide range of

applications too. In the play, to detect obstacles we use sensors. For sensing other vehicles we use a reader which reads the signals transmitted from those vehicles. We use a roadmap to travel, and the automobiles dynamics to track its own position. This is used as an alternative to GPS. All these data are then combined and plotted on a frame buffer. Then we apply algorithms to generate path by considering the frame buffer. Using this we can achieve a safe and easy travel. iDrive allows the driver and front-seat passenger to control such amenities as the climate (air-conditioned and heater), the audio system (radio and CD player), the navigation system and communication system. Recently, iDrive is used in BMW cars. AUTHENTICATION ON KEY MANAGEMENT FRAMEWORK WITH HYBRID MULTICASTING NETWORKS Suganthi P, Ramya K Sree Sowdambika College of Engineering, Abstract Adhoc networks are dynamically created and maintained by the individual nodes comprising the network. They do not require a pre-existing architecture for communication purposes and do not rely on any type of wired infrastructure. In an adhoc network all communication occurs through a wireless mediam. The design and management of ad-hoc networks is significantly a challenging one when compared to contemporary networks. Authenticating the multicast session is an important one. To authenticate several factors should be considered, major issue are resource constraints and the wireless links. In addition to being resource efficient and robust, security solution must be provides to large group of receivers and to long multi-hop paths. The authentication must be done without much delay and should independent of the other packets. In existing TAM Tired Authentication scheme for Multicast traffic is proposed for ad-hoc networks. It exposed network clustering to reduce the overhead and to improve the scalability. Its two tired hierarchy combines the time and secret-information asymmetry to achieve the resource efficiency and scalability. In the proposed system, a Asynchronous authentication scheme as using shared key management is proposed to resolve the most conflicting security requirements such as group authentication and conditional privacy. The proposed batch verification scheme as a part of the protocol poses a significant reduction in the message delay, then we use shared key process so requirement of the storage management is very less.


Abstract Web-based collaborations and processes have become essential in todays business environments. Such processes typically span interactions between people and services

across globally distributed companies. Web services and SOA are the defacto technology to implement compositions of humans and services. The increasing complexity of

compositions and the distribution of people and services require adaptive and contextaware interaction models. To support complex interaction scenarios, we introduce a mixed service-oriented system composed of both human-provided and Software-Based Services (SBSs) interacting to perform joint activities or to solve emerging problems. However, competencies of people evolve over time, thereby requiring approaches for the automated management of actor skills, reputation, and trust. Discovering the right actor in mixed service-oriented systems is challenging due to scale and temporary nature of collaborations. We present a novel approach addressing the need for flexible involvement of experts and knowledge workers in distributed collaborations. We argue that the automated inference of trust between members is a key factor for successful collaborations. Instead of following a security perspective on trust, we focus on dynamic trust in collaborative networks. We discuss Human-Provided Services (HPSs) and an approach for managing user preferences and network structures. HPS allows experts to offer their skills and capabilities as services that can be requested on demand. Our main contributions center around a context-

sensitive trust-based algorithm called ExpertHITS inspired by the concept of hubs and authorities in web-based environments. ExpertHITS takes trust-relations and link properties in social networks into account to estimate the reputation of users.

ADAPTIVE MULTIPLE REGION SEGMENTATION BASED ON OUTDOOR OBJECT DETECTION Anju.J.A Mr.Jenopaul Abstract The main research objective of this paper is to detecting object boundaries in outdoor scenes of images solely based on some general properties of the real world objects. Here, segmentation and recognition should not be separated and treated as an interleaving procedure. In this project, an adaptive global clustering technique is developed that can capture the non-accidental structural relationships among the constituent parts of the structured objects which usually consist of multiple constituent parts. The background objects such as sky, tree, ground etc. are also recognized based on the color and texture information. This process groups them together accordingly without

PSNCollege of Engineering and Technology,

depending on a priori knowledge of the specific objects. The proposed method outperformed two state-of-the-art image segmentation approaches on two challenging outdoor databases and on various outdoor natural scene environments, this improves the segmentation quality. By using this clustering technique is to overcome strong reflection and over segmentation. This proposed work shows better performance and improve background identification capability. IMPLEMENTING SCALING ONLINE SOCIAL NETWORKS USING SOCIAL PARTITIONING AND REPLICATION Anusree V.K M. Edwin Jayasingh Abstract Distributed network is the important application of networking. A social networking service is an online service that focuses on building of social networks. Vertical scaling introduces, add resources to a single node in a system, involving the addition of CPUs or memory to a single computer. Such vertical scaling of existing systems also enables more effectively and it provides more resources for the hosted set of operating system. Horizontal scaling introduces to add more nodes to a system, such as adding a new computer to a distributed software application. This model has created an increased demand for shared data storage with very high I/O performance. When a servers CPU is bound, adding more servers do not help serve more requests [1]. Due to complex nature of the OSNs the existing partitioning techniques does not produce any optimal solution for the back end data scalability. In this paper, Social partitioning and replication model for the middleware joint partitioning and replication of the underlying community structure to provide an optimal solution for the band end data scalability for the online social networks and to ensure that all the data is local. Infant Jesus College of Engineering

EFFICIENT FORWARDING PROTOCOL TO TOLERATE SELFISH BEHAVIOR IN SOCIAL MOBILE NETWORK Mr.M.Arul Sanka Prof.S.Gokul Pran Ratnavel Subramaniam College of Engineering and Technology Abstract Nodes should accept to use their own energy and bandwidth just to carry other peoples messages. One fundamental and natural question, especially in this setting, is why nodes should do the above function. Present two forwarding protocols for mobile wireless networks of selfish individuals. Assume that all the nodes are selfish and show formally that both protocols are strategy proof that

is, no individual has an interest to deviate. Improve performance by reducing the number of replicas and the storage requirements. Extensive simulations with real traces show that this protocols introduce an extremely small overhead in terms of delay, while the techniques introduce to force faithful behaviour have the positive and quite surprising side effect to improve performance by reducing the number of replicas and the storage requirements for the node. Test this protocols also in the presence of a natural variation of the notion of selfishness nodes that are selfish with outsiders and faithful with people from the same community. Even in this case, this protocols are shown to be very efficient in detecting possible misbehaviour. IMAGE PROCESSING
S.ASWINI K.SUBBAMMAL SCAD College of Engineering And Technology

Abstract Morphological Image Processing is an important tool in the Digital Image processing, since that science can rigorously quantify many aspects of the geometrical structure of the way that agrees with the human intuition and perception. Morphologic image processing technology is based on geometry. It emphasizes on studying geometry structure of image. We can find relationship between each part of image. When processing image with morphological theory. Accordingly we can comprehend the structural character of image in the morphological approach an image is analyzed in terms of some predetermined geometric shape known as structuring element. Morphological processing is capable of removing noise and clutter as well as the ability to edit an image based on the size and shape of the objects of interest. Morphological Image Processing is used in the place of a Linear Image Processing, because it sometimes distort the underlying geometric form of an image, but in Morphological image Processing, the information of the image is not lost. In the Morphological Image Processing the original image can be reconstructed by using Dilation, Erosion, Opening and Closing operations for a finite no of times. The major objective of this paper is to reconstruct the class of such finite length Morphological Image Processing tool in a suitable mathematical structure using Java language. The Morphological Image Processing is implemented and successfully tested in FORENSICS


Operations are applied for binary images. FORENSICS: Fingerprint Enhancement and reduction of noise in finger print image

MATHEMATICAL MORPHOLOGY BASED FEATURE EXTRACTION FOR REMOTELY SENSED IMAGE TRAFFIC PATTERN S.Jayarani S.Nivetha P.Mohanadivya S.Pavithra S. Premkumar Narasus Sarathy Institute of Technology Abstract Feature Extraction is nothing but transforming the input data into the set of features .It is a special form of dimensionality reduction. The process of feature extraction using traditional methods is a tedious and time consuming process. In order to reduce human efforts involved in feature extraction from remotely sensed imagery, semi-automatic and automatic feature extraction algorithms are developed. GIS data bases must be frequently and accurately maintained and updated. So it is required to analyze the remotely sensed images and extract the features on a regular basis for the effectiveness of GIS databases. This project presents a new method for extraction of High Resolution remotely sensed images based on binary mathematical morphology operators. The proposed approach involves several advanced morphological operators among which an adaptive hit-or-miss transform with varying sizes and shapes of the structuring element used to extract different features. Structuring element is the important one in the feature extraction Because based on the size and shape it has to extract different types of features in a single images. Using our Methodology has been found the Rectangular shape or Square are used for Building Extraction and Line shape is used for Road Extraction. In the earlier days feature extraction can be done by different methods. one is the Principal component analysis(PCA) which is optimal in the mean square sense for representation, it is not appropriate for classification. So they developed Decision Analysis Feature extraction(DAFE) algorithm. It also has the weakness that it is not directly related to the probability of error in classification. On the other hand Decision Boundary Feature Extraction(DBFE) is outperformed some approaches in terms of overall and average accuracies. Our project is used to extract different types of feature from a single remotely sensed image. Experiments made on a IKONOS and WORLD VIEW satellite image shows the effectiveness of the methodology. MULTIPLE KEY GENERATION USING ELLIPTIC CURVE CRYPTOGRAPHY FUSION ALGORITHM FOR BIOMETRIC SOURCE Joju John,, Mr T. Rajesh,

Abstract The biometric data are stored in the template in which various techniques are used to protect it against privacy and security threats. The binary vector derived from biometric samples provide a great portion of template protection technique. For the same template protection system it is observed that there is a large variation on its key length. It determine the analytical relationship between the classification performance of the fuzzy commitment scheme and theoretical maximum key size given as input on Gaussian biometric source. The number of enrolment, verification sample, features component and biometric source capacity. It shows features of estimated maximum key size and classification performance are interdependent in analysis of the work. Both the theoretical analysis, as well as an experimental evaluation showed that feature interdependencies have a large impact on performance and key size estimates

A NONLINE OF SIGHTS PROVIDES IN VANET USING THE MOBILE TOWERS TECHNOLOGY A.Joshua Issac S.Sathyaraj Abstract In past decade GPS system is used in vehicles GPS is starting to show some undesired problems such as not always being available or not being robust enough for some applications. For this reason a number of other localization techniques such as Dead Reckoning, cellular Localization, and image/video localization have been used in VANETs to overcome GPS limitations. In vehicular Ad Hoc Networks (VANETs), vehicles communicate with each other and possibly with a roadside infrastructure to provide a long list of applications varying from transit safety to driver assistance and internet access with the direct communications. The Direct communication affects the localization services. In this project to overcome this problem a location verification protocol has been proposed. Dealing with such obstacles is a challenge in VANETs as moving obstacles such as trucks are parts of the network and have the same characteristics of a VANET node. It is providing VANET position integrity through filtering. Additionally a collaborative protocol to verify an announced position when direct communication between the questioned node and the verifier is not possible. In addition to verifying a node location in a multihop cooperative approach, several security measures were included to improve the message integrity.

Oxford Engineering College, Trichy

VOICE CALLS OVER WI-FI K.PRIYADHARSHINI G.LAKSHMI SRI SAI RAM ENGINEERING COLLEGE Abstract The use of Wi-Fi enabled cell phones to access internet away from the PC is greatly increasing. Using Wi-Fi enabled phones as IP phones and their communication within a local wireless LAN is discussed in this paper. This proposed model is a form of telecommunication that allows data and voice transmissions to be sent across a wide range of interconnected networks. The models which are Wi-Fi enabled and have J2ME platform can be used to communicate with each other through the free 2.4GHz communication channel. Since this is free, channel security is a concern. To overcome this, the packets of data may be encrypted in the header and payload by different encryption techniques. However even the security is a concern only within the specific network. The communication is completely safe from attacks external to this local network. Each mobile device connects to a WLAN router and identifies itself in the routing table. Calls can be placed by a user by sending the packets to the router, which then tries to find the destination. The destination must also be connected to the WLAN. If not the Wi-Fi server can tunnel the calls to the GSM network using UNC (Unified Mobile Access Network Converter). Since the communication channel is capable of being affected by an outside influence (hacking), it is provided with complex cryptography techniques, which engenders high security. Our proposal allows free calls within the network, with high quality voice transmission. This model will be a prototype of itinerant devices communicating in the Wi-Fi bandwidth, and will greatly reduce the communication cost in large organizations.

PRIVACY PRESERVING ON DEMAND ROUTING USING USOR FOR MANET Mahesh kumar.M, Saravanan. S, Abstract Mobile ad hoc networks often support sensitive applications. These applications may require that users identity, location, and correspondents be kept secret. This is a challenge in a MANET because of the cooperative nature of the network and broadcast nature of the communication. A number of anonymous routing schemes have been proposed for ad hoc networks to provide better

Srinivasan Engineering College

support for privacy protection but bring significant computation overhead. However, none of these schemes offer complete unlinkability or unobservability property since data packets and control packets are still linkable and distinguishable in these schemes. s. USOR is efficient as it uses a novel combination of group signature and ID-based encryption for route discovery. The wormhole attacks cannot be prevented in USOR mechanism. The proposed system aimed at developing unobservable routing scheme resistant against DoS attacks such as Gray hole/Black whole attacks to protect network-layer reactive protocols. It discovers malicious nodes during route discovery process when they mitigate fabricated routing information to attract the source node to send data through malformed packet. Security analysis demonstrates that USOR can well protect user privacy against internal and external attackers. The simulation results show that it achieves stronger privacy protection than existing schemes.

ENERGY CONSUMPTION IN SENSOR NETWORK USING CONTINUOUS NEIGHBOR DISCOVERY G.Senthil Kumar, A.Maria Nancy SRM University Abstract In wireless sensor network to make reliable path connectivity and packet exchange will take more time and also need more power. Two techniques are analysed here to reduce time and maintain power consumption. One of the technique is Continuous Neighbor Discovery, It will find neighbor node and also continuously maintain a immediate neighbour node view. Another technique is Link Assessment Method, It allows for probabilistic guarantee of collision-free packet exchange. Each sensor using a simple protocol in a coordinate effort to reduce power consumption without increasing the time required to detect hidden sensors. SASY USER NAME AND PASSWORD CLOUD R.Monisha K.S.Viveka SCAD College of Engineering and Technology Abstract In this paper, we will discuss the user authentication problems and difficulties to managing user names and Passwords. In many cases the lack of standard rules for choosing. User name and Password has made it really challenging to Remember login information. We continue to propose a model and a technique for this issue and discuss the implementation and Utilizing of this service.

AN EFFICIENT DATA AUDITING IN CLOUD COMPUTING K. Nanthini T. Saravanan PSN College of Engineering and Technology Abstract Cloud Storage allows users to store their data and use the cloud applications without the need of local hardware and software resources. Cloud Storage service possess many security risks against storage exactness. This paper presents a flexible distributed storage integrity auditing mechanism to achieve fast localization of data error with low complexity. The proposed design allows users to audit their data and it achieves dynamic data support to ensure the correctness and availability of users data in cloud. i.e., it efficiently supports block modification, deletion, append.

NATURE AGENT BASED ADAPTIVE ENERGY EFFICIENT MOBILITY PATTERN AWARE ALGORITHM FOR MANET K.Naresh Kumar Thapa Dr.T.Pearson Abstract Over the last decade, research efforts are made in MANETs (Mobile Ad Hoc Network) to develop an efficient routing based on energy consumption, time delay and also based on Quality Of Service (QOS). But many research papers concentrate with the routing and with less security or with no security. Security and Routing do not go hand in hand as security features when included with routing leads to cost factor. In our paper we have suggested theoretically to provide routing with Security as a software package so that we need not add extra features for security to shell out extra perks. Thus in this paper security and routing go hand in hand. Thus we conclude the paper stating that adding security to routing protocol does not affect the time delay, energy consumed and other Quality of service (QOS). RESCUE ROBOT FOR LIFE SAVING OPERATION X.Mary Ajila Xavier Scad College of Engineering and Technology, Abstract Science and technology has gone in depth to all Day to day application like automation, biometrics, bio-medical and life saving system and equipment. We should like to develop a rescue robot for saving human life and their belongings during natural and abnormal causes. The proposal

DMI College of engineering

robot is a life saving machine and may extend the life of suffered peoples in the accidents and collision area. The proposed robot will be designed with extreme care to withstand load, high and low temperature and chemical environments

FUZZY NETWORK PROFILING FOR INTRUSION DETECTION Bharath .B Abstract The Fuzzy Intrusion Recognition Engine (FIRE) is an anomaly-based intrusion detection system that uses fuzzy logic to assess whether malicious activity is taking place on a network. It uses simple data mining techniques to process the network input data and help expose metrics that are particularly significant to anomaly detection. These metrics are then evaluated as fuzzy sets. FIRE uses a fuzzy analysis engine to evaluate the fuzzy inputs and trigger alert levels for the security administrator. This paper describes the components in the FIRE architecture and explains their roles. Particular attention is given to explaining the benefits of data mining and how this can improve the meaningfulness of the fuzzy sets. Fuzzy rules are developed for some common intrusion detection scenarios. The results of tests with actual network data and actual malicious attacks are described. The FIRE IDS can detect a wide-range of common attack types. Sri Manakula Vinayagar Engineering College


MR.K.MANIVANNAN. S.PRIADARSINI PSNA COLLEGE OF ENGINEERING AND TECHNOLOGY Abstract - Enterprises usually store data in internal storage and install firewalls to protect against intruders to access the data. They also standardize data access procedures to prevent insiders to disclose the information without permission. In cloud computing, the data will be stored in storage provided by service providers. Service providers must have a viable way to protect their clients data, especially to prevent the data from disclosure by unauthorized insiders. Storing the data in encrypted form is a common method of information privacy protection. If a cloud system is responsible for both tasks on storage and encryption/decryption of data, the system administrators may simultaneously obtain encrypted data and decryption keys. This allows them to access information without authorization and thus poses a risk to information privacy. This study proposes a business model for cloud computing based on the concept of separating the encryption and decryption service from the storage service. A CRM (Customer Relationship Management) service is described in this paper as an example to illustrate the proposed business model.

HIGH EFFICENCY VIDEO CODEC USING CDSA J.Cinista Mrs.S.Buvaneswari Abstract This paper presents an efficient low power VLSI architecture of in-loop Adaptive Bilateral Filter with high efficiency data access system for supporting multiple video coding standards including H.264 BP/MP/HP, SVC, MVC, AVS, and VC-1. Advanced standards, such as H.264 MP/HP, SVC, and MVC, adopt Micro Block Adaptive Frame Field to enhance motion estimation which results in the poor performance, leads to motion estimation at lower data rate and speed that makes. The Adaptive Bilateral Loop filter with an In-Loop Filter is used to eliminate the ringing artifacts .This design challenge has not been discussed in previous works according to our best knowledge. Therefore, we develop a Cross diamond search Algorithm to manipulate motion prediction vectors that provides higher compression ratio. A FLASH TRIE ARCHITECTURE LOOKUP FOR IPv6 PROTOCOL K.Nithya Abstract: It is becoming apparent that the next generation IP route lookup architecture needs to achieve speeds of 100 Gbps and beyond while supporting both IPv4 and IPv6 with fast real-time updates to accommodate ever-growing routing tables. Some of the proposed multi bit-trie based schemes, such as Tree Bitmap, have been used in todays high end routers. However, their large data structure often requires multiple external memory accesses for each route lookup. A pipelining technique is widely used to achieve high-speed lookup with a cost of using many external memory chips. Pipelining also often leads to poor memory load balancing. In this project, a method is proposed where a new IP route lookup architecture called Flash Trie that overcomes the shortcomings of the multi bit-trie based approach. It uses a hash-based membership query to limit off-chip memory accesses per lookup to one and to balance memory utilization among the memory modules. A data structure called PrefixCompressed Trie is off-chip memory accesses per lookup to one and to balance memory utilization among the memory modules. A new data structure called Prefix-Compressed Trie is developed, that reduces the size of a bitmap. Flash Trie also supports incremental real-time updates. CHANNEL ESTIMATION ALGORITHMS USING LSE FOR OFDM-IDMA Kamatchi.M Saravana Kumar.P Shri Andal Alagar College of Engineering

Shri Andal Alagar College Of Engineering


Shri Andal Alagar College Of Engineering

Abstract This project presents pilot based algorithm performing LSE (Least Square Error) based channel estimation at channel equalizer at better equalization at receiver. LSE consist of

block type and comb type arrangement. OFDM signal the bandwidth is divided into many narrow sub-channels which are transmitted in parallel. Each sub-channel is typically chosen narrow enough to eliminate the effect of delay spread. OFDM with IDMA to overcome the effect of ISI. Broadband wireless systems based on orthogonal frequency division multiplexing (OFDM) often require IFFT/FFT to produce multiple sub carriers. Channel estimation is a outstanding process at receiver side. We propose a new method that is LSE to estimate faded channel signals that reconstruct the received signal that has equal offset with transmitted signal.

ABSTRACT STATELESS MULTICAST PROTOCOL FOR ADHOC NETWORKS B.Aathi Bhuvana, (PG Student)/CSE, S.Surendren, Assistant Professor/CSE, aathi_bhuvi@yahoo.co.in, surendran@gmail.com Tagore Engineering College, Chennai, India. Abstract Multicast routing protocols typically rely on the a priori creation of a multicast tree (or mesh), which requires the individual nodes to maintain state information. In dynamic networks with bursty traffic, where long periods of silence are expected between the bursts of data, this multicast state maintenance adds a large amount of communication, processing, and memory overhead for no benefit to the application. Thus, we have developed a stateless receiver-based multicast (RBMulticast) protocol that simply uses a list of the multicast members (e.g., sinks) addresses, embedded in packet headers, to enable receivers to decide the best way to forward the multicast traffic. This protocol, called Receiver-Based Multicast, exploits the knowledge of the geographic locations of the nodes to remove the need for costly state maintenance (e.g., tree/mesh/neighbor table maintenance), making it ideally suited for multicasting in dynamic networks. RBMulticast was implemented in the OPNET simulator and tested using a sensor network implementation. Both simulation and experimental results


confirm that RBMulticast provides high success rates and low delay without the burden of state maintenance.

FRAUD DETECTION IN SOCIAL SECURITY AND SOCIAL WELFARE DATA MINING E.Gopalakrishnan (P.G.student)/CSE, P.Selvakumari Assistant Professor/CSE, gk1990@yahoo.com, Pselvichinnasamy@gmail.com, Tagore Engineering College, Chennai, India. Abstract The importance of social security and social welfare business has been increasingly recognized in more and more countries. It impinges on a large proportion of the population and affects government service policies and peoples life quality. Typical welfare countries, such as Australia and Canada, have accumulated a huge amount of social security and social welfare data. Emerging business issues such as fraudulent outlays, and customer service and performance improvements challenge existing policies, as well as techniques and systems including data matching and business intelligence reporting systems. The need for a deep understanding of customers and customergovernment interactions through advanced data analytics has been increasingly recognized by the community at large. So far, however, no substantial work on the mining of social security and social welfare data has been reported. For the first time in data mining and machine learning, and to the best of our knowledge, this paper draws a comprehensive overall picture and summarizes the corresponding techniques and illustrations to analyze social security/welfare data, namely, social security datamining (SSDM), based on a thorough review of a large number of related references from the past half century. In particular, we introduce an SSDM framework, including business and research issues, social security/welfare services and data, as well as challenges, goals, and tasks in mining social security/welfare data.

MOTIF MINING IN SEQUENTIAL DATA SETS USING FLAME G.L. Beautlin (PG Student)/CSE, S. Murugesan, Assistant Professor/CSE, beautlin.it@gmail.com, muruga13@gmail.com,

Tagore Engineering College, Chennai, India

Abstract--- In many cases, existing sequential pattern mining algorithm still faces tough challenges in both effectiveness and efficiency. A large class of applications ,such as health care, biological DNA and protein motif mining, require efficient mining of approximate patterns that are contiguous. Most of the existing algorithms that can be applied to find such contiguous approximate pattern mining have drawbacks like poor scalability, lack of guarantees in finding the pattern, and difficulty in adapting to other applications. In this paper, a new novel algorithm is introduced called FLexible and Accurate Motif DEtector (FLAME). FLAME is a fast, scalable suffix-tree-based algorithm that can be used to find frequent patterns with a variety of definitions of motif models. It is also accurate and used in real and synthetic data and also address the extended structured motif extraction, which allows mining frequent combinations of motifs under relaxed constraints.

AN APPROACH FOR ANOMALY BASED DETECTION AND PREVENTIONS OF STATE VIOLATIONS IN WEBAPPLICATION VULNERABILITIES G.Sabeena Gnanaselvi, (PG Student)/CSE, DR.C.Rajabhushanam.,Professor/CSE, selvi.sabi6@gmail.com, rajcheruk@gmail.com Tagore Engineering College, Chennai, India. Abstract Web services and applications have increased in both popularity and complexity over the past few years. Daily tasks, such as purchasing, banking, travel, and social networking, are all done via the web. Such services typically employ a web server front end that runs the application user interface logic, as well as a back-end server that consists of a database or file server. We present Double Guard, a system used to detect attacks in multitier web services. We implemented Double Guard using an Apache webserver with MySQL and lightweight virtualization. Our approach can create normality models of isolated user sessions that include both the web front-end (HTTP) and back-end (File or SQL) network transactions. Make database security under serious attacks. SQL injection attack is the one of the common attacks. Internet services and applications have become an inextricable part of daily life, enabling communication and the management of personal information from anywhere.

Analyse principle of SQL attack and develop database protection system between web application and database. Provide protective measures for both ordinary users and Administrators. In this paper, we analyse principles of SQL attacks, study a database protection system which is used between the Web application and the database. The role of a Web application and database in the database between the protection system for ordinary users and administrators were made by different users of protective measures to effectively guarantee the security of the database.

RESOURCE PROVISIONING COST OF CLOUD COMPUTING BY ADAPTIVE RESERVATION TECHNIQUES J.Shanthaseelan, PG Student, P.Alagu Manoharan, Assistant Professor shanthaseelan@gmail.com, manosrinivas@gmail.com Tagore Engineering College, Chennai, India. Abstract Cloud computing is a service oriented computing based on pay per use technology. In generally cloud computing service requests are on demand requests. It may cause some problems like not surety about resource, high cost during demand. It may affect the user cost to pay. Optimization of cost in resource provisioning technique will be used to reduce the cost paid by the user. The reservation plan for forecasting the future required of services. It may cause the surety about the resource availability and preserve the user cost from the on time demanding cost for provisioning. In this technique use a unique customer id to solve the provisioning problems like under provisioning and over provisioning. To solve this problem by using on demand request reservation. Optimizing cost in resource provisioning algorithm will use the stochastic programming modeling, benders

decomposition and scenario techniques to solve the problems by using this technique will achieve the reduced cost for user.



N. Kavitha, (PG Student)/CSE, R.M.KumareshBabu, Assistant Professor/CSE, kavitharuk@gmail.com, rmkumareshbabu@gmail.com Tagore Engineering College, Chennai, India. Abstract --Online applications are constructed with web interfaces and shell interfaces. Remote login services are used in web applications. Web interface and secure shall login (SSH) methods are used for the remote login process. Remote login services are attacked with Brute force and dictionary attacks. Password guessing attacks are initiated by the Botnets. Automated Turing Tests (ATTs) is conducted to identify automated malicious login attempts. Pinkas and Sander (PS) and van Oorschot and Stubblebine proposals (VS) are used to limit the online password guessing attacks based on ATTs. The PS proposal reduces the number of ATTs sent to legitimate users. The VS proposal reduces the security overhead with a significant cost to usability. Security, usability and user interface factors are considered in the remote login process. Remote login services are disturbed by brute force and password dictionary attacks. Login attacks are controlled by Password Guessing Resistant Protocol (PGRP). PGRP limits the total number of login attempts from unknown remote hosts. PGRP enforces ATTs after a few failed login attempts are made from unknown machines.PGRP supports both graphical user interfaces (browser-based logins) and character-based interfaces (SSH logins). User name and IP address are used to detect legitimate users. PGRP protocol is improved to provide attack free remote login services. Cookie thefts are handled with enhanced PGRP protocol. Black lists are used to manage the attacker addresses under login verification. Compromised machine attacks are handled with the user name and IP address associations.. The password communications are secured with RSA algorithm.

SECURE ROUTING PROTOCOL IN MOBILE ADHOC NETWORKS M.Arun kumar, (PG Student)/CSE, G. Kamesh, Assistant Professor/CSE, aravindmohan90@gmail.com, kamesh.govin@gmail.com

Tagore Engineering College, Chennai, India.

Abstract Privacy-preserving routing is crucial for some adhoc networks that require stronger privacy protection. A number of schemes have been proposed to protect privacy in adhoc networks. However, none of these schemes offer complete unlinkability or unobservability property since data packets and control packets are still linkable and distinguishable in these schemes. In this paper, we define stronger privacy requirements regarding privacy-preserving routing in mobile ad hoc networks. Then we propose an unobservable secure routing scheme USOR to offer complete unlinkability and content unobservability for all types of packets. USOR is efficient as it uses a novel combination of group signature and ID-based encryption for route discovery. Security analysis demonstrates that USOR can well protect user privacy against both inside and outside attackers. We implement comparing with AODV and MASK. The simulation results show that USOR not only has satisfactory performance compared to AODV, but also achieves stronger privacy protection than existing schemes like MASK.

CORMAN: A Novel Cooperative Opportunistic Routing Scheme in Mobile Ad Hoc Networks S. Radhika, (PG Student)/CSE, Mrs.N.H.Angela Lincy, Assistant Professor/CSE, radhipattu@gmail.com, lincyclement@gmail.com Tagore Engineering College, Chennai, India AbstractThe link quality variation of wireless channels has been a challenging issue in data communications until recent explicit exploration in utilizing this characteristic. The same broadcast transmission may be perceived significantly differently, and usually independently, by receivers at different geographic locations. Furthermore, even the same stationary receiver may experience drastic link quality fluctuation over time. The combi- nation of link-quality variation with the broadcasting nature of wireless channels has revealed a direction in the research of wire- less networking, namely, cooperative communication. Research on cooperative communication started to attract interests in the community at the physical layer but more recently its importance and usability have also been realized at upper layers of the

network protocol stack. In this article, we tackle the problem of opportunistic data transfer in mobile ad hoc networks. Our solution is called Cooperative Opportunistic Routing in Mobile Ad hoc Networks (CORMAN). It is a pure network layer scheme that can be built atop offthe-shelf wireless networking equipment. Nodes in the network use a lightweight proactive source routing protocol to determine a list of intermediate nodes that the data packets should follow en route to the destination. Here, when a data packet is broadcast by an upstream node and has happened to be received by a downstream node further along the route, it continues its way from there and thus will arrive at the destination node sooner. This is achieved through cooperative data communication at the link and network layers. This work is a powerful extension to the pioneering work of ExOR. We test CORMAN and compare it to AODV, and observe significant performance improvement in varying mobile settings.

PREDICTING THE USEFULNESS AND IMPACT OF REVIEW FOR MOVIE DOMAIN S. Vinoth prasath, (PG Student)/CSE, N.Balaji, Assistant Professor/CSE, vinoth679@gmail.com, balaji1783@gmail.com Tagore Engineering College, Chennai, India. Abstract- Posting online reviews has become an increasingly popular way for people to share with other users their opinions and sentiments toward products and services. Both the sentiments expressed in the reviews and the quality of the reviews has a significant impact on the future sales performance of products in question. To tackle the problem of mining reviews for predicting product sales performance, propose Sentiment PLSA (S-PLSA). Based on S-PLSA, ARSA, an autoregressive sentiment-aware model for sales prediction and then seek to further improve the accuracy of prediction by considering the quality factor, with a focus on predicting the quality of a review in the absence of user-supplied indicators, and present ARSQA, an autoregressive sentiment and quality aware model, to utilize sentiments and quality for predicting product sales performance.

HASBE: A Hierarchical Attribute-Based Solution for Flexible and Access Control in Cloud Computing

S.Bhuvanapriya Computer Science Engineering, Tagore Engineering College, Anna University, Chennai, India

AbstractCloud computing has emerged as one of the most influential paradigms in the IT industry in recent years. Since this new computing technology requires users to entrust their valuable data to cloud providers, there have been increasing security and privacy concerns on outsourced data. Several schemes employing attribute-based encryption (ABE) have been proposed for access control of outsourced data in cloud computing; however, most of them suffer from inflexibility in implementing complex access con- trol policies. In order to realize scalable, flexible, and fine-grained access control of outsourced data in cloud computing, in this paper, we propose hierarchical attribute-set-based encryption (HASBE) by extending ciphertext-policy attribute-set-based encryption (ASBE) with a hierarchical structure of users. The proposed scheme not only achieves scalability due to its hierarchical struc- ture, but also inherits flexibility and fine-grained access control in supporting compound attributes of ASBE. In addition, HASBE employs multiple value assignments for access expiration time to deal with user revocation more efficiently than existing schemes. We formally prove the security of HASBE based on Digital Signature Approach (DSA) scheme by Bethencourt et al. and analyze its performance and computational complexity. We implement our scheme and show that it is both efficient and flexible in dealing with access control for outsourced data in cloud computing with comprehensive experiments.



T.Preethi, (PG Student)/CSE, Mrs.M.P.Elgiba Mabel, Assistant Professor/CSE, preethithegarajan@gmail.com, elgibamabel@gmail.com Tagore Engineering College, Chennai, India.

Abstract MPEG- 4 video streaming service is being regarded as a de facto standard service for current mobile multimedia video streaming services such as video conferencing, voice over IP, and even state -of-the-art digital multimedia broadcasting technology. There have been many illegal users who have used copyrighted material without getting permission and/or paying for its use. Therefore, much attention has been paid for Digital Rights Management (DRM) for digital media. We constructed a protection scheme for MPEG-4 video file format, which can be applied, to the video services that are exploiting the MPEG-4 standard. In this protection scheme, minimum segments of every Video Object Plane in a MPEG-4 video file could be encrypted with a symmetric cryptography system like a DES (Data Encryption Standard) so that people who have not received permission and/or paid to use the contents would not be able to view them. We applied this scheme to all kinds of MPEG-4 VOP (Video Object Plane) types, that is I-, P-, and B-VOPs respectively. The prototype system was developed for proofing the feasibility of this scheme for hand- held devices and for mobile multimedia service environment. The efficiency of this scheme is reported at the end of this paper.

A NOVEL APPROACH FOR DOCUMENT RETRIEVAL USING HAC BASED ON MVS MEASURE Mr.S.Parthiban Student/CSE, Ms.C.Suchitra Assistant Professor /CSE sparthiban2012@gmail.com, suchitrachinnathambi@gmail.com. Raja College Of Engg. & Tech.

Abstract- Clustering is one of the most interesting and important topics in data mining. The aim of clustering is to find relationship among the data objects, and classify them into meaningful subgroups. The effectiveness of clustering algorithms depends on the appropriateness of the similarity measure between the data in which the similarity can be computed. In this paper, a novel method for measuring similarity between data objects particularly text documents is introduced which uses the Multi-View point based Similarity Calculation. With the proposed similarity measure, Hierarchical Clustering Algorithm used to forms the document groups. From the clustered objects, the document retrieval can be done based on the query. The query is preprocessed then it is matched with the documents in the clusters. Ranking is provided for the Clusters with respect to the query matching result. The most relevant Cluster to the query will be retrieved with this approach. From this, more informative assessment of similarity could be achieved between the documents.

A SCORE BASED TRUSTWORTHY DECLARATION SCHEME FOR VANETS Ms.A.Jenifer Sophia Student/CSE, Ms.P.Suseendra Assistant Professor /CSE pjeniferarul@gmail.com, suseecse@gmail.com Raja College Of Engg. & Tech. Abstract: Vehicles are allowed to generate and broadcast messages about road conditions, such as traffic congestion and accidents to nearby vehicles in Vehicular ad hoc networks (VANETs). These kinds of messages may improve road safety and traffic efficiency of neighboring vehicles. However, messages generated by vehicles may not be reliable. Then propose the secure algorithm through a novel announcement system. The neighboring vehicle utilizes this information within a seconds through the sha1-rsa algorithm and be aware form the road conditions. The reputation system is to find the evaluation of the reliability of the message is based on the digital score which is stored in the server.



Nancy Noella R.S Marthandam College of Engineering and Technology, Anna University, Chennai 97, Tamil Nadu, India

Abstract This paper presents a novel Computer-Aided Diagnosis (CAD) technique for the early diagnosis and the classification of stages in the Alzheimers disease (AD) based on Nonnegative Matrix Factorization (NMF) based Decision Tree Classifier (DTC). In general, decision trees represent a disjunction of conjunctions of constraints on the attribute-values of instances. Each path from the tree root to a leaf corresponds to a conjunction of attribute tests and the tree itself to a disjunction of these conjunctions. The CAD tool is designed for the study and classification of functional brain images. For this purpose, two different brain image databases are selected: a single photon emission computed tomography (SPECT) database and positron emission tomography (PET) images, both of them containing data for both Alzheimers disease patients and healthy controls as a reference. These databases are analyzed by applying the Nonnegative Matrix Factorization for feature selection and extraction of the most relevant features. The resulting sets of data, which contain a reduced number of features, are classified the stages in Alzheimers disease by means of a decision tree classifier with bounds of confidence for decision.



Aksha.A PG Student, Mrs Geetha Jenifel Assistant Professor Department of Computer and Communication Engineering, Department of Information Technology, aksha90a@gmail.com, jenifel.it@gmail.com St. Xaviers Catholic College of Engineering

Abstract- A mobile ad hoc network (MANET) is a collection of dynamic, independent, wireless devices that groups a communications network. According to the positions and transmission range, every node in MANET acts as a router and tends to move arbitrary and dynamically connected to form network. Routing is used for directing communication over large networks. Optimal path is determined based on routing. Authentication is the act of confirming the truth of an attributive of a datum or entity, this involve identity or conform of a person or software program. Existing system consist of a new Tiered Authentication scheme for Multicast traffic (TAM) for large scale ad-hoc networks. TAM exploits network clustering to reduce overhead and ensure scalability. Multicast traffic within a cluster employs a one-way hash function chain in order to authenticate the message source. Crosscluster multicast traffic includes message authentication codes (MACs) that are based on a set of keys. But it consists of some difficulties such as memory overhead, and time over head for using set of keys. In proposed system a unicast authentication scheme is used for authentication in multicast routing. Here we are using single key for authentication. For encrypting RSA algorithm is used and exchange the key using Diffie-helman key exchange algorithm.



Nimmy.Balakrishnan PG student, Mr.J.Prakash Assistant Professor, Department of Computer Science and Engineering, Marthandam College of Engineering and Technology.

AbstractDiabetic Retinopathy caused due to diabetes is a high risk complication which can cause irreversible loss of vision. The two major lesions that show the presence of Diabetic Retinopathy are Diabetic Macular Edema and Microaneurysm. It is a complication of the eye often leading to reduced capacity of vision. In the initial stages of diabetic retinopathy, patients are generally asymptomatic, but in more advanced stages of the disease patients may experience symptoms that include floaters, distortion, and/or and blurred vision. In this paper, detection and severity assessment of DME is done along with detecting Microaneurysm, which eventually shows the presence of Diabetic retinopathy. Bayesian classifier is used to assess the presence of abnormality of DME and Thresholding is used to assess the severity. Morphological operations are used to separate those lesions from eye structures.


Raji S.R PG Student, Mr. B.L Radha Krishnan Assistant professor, Marthandam College of Engineering and Technology Abstract- There has been an increasing interest in the study of video based fire detection algorithms as video based surveillance systems become widely available for indoor and outdoor monitoring applications. Although many video based smoke-detection algorithms

have been developed and applied in various experimental or real life applications, but the standard method for evaluating their quality has not yet been proposed. In this framework, it is assumed that the compound algorithm consists of several subalgorithms, each of which yields its own decision as a real number centered around zero, representing the confidence level of that particular subalgorithm. In this project, the wavelet support vector machine (WSVM)-based model is used for Wild fire detection (WFD). Decision values are linearly combined with weights that are updated online according to an active fusion method based on performing entropic projections onto convex sets describing subalgorithms. The new wavelet kernel is proposed to improve the generalization ability of the support vector machine (SVM). More-over, the proposed model utilizes the principle of wavelet analysis to facilitate nonlinear characteristic extraction of the image data. To reduce misclassification due to fog, an efficient fog removal scheme using adaptive normalization method.

FLAME AND FIRE EDGE DETECTION AND PARAMETER MONITORING WITH AN AUTOADAPTIVE BASED ALGORITHM Godwin Jinu, Sundara Guru gmgjin@gmail.com, sundaraguru11@gmail.com, Department of electronics and communication, PSN college of engineering and technology.

Abstract-The determination of flame or fire edges is the process of identifying a boundary between the area where there is thermo chemical reaction and those without. It is a precursor to image-based flame monitoring, early fire detection, fire evaluation, and the determination of flame and fire parameters. Several traditional edge-detection methods have been tested to identify flame edges, but the results achieved have been disappointing. Some research works related to flame and fire edge detection were reported for different applications; however, the methods do not emphasize the continuity and clarity of the flame and fire edges. A computing algorithm is thus proposed to define flame and fire edges clearly and continuously. The algorithm detects the coarse and superfluous edges in a flame/fire image first and then identifies the edges of the flame/fire and removes the irrelevant artifacts. The auto adaptive

feature of the algorithm ensures that the primary symbolic flame/fire edges are identified for different scenarios. Experimental results for different flame images and video frames proved the effectiveness and robustness of the algorithm.

A CROSS-LAYER IMPLEMENTATION AND EFFICIENCY IMPROVEMENT USING CL-DNA Ariya.I.M aryamohan567@gmail.com Department of communication systems Prathyusha institute of technology and management, Chennai

Abstract- The IEEE 802.15.4 standard is designed to achieve low-power transmissions in low-rate and short distance wireless personal area networks (WPANs).In IEEE 802.15.4 CSMA/CA protocol is used. This protocol cannot avoid the hidden node collision. The hidden node collision is called as hidden device problem (HDP). Due to the hidden device problem, the inefficient data transmission and serious power consumption will occur in WPAN. In this paper, we propose a cross-layer detection and allocation (CL-DNA) scheme to solve the HDP in IEEE 802.15.4.The CL-DNA algorithm does not need extra control overhead in data transmissions. The proposed CL-DNA algorithm detects relationships of hidden devices based on the overlapped signals and then allocates the hidden devices into distinct sub periods for transmission. This improves the throughput and reduces the power consumption.



A.RishaInigo PG Student of computer and communication dept, Mrs.GeethaJenifel Assistant Professor, rishainigo@gmail.com, jenifel.it@gmail.com St.Xaviers Catholic College of Engineering.

Abstract- A lot of research going on in the field of privacy protection in mobile ad-hoc networks. Because mobile ad-hoc network is an infrastructureless network so the privacy in mobile ad-hoc network is important than the wired networks.Most prior work focused on

security issues less concentration has been devoted to privacy. None of existing schemes offer complete privacy without any malicious action (i.e. sending invalid message, modifying original message). The proposed approach technique deals with privacy protection scheme that satisfies the entire requirement for providing stronger privacy and protect from the DOS attack. We proposed the idea behind the concept of verifies the originality of the message of source node. This scheme is efficient as it uses a combination of group signature, ID based encryption and hashing function for route discovery.



Ancy Placid R. PG Scholar, Mrs. Jemila Rose R., M.Tech. Assistant Professor ancy.placid@gmail.com, 2Jemila.rose@gmail.com St. Xaviers Catholic College of Engineering, Chunkankadai.

Abstract In this paper, we introduced a new technique for detecting object boundary in ultrasound images which uses the magnitude and direction information of the image which is obtained using Edge map and Edge vector model. The performance and robustness of the technique are tested to segment objects in ultrasound images. To further evaluate the efficiency of the proposed method in addition to the visual inspection, we evaluate our boundary detection method numerically using the Hausdorff distance and the probability of error in image segmentation.

DENOISING OF ULTRASOUND IMAGE USING IMPROVED OSRAD Annie BlessyBosco J1,PG Student, Mrs. R. Jemila Rose 2, Assistant Professor, 1Department of Computer and Communication Engineering, annieblessy081@gmail.com 2Department of Information Technology, Jemila.rose@gmail.com St.Xaviers Catholic College of Engineering. Abstract Ultrasound imaging is the most commonly used imaging system in medical field. Main problem related to this imaging technique is introduction of speckle noise, thus making

the image unclear. The success of ultrasonic examination depends on the image quality which is usually retarded due to speckle noise. There have been several techniques for effective suppression of speckle noise present in ultrasound images. The filtering techniques considered include anisotropic diffusion, wavelet denoising, and local statistics. Comparison of the filters is based on their application of objective quality metrics, which quantifies the preservation of image edges, overall image distortion, and improvement in image contrast. The computational analysis quantifies the number of operations required for each speckle reduction method. A speed-accuracy analysis of discretization methods for anisotropic diffusion is included. It is concluded that the optimal method is the OSRAD diffusion filter. The proposed approach technique deals with an improved OSRAD filter which gives an efficient result other than the previous filters by analyzing the quality metrics.

A NOVEL CONTEXT BASED REVERSIBLE WATERMARKING Anoja C.M 1, PG Student, Dr. C. Seldev Chirstopher 2 ,Professor, 1 Department of Computer and Communication Engineering, anojacm10@gmail.com 2Department of Computer Science and Engineering, cseldev@gmail.com, St. Xaviers Catholic College of Engineering


Abstract- Over the past few years a lot of research are going in the field of reversible watermarking. Reversible watermarking is a novel category of watermarking schemes. It completely recover the watermark along with the original image. In starting this technique is only used for the authentication, now it is used to send highly informative information through the original image. This technique is mostly used by some important media, such as medical and military images, because these kinds of media do not allow any loss. The previous reversible watermarking techniques are least significant bit embedding, difference expansion, prediction error expansion, integer transform, expansion embedding, reversible data hiding, pattern substitution, sample projection ,etc.. In all these technique the data embedding capacity and image quality is comparatively less. So in this paper a new proposed system for reversible watermarking technique named context based reversible watermarking is introduced to improve the visual quality of the recovered images and to increase the embedding capacity with less computational complexity and less distortion.


F.Roselet Vincy1, D.Hevin Rajesh2 1PG Scholar, M.E (Computer And Communication), roseletvincy@gmail.com 2Assistant Professor, Department of Information Technology, hevin@rediffmail.com St. Xaviers Catholic College of Engineering, Chunkankadai-629807

Abstract A wireless sensor network(WSN) consists of spatially distributed autonomous sensors to monitor physical or environmental conditions, such as temperature, pressure, sound, etc. and to cooperatively pass their data through the network to a main location. A critical need in Wireless Sensor Network (WSN) is to achieve energy efficiency during routing as the sensor nodes have scarce energy resource. Clustering helps to achieve energy efficiency by reducing the organization complexity overhead of the network which is proportional to the number of nodes in the network. This paper proposes a novel hybrid

multipath routing algorithm with an efficient clustering technique. A node, selected as cluster head if it has high surplus energy, better transmission range and least mobility. A fuzzy logicbased clustering approach with an extension to the energy predication is proposed to prolong the lifetime of WSNs by evenly distributing the workload. In addition to the residual energy, the expected residual energy (ERE) has been introduced to act as a fuzzy descriptor during the on-line cluster head selection process.







CONTRAST ENHANCEMENT G. M. Soji1, Mr. R. P. Anto Kumar2 1PG student, Department of computer and Communication, soji.sojigm@gmail.com 2Professor, Head of IT department, anto_friends@yahoo.com St. Xaviers Catholic College of Engineering.

Abstract: Image Enhancement techniques are used to enhance the low quality image that was captured by a digital camera under non uniform lighting, poor lighting etc. Histogram Equalization is one of the image enhancement techniques, which yield good contrast of an image, but the brightness cannot be preserved. So histogram specification technique called piecewise linear transformation method is used. But it can encounter problems by shifting the modes means too far from the original means. The computational time required for linear transformation is also very high. To overcome this, a new histogram specification method called piecewise non linear maximum entropy, which is a very fast and flexible technique is proposed to enhance the image contrast. This technique yields an image with similar brightness level as the original one and also applicable for color images.


1 PG Student, Dept. of Computer and Communication Engineering, labina.r@gmail.com 2 Assistant Professor, Dept. of Information Technology, stalinmail2000@yahoo.com St. Xaviers Catholic College of Engineering, Chunkankadai Abstract Cloud mainly offers various services over internet. It allows users to store data on various unknown servers. Clients may use the resources based on pay only for use basis. Several schemes are proposed to store data efficiently on a cloud data centres. The previous schemes used erasure codes such as Reed Solomon codes. They have high computational complexity. The aim of this project is to store the data efficiently in cloud data centres. Here tornado codes are used to encode the file so that it is easy to generate and fix erasures very faster. Tornado codes have high performance and lower computational complexity compared to Reed Solomon codes. Because of the linear complexity and the comparably simple computation Tornado codes are very fast, and also the principal algorithms for the encoder and decoder are simple.


M.Manoj Malathi1 , G.Sahaya Stalin Jose,2


1PG Student, manojmalathi3@gmail.com, 2Professor, stalinmail2000@yahoo.com St.Xaviers Catholic College of Engineering,

Abstract- Scheduling is the job of allocating CPU time to different tasks within an operating system. In this paper, the Adaptive Scoring Job Scheduling algorithm, users can submit different types of jobs at the same time containing computing-intensive jobs or data-intensive jobs. The computing-intensive job means that jobs need lots of computing power to complete. The data-intensive job means that the resource needs to take lots of bandwidth to transmit files. Local update and global update rules will be applied to adjust the status of the resource and the cloud system. Thus it is more efficient compared to the previous job scheduling algorithms.


R.Benila1 , Dr.C.Seldev Christopher,2 1 PG Student, benilaruphen@gmail.com, 2Professor, cseldev@gmail.com Department of Computer Science and Engineering, St. Xaviers Catholic College of Engineering

Abstract- This paper proposes novel adaptive edge sensing algorithm, which sense the edge and classify the smooth area and edge area to increase the quality of the image in the edges. An extended version of edge sensing detects the image along the edges as well as in horizontal, vertical and diagonal. The parameters used for these algorithm are mean square error, peak signal to noise ratio and color signal to noise ratio. In existing edge sensing algorithm, interpolation may not be done along edges but across edges, which tends to create color artifacts. To address this issue, propose an novel adaptive edge-sensing model for all the green, red and blue components. The idea is to detect, at a missed pixel, if there is a potential horizontal, vertical, or diagonal edge.


R.Divya1 , Dr.F.Ramesh Dhanaseelan,2 1 PG Student, divyakarthika1563@gmail.com, 2Professor&Head, message-to-

ramesh@yahoo.com, Department of Computer Applications, St. Xaviers Catholic College of Engineering

Abstract- Data mining is a process for discovering information from large databases. Frequent itemsets mining plays an essential role in association rules mining. It takes the lot of time to mine the frequent itemsets. Frequent itemsets and association rule mining problems increased the interest of many research people to solve these mining problems. So hundreds of new algorithms or improvements on existing algorithms were already done to mine the frequent itemsets. Mining frequent itemsets is one of the most investigated fields in data mining. It is a fundamental and crucial task for transaction based analysis. In this proposed Quantitative mining first the database is represented vertically using Eclat Algorithm to reducing the memory requirement. And also to improve the support count Trie construction was used to mine the frequent item sets with its quantity. AN ENHANCED DIGITAL SIGNATURE BASED ROUTING PROTOCOL IN WIRELESS SENSOR NETWORK Sobin Soniya S. 1, Mr. D. Hevin Rajesh 2 1 PG Student, Department of Computer and Communication Engineering, hevin@rediffmail.com, albinsobin2006@gmail.com 2Professor, Department of Information Technology, St. Xaviers Catholic College of Engineering

Abstract- This paper addresses the security of routing protocols. Wireless routing protocols suffer from many attacks like spoofing or altering the route information, sinkhole attack,

wormhole attack, Sybil attack, selective forwarding etc. Transmission of data in a secure manner is very important in this type of networks. Digital signatures are an efficient way to achieve the security of data sent over wireless networks. Here it is proposed An Enhanced Digital Signature based Routing Protocol in Wireless Sensor Network (AEDSRP) to protect the data send over a wireless network. In the existing system they use 64 bit digital signature algorithms. Using this 64 bit algorithm the attacker can easily hacking the messages. In this paper we propose a new algorithm is called as An Enhanced Digital Signature based Routing Protocol in WSN using it digital signature algorithm. Properly programmed digital signatures are more difficult to forge than other type of security measures. This system also shows efficiency in packet delivery fraction and energy consumption.

SEGMENTATION AND RECOGNITION OF UNSTRUCTURED BACKGROUND OBJECTS USING PERCEPTUAL ORGANIZATION W.Jency Remila 1, Mr.R.P.Anto Kumar 2 1PG Student,remila.remi2@gmail.com, 2Professor, anto_friends@yahoo.com Department of Computer and Communication, St.Xaviers Catholic College of Engineering. Abstract: In this report, a novel image segmentation algorithm for outdoor scene based on background Recognition and Perceptual Organization was proposed. So, develop a Perceptual Organization model by quantitatively incorporating a list of Gestalt laws. Usually the structurally challenging objects consist of multiple constituent parts. So develop an perceptual organization model that can capture the non-accidental structural relationships among the constituent parts of the structured objects and, group them together accordingly without depending on a priori knowledge of the specific objects. The experimental results show that of the proposed method outperformed two competing image segmentation approaches and achieved good segmentation quality on various natural scene environments.


DEVELOPING A REPUTATION BASED TRUST MODEL FOR MULTI AGENT SYSTEM Anupa.N.L1 (email id:catchanupa@gmail.com) 1Department of Computer Science, T..Institute of Technology, Chennai.

Abstract - Security and privacy issues have become critically important with the fast expansion of multi agent systems. Most network applications such as pervasive computing, grid computing, and P2P networks can be viewed as multi agent systems which are open, anonymous, and dynamic in nature. Such characteristics of multi agent systems introduce vulnerabilities and threats to providing secured communication. One feasible way to minimize the threats is to evaluate the trust and reputation of the interacting agents. Many trust/reputation models have done so, but they fail to properly evaluate trust when malicious agents start to behave in an unpredictable way. Moreover, these models are ineffective in providing quick response to a malicious agents oscillating behavior. Another aspect of multi agent systems which is becoming critical for sustaining good service quality is the even distribution of workload among service providing agents. Most trust/reputation models have not yet addressed this issue. So, to cope with the strategically altering behavior of malicious agents and to distribute workload as evenly as possible among service providers; we present in this paper a dynamic trust computation model called SecuredTrust. In this paper, we first analyze the different factors related to evaluating the trust of an agent and then propose a comprehensive quantitative model for measuring such trust. We also propose a novel load158

balancing algorithm based on the different factors defined in our model. Simulation results indicate that our model compared to other existing models can effectively cope with strategic behavioral change of malicious agents and at the same time efficiently distribute workload among the service providing agents under stable condition.

IDENTIFYING MALICIOUS PACKET LOSS IN DISRUPTION TOLERANT NETWORK Gangadevi M (PG Scholar), Vijayaraj A ganga.mathi@gmail.com, vijayaraj@saveetha.ac.in Department of Information Technology, Saveetha Engineering College, Chennai, India

Abstract Disruption tolerant networks (DTNs) are a special type of wireless network which has the lack of continuous connectivity. Due to the unique characteristic of frequent partitioning in DTNs, multicasting is a considerably different and challenging problem. To overcome this issue Distributed scheme is used to detect packet dropping in DTNs. A node may misbehave by dropping packets. Routing misbehavior can be caused by selfish nodes that are unwilling to spend resources such as power and bandwidth on forwarding packets. It reduces the packet delivery ratio and wastes system resources. Distributed scheme is used to detect packet dropping in DTNs. And genuine packet loss is differentiated with malicious packet loss by comparing the buffer level of every node and assigning bandwidth as per the category. For security, data packets are encrypted. Routing misbehavior is reduced by limiting the number of packets forwarded to the misbehaving nodes.










1Department of Computer Science, (email id:rgeethachandra@gmail.com) T..Institute of Technology, Chennai.

Abstract - Update scheduling in streaming data warehouses, which combine the features of traditional data warehouses and data stream systems. In our setting, external sources push append-only data streams into the warehouse with a wide range of interarrival times. While traditional data warehouses are typically refreshed during downtimes, streaming warehouses are updated as new data arrive. We model the streaming warehouse update problem as a scheduling problem, where jobs correspond to processes that load new data into tables, and whose objective is to minimize data staleness over time (at time t, if a table has been updated with information up to some earlier time r, its staleness is t minus r). We then propose a scheduling framework that handles the complications encountered by a stream warehouse: view hierarchies and priorities, data consistency, inability to preempt updates, heterogeneity of update jobs caused by different interarrival times and data volumes among different sources, and transient overload. A novel feature of our framework is that

scheduling decisions do not depend on properties of update jobs (such as deadlines), but rather on the effect of update jobs on data staleness. Finally, we present a suite of update scheduling algorithms and extensive simulation experiments to map out factors which affect their performance.

IMPLEMENTING TRUST NEGOTIATIONS IN MULTISESSION TRANSACTIONS R.T.Anu Maheswari PG Student, Department of Computer Science, (email id:maheswarianu@gmail.com) T.J.Institute Of Technology,Chennai.

Abstract - Trust Negotiation has shown to be a successful, policy-driven approach for automated trust establishment, through the release of digital credentials. Current real applications require new flexible approaches to trust negotiations, especially in light of the widespread use of mobile devices. In this paper, we present a multisession dependable

approach to trust negotiations. The proposed framework supports voluntary and unpredicted interruptions, enabling the negotiating parties to complete the negotiation despite temporary unavailability of resources. Our protocols address issues related to validity, temporary loss of data, and extended unavailability of one of the two negotiators. A peer is able to suspend an ongoing negotiation and resume it with another (authenticated) peer. Negotiation portions and intermediate states can be safely and privately passed among peers, to guarantee the stability needed to continue suspended negotiations.This negotiation protocol can withstand the most significant attacks. As by our complexity analysis, the introduction of the suspension and recovery procedures, and mobile negotiations does not significantly increase the complexity of ordinary negotiations. Our protocols require a constant number of messages whose size linearly depend on the portion of trust negotiation that has been carried before the suspensions.

MOUSE ACTIVITY BY EYE EXPRESSIONS USING ENSEMBLE METHOD Denis Joycy, N. Vel Murugesh Kumar, denis17joycy@gmail.com, M.E Final Year, murugobi@gmail.com, Assistant Professor, Saveetha Engineering College, Thandalam, Chennai 602105.

Abstract This project aims to present an application that is able of replacing the traditional mouse with the human face as a new way to interact with the computer. Facial features (nose tip and eyes) are detected and tracked in real-time to use their actions as mouse events. The coordinates and movement of the nose tip in the live video feed are translated to become the coordinates and movement of the mouse pointer on the users screen. The left/right eye blinks fire left/right mouse click events. The only external device that the user needs is a webcam that feeds the program with the video stream. In the past few years high technology has become more progressed, and less expensive. With the availability of high speed processors and inexpensive webcams, more and more people have become interested in real-time applications that involve image processing. In our work we were trying to compensate people who have hands disabilities that prevent them from using the mouse by designing an application that uses facial features (nose tip and eyes) to interact with the Computer. The

nose tip was selected as the pointing device; the reason behind that decision is the location and shape of the nose; as it is located in the middle of the face it is more comfortable to use it as the feature that moves the mouse pointer and defines its coordinates. Eyes were used to simulate mouse clicks, so the user can fire their events as he blinks.

EFFICIENT AND SECURE AUTHENTICATION FOR MULTICAST TRAFFIC IN WIRELESS AD-HOC NETWORKS Jaya Selvi.S.P, Mr.R.Saravanan, jayaselvi10@gmail.com, sararaju@gmail.com Dept of CSE, Saveetha Engineering College, Chennai. Abstract Ad-hoc networks have been deployed largely due to recent advances in wireless communication technologies. These networks may be deployed in adverse or even hostile

environments so they should consider security issues in their communications. The network management has the fundamental requirement of authenticating the source and ensuring the integrity of message traffic. This includes a new scheme Tired Authentication for Multicast traffic particularly for large scale dense ad-hoc networks. This includes a new scheme Tired Authentication for Multicast traffic particularly for large scale dense ad-hoc networks. The TAM introduces the network clustering to ensure scalability and to increase the performance. This introduces a special codes called message authentication codes(MAC) which is based on a set of secret keys. The advantage in performance of TAM is to reduce the delivery delay and to reduce the bandwidth overhead. Multicast traffic within a cluster employs a one-way hash function chain in order to authenticate the message source.

ENERGY EFFICIENT UTILIZATION USING LEARN ALGORITHM Suviga.K (PG Scholar) Dept of Electronics and communication Engineering,

Saveetha Engineering College, Chennai, India. Suvigak17@gmail.com

Abstract A number of energy-aware routing protocols were proposed to seek the energy efficiency of routes in Multihop wireless networks. Among them, several geographical localized routing protocols were proposed to help making smarter routing decision using only local information and reduce the routing overhead. However, all proposed localized routing methods cannot guarantee the energy efficiency of their routes. In this article, we first give a simple localized routing algorithm, called Localized Energy-Aware Restricted

Neighbourhood routing (LEARN), which can guarantee the energy efficiency of its route if it can find the route successfully. The critical transmission radius in random networks which can guarantee that LEARN routing finds a route for any source and destination pairs asymptotically almost surely. So by using LEARN algorithm the energy efficient of the source and destination pair is thus may be maintained and the attacks occurs in the wireless network can also be reduced. One can also extend the proposed routing into threedimensional (3D) networks and derive its critical transmission radius in 3D random networks.


KAMAL.G, Dr.P.VALARMATHIE, PG Scholar, kamalg2628@gmail.com, Professor, valarmathie@saveetha.ac.in, Department of Computer Science and Engineering, Saveetha Engineering College, Thandalam, Chennai 602105.

Abstract In traditional database security research, the database is usually assumed to be trustworthy with simple encryption. Under this assumption, the goal is to achieve security against external attacks (e.g. from hackers) and possibly also against users trying to obtain information beyond their privileges, for instance by some type of statistical inference. The

objective of the paper is to develop a proficient and economical method for data Security. The task is implemented using cryptography encryption with Advanced Encryption Standards (AES) for data security of Personal Computers, Laptops, database, cloud or any form of storage devices. The focus of this work is to authenticate and protect the content of database from illegal use. Along with this authentication is provided on decryption side which provides multiple security. In this paper we explore new factor of Authentication, that is, Something You process. Valid user identification is the demand of computing society and it plays a very key role in the arena of security. This study explores a new factor of authentication, human cognitive level of authentication for access control, and proposes new model of multilevel security.


MINU INBA SHANTHINI WATSON BENJAMIN, N. VEL MURUGESH KUMAR, M.E. Final Year, minuben19@gmail.com, Assistant Professor, murugobi@gmail.com Department of Computer Science and Engg, Saveetha Engineering College, Thandalam, Chennai 602105.

Abstract The important feature of detecting the moving objects in videos is Background subtraction. The main process involved in the background is the foreground detection. However, many algorithms usually neglect the fact that the background images consist of different objects whose conditions may change frequently. In this paper, a hierarchical background model is proposed based on segmenting the background images. It first segments the background images into several regions by the Support Vector Machine. Then, a

hierarchical model is built with the region models and pixel models. The region model is extracted from the histogram of a specific region which is similar to the kind of a Gaussian mixture model. The pixel model is described by histograms of oriented gradients of pixels in each region based on the cooccurrence of image variations. we propose Silhouette detection algorithm. The experimental results are carried out with a video database to demonstrate the effectiveness, which is applied to both static and dynamic scenes by comparing it with some well-known background subtraction methods and according to the experiment, the Silhouette detection method is easy to operate and possesses high rate of accuracy, low rate of complexity, and well adapt to different kinds of shadow distribution.

TEST CASE PRIORITIZATION TECHNIQUE TO ASSIST REGRESSION TESTING BASED ON DEPENDENCY DETECTION Mohamed Shameem A Final Semester ME Software Engineering Saveetha Engineering College, Chennai. shameemgood@gmail.com

Abstract-- Regression testing is one of the important activities of software development. When a older version of the software is modified into a newer version a set of test cases needs to be run and the both the versions of the test cases are compared. If both the outputs are matched then the modifications does not affect the remaining part of the software. Rerunning the entire test suite of the previous version increases the cost and time of regression testing. In order to overcome these test case prioritization is used. Test case prioritization techniques schedules the test cases for the regression testing. Test cases with highest priority are scheduled to be executed first. There are several number of prioritization techniques are available with their own limitations. This paper presents a metric for assessing the rate of fault dependency detection. This proposed algorithm identifies the faults in earlier stages and the effectiveness of the prioritized test cases are compared with the non prioritized ones by APFDD.



P.Harikumar, A.Vijayaraj PG Scholar, harikumarsec@gmail.com, Associate Professor, vijayaraj@saveetha.ac.in Department Of Information Technology, Saveetha Engineering College, Thandalam, Chennai 602105, Tamilnadu, India

AbstractFuturistic advancements in wireless communication and mobile networking has innovated the popularity of the multimedia services with wireless streaming at finger tips of mobile users. To acquire these streaming video contents efficiently proposing an application for mobile phones with Android Operating System that can provide the required mechanisms for streaming video sharing is necessary. The Application is based on peer-to-peer communication between mobile phones, i.e. without the use of video processing servers or network infrastructure and allows sharing live information captured by mobile phone sensors (e.g., camera, microphone) with persons that might be multiple wireless hops away. The user can also be able to save the streaming video to SD card and watch it later whenever he wants.


S.Aashiq Banu1, M.Veni Saranya2 1 Final year M.E. (VLSI Design), aashiq.banu@gmail.com 2Assistant Professor, veni.saranya@gmail.com

Oxford Engineering College, Tiruchirapalli, TamilNadu.

Abstract-Data transmission through a channel requires more security, so security gaining is more importance than simply transmission. With the wireless communications coming to homes and offices, the need to have secure data transmission is of utmost importance. The information should be sent confidentially over the network without fear of hackers or unauthorized access to it. This makes security implementation in networks a crucial demand. Symmetric Encryption Cores provide data protection via the use of secret key only known to the encryption and decryption ends of the communication path. Secure transmission require cryptographic algorithm. This document describes the RC5 encryption algorithm, a fast symmetric block cipher suitable for hardware or software implementations. A novel feature of RC5 is the heavy use of data dependent rotations. RC5 has a variable word size, variable number of rounds and a variable length secret key. The encryption and decryption algorithms are exceptionally simple. The requirements of hardware implementation of these algorithms are less power consumption, allocation of resources, re-configurability, architecture efficiency and cost efficiency. This paper aims to the speed, improve performance and throughput. Its organized as brief introduction about algorithm, RC5 algorithm, System on Chip architecture, pipelined architecture, and results.



1J.Libi Sharon, 1 P.G Final Year, 2 Y.Ras Mathew, 2Assistant Professor, 3 A.Nagalinga Rajan, 3Assistant Professor, libisharon88@gmail.com, rasmathew@yahoo.com,

nagalingarajan@gmail.com, 1,3Department of Computer Science & Engineering, Infant Jesus College of Engineering, Keelavallanadu,Thoothukudi,Tamilnadu,India 2Department of Mechanical Engineering, Hindustian College of Engineering & Technology, Coimbatore, Tamilnadu, India.

Abstract - In current scenario many attempt has been made to solve the problem of clustering categorical data to combine with the result being competitive to conventional algorithm. It is observed that the technique is unfortunately generate a final data partition based on incomplete information. The ensemble information matrix presents only in clusterdata point relations, with many entries left unknown. We analysis that to suggest this problem degrades the quality of clustering result and also present a link-based approach which improves the conventional matrix by discovering unknown entries through similarity between clusters in an ensemble. The link-based algorithm is used to proposed similarity assessment. To obtain the final clustering result, a graph partitioning technique is applied to a weighted bipartite graph which is formulated from the refined matrix. On a multiple real data sets suggest that the proposed link-based method almost always outperforms both the conventional clustering algorithm for categorical data and well-known cluster ensemble techniques

PREDICTION OF CODE FAULTS USING NAIVE BAYES AND SVM CLASSIFIER S.Shanmuga priya, R.Rajaramya PG Scholar, Priya.yasha@gmail.com, Assistant Professor, ramyanitha@gmail.com Department Of Information Technology, Saveetha Engineering College, Thandalam, Chennai 602105


AbstractMachine learning classifiers have emerged as a way to predict the existence of a bug in a change made to a source code file. The classifier is first trained to build model based on software training datasets, and then this model is used to predict the fault in test set. The test set is constructed is in the attribute relation file format, where attribute are the metrics computed from the software code. The software metrics computed are from the following categories like Halstead Metrics, Complexity Metrics and LOC metrics. The proposed system over comes the problem of potential insufficiency in accuracy for practical use, and use of a large number of features. These large numbers of features adversely impact the accuracy of the approach. This paper proposes a feature selection technique applicable to classificationbased fault prediction. This technique is applied to predict faults in software codes. The performance of Naive Bayes and Support Vector Machine (SVM) classifiers is characterized. When new unseen bugs are given as input to the algorithms, the performance comparison of different algorithms is done on the basis of accuracy of F-Measure parameters.



Abstract - In this paper we have proposed an algorithm called Earliest deadline first with the virtual deadline. In these days embedded system is depends on many certification requirements. Certification requirements increase the scheduling problem. But these problems cannot be solved by the normal scheduling theory. This algorithm is mainly used for single pre-emptive processor. In such type of processor deadline met is a important factor. If any one of the process missed their deadline means the upcoming process cannot succeed. For that here we are going to find out a modified parameter called virtual deadline. After that

schedule and execute the process using virtual deadline instead of old deadline. This method gives the better performance compare with the existing system.

STRUCTURED DATA EXTRACTION FROM THE DEEP WEB USING TAG AND VALUE SIMILARITY Vimala.S M.E.(Software Engineering) Saveetha Engineering College, Chennai, India

Abstract: Online databases called the web databases comprise the deep web. Pages in deep web are dynamically generated on receiving a user query which is submitted through the query interface. Extracting data from these deep web pages automatically are very important in many applications which deal with multiple databases. But extracting structured data from these pages is a challenging problem due to the underlying intricate structures of such pages. Until now, a large number of techniques have been proposed to address this problem, but all of them have several limitations. The proposed work automatically extracts structured data from the deep web by first identifying the data regions and segmenting it into Query Result Records (QRRs) in the web page. It then aligns the QRRs into a table such that data values of the corresponding attributes are put into the same column. The proposed system makes use of both the tag and value data for the alignment to be accurate. It also handles non contiguous QRRs due to presence of auxiliary information and processes nested structures that may exist in the QRRs.


SECURED PAYMENT FOR OUTSOURCED COMPUTATION USING HOMOMORPHIC ENCRYPTION R. Chitra, M.Tech Student, Mr. A. Murugan, Assistant Professor 14chit1990@gmail.com, murugan.a@ktr.srmuniv.ac.in SRM University, Kattankulathur.

Abstract-With the recent advent of cloud computing and volunteer computing initiatives, users can outsource their computations for execution on computers with spare resources. The problem of outsourcing computations in distributed environments has several security problems . The existing system having the dishonesty problems between outsourcers and workers such as the worker is entitled to a probabilistic verification of the payment received before beginning the computation and the worker with an unfair advantage in recovering the payment before completing the job. The proposed system offers solving the above mentioned issues using Extended Ringer Payments Method. Also the proposed system, outsourcer is used to exact secret sharing and to compute shares of the payment token. Also it effectively prevents the worker from performing incomplete computations also it verifies the payment token before preceding the computation. This efficiently prevent from the dishonesty outsourcer. This project also encrypts payment information using cryptographic technique. The proposed solution addresses these concerns while involving bank only in the payment generation.









jayasudhainba@gmail.com S.R.M University,Kattankulathur

Abstract - The project proposes the image retrieval technique based on image contents using different wavelets. The need for efficient content-based image retrieval has increased tremendously in many application areas such as biomedicine, military, commerce, education, and web image classification and searching. Content-based Image Retrieval technology overcomes the defects of traditional text-based image retrieval technology, such as heavy workload and strong subjectivity. It makes full use of image content features (color, texture, shape, etc.), which are analyzed and extracted automatically by computer to achieve the effective retrieval using a single feature for image retrieval cannot be a good solution for the accuracy and efficiency. The images are stored in the database in RGB format. Due to illumination especially for outdoor image acquisition, RGB model gives different values in different environment that may reduce the retrieval performance but HSV model gives more stability, since the color information of HSV space is distributed separately from the illumination part in different channels. The performance of the image retrieval is differentiated by different wavelet transforms because it provides the image information more effectively and GLCM is an old classic method used to describe the texture in variety of image recognition fields. GLCM is useful since it make uses of the three major elements namely, texture information, histogram.


Sasi kumar p M.Tech IT Srm University


Abstract - An active worm is similar to virus that propagates itself on Internet to affect other computers. They affect a large number of computers because of their self-propagating nature as they continuously compromise computers on the Internet. These worms scan the IP addresses to affect the computers. Existing detection schemes are based on the propagation speed and the traffic scan volume. The camouflaging worm is a new type of active worm, which also has the self-propagating behavior similar to traditional worms but it can control its propagation speed by controlling the scan traffic volume. The C-Worm slowdown the propagation speed during the detection. We analyze the characteristics of C-Worm and conduct a comprehensive comparison between its traffic and non-worm traffic. These two types of traffic are barely distinguishable in the time domain and it can be distinguishable in the frequency domain. Thus the existing detection scheme does not detect the C-Worm. For detecting C-Worm we use frequency domain because in time domain it cannot be noticeable. Novel spectrum-based scheme is used to detect the C-Worms. It used Power Spectral Density (PSD) to scan traffic volume in frequency domain and uses Spectral Flatness Measure (SFM) to distinguish the C-Worm traffic from the non-worm traffic. This technique is used to detect the C-Worms and also traditional worms. INTEGRATION OF MULTIMEDIA MEDICAL DATA IN TO DISTRIBUTED MHEALTH SYSTEMS

C.Rebekhal kanmani, M.Tech Student, Mr. A. Murugan, Assistant Professor chellaakanmani@gmail.com, murugan.a@ktr.srmuniv.ac.in Department of Computer Science and Engineering, SRM University, Kattankulathur.

Abstract- With the advent of 4G and other long-term evolution (LTE) wireless networks, the traditional boundaries of patient record propagation are diminishing as networking technologies extend the reach of hospital infrastructure and provide on-demand mobile access to medical multimedia data. However, due to legacy and proprietary software, storage and decommissioning costs, and the price of centralization and redevelopment, it remains complex, expensive, and often unfeasible for hospitals to deploy their infrastructure for online and mobile use.This paper proposes the Spark Med data integration framework for

mobile healthcare (m-Health), which significantly benefits from the enhanced network capabilities of LTE wireless technologies, by enabling a wide range of heterogeneous medical software and database systems (such as the picture archiving and communication systems, hospital information system, and reporting systems) to be dynamically integrated into a cloud-like peer-to-peer multimedia data store. Our framework allows medical data applications to share data with mobile hosts over a wireless network (such as Wi-Fi and 3G), by binding to existing software systems and deploying them as m-Health applications. Spark Med integrates techniques from multimedia streaming, rich Internet applications (RIA), and remote procedure call (RPC) frameworks to construct a Self-managing, Pervasive Automated network for Medical Enterprise Data (Spark Med).


V.Rajkumar, C.T Manimegalai Department of Electronics and Communication Engineering SRM University, Chennai, India E-mail: vrajkumarmail@gmail.com , manimegalai.c@ktr.srmuniv.ac.in

Abstract - Wireless body area network are expected to be a breakthrough technology in healthcare areas such as hospital and telemedicine. Because the human body has a complex shape consisting of different tissues. it is expected that the nature of propagation of electromagnetic signals in the case of WBAN to be very different than the one found in other environment. Here we are going to expand the knowledge of IEEE 802.15.3a UWB channel by taking measurement of parameters in frequency range from 3-6GHz and transmitting to remote monitor with high data rate up to 480Mbps by using MB-OFDM and increasing the throughput with power efficiency.



rajendran.yasodha73@gmail.com, murugan.a@ktr.srmuniv.ac.in Department of Computer Science and Engineering, SRM University, Kattankulathur.

Abstract- The project mainly focussed on word based discovery. The need for supporting the classification and semantic annotation of services constitutes an important challenge for servicecentric software engineering. Latebinding and in general, service matching approaches require services to be semantically annotated. Such a semantic annotation may require, in turn, to be made in agreement to a specific ontology. Also, a service description needs to properly relate with other similar services. In this project, addresses the issue of web service discovery given non-explicit service description semantics that match a specific service request. This approach to semantic based web service discovery involves semanticbased service categorization and semantic enhancement of the service request. It proposes a solution for achieving functional level service categorization based on an ontology framework. Additionally, utilizes clustering for accurately classifying the web services based on service functionality. The semantic-based categorization is performed at the universal description discovery and integration (UDDI). The semantic enhancement of the service request achieves a better matching with relevant services. The service request enhancement involves expansion of additional terms (retrieved from ontology) that are deemed relevant for the requested functionality. An efficient matching of the enhanced service request with the retrieved service descriptions is achieved utilizing Latent Semantic Indexing (LSI).



K.Mohan Lakshmi Kumar M.Tech, Mrs P.Akilandeswari Asst. Prof mohanchinna7@gmail.com Dept. Of CSE, SRM University.

Abstract Mobile Ad hoc Networks (MANET) are highly vulnerable to attacks due to dynamic nature of its network infrastructure. Among these attacks, routing attacks have received considerable attention since it could cause the most devastating damage to MANET. Even though there exist several intrusion response techniques to reduce such critical attacks, existing solutions typically attempt to isolate malicious nodes based on binary or naive fuzzy responses which could lead to uncertainty in countering routing attacks in MANET. In this paper we propose a risk-aware response mechanism to systematically cope with the identified attacks by using Dempster-Shafer mathematical theory of evidence introducing a notion of importance factors. Our approach adopts an isolation mechanism in a temporal manner based on the risk value. We perform risk assessment with the extended D-S theory for both attacks and corresponding counter measures to make more accurate response decisions introducing a new concept of recovering the routing table.

HARVESTING IMAGE DATABASES FROM THE WEB Mr. C. Suresh Kumar, Ms. D. Anitha , assistant professor, sureses@gmail.com, anitha.d@ktr.srmuniv.ac.in Dept. of CSE, SRM University, Chennai, India.

Abstract The proposed system of prototype based re-ranking addresses the problems associated with the existing system of image search re-ranking such as unreliability. The proposed system links the relevance of the images to their initial rank positions. Then, we

employ a number of images from the initial search result as the prototypes that serve to visually represent the query and that are subsequently used to construct meta re-rankers. By applying different meta rerankers to an image from the initial result, re-ranking scores are generated, which are then aggregated using a linear model to produce the nal relevance score and the new rank position for a n image in the reranked search result. Human supervision is introduced to learn the model weights offline, prior to the on line reranking process. While model learning requires manual labelling of the results for a few queries, the resulting model is query independent and therefore applicable to any other query. The proposed system will outperform the existing supervised and unsupervised re ranking approaches.

TRUSTED APPLICATION CENTRIC AD HOC NETWORK 1Balavignesh B, 2Murugaanandam S 1,2Dept. of IT, SRM University, Chennai, India. balavignesh66@gmail.com, murugaanandam.s@ktr.srmuniv.ac.in

Abstract: We design and implementation of a policy enforcing mechanism based on Satem, a kernel-level trusted execution monitor built on top of the Trusted Platform Module. Under this mechanism, each application or protocol has an associated policy. Two instances of an application running on different nodes may engage in communication only if these nodes enforce the same set of policies for both the application and the underlying protocols used by the application. In this way, nodes can form trusted application centric networks. Before allowing a node to join such a network, Satem verifies its trustworthiness of enforcing the

required set of policies. Furthermore, Satem protects the policies and the software enforcing these policies from being tampered with. If any of them is compromised, Satem disconnects the node from the network. We demonstrate the correctness of our solution through security analysis, and its low overhead through performance evaluation of two MANET applications.


S.Mohammed Hajiludeen1, C.T Manimegalai2 Department of Telecommunication and Network Engineering SRM University, Chennai, India E-mail: 1hajiludeen@live.in , 2manimegalai.c@ktr.srmuniv.ac.in

Abstract A combined approach where APSK and low density parity check codes are used to reduce the complexity and power consumption of pulsed orthogonal frequency-division multiplexing (pulsed-OFDM) ultra wideband systems. Recent advances in consumer electronics (camcorders, DVD players, wireless USBs etc.) have created a great need for wireless communications at very high data rates over short distances. UWB systems have shown their ability to satisfy such needs by providing data rates of several hundred Mbps. APSK is recognized as spectrally efficient baseband modulation scheme. M-ary schemes are more bandwidth efficient, but more susceptible to noise. MPSK and QAM are bandwidth efficient but not power efficient. M-APSK has optimum distance between points and variation in amplitude, which requires less power and minimum interference. In the base paper they have used LDPC codes to achieve higher codes rates. Since QPSK modulation is used they can achieve a maximum SNR of 6dB to 8dB. To achieve SNR up to 16dB so that throughput of WPAN system improves by reducing power and complexity by using APSK modulation.









ULTRASOUND IMAGE A.Praveena(M.E), Dr.R.K.Selvakumar(M.Tech,Phd) Department of Computer and Communication Cape Institute Of Technology, Anna University, India praveenaneela@gmail.com, rkselvam@redfiffmail.com

Abstract - Accurate detection of object boundaries via active contours is an important approach in the field of medical imaging. Boundary detection involves identification of desired constraint that can be used for segmentation of the image. Most active contours converge toward some desired contour by minimizing a sum of internal (prior) and external (image measurement)energy terms. Such an approach is elegant, but suffers from the presence of noise or complex contours. To overcome the above mentioned limitation , we propose a new technique for boundary detection for ill-defined edges in noisy images using a novel edge following. The edge following technique is based on the vector image model and the edge map. The vector image model provides a more complete description of an image by considering both directions and magnitudes of image edges. The edge map is derived from Laws texture feature and the canny edge detection. The vector image model and the edge map are applied to select the best edges.








Josephin Jeeba J P, Mrs. P Anishya, M.E Dept of Computer and Communication, Dept of Electronics and Communication Engineering Cape Institute of Technology, Levengipuram, Anna University-Chennai, India

(josephinjeebajp@gmail.com), (smilingeyes.nic@gmail.com) AbstractMobile security applications generally require the ability to perform powerful pattern matching to protect against attacks such as viruses and spam. Unfortunately, mobile handsets development process has been driven by market demand, focusing on new features and neglecting security. So, it is imperative to study the existing challenges that facing the mobile handsets threat containment process, and the different techniques and methodologies that used to face those challenges and contain the mobile handsets malwares. In this project I have proposed a adaptive prefix matching based virus detection method. A adaptive prefix matching-based virus-detection unit provides high throughput, but also challenges for low power and low cost. In this project, an adaptively dividable dual-port (unifying binary and ternary CAMs) is proposed to achieve a high-throughput, low-power, and low-cost virusdetection processor for mobile devices. The proposed method is realized with the dual-port AND-type match-line scheme which is composed of dual-port dynamicAND gates. The dual-port designs reduce power consumption and increase storage efficiency due to shared storage spaces.

COLOR IMAGE ENHANCEMENT USING REGULARIZER BASED APPROACH L.K.Pavithra, Dr.R.K.Selvakumar Dept of computer and communication Cape Institute of Technology, Levengipuram, Anna University-Chennai, India pavithrra.pavi@gmail.com, rkselvam@rediffmail.com Abstract Color images provide more information for visual perception than that of the gray images Acquiring images from cameras are often degraded with blur, noise, or blur and noise simultaneously. Improvement of image quality has been highly demanding for users in digital imaging systems. The processing to be applied to these images depends on the way of

extracting wanted information in order to obtain better images. PDEs based on diffusion methods and a shock filter has been proposed as a tool for noise elimination, image enhancement and edge detection. However the PDEs models are not efficient for the complex set of the image and sometimes results in non-efficient systems. The proposed algorithm uses nonlocal regularizer approach. The nonlocal regularizer not only compares the color value at a single point but the geometrical configuration in a whole neighborhood (patch),which provides the cleanest and sharpest results without creating false colors.








NONDETERMINISTIC DTN M.Suhasini, Mrs.P.Anishya,M.E Dept of Computer and Communication, Dept of Computer Science and Engineering Cape Institute of Technology, Levengipuram, Anna University-Chennai, India suhasiniromiya@gmail.com Abstract Delay tolerant networks are a special type of wireless mobile networks which may lack continuous network connectivity. Due to uncertainty in nodal mobility, DTN routing usually employs multi-copy forwarding schemes. To avoid the cost associated with flooding, much effort has been focused on opportunistic forwarding, which aims to reduce the cost of forwarding while retaining a high performance rate by forwarding messages only to nodes that have high delivery probabilities. This paper aims to provide an optimal opportunistic forwarding protocol which maximizes the expected delivery rate while satisfying a certain constant on the number of forwarding per message. In our proposed optimal opportunistic forwarding (OOF) protocol, we use an optimal opportunistic forwarding metric derived by modeling where each forwarding as an optimal stopping rule problem. An exponentially weighted moving average (EWMA) scheme has been employed for on-line updating the nodal contact probabilities, with its mean proven to converge to the true contact probability.


DESIGN OF A MINIATURIZED SLOT MICROSTRIP PATCH ANTENNA FOR WLAN APPLICATIONS S. Reisha Marybeth, A. Jeyasheeba reishaishu@gmail.com, jeyasheebaa@gmail.com Dept of computer science and engineering Cape Institute of Technology, Levengipuram, Anna University-Chennai, India Abstract with the recent advances in telecommunications, the need for compact antennas has greatly increased. Electronic equipment has rapidly reduced in physical size due to the development of integrated circuits, especially in mobile communications; the demand for the smaller antennas is quite strong. However, requirements on antenna performance on such small equipment are becoming increasingly severe, since the antenna performance should not be significantly degraded as the size become smaller. The micro strip antenna is one of the most preferable for small equipment, especially when a built-in antenna is required. It has many advantages such as low profile and easy fabrication. However for low-frequency applications, the micro strip size becomes too large for practical implementation. One of the problems in micro strip antenna technology is the reduction of the antenna sizes. The main aim of this project is to design a miniaturized micro strip patch antenna with further development in bandwidth and improvement in gain. Theoretical calculations have been done for the WLAN frequencies 2.4GHz, 3.6GHz and 5GHz. By using the software Ansoft HFSS the antenna can be designed, fabricated and tested. The compact size and light weight features of the patch antennas make them perfect for various commercial and industrial applications.

ENERGY EFFICIENT DYNAMIC NODE SELECTION BASED HYBRID ALGORITHM Saranya.A,Frank Little Mary.F. sssaranya10@ gmail.com, frank_lissy@gmail.com Dept of Computer and Communication, Dept of Computer Science and Engineering

Cape Institute of Technology, Levengipuram, Anna University-Chennai, India Abstract - In wireless ad hoc network there are several routing algorithms, which utilize topology information to make routing decisions at each node, geographic routing to be efficient and robust in a dynamic environment. In geographical routing protocol to maintain neighbor position for making effective forwarding decision. Geographic routing has become one of the most suitable routing strategies in wireless mobile ad hoc network mainly due to its scalability.The principle approach in geographic routing is greedy forwarding, which fails if the packet encounters a void node (i.e., a node with no neighbor closer to the destination than itself). The proposed GRP maintain table after every transmission of data update its routing table for that node to sending data with smallest path in mobile ad hoc network. Our experimental results show the effectiveness of performance on sending data from updated table to conserve power and time and obtain minimum time delay, maximum throughput and minimum data drop and retransmission attempts.

MULTIMODAL BIOMETRICS USING SPARSE APPROXIMATED NEAREST POINTS BETWEEN IMAGE SETS Nivy G S, nivy.g.s@gmail.com, Dept of Computer and Communication Cape Institute of Technology, Levengipuram, Anna University-Chennai, India

Abstract Biometrics provides improved security over traditional electronic access control. This make use of physical or behavioral characteristics for authorizing an individual. Simply saying in order to make the biometrics to be ultra secure and to provide more than the average accuracy, we require more than one form of biometric identification. Hence the need arises for the usage of multimodal biometrics. Ear biometrics makes use of ear images for human recognition [5]. The handscan and palm biometrics use the hand geometry and palm pattern respectively [2]. A combination of these three techniques in multimodal biometric system greatly reduces the probability of accepting an impostor and it is a convenient

methodology that offers no or less subject cooperation. Yiqun H u, Ajmal S Mian and Roybn Owens proposed an efficient and robust Algorithm for image set classification [7].An image set is represented as a triplet: a number of image samples, their mean and an affine hull model. Then for calculating the between set distance there introduced Sparse Approximated Nearest Point (SANP) between set distance. Unlike existing methods in image classification, the dissimilarity of two sets is measured as the distance between their nearest points and those can be sparsely approximated from the image samples of their respective sets. Image classification using sparse combination of few image samples results in greater identification accuracy and improves performance as compared with the existing ones. In this project a multimodal biometrics system that makes use of ear, palm and hand geometry for individual identification using sparse approximated nearest points between image sets is proposed and the results shall be compared with other existing multimodal biometrics system.

INDIAN LICENSE PLATE RECOGNITION BASED ON OPTICAL CHARACTER RECOGNITION R.Dennis,Dr.R.K.Selvakumar dennisanies2005@gmail.com, rkselvam@reddiffmail.com Department of Computer and Communication,Department of Computer Science and Engineering Cape Institute of Technology,Levengipuram, Anna University,India Abstract- Indian license plate recognition based on optical character recognition (ILPROCR) plays an important role in numerous applications and a number of techniques have been proposed.The approach concerns stages of pre-processing, license plate detection, extract character and number from the detection plate, license plate segmentation and character recognition.In the experiments all types of license plate, camera obtained at different day time and weather conditions were used. This paper provides character recognizer for the identification of the characters in the license plate.


AN EFFICIENT COLLABORATIVE WATCHDOG METHOD FOR DETECTING SELFISH NODE OVER MOBILE AD HOC NETWORK Saranya.G.V , Mrs.F.Frank Little Mary.M.E, saranyagv.1989@gmail.com, frank_lissy@yahoo.co.in Department of Computer and Communication, Department of Computer Science and Engineering Cape Institute of Technology, Levengipuram, Anna University,Chennai,India

Abstract - Mobile Ad Hoc Network (MANET) is a collection of mobile nodes (hosts) which communicate with each other via wireless links either directly or relying on other nodes as routers. Due to the open structure and scarcely available battery-based energy, node misbehaviors may exist and some selfish nodes will participate in the route discovery and maintenance processes but refuse to forward data packets.A selfish node is one that tries to utilize the network resources for its own profit but is reluctant to spend its own for others.These selfish nodes reduce the overall data accessibility in the network. An optimal lter is designed for the purpose of detection. By using a Wavelet Transform (WT), the colored noise with complex Power Spectral Density (PSD) can be approximately whitened. Since a larger Signal to Noise Ratio (SNR) increases the detection rate and decreases the false alarm rate, the SNR is maximized by analyzing the signal at specic frequency ranges.





Soni L.K,Dr.R.K.Selvakumar

sones.lk@gmail.com, Dept of Computer and Communication, Cape Institute of Technology, Levengipuram Anna University-Chennai, India

Abstract: The high variability of sign appearance in uncontrolled environment has made the detection and classification of road sign a challenging task, because road signs provide information to the driver and help them to drive more safely and thus regulating their actions. in this paper we propose a method that automatically detect and classify road signs. there are three main stages in our proposed method 1) segmentation by clustering the pixels based on the color features to find region of interest 2)traffic sign detection 3)road sign recognition




Sowmya S, Mrs. S Sivakala, M.Tech (aymwossteph@gmail.com), (sivakalakrm@gmail.com) Dept of Computer and Communication, Dept of Computer Science and Engineering Cape Institute of Technology, Levengipuram, Anna University-Chennai, India

AbstractScenarios where nodes have limited energy and forward messages of different importances (priorities) are frequent in the context of wireless sensor networks. Recent research shows that significant energy saving can be achieved in mobility-enabled wireless sensor networks (WSNs) that visit sensor nodes and collect data from them via short-range

communications. However, a major performance bottleneck of such WSNs is the significantly increased latency in data collection due to the low movement speed of mobile base stations. To address this issue, a rendezvous-based data collection approach with selective message forwarding scheme is proposed in which a subset of nodes serves as rendezvous points that buffer and aggregate data originated from sources and transfer to the base station when it arrives. In this method to reduce the memory overhead of aggregation node selective message forwarding schemes are developed. The schemes will depend on parameters such as the available battery at the node, the energy cost of retransmitting a message, or the importance of messages. This approach combines the advantages of controlled mobility and in-network data caching and can achieve a desirable balance between network energy saving and data collection delay.

ROBUST WHITE MATTER LESION SEGMENTATION IN MRI BRAIN IMAGE USING KFCM Vandana.s.p PG student, D.Samson M.E Department Of Computer And Communication, Department Of Computer Science and Engineering Cape Institute of Technology, Levengipuram Anna University Chennai, India Lavansha.sp@gmail.com

Abstract White Matter Lesions (WMLs) are small areas of dead cells found in parts of the brain. In general, it is difficult for medical experts to accurately quantify the WMLs due to decreased contrast between White Matter (WM) and Grey Matter (GM).The aim of this paper is to automatically detect the White Matter Lesions which is present in the brains of elderly people. Here first kernel fuzzy c-means clustering (KFCM) was used to segment normal brain tissues (white matter, grey matter, and cerebrospinal fluid). The lesions in normal white matter were used to sample the WML intensities. The segmentation of WML was optimized


by a partial volume averaging. The proposed method have better performance when noise and other artifacts are present than the standard methods

SCALABLE ACTIVE LEARNING FOR MULTICLASS IMAGE CLASSIFICATION USING SUPER SELF ORGANIZING MAPS BASED 3D OBJECT RECOGNITION Sruthi.S.Nair Dept of computer and communication, Cape Institute of Technology, Levengipuram Anna University-Chennai, India srsruthirose@gmail.com

Abstract Machine learning techniques for computer vision applications like object recognition require a large number of training samples for satisfactory performance. In many cases, more than two dimensions are needed to provide a reasonably useful picture of data, so that visualization remains a problem. Recently Ajay.J.Joshi, Fatih Porikli and Nikolaos Papanikolopoulos have proposed a new interaction modality for training which requires only binary feedback using value of Information (VOI) algorithm [1] that chooses informative queries with locality sensitive hashing [1] to provide a fast approximation to active learning. As an approach for recognizing instances of a 3D object in a single camera image and for determining their 3D poses [5] by using only geometry information for recognition without using texture information have proposed. Self organizing maps do not rely on distributional assumptions and can handle huge data sets with ease. A super self organizing map is the self organizing maps with multiple parallel maps and can handle missing values. A new 3D object recognition method based on super self organizing maps is proposed and the results will be compared with value of information algorithm with locality sensitive hashing. The work will be enhanced for various 3D objects and the experimental results will be analyzed.

A NOVEL CIPHER SECURITY MECHANISM FOR IEEE 802.11i K.Antony Kumar PG Scholar, Department of Computer Science and Engineering, Saveetha Engineering College, Thandalam, Chennai. antonykmr32@gmail.com

Abstract In todays environment due to rapid development of internet, wireless security seems to be an important aspect of the communication / message transmission. Wireless security prevents unauthorized access or damage to confidential information. There are several cryptographic techniques in which the current encryption standard for wireless networks recommends using the AES algorithm. In the counter mode of AES algorithm, 128 bit input data is encrypted with 128 bit key brook and produces 128 bit encrypted output data before transmission. Moreover, the length of the data is directly proportional to the energy level consumption. This problem is addressed by using a novel cipher security mechanism called High Diffusion (HD) cipher. The High Diffusion cipher uses AES algorithm along with Counter(CTR) mode and Cipher Block Chaining(CBC). This HD cipher securely expands the given 128 bit counter value to a larger 288 bit key brook. In order to reduce energy loss, the data block size is increased and so encryption per frame decreases. When HD cipher is used instead of AES, we observe that energy efficiency due to HD cipher is significant for larger frame lengths.

ADAPTIVE DICTIONARY LEARNING BASED IMAGE SUPER RESOLUTION ENHANCEMENTS SweenaVijay, Mr.R.K.SelvaKumar, M.Tech., Ph.D., Sweenavijay8@gmail.com, rkselvam@rediffmail.com Dept of computer and communication, Dept of computer and communication


Cape Institute of Technology, Levengipuram, Anna University-Chennai, India

Abstract Image super-resolution (SR) reconstruction is the process of generating an image at a higher spatial resolution by using one or more low-resolution (LR) inputs from a scene. By super resolving an LR image, more robust performance can be achieved in many applications such as computer vision, medical imaging, video surveillance, and entertainment. Obtaining super resolution using high frequency component is a complex and complicated task. Active learning based approach can be used to provide efficient Super resolution approach. The existing approach is a novel generic image priorgradient profile prior, which implies the prior knowledge of natural image gradients. In this prior, the image gradients are represented by gradient profiles, which are 1-D profiles of gradient magnitudes perpendicular to image structures. Existing system is a gradient field transformation to constrain the gradient fields of the high resolution image and the enhanced image when performing single image super-resolution and sharpness enhancement .This project focuses on learning the basics set, also called dictionary, to adapt it to specific data, an approach that has recently proven to be very effective for signal reconstruction and classification in the audio and image processing domains. This project proposes a new online optimization algorithm for dictionary learning, based on stochastic approximations, which scale up

gracefully to large data sets with millions of training samples. A proof of convergence is presented, along with experiments with natural images demonstrating that it leads to faster performance and better dictionaries than classical batch algorithms for both .it is common to learn dictionaries adapted to small patches, with training data that may include several millions of these patches small and large datasets.









MULTIPURPOSE BROWSER SathyaSeelan.R1, PG Scholar, Saravanan.R2, Associate Professor, r.s.seelan@live.com, Engineering, Saveetha Engineering College, Thandalam, Chennai-602105



of Computer Science


Abstract- Phishing is the act of attempting to acquire information such as usernames, passwords, and credit card details by masquerading as a trust worth entity in an electronic communication there are several different techniques to control phishing, including legislation and technology created specifically to protect against phishing. This project provides the anti-phishing technique in the form of browser where a webpage is filtered at three levels, as Url filtering, server address filtering and Action content filtering which may provide more accuracy in identifying phishing pages and to block it.

SOCIETY SURVEILLANCE SYSTEM-A FORUM TO PORTRAY CITY NEEDS S.Niresh Kumar PG Scholar, Department of Computer Science and Engineering, Saveetha Engineering College, Thandalam, Chennai. s.nireshkumar@gmail.com

Abstract The Society Surveillance System (SSS) is used to address and solve the environmental problems. It enable the user to report about the problems they face in their daily activities (Eg : opened drainage , Flow of sewage water on road ,improper roads etc). Many of these problems cannot be identified and solved by government. These problems are going to be addressed in this system by creating a forum. The Authorized user can directly post their complaints in this forum which is maintained by government site. The Registered user can post the complaints with an image of such problem. Android mobiles are used to capture the image of the problem and GPS are used to calculate the current location where the problem actually took place and user may also include some descriptions about that problem. Any registered or non-registered user may report the abuse comments on the posted complaints. If the abuse information seems to be true then the server send an abuse description to the appropriate user therefore the user should remove their complaints within a certain period of time. Otherwise the problem is addressed by the concerned authority and the

work process starts. When the problem is solved the complaint automatically deleted after certain time period.

MOBILE TRANSLATION A MACHINE TRANSLATION SERVICE FOR HANDHELD DEVICES Valarmathy.P, PG Scholar, valarmathy22@gmail.com, Department of Computer Science and Engineering, Saveetha Engineering College, Thandalam, Chennai

Abstract Machine Translation [3] is the usage of software in translating a text or speech from one natural language to another. On the basic level, this performs simple substitution of words in one natural language for words in another. This alone does not produce a good translation. It should recognise the whole phrase. Current Machine Translation software allows for customization by domain or profession. This service can be extended to some hand-held devices including mobiles, Pocket PCs and PDAs. Such devices provide the advantage of instantaneous and non-mediated translation from one natural language to another with less cost than a human translator. The main aim of this paper is to provide such translation in mobile phones.

ADAPTIVE PERFORMANCE FOR THE PREDICTION OF GRID RESOURCES D.Hena Petricia#1 J.R.Balakrishnan #2 #1Student, Dept. of CSE , hena.cse25@gmail.com

#2 Professor, Dept. of CSE, Anand Institute of Technology, Kazhipattur, Chennai, TamilNadu, India

Abstract - Despite the wide spread and usage of traditional computing, it does exhibit various limitations. These limitations are overcome by Grid Computing. The core functions of grid computing are resource allocation and job scheduling. These functions are based on adequate information of available resources. It is very important to ensure that the resource status information is acquired on time. In large scale applications several heterogeneous resources are distributed across the networks. The Adaptive scheduler for computational grids with ATOP grid middleware will overcome these issues. During overload the Adaptive Partition Scheduler will enforce hard limits on the total run-time for the sub-systems within a partition, as dictated by the allocated bandwidth for the particular partition. In addition the proposed concept will also support automatic data backup on the local disk. For the expectation of achieving higher performance, our proposed map reduce and bees algorithm shows the monitoring subsystem does bring obvious influence on computing nodes.


1Mithila Bell. S,2 Roselin Mary. S 1,2Department of Computer Science and Engineering, Anand Institute of Higher Technology, Kazhipattur, India. 1mithila.bell@gmail.com, 2jesuroselin@gmail.com

Abstract - The necessity of effective space management and faster retrieval of data in distributed file systems are increasing day-by-day. Various advancement in the technologies of distributed file systems lead to various features like storing the data in the databases, storing the raw documents in the server and storing the files in the FTP. These existing data storage systems are based on the hierarchical directory tree organization and they do not meet

the scalability and functionality requirements of exponentially growing Exabyte level file systems with billions of files. The concept of creating a custom repository for storing the data in the file system is proposed. Data in the custom container will be stored after compression. To enhance the process of retrieving the data faster, the indexing of data inside the container is achieved. The facilities of restringing the data access, providing security and access rights on the documents are made available in the container by means of encryption.


S.Karthika#1 D.Arul Devarajam #2 #1Student, Dept. of CSE , askarthi.cse@gmail.com#1, #2Assistant Professor, aruldevarajam@gmail.com#2 Dept. of CSE, Anand Institute of Technology, Kazhipattur, Chennai, TamilNadu, India

Abstract - Wireless sensor networks are widely used for monitoring an area. The use of sensors is to detect enemy intrusion and geofencing of gas and oil pipelines and also efficient for data accumulation, localize sensor reprogramming. The problem of authentication and pairwise key establishment in sensor networks with mobile sink is still not solved in the face of mobile sink replication attacks. The concept of q-composite key predistribution and a general three-tier security framework for authentication and pairwise key establishment between mobile sinks and sensor nodes is also used. The polynomial pool-based key predistribution scheme substantially improved network resilience to mobile sink replication attacks compared to the single polynomial pool based key predistribution. Q-composite with the advancement of encryption and decryption using twofish algorithm is proposed.

Twofishs algorithm is used for Pre-computing key-dependent S-boxes, and a relatively complex key schedule. In addition the messages which are communicated between the nodes and mobile sinks will be stored in the server. This key predistribution scheme provides low cost, secure communication between sensor nodes and mobile sinks.



PG Scholar, Dr.S.GODFREYWINSTER, Professor,

ponrajes4174@gmail.com, godfrey@saveetha.ac.in Department of Computer Science and Engineering, Saveetha Engineering College, Thandalam, Chennai 602105.

Abstract In a research on mobile commerce has received a lot of interests from both of the industry and academia. Among them, one of the active topic areas is the mining and prediction of users mobile commerce behaviors such as their movements and purchase transactions. In this paper, we propose a novel framework, called Mobile Commerce Explorer (MCE), for mining and prediction of mobile users movements and purchase transactions under the context of mobile commerce. The MCE framework consists of three major components: 1) Similarity Inference Model (SIM) for measuring the similarities among stores and items, which are two basic mobile commerce entities considered in this paper; 2) Personal Mobile Commerce Pattern Mine (PMCP-Mine) algorithm for efficient discovery of mobile users Personal Mobile Commerce Patterns (PMCPs); and 3) Mobile Commerce Behavior Predictor MCBP for prediction of possible mobile user behaviours

DYNAMIC DETECTION OF ACTIVE WORMS IN INFECTED HOST Ms.Janani BaskaranMr.S.Sanjeeve Kumar Anna University, Regional Centre, Madurai

janani.sarojini89@gmail.com,sanjeevesankar@gmail.com ABSTRACT The Internet has developed to give many benefits to mankind. The access to

information being one of the most important. Worms cause major security threats to the Internet. Worms are software components that are capable of infecting a computer and then using that computer to infect another computer. The cycle is repeated, and the population of worm-infected computers grows rapidly. Smart worms cause most important security threats to the Internet. The ability of smart worms spread in an automated fashion and can flood the internet in a very short time. A new class of smart worms, referred to as Camouflaging worm (C-Worm in short). C-worm is different from traditional worm. C-worm intelligently

manipulate its scan traffic volume over time. Motivated by observations, we designed a novel Spectrum Based detection Scheme to detect the C-worm. This scheme uses Power Spectral Density (PSD) distribution of the scan traffic volume and its corresponding Spectral Flatness Measure (SFM) to distinguish the C-worm traffic from background traffic. This scheme is effectively detecting not only the C-worm but traditional worms as well. The goal is to prevent, detect and delete the smart worms as well as traditional worms.

Performance Analysis of Energy Adaptive Mechanism for Capacity Maximization in Wireless Sensor Network E. Ayyammal, Mr. M. Saravanan Anand Institute of Higher Technology, Anna University, Chennai Mail id: ayyammals@yahoo.co.in Abstract Cognitive radio technology reduce wireless resource scarcity problem by changing frequency spectrum, power and modulation type. Opportunistic spectrum access increases the network capability and quality. Wireless sensor network with capabilities of cognitive radio generated the paradigm of cognitive radio sensor networks overcoming the challenges posed by event-driven traffic demands of wireless sensor network. The aim of this paper is to analyze power and rate adaptation problem using energy adaptive mechanisms and information correlations for a multi-hop cognitive radio sensor network in an information theoretical capacity maximization framework. An optimization framework will be introduced

for a multi-hop cognitive radio sensor network topology maximizing the information capacity sent to the sink combining cognitive radio sensor network characteristics with energy adaptive mechanisms, power-bandwidth control and information correlation utilization. The capacity optimization problem will be defined analytically and practical local schemes will be analyzed to find the superiority of objective functions utilizing information correlation and energy adaptive mechanisms in terms of maximum information rate at sink i.e. Rmax. The performance will be analyzed by varying total bandwidth and various energy distributions and to observe the dependence of Rmax on total bandwidth. Simulation result is given for the relation between data collected from sensors and available capacity when relays operating at full power by varying total bandwidth. Index TermsCognitive wireless sensor network, adaptive power control, information capacity.

Mitigation of Flooding attack using defense mechanism in Mobile Adhoc Networks M.Muthumeenakshi#1 , Dr. P.Subathra#2 Thiagarajar College of Engineering muthumeenakshime@tce.edu,pscse@tce.edu Abstract---A Mobile Adhoc Network (MANET) is a self configuring network of mobile routers connected by wireless links, which form the topology with low cost on the fly. Because of using some consumer electronic devices generally operate on limited battery power and therefore vulnerable to security threats like data flooding attacks. In the data flooding, malicious nodes flood the network by sending useless data packets. These useless data packet exhausts the network resources and hence legitimated user can not able to use the resources for valid communication. Hence we propose a period based defense mechanism against data flooding attacks with the aim of enhancing the throughput by generating the flooding attack and identify the malicious & victim node with respect of buffer size in the mobile adhoc networks. Keywords- Mobile Adhoc Network, data flooding attack, throughput.


Study of Availability Awareness in Task Scheduling Problem

Dr. S.PADMAVATHI,M.PRADEEPA MEENAKSHI * Thiagarajar College of Engineering,Thiagarajar College of Engineering spmcse@tce.edu, *deepamathesan@gmail.com
Availability is a significant factor for executing any application in the cluster computing environment. Most of the existing scheduling algorithms were designed to improve the response time of the multiclass tasks ignoring the availability constraints. To explore this issue, each node in a computing resource is modeled using the nodes computing capability and availability. Multiple classes of tasks are characterized by their execution times and availability requirements. A novel scheduling strategy is adopted to effectively improve the response time of the computing resources by considering the availability constraints of the nodes in cluster computing environment. Experimental results show that the proposed algorithm is better than the existing algorithms in terms of the response time. Keywords: Availability, response time, computing environment, multiclass tasks.

Task Assignment on MPSoC to minimize Worst-Case Execution Time

S.Santhana Prabha#1 , Dr. P.Chitra#2
Thiagarajar College of Engineering
santhanaprabhasivakumar@gmail.com, pccse@tce.edu

Abstract Real time applications running on multicore processors require that tasks complete their execution before specific deadlines. Therefore it is important to obtain the accurate worst-case execution time (WCET) of those applications, due to inter-core interferences of using shared resources like the shared caches. The main objective of this project is to minimize the overall worst-case execution time of cores by optimal task assignment on Multi-Processor Systems-on-Chip (MPSoC). In Multicore processors, task assignment to each individual core plays an important role to minimize the overall worst case execution time. Here task assignment algorithms are implemented for assigning tasks to each core. The results of the algorithms are compared to find the optimal task assignment to each core, which minimizes the overall execution time of the number of cores present. Keywords Level 1 cache, Level 2 cache, Task assignment, Worst-Case Execution Time analysis.

Attacker Group Recognition Using JEAN Methodology

K.NARASIMHA MALLIKARJUNAN,K.GOPIPRIYADHARSHINI* Thiagarajar College of Engineering arjunkambaraj@tce.edu,gopipriyame@tce.edu
Abstract- Networks are subject to attacks from malicious sources. The attacker follows various sequences to attack the network or the system. The attack can be single stage or multi-stage. The multiple sequences of steps taken by the attacker to attack the victim is known as multi stage attack.Once the sequences of the attacker are known or the behaviour is analyzed, we can secure the network. To secure the network it is essential to find the attackers behaviour, the members of the group or the group set and their intention. This proposed method is able to find the attack scenario which is the logical relation between the alerts, attack intention, next stage of attack, next victim and also the attackers group. Attackers group identification along with the multi -stage attack forecast is the novelty in this proposed work. Keyword- NetworkSecurity,Intrusion detection,Attack behavior,Multi-stage attack,Alert Correlation.



Dr.N.Kamaraj C. Senthilkumar R. Vigneshwari Thiagarajar College of Engineering nkeee@tce.edu cskcse@tce.edu rvigneshwari.cse@gmail.com

Abstract Due to lack of infrastructure in MANETs, nodes must play the roles of router, server and client, compelling them to cooperate for the correct operation of the network. Mobile ad-hoc network can work properly only if the participating nodes cooperate in routing and forwarding. In MANET a source node must rely on other nodes to forward its packets on multihop routes to the destination. Specific protocol has been proposed for ad hoc networks for a perfect cooperation among nodes. The required changes in the DSR protocol to support trust in DSR are done. By incorporating these changes a trusted route is formed by introducing the concepts of trust which aims to improve the path length for a reliable and feasible packet delivery. This protocol provides mobile nodes with an approach to evaluate the trust degree of other nodes including direct neighbor and strange nodes.
Index TermsMobile ad hoc network, trusted route, trust degree.

T. Manikandan Thiagarajar College of Engineering Madurai Abstract Securing routing functions is very important for to provide successful end to end communication, it is very difficult to achieve in the mobile adhoc networks, which includes many challenges like improving detection accuracy when facing highly dynamic characteristic of such networks and differentiating malicious nodes under a totally autonomous structure. In this paper we proposed a complete intrusion detection system that intends to solve challenges in such a way that, to improve the detection accuracy it rely on the collaboration among the nodes neighboring the suspected node and integrate the information, improve the detection accuracy and reject the false routing updates. It works like a court in real life any node can be suspected for illegal behavior. The judge of the court has the authority to make final decision through examining the information collected.

Keywords: routing, mobile adhoc networks, intrusion detection system



University College of Engineering, Nagercoil.


Abstract - Intrusion Detection System is a rapidly growing area of interest because of increasing large number of security threats. Proposed work describes, a novel neural network method based on Genetic Algorithm (GA) for detecting network intrusions. Multi Layer Perceptron (MLP) is used for intrusion detection which is reasonably capable of classification of network connection records. GA is a randomized search technique often used to find the approximate of the solutions to combinatorial optimization problems. My project work provides optimal intrusion detection mechanism in order to minimize amount of features and maximize the detection rates. Key Terms - Intrusion Detection System (IDS), Multi Layer Perceptron (MLP), Genetic Algorithm (GA), Detection rate, False positive, False negative, Neural network.