......

..

IJCTA-Volume 3 Issue 6/November-December 2012/Page-2
S.No
Title/Author Name
Page No
21
A Method of Region Considered Segmentation for Automatic Image Registration
-V .BabyVennila,B.Bhuvaneswari,Dr .R.K.Gnanamurthy
Abstract
Automatic image registration is still an actual challenge in several fields. Even though many image registration algorithms have been developed over the years, it is far from the broad use of applications. This paper deals with problem faced in the area of remote sensing consisting of image registration. Generally remote sensing images may have different gray –level characteristics and multiple features, so simple techniques cannot be applied directly. In this work, new automatic image registration through region considered segmentation is proposed. This approach performs region considered segmentation for the pair of images to be registered, followed by features characterization of the extracted objects with control point selection and objects matching. It leads to the registration of satellite images (multisensor and multiband) with differences in rotation and translation. The region considered segmentation for automatic image registration leads to a more sub pixel accuracy.
Index Terms – Region considered segmentation; image matching; remote sensing; control point selection
1986-1990


  PDF

22
A Method Level Security Metrics Suite for Java Programs
-Sree Ram Kumar T,Dr K Alagarsamy
Abstract
One of the biggest challenges faced by software engineers today is the engineering of secure software. Attempts are being made to apply the principles originally proposed for the engineering of “quality” software to security. One of such principles is related to the development and usage of “metrics” which are measures serving as indicators of how much of “something” software possesses. Security metrics attempt to measure the “amount” of security a software has. In this paper, we propose some metrics, which apply at the source code level that can serve as a guide for software developers in identifying the most vulnerable parts of the source code. We also demonstrate the validity of the proposed metrics through empirical results.
Keywords: Security Metrics, Security Source Code Metrics
1991-1996

  PDF

23
Parallelization of Bioinformatics Algorithm using Genetic Algorithm on Multicore platform
-P. S. Borkar,A. R. Mahajan
Abstract
Nowadays, multicore processor and GPUs have entered the mainstream of microprocessor development. Multicore processors are widely used across many application domains including general-purpose, embedded, network, digital signal processing and graphics. The amount of performance gained by the use of a multicore processor depends very much on the software algorithms and implementation. In particular, the possible gains are limited by the fraction of the software that can be parallelized to run on multiple cores simultaneously. With the popularity of multicore problems, how to make use of multicore computing resources to accelerate the traditional serial applications has become a common concern problem. Multicore processor architectures provide high computation power by providing multiple levels of parallelism. It is a well known fact that genetic algorithms (GAs) are ideal for parallel computers due to their ability to parallel evaluate population members. Generally, the population can be split into several sub-populations and can execute them in parallel to speed up the processing of generation of population. In the field of Bioinformatics, for computation of larger data sets as well as for complex algorithms, such recent advance in the computer architecture help to provide optimal solution in short time. In this paper, we have given the proposed work about parallelism of complex bioinformatics algorithm using parallel genetic algorithm.
Keywords- multicore architecture, parallel genetic algorithm, bioinformatics algorithm.
1997-2000


  PDF

24
Robust Aggregation in Sensor Network: An Efficient Frequent itemset and Number of occurrence counting
-Kayalvizhi s,Vanitha k
Abstract
Sensor networks are collection of sensor nodes which co-operatively send sensed data to base station. As sensor nodes are battery driven, an efficient utilization of power is essential in order to use networks for long duration hence it is needed to reduce data traffic inside sensor networks, reduce amount of data that need to send to base station. The aim of the project is to develop scalable aggregation methods to extract useful information from the data the sensors collect. Partitioning large set of data, for the result of horizontal aggregation, in to homogeneous dataset is important task in this system. Association rule apriority algorithm using SQL is best suited for implementing this operation. In this project we consider the PIVOT operator which is a built-in operator in a commercial DBMS. Since this operator can perform transposition it can help evaluating horizontal aggregations, our proposal though the list of distinct to values must be provided by the user, whereas ours does it automatically, output columns are automatically created.
Index Terms- Data aggregation, Wireless Sensor Network (WSN), SPJ, Horizontal aggregation, case, pivot, Frequent itemset.
2001-2005


  PDF

25
Self Medical Diagnosis using Artificial Neural Network
-Prof.Mahesh P.Gaikwad
Abstract
The major problem in medical field is to diagnose diseases. Human beings always make mistake and because of their limitation, diagnosis would give the major issue of human expertise. One of the most important problems of medical diagnosis, in general, is the subjectivity of the specialist. It can be noted, in particular in pattern recognition activities, that the experience of the professional is closely related to the final diagnosis. This is due to the fact that the result does not depend on a systematized solution but on the interpretation of the patient's signal .The basis for a valid diagnosis, a sufficient number of experienced cases, is reached only in the middle of a physician’s career and is therefore not yet present at the end of the academic formation. This is especially true for rare or new diseases where also experienced physicians are in the same situation as newcomers. Principally, humans do not resemble statistic computers but pattern recognition systems. Humans can recognize patterns or objects very easily but fail when probabilities have to be assigned to observations. This paper is basically a medical diagnosis system which tells the user about the disease he/she is having on the basis of his/her symptoms.
Keywords: ANN, KMP, SOM algorithm
2006-2013

  PDF

26
Graph Invariants and Graph Isomorphism
-Vijaya Balpande,Dr. Anjali R Mahajan
Abstract
In graph theory, Graph Isomorphism is an important issue. Information in the database can be stored in the form of graph. Graph represents the structural information in an efficient way. Graph Isomorphism problem is to determine if there exists one to one correspondence between the structures of two graphs. Graph isomorphism problem arises in many fields such as chemistry, switching theory, information retrieval, social networks, etc. Graph Invariants are used to determine the isomorphism between two graphs. Graph Invariant is only the necessary condition for graph Isomorphism. Most of the researchers believe that isomorphism problem is NP complete problem and the most difficult problem .Graph isomorphism is a NP complete or not is always a hot issue for the researchers to study. In this paper we discuss the graph invariants and graph isomorphism techniques.
Keywords- Graph Isomorphism, Subgraph Isomorphism, and Graph Invariant
2014-2017

  PDF

27
Decision Tree approach to predict the Gender wise response on the volume of applications in government organizations for recruitment process
-Sunil Kumar Chhillar,Baljit Singh Khehra
Abstract
In this paper, we propose a practical method to predict the gender wise response on the volume of applications in government organization for recruitment process using Microsoft Decision Trees Algorithm which is a classification and regression algorithm provided by Microsoft SQL Server Analysis Services (SSAS) for use in predictive modeling of both discrete and continuous attributes. The Microsoft Decision Trees algorithm approach is used builds a data-mining model by creating a series of splits, also called nodes, in the tree. The algorithm adds a node to the model every time an input column is found to be significantly correlated with the predictable column. The sample calculation shows that the proposed method is highly reliable that provides practical approach and tools for the wide use of data mining to predict results based on input, or attempt to find relationships among data or cluster data in previously unrecognized yet similar groups. Earlier the predictions are available, better will be their utility for government i.e. provides valuable information with regard to procedure fixation, improve service delivery, reduce costs, redefine administrative processes etc.
Keywords- SSAS; Lift Chart, Decision tree
2018-2021



  PDF

28
A Review on Broadcasting Schemes for Emergency Messages in VANET
-Ami Dalal,Sumitra Menaria
Abstract
VANET means Vehicular Adhoc Network which is an upcoming field and a new technology for intelligent transportation system. It has a tremendous potential to reduce the traffic related issues and also to provide safety to the drivers of the vehicles and the passengers. The most important part of VANET is the broadcasting of the messages to the vehicles in order to solve the problems like building shadow, connection hole and intersection problems. VANET sends emergency messages to the vehicles. In most applications VANET need to broadcast messages periodically to the other vehicles regarding to their positions. These broadcasting messages make it easy to track the vehicles in case of any anomaly. In this paper we study the different broadcasting schemes for broadcasting emergency messages.
Keywords: VANET, broadcasting messages, ITS, emergency messages
2022-2026

  PDF

29
Deal with attacks over cognitive radio networks authentication
-Saeideh Taheri,Ahmad Sharifi,Reza Berangi
Abstract
According to the recent development, we look forward to further widespread use of wireless networks soon. Thus one of the issues that we face to them is shortage of spectrum. Cognitive radio networks (CRNs) are an appropriate way to overcome this deficiency and also CRNs increase productivity and efficiency of spectrum. The crucial assumption is that secondary user should not interfere with primary user, therefore when primary user does not use of spectrum, usage permission of this spectrum is given to authorized secondary user. So there must be a secure way for CRNs authentication and it must be able to deal with the attacks in CRNs authentication. In this paper we discuss about classification and attacks definition, dangers and risks in wireless networks and CR and eventually we propose a new method for preventing attacks in CRNs based on ID encryption in the encrypting method based on two encryption keys.
Keywords: Cognitive radio networks, Authentication,Attacks, ID, Key, Encryption
2027-2032

  PDF

30
A New Method for Forecasting Enrollments based on Fuzzy Time Series with Higher Forecast Accuracy Rate
-Preetika Saxena,Santhosh Easo
Abstract
The Time-Series models have been used to make predictions in whether forecasting, academic enrollments, etc. Song & Chissom introduced the concept of fuzzy time series in 1993[11]. Over the past 19 years, many fuzzy time series methods have been proposed for forecasting enrollments. These methods mainly focus on 3 factors, namely, the universe of discourse, partition of discourse and the defuzzification method. These methods have either used enrollment numbers or difference of enrollments or percentage change as the universe of discourse. And either used frequency density based portioning or ratio-based portioning as the partition of discourse. For defuzzification process, these methods either used Chen’s method or Jilani’s method. The main issue in forecasting is in improving accuracy. But the forecasting accuracy rate of the existing methods is not good enough. This paper proposed a method based on fuzzy time series, which gives the higher forecasting accuracy rate than the existing methods. The proposed method used the percentage change as the universe of discourse; mean based partitioning as the partition of discourse and centroid method for defuzzification. To illustrate the forecasting process, the historical enrollment of University of Alabama is used.
Keywords- fuzzy set, fuzzy time series, forecasting, forecast error, time variant model, first order model.
2033-2037


  PDF

31
An Approach to Proactive Risk Classification
-M.S. Rojabanu, Dr. K. Alagarsamy
Abstract
A narrow approach to risk analysis and understanding the scope of a software project has contributed to significant software failures. Using expanded risk analysis with enlarged the project scope considered by software developers. This paper proposes a new model for the proactive risk management based on using the apriori algorithm for generating association rules. The Model is discussed, the possibilities of building such model and the outcome is also discussed.
Keywords: Proactive risk Management, Classification, Apriori Algorithm.
2038-2045

  PDF

32
Securing Anonymizing Networks from Sybil Attacks
-Arunasri Chamarti,P.Rajasekhar
Abstract
In bio-medicine and other health-related fields a need for anonymous access to Grid resources has been identified. Anonymity networks such as Tor provide users with a means to communicate privately over the Internet without disclosing their IP addresses. But the success of such networks, however, has been limited by users employing this anonymity for abusive purposes, such as defacing Wikipedia. In such cases, the Anonymizing networks as intended should disable access to misbehaving users while providing anonymous connections to the honest users. But the anonymity provided by such networks prevents website administrators from blacklisting individual malicious users IP addresses. As a result, administrators block all network nodes. To solve this problem, a system called NYMBLE was introduced which can blacklist malicious users without the knowledge of their IP addresses while allowing honest users to connect anonymously. But Anonymous authentication scheme may be vulnerable to Sybil attacks if users can get new credentials after their issued credential is blacklisted. In this paper, it is discussed how the Sybil Attacks can be prevented in the Anonymizing networks using a Central Authority whose role is to ensure that each pseudonym belongs to some valid user.
Index Terms- Anonymizing Networks, Privacy, Pseudonymity, Sybil Attacks, Security.
2046-2052

  PDF

33
Fast and Efficient IP Network Recovery using Multiple Routing Configuration
-Kriti Baranwal,Preeti Gupta,Priti Srivastava
Abstract
Internet plays a vital role in our communications infrastructure, due to slow convergence of routing protocols after network failure become a growing problem. To guarantee fast recovery from link and node failure in networks, we propose a new recovery scheme called Multiple Routing Configuration (MRC). Our proposed scheme guarantees recovery in all single failure states, using a mechanism to handle both link and node failures without knowing the actual cause of the failure. MRC is strictly connectionless, and assumes only hop-by-hop forwarding from source to destination. MRC is based on maintaining additional routing information in the routers, and allows packet forwarding to carry on an alternative output link immediately after the detection of a failure. In this paper we present MRC, and examine its performance with respect to load distribution after a failure. We also present how an estimate of the traffic demands in the network can be used to improve the distribution of the recovered traffic, and thus reduce the chances of congestion in the network.
Keywords - Availability, computer network reliability, communication system routing, MRC, protection.
2053-2056


  PDF

34
Cost effective selection of Data center by Proximity-Based Routing Policy for Service Brokering in Cloud Environment
-Devyaniba Chudasama,Naimisha Trivedi,Richa Sinha
Abstract
Cloud Services may be offered as public or private or combined. There is the demand of timely, repeatable, and controllable methodologies for evaluation of algorithms, applications, and policies before actual development of cloud products. CloudAnalyst is a tool that helps developers to simulate largescale Cloud applications with the purpose of understanding performance of such applications under various deployment configurations. CloudAnalyst deploys different service brokering policies depending on the requirements of the cloud application. The service proximity based routing policy used in the simulator selects data center from the earliest region to route the user requests. When there is the situation to select one data center from those with same region, the policy selects data-center randomly without considering cost-effectiveness or other parameters. we propose an enhanced proximity-based routing policy that select cost effective data center.
Keywords - Cloud Computing, Simulation, CloudSim, CloudAnalyst, Internet Cloudlet, Region, Service Broker
2057-2059


  PDF

35
An Automated Neuro Model for Software Effort Estimation and RMMI using Competitive Learning
-M.Senthil Kumar,Dr.B.Chidhambara Rajan
Abstract
This Paper addresses the neural model for analyzing software effort estimation and risk prediction using the automated neuro tool is proposed. The model carries some of the desirable features of the neural approaches such as learning ability and good interoperability with the use case models of software estimation. It can train the system to calculate the Use case points without use of the automated use case tools and XML input files. Approach: In this paper, is discussed & compared with the existing work on effort estimation and also it proposes a Neural Network training Approach for effort estimation process with good judgments between use case types. Results: Knowledge acquired by the network from the environment through a learning process is given as the input. Conclusion: Furthermore, this method can help provide a path for Risk Mitigation and Monitoring (RMMI) measures and reduction of development cost of the software projects.
Keywords: Effort Estimation, Neural Networks, RMMI, XML files.
2060-2065


  PDF

36
A Novel Technique of Replicate Data Detection for Best Precision in Cloud-Based Computing
-Pritaj Yadav,Prof. Alka Gulati,Prof. Vineet Richharia
Abstract
We propose a highly efficient and scalable duplicate-search technique based on hash algorithm, Cloud-based computing is an emerging practice that offers significantly more infrastructure and financial flexibility than traditional computing models which requires very low computational cost and memory cost. Larger enterprises may have implemented very strong security approaches that may or may not be equaled by cloud providers, but don't just assume that security is a problem. Look for the type of security functionality you would look for in an in-house solution. A documents may get mirrored to avoid delays or to provide fault tolerance. Our algorithm RDDA for detecting replicate documents are critical in applications where data is obtained from multiple sources. The removal of replicate documents is necessary, not only to reduce run time, but also to improve search accuracy. Today, search engine crawlers are retrieving billions of unique URL’s, of which hundreds of millions are replicates of some form. This function rapidly compares large numbers of files for identical content by computing the SHA-256 hash of each file and detecting replicates. The probability of two non-identical files having the same hash, even in a hypothetical directory containing millions of files, is exceedingly remote. By efficiently presenting only unique documents, user satisfaction is likely to increase.
Keywords- unique documents, detecting replicate, replication, search engine, SHA.
2066-2072


  PDF

37
ALIX Route optimization Mechanism in MANET’s For AODV
-M.Karthik,K. Venkateswara Rao
Abstract
As the routing is the basic cause of networking, so the concept of networking is interdependent with methodology of routing. In reactive routing protocols one the AODV plays the active role in obtaining the routes when needed. In related to the Ad-hoc networks like MANET’s, it is considered as the very most prioritized protocol.In this paper a slight advantageous attachment to the AODV routing protocol makes luxurious in connecting nodes. The new protocol was Aodv+Shrinking formed as ALIX (Aodv Link Extension). It evaluates the shortcuts in the route and avoids the two major problems encountering previously. 1 sub optimal 2 connection breaks. ALIX ensures these two problems will be overcome by initiating the continuous route optimization over the cases of node mobility, as well as it defines the path lengths are topologically efficient. ALIX uses one more control packet to avoid unnecessary hop’s travelling.
Keywords: Ad-hoc, AODV, ALIX
2073-2076

  PDF

38
Performance Analysis of Shortest Path Energy Efficient Routing Protocol in Wireless Sensor Networks Using NSRS Algorithm
-Indumathi.A, Vinoba.V, Padmavathy.T.V
Abstract
Wireless Sensor Network is an up-and-coming technology that shows great promise for various innovative applications both for public and military. Wireless Sensor Networks (WSNs) are ad-hoc wireless networks of typically consists of hundreds or even thousands of small low-cost sensor nodes that communicate in a wireless way. A sensor node is a small independent unit, often running on batteries, with hardware to sense environmental characteristics, a processor and a radio transceiver. All nodes send their sensed data to a central gateway node. Existing network protocols are not suitable for the WSN setting, since they often require a lot of information exchange. Therefore, suitable WSN protocols are required and their correctness and robustness is essential. However, only few techniques are available to support the design of these protocols. We propose a novel energy efficient routing protocol by using quadratic assignment technique. Our protocol achieves energy efficiency by finding an optimal path to transfer the data from source to destination. In this paper the proposed routing protocol, the sensor nodes identifying the neighboring nodes, and by first sending a group of the identity to the intermediate nodes will be named as the selected path till arriving the first actor, during which the next sending will only be carried out through that path. Keywords— Battery power, Energy Cost, Lifetime, Routing, Wireless Sensor Network Shortest path
2077-2083


  PDF

 
 
IJCTA © Copyrights 2010| All Rights Reserved.

This work is licensed under a Creative Commons Attribution 2.5 India License.