NCSCN 2015 - National Conference on Simulations of Computing Nexus
"NCSCN 2015 Conference Papers "
A Study on Dependency Optimization using Machine-Learning Approach for Test Case Prioritization[Full-Text ]
Sathya C, Karthika CThe main goal of this paper is Test Case Prioritization where the process is to order test cases. This ordering of test case will give and increased rate in fault detection. Test Case Prioritization will improve the fault fixing process anmd thus leads a way to early delivery of the software. Due to the functional dependencies between the requirements the cse of executing the test case in any order goes false. In this paper, we present different techniques that provides us information about the various ways of prioritization the test case using the dependencies between them. The dependencies of the test case is main based on the interaction between the requirements or even between the various modules and funcitons of the whole system. This test case ordering based on the functional dependencies is likely to increase the fault detection earlier than other fault detection systems. This is known through the empirical evaluations on six systems that were built towards the industry. We also proposed a new system which is a machine learning techn ique. This is known through the empirical evaluations on six systems that were built towards the industry. We also proposed a new system which is a machine learning technique. Here Case-Based Paradigm is indulged with Analytical Hierarchy Processing which proves itself better than other techniques proposed to date.
------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
A Study on Enhancing Source Code Quality by Using SIG Approach[Full-Text ]
G.Aswini, Mr.D.Gautham Chakravarthy, Dr.S.GunasekaranThis paper presents the study of three model which is used to improve maintainability process of the software system product. The Software Improvement Group (SIG) has the process of maintainability index to calculate the single number that express the maintainability of the system. The analytical Hierarchical process model is used to extract the data and metrics to assign the relative weight and the clustering technology is used to cluster the derived ISO/IEC-9126’s maintainability values. The two dimensional model is used for the maintainability activity with the system property. It has activity and facts for the representation of Factor-Criteria-Metrics approach.
------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
A Survey on Big Data Processing in Large Scale Computing System[Full-Text ]
N.Boopal,S.Gunasekaran,S.KarthibanBig data defines to data that cannot be handled in traditional system and the data capacity is too large to handle. Data analysis which has huge set of data in the cluster nodes of cloud system and the data can be processed while analysing. Map Reduce is the key to access the big data environment with good scalability. The cost of the server is leading to be huge while compared with the total cost during analysis of data. Heterogeneous workloads are major problems in data centres, the hybrid structure of cloud system that establishes HDFS and parallel database process for processing and indexing. PSO are constructed to proceed parallel data processing and workloads on each node. In this paper, we introduce data classification mechanism to task partitioning, a system that efficiently process complex data analysis tasks by improving Map Reduce runtime framework on large clusters.
------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
A Survey on VANET based Secure and Privacy Preserving Navigation[Full-Text ]
S.Kathirvel, D.Gautham Chakravarthy, Dr.S.GunasekaranVANET is collected of vehicles and roadside infrastructure units (RSUs). Vehicles are equipped with wireless communication devices, which are called On-Board Units (OBUs). The wireless communication devices enable vehicles to exchange traffic related information with each other and with RSUs. VANETs raise many security and privacy concerns at the same time. Malicious users can take advantage of VANET and disturb the whole system. In the previous researchers introduced many techniques and methods, Road information collected to provide navigation service to drivers. Based on the destination and the current location of the driver the system can automatically search for a route that yields minimum traveling delay in a distributed manner using the online information of the road condition. . In this survey discuss various VANET security techniques and algorithms to detect and prevent the malicious users in the roadways.
------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Study on Improving Efficiency of Software by Detecting and Correcting Code Smells[Full-Text ]
Suganya D, Kathiresan V, Gunasekaran SCode smells denotes the poor standards of the implementation strategy. Presence of code smells makes source code maintenance a tedious process and also making proneness to faults and changes higher issue. Code smells are also something which results due to poor designing solutions called anti-patterns. These, code smell becomes a challenge for the software engineers to make out changes which might be a hindrance for the software and evolution of software. Hence, this survey paper focuses on various methods and techniques for improving the efficiency and functioning of the software. Code smells are defects in coding or design of a software which does not stop the software from functioning but it slows down the efficiency of the software gradually. It has a serious impact on the maintenance of the software in a drastic manner where the structural characteristics of software indicates a code or design problem that makes software hard to evolve and maintain which triggers refactoring of code. Code smells are suboptimal design choice degrading different aspects of the code quality indicating deeper design problems which causes problems in the evolution of a software product. Not all of them are equally problematic and some may not be problematic at all in some contexts.
------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Survey on Fault Tolerance and Residual Software Fault of the System by Using Fault Injection[Full-Text ]
Harunya B, Deepa N R, Dr.Gunasekaran SFault injection is used to characterize the failure to validate and compare the fault tolerant mechanisms. A systematic and quantitative approach is using fault injection to guide the design and implementation of fault tolerance systems. The two experimental approaches were used to analyse the software fault by fault injection. In the first experiment, a set of faults is injected by SWIFI tool to evaluate the accuracy of injected faults. The second experiment, the fault triggers and fault types are used to cause the impact of fault in target system and program results. The injected fault representatives are used to observe the field data in software faults without using any excess time is spent by the fault. The general pin-hole fault injection tool can be used to demonstrate the application of MESSALINE. This application can be used to validate the inter locking systems for railway control application and dependable communication systems. The NFTAPE tool can be used to conduct the automated faults to be injected and available for lightweight fault injectors. The problem is solved by triggers, monitors and other components of fault injectors. The G-SWIFT approach can be used for removing the fault and filtering the result. This approach can be used to improving the fault representative of the system.
------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Survey on Infrequent Itemset Mining Using Frequent Pattern Growth[Full-Text ]
S.Nandhini, Mr.M.Yogesh prabhu, Dr.S.GunasekaranItemset mining is an effective area of research due to its successful application in various data mining scenarios like ?nding association rules. There are two types of itemset mining namely, Frequent Itemset Mining and Infrequent Itemset Mining. The research society has focused on the Infrequent Weighted Itemset Mining problem. The infrequent weighted itemset are item sets whose frequency of occurrence in the analyzed data is less than or equal to a maximum threshold. Two algorithms are reviewed to find rare itemset, that are infrequent weighted itemset (IWI) and Minimal Infrequent Weighted Itemset (MIWI) and this is based on the frequent pattern-growth paradigm. Finally performance analysis of an algorithm has been shown in terms of execution time.
------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Survey on Privacy Oriented Web Service Composition[Full-Text ]
Sandhiya R, Joe Dhanith P R, Dr Gunasekaran SIn the web service environment privacy entity is used to determine when to release the private information depending upon the service providers. The query rewriting approach is used for querying and automatically composing DP services by using RDF views. Service oriented data mashup application is used to integrate the data from multiple data providers. This reveals the sensitive information from other data providers, creating flexible dynamic business process application. The attack model is introduced to analyse the social information from query log in Daas for encrypting the user information from the trusted client. The data in original form contains sensitive information about individuals this violates the privacy in the Privacy-Preserving Data Publishing (PPDP) method.
------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------