International Journal of Information Technology and Computer Science (IJITCS)

IJITCS Vol. 6, No. 12, Nov. 2014

Cover page and Table of Contents: PDF (size: 269KB)

Table Of Contents

REGULAR PAPERS

A Security Scheme against Wormhole Attack in MAC Layer for Delay Sensitive Wireless Sensor Networks

By Louazani Ahmed Sekhri Larbi Kechar Bouabdellah

DOI: https://doi.org/10.5815/ijitcs.2014.12.01, Pub. Date: 8 Nov. 2014

The main objective of this paper is to secure a cross-layer, energy efficient MAC protocol (CL-MAC) dedicated to delay sensitive wireless sensor networks (WSN) for detecting and avoiding wormhole attack. CL-MAC protocol is the result of our previous research works for which the security aspects have not been taken into consideration during its design stage. To formally prove the importance of the proposed scheme, we provide a theoretical study based on Time Petri Net to analyse some important properties related to the devastating effect of the wormhole attack and its countermeasure on the CL-MAC operations. Also, we perform an experimental evaluation through the simulation using realistic scenarios in order to show the performance of the proposal in terms of energy saving, packets loss ratio and latency. The obtained results indicate the usefulness of the formal study provided in this work when applied in security context and confirm clearly a good performance of the proposed scheme against wormhole attack.

[...] Read more.
Performance of High-Altitude Platforms Cellular Communications using Hamming-Tapered Concentric Circular Arrays

By Fahad Alraddady

DOI: https://doi.org/10.5815/ijitcs.2014.12.02, Pub. Date: 8 Nov. 2014

In this paper, the performance of cellular communications based on the ambitious technology of High-Altitude Platforms (HAPs) is discussed when using tapered concentric circular arrays. The coverage cell will be described and designed with an efficient beamforming technique where the Hamming window is proposed as a tapering function and applied to the uniform concentric circular arrays (UCCA) for sidelobe reduction. Based on establishing some mapping curves, this novel tapering window is optimized in its parameters to have the lowest possible sidelobe level that can be 45 dB below the main lobe level. The optimum weights of Hamming window are found to be function of the number of elements of the innermost ring and the number of rings in the array. The cell power pattern is also discussed where the out-of-cell radiation is greatly reduced which in turns reduces the co-channel interference and improves the Carrier-to-Interference Ratio (CIR).

[...] Read more.
Polynomial Differential-Based InformationTheoretically Secure Verifiable Secret Sharing

By Qassim Al Mahmoud

DOI: https://doi.org/10.5815/ijitcs.2014.12.03, Pub. Date: 8 Nov. 2014

In Pedersen’s VSS scheme the secret is embedded in commitments. And the polynomial used is of degree at most (t-1). In strong – (t, n) VSS which based on Pedersen’s scheme that polynomial in verification purpose is public polynomial. The public polynomial in their scheme which acts in verification purpose is not secure. And the secret is secure if the dealer cannot solve the discrete logarithm problem. In our propose scheme we will satisfy the security requirements in strong t-consistency and consider the security on verification polynomial used. We will show in shares verification algorithm the participants can verify that their shares are consistent and the dealer is honest (i.e. the dealer cannot success in distributing incorrect shares even the dealer can solve the discrete logarithm problem.) before start secret reconstruction algorithm. The security strength of the proposed scheme lies in the fact that the shares and all the broadcasted information convey no information about the secret.

[...] Read more.
MAROR: Multi-Level Abstraction of Association Rule Using Ontology and Rule Schema

By Salim Khiat Hafida Belbachir Sid Ahmed Rahal

DOI: https://doi.org/10.5815/ijitcs.2014.12.04, Pub. Date: 8 Nov. 2014

Many large organizations have multiple databases distributed over different branches. Number of such organizations is increasing over time. Thus, it is necessary to study data mining on multiple databases. Most multi-databases mining (MDBM) algorithms for association rules typically represent input patterns at a single level of abstraction. However, in many applications of association rules – e.g., Industrial discovery, users often need to explore a data set at multiple levels of abstraction, and from different points of view. Each point of view corresponds to set of beliefs (and representational) commitments regarding the domain of interest. Using domain ontologies, we strengthen the integration of user knowledge in the mining and post-processing task. Furthermore, an interactive and iterative framework is designed to assist the user along the analyzing task at different levels. This paper formalizes the problem of association rules using ontologies in multi-database mining, describes an ontology-driven association rules algorithm to discoverer rules at multiple levels of abstraction and presents preliminary results in petroleum field to demonstrate the feasibility and applicability of this proposed approach.

[...] Read more.
An Integrated Approach to Drive Ontological Structure from Folksonomie

By Zahia Marouf Sidi Mohamed Benslimane

DOI: https://doi.org/10.5815/ijitcs.2014.12.05, Pub. Date: 8 Nov. 2014

Web 2.0 is an evolution toward a more social, interactive and collaborative web, where user is at the center of service in terms of publications and reactions. This transforms the user from his old status as a consumer to a new one as a producer. Folksonomies are one of the technologies of Web 2.0 that permit users to annotate resources on the Web. This is done by allowing users to use any keyword or tag that they find relevant. Although folksonomies require a context-independent and inter-subjective definition of meaning, many researchers have proven the existence of an implicit semantics in these unstructured data. In this paper, we propose an improvement of our previous approach to extract ontological structures from folksonomies. The major contributions of this paper are a Normalized Co-occurrences in Distinct Users (NCDU) similarity measure, and a new algorithm to define context of tags and detect ambiguous ones. We compared our similarity measure to a widely used method for identifying similar tags based on the cosine measure. We also compared the new algorithm with the Fuzzy Clustering Algorithm (FCM) used in our original approach. The evaluation shows promising results and emphasizes the advantage of our approach.

[...] Read more.
Dynamic Vertical Handoff Algorithm Using Media Independent Handover Service for Heterogeneous Network

By Payaswini P Manjaiah D.H

DOI: https://doi.org/10.5815/ijitcs.2014.12.06, Pub. Date: 8 Nov. 2014

Media Independent Handover (MIH) is a standard proposed by IEEE 802.21 working group to facilitate vertical handover support between heterogeneous networks. Currently, the implementation of the IEEE 802.21 standard supported in ns2, provided by National Institute of Standards and Technology (NIST) considers only the signal strength as a parameter to determine the destination network. Selecting a destination network using only RSS as indicator does not meet the needs of all users. For more accurate choice of destination network, vertical handoff decision should consider the values of different criteria of network as well mobile node. In this paper, we have proposed an improved handoff decision module including additional parameters which considers state of mobile node and network conditions during handoff decision in order to improve the performance. The simulation results shows that the proposed method achieves better performance in terms of throughput, handoff latency and packet drop over the basic handoff scheme. More over the proposed method helps to reduce the unnecessary handoffs by eliminating the ping-pong effect and thus increases overall system performance.

[...] Read more.
Measuring Complexity, Development Time and Understandability of a Program: A Cognitive Approach

By Amit Kumar Jakhar Kumar Rajnish

DOI: https://doi.org/10.5815/ijitcs.2014.12.07, Pub. Date: 8 Nov. 2014

One of the central problems in software engineering is the inherent complexity. Since software is the result of human creative activity and cognitive informatics plays an important role in understanding its fundamental characteristics. This paper models one of the fundamental characteristics of software complexity by examining the cognitive weights of basic software control structures. Cognitive weights are the degree of the difficulty or relative time and effort required for comprehending a given piece of software, which satisfy the definition of complexity. Based on this approach a new concept of New Weighted Method Complexity (NWMC) of software is developed. Twenty programs are distributed among 5 PG students and development time is noted of all of them and mean is considered as the actual time needed time to develop the programs and Understandability (UA) is also measured of all the programs means how much time needed to understand the code. This paper considers Jingqiu Shao et al Cognitive Functional Size (CFS) of software for study. In order to validate the new complexity metrics we have calculated the correlation between proposed metric and CFS with respect to actual development time and performed analysis of NWMC with CFS with Mean Relative Error (MRE) and Standard Deviation (Std.). Finally, the authors found that the accuracy to estimate the development time with proposed measure is far better than CFS.

[...] Read more.
Event-Coverage and Weight based Method for Test Suite Prioritization

By Neha Chaudhary O.P. Sangwan Richa Arora

DOI: https://doi.org/10.5815/ijitcs.2014.12.08, Pub. Date: 8 Nov. 2014

There are many challenges in testing of Graphical User Interface (GUI) applications due to its event driven nature and infinite input domain. Testing each and every possible combination of input require creating number of test cases to satisfy the adequacy criteria of GUI testing. It is not possible to test each and every test case within specified time frame. Therefore it is important to assign higher priority to test cases which have higher fault revealing capability than other test cases. Various methods are specified in literature for test suite prioritization of GUI based software and some of them are based on interaction coverage and weight of events. Weight based methods are defined namely fault prone weight based method, random weight based method and equal weight based method in which fault prone based method is most effective. In this paper we have proposed Event-Coverage and Weight based Method (EC-WBM) which prioritizes GUI test cases according to their event coverage and weight value. Weight value will be assigned based on unique event coverage and fault revealing capability of events. Event coverage based method is used to evaluate the adequacy of test cases. EC-WBM is evaluated for 2 applications one is Notepad and another is Calculator. Fault seeding method is used to create number of versions of application and these faults are evaluated using APFD (Average percentage of fault detection). APFD for prioritized test cases of Notepad is 98% and APFD for non-prioritized test cases is 62%.

[...] Read more.
Calculation of Overvoltage and Estimation of Power Transformer’s Behavior When Activating the Reactors

By Slobodan Bjelic Zorica Bogicevic

DOI: https://doi.org/10.5815/ijitcs.2014.12.09, Pub. Date: 8 Nov. 2014

The document illustrates the methods and algorithms for calculation of overvoltage at the power transformer excerpts during the activation of inductive consumer to secondary transformer. Characteristic stages are activation of the primary phase and followed by two other phases. The new specific condition occurs during the activation of each phase which is presented by alternative electric circuit and simplified equivalent scheme that is used to calculate the values and evaluate overvoltage. For selected parameters of the transformer and inductive loads, the simulation is performed with chosen MATLAB software package.

[...] Read more.
Graphical Representation of Optimal Time for a Step-Stress Accelerated Life Test Design Using Frechet Distribution

By Sana Shahab Arif-Ul-Islam

DOI: https://doi.org/10.5815/ijitcs.2014.12.10, Pub. Date: 8 Nov. 2014

The article provides an approach of getting optimal time through graph for Simple step stress accelerated test of inverse weibull distribution. In this we estimate parameters using log linear relationship by maximum likelihood method. Along with this, asymptotic variance and covariance matrix of the estimators are given. Comparison between expected and observed Fisher Information matrix is also shown. Furthermore, confidence interval coverage of the estimators is also presented for checking the precession of estimator. This approach is illustrated with an example using software.

[...] Read more.