International Journal of Information Technology and Computer Science (IJITCS)

IJITCS Vol. 5, No. 12, Nov. 2013

Cover page and Table of Contents: PDF (size: 262KB)

Table Of Contents

REGULAR PAPERS

An Analytic Approach for Calculating Frame Erasue Rate in Cellular GSM Networks

By Ahmed M. Alaa Hazem Tawfik

DOI: https://doi.org/10.5815/ijitcs.2013.12.01, Pub. Date: 8 Nov. 2013

The Quality of Service (QoS) of a GSM system is quantified in terms of Bit Error Rate (BER) and Frame Erasure Rate (FER) observed by the user. The problem of obtaining analytical expressions for BER and FER in a fading channel with multiple cochannel interferers (CCI) is an extremely complex mathematical problem. The reason for this complexity is that the involvement of several GSM physical layer modules is required to obtain an expression for the probability of bit error. Besides, one needs to obtain the statistical properties of faded cochannel interferers in order to obtain the raw BER of GMSK modulation. Thus, error rate metrics are usually obtained by simulating the GSM physical layer rather than treating the problem analytically. A reliable interface between system and link level models can be obtained by evaluating the BER and FER in terms of the Signal-to-Interference Ratio (SIR) analytically, instead of the pre-defined statistical mapping data usually used in literature. In this work, bounds on the uplink BER and FER are obtained for the GSM physical layer assuming a CCI limited system where both the desired and interference signals are subjected to Rayleigh fading. The analysis considers GMSK modulation, convolutional coding and Frequency Hopping.

[...] Read more.
Combining Time Reversal and Fast Marching Method in Wireless Indoor Positioning

By Guoping Chen Wenshan Wang Hao Zeng Chun Guan Feng He

DOI: https://doi.org/10.5815/ijitcs.2013.12.02, Pub. Date: 8 Nov. 2013

Most of current wireless indoor positioning methods could not accurately obtain channel model, the mapping between spatial position and received signal features. The main factor for a precise channel model in an indoor environment is multipath effect. Time reversed (TR) wireless indoor positioning method has been validated to effectively reduce signals fading or time delay affected by multipath effect. However, these advantages are depended on a prior known channel model, without this condition, the accuracy of TR method will be seriously deteriorated. To solve the shortcoming of a general TR method in an unknown channel model application, we present a combining Time Reversal and Fast Marching Method (TR-FMM) positioning method. This method locates a target with two stages. In the stage one, the precise channel model of an indoor environment is estimated by FMM and simultaneous algebraic reconstruction technique (SART). In this stage, Time of Flight (TOF) information generated by some fixed spatial position anchors are used to fulfill the indoor channel model estimation, then the needed channel impulse response (CIR) for TR method will be obtained based on the estimated channel model. In the stage two, with the obtained CIR, any new joint mobile target will be accurately located by a general TR wireless indoor positioning method. Some numerical simulations have been presented to validate the proposed method. Simulative results depict the positioning deviation is less than 3cm for a newly joined mobile target with 1cm scale in a moderate complex indoor configure, and the accuracy of the positioning is improved 30 times comparing to a general TR method. The positioning time in the stage 2 is less than 3 minutes in a PC with 1.6 GHz dual CPUs and 2G Bytes memory. Obviously, the proposed method has great advantage in high accuracy and low complexity for wireless indoor positioning system.

[...] Read more.
Efficient Algorithm for Destabilization of Terrorist Networks

By Nisha Chaurasia Akhilesh Tiwari

DOI: https://doi.org/10.5815/ijitcs.2013.12.03, Pub. Date: 8 Nov. 2013

The advisory feasibility of Social Network Analysis (SNA) to study social networks have encouraged the law enforcement and security agencies to investigate the terrorist network and its behavior along with key players hidden in the web. The study of the terrorist network, utilizing SNA approach and Graph Theory where the network is visualized as a graph, is termed as Investigative Data Mining or in general Terrorist Network Mining. The SNA defined centrality measures have been successfully incorporated in the destabilization of terrorist network by deterring the dominating role(s) from the network. The destabilizing of the terrorist group involves uncovering of network behavior through the defined hierarchy of algorithms. This paper concerning the destabilization of terrorist network proposes a pioneer algorithm which seems to replace the already available hierarchy of algorithms. This paper also suggests use of the two influential centralities, PageRank Centrality and Katz Centrality, for effectively neutralizing of the network.

[...] Read more.
Hybrid Communication System Based on OFDM

By Farman Ullah Nadia N Qadri Muhammad Asif Zakriyya Aamir Khan

DOI: https://doi.org/10.5815/ijitcs.2013.12.04, Pub. Date: 8 Nov. 2013

A Hybrid architecture between terrestrial and satellite networks based on Orthogonal Frequency Division Multiplexing (OFDM) is employed here. In hybrid architecture, the users will be able to avail the services through the terrestrial networks as well as the satellite networks. The users located in urban areas will be served by the existing terrestrial mobile networks and similarly the one located in rural areas will be provided services through the satellite networks.
The technique which is used to achieve this objective is called Pre-FFT adaptive beamforming also called time domain beamforming. When the data is received at the satellite end, the Pre-FFT adaptive beamforming extracts the desired user data from the interferer user by applying the complex weights to the received symbol. The weight for the next symbol is then updated by Least Mean Square (LMS) algorithm and then is applied to it. This process is carried out till all the desired user data is extracted.

[...] Read more.
An Analytical Study of Power Line Effect on UTP Cable using Lumped Circuit Components

By Mitamoni Sarma Shikhar Kr. Sarma

DOI: https://doi.org/10.5815/ijitcs.2013.12.05, Pub. Date: 8 Nov. 2013

The paper defines the term electrical noise with its types. Electromagnetic Interference (EMI), which is one type of electrical noise, is also defined and general techniques used for controlling EMI are described. Networking cables are affected by the EMI effect caused by a nearby power cable and data transmission through Unshielded Twisted Pair (UTP) cable, which is the mostly effected cable by EMI, may be degraded for it. Today, UTP cable is the most popular networking cable supporting 10G Ethernet. The most common effective methods for reduction of EMI effect on UTP cable, physical separation and use of shielding are described. EMI is caused by coupling mechanisms between source of interference and receptor. The two types of couplings are capacitive coupling and inductive coupling. The paper analyses and models the two couplings using lumped circuit components and electric circuit analysis considering power cable as the source of interference and networking cable as the receptor circuit of EMI.

[...] Read more.
Improving Situational Awareness for Precursory Data Classification using Attribute Rough Set Reduction Approach

By Pushan Kumar Dutta O. P. Mishra M.K.Naskar

DOI: https://doi.org/10.5815/ijitcs.2013.12.06, Pub. Date: 8 Nov. 2013

The task of modeling the distribution of a large number of earthquake events with frequent tremors detected prior to a main shock presents us unique challenges to model a robust classifier tool for rapid responses are needed in order to address victims. We have designed using a relational database for running a geophysical modeling application after connecting database record of all clusters of foreshock events from (1998-2010) for a complete catalog of seismicity analysis for the Himalayan basin. by Nath et al,2010. This paper develops a reduced rough set analysis method and implements this novel structure and reasoning process for foreshock cluster forecasting. In this study, we developed a reusable information technology infrastructure, called Efficient Machine Readable for Emergency Text Selection(EMRETS). The association and importance of precursory information in reference to earthquake rupture analysis is found out through attribute reduction based on rough set analysis. Secondly, find the importance of attributes through information entropy is a novel approach for high dimensional complex polynomial problems pre-dominant in geo-physical research and prospecting. Thirdly, we discuss the reducible indiscernible matrix and decision rule generation for a particular set of geographical co-ordinates leading to the spatial discovery of future earthquake having prior foreshock. This paper proposes a framework for extracting, classifying, analyzing, and presenting semi-structured catalog data sources through feature representation and selection.

[...] Read more.
Controlling Citizen’s Cyber Viewing Using Enhanced Internet Content Filters

By Shafii Muhammad Abdulhamid Fasilat Folagbayo IBRAHIM

DOI: https://doi.org/10.5815/ijitcs.2013.12.07, Pub. Date: 8 Nov. 2013

Information passing through internet is generally unrestricted and uncontrollable and a good web content filter acts very much like a sieve. This paper looks at how citizens’ internet viewing can be controlled using content filters to prevent access to illegal sites and malicious contents in Nigeria. Primary data were obtained by administering 100 questionnaires. The data was analyzed by a software package called Statistical Package for Social Sciences (SPSS). The result of the study shows that 66.4% of the respondents agreed that the internet is been abused and the abuse can be controlled by the use of content filters. The PHP, MySQL and Apache were used to design a content filter program. It was recommended that a lot still need to be done by public organizations, academic institutions, government and its agencies especially the Economic and Financial Crime Commission (EFCC) in Nigeria to control the internet abuse by the under aged and criminals.

[...] Read more.
Modeling Truncated Loss Data of Operational Risk in E-banking

By Maryam Pirouz Maziar Salahi

DOI: https://doi.org/10.5815/ijitcs.2013.12.08, Pub. Date: 8 Nov. 2013

Operational risk is an important risk component for financial institutions, especially in E-banking. Large amount of capital that are assigned to decrease this risk are evidence to this subject. One of the most important factors for modeling operational risk to estimate capital charge is loss data collections of banks. But sometimes for reasons like decreasing the costs, banks save only the losses above some determined thresholds at their database. For achieving accurate capital charge, this threshold should be considered in determining capital charge. This paper focuses on modeling truncated loss data above some given threshold. We discuss several statistical methods for modeling truncated data, and suggest the best one for modeling truncated loss data. We have tested our suggested model for some operational loss data samples. Our results indicate that our approach can be useful for increasing accuracy of estimating operational risk capital charge in E-banking.

[...] Read more.
A Cluster Based Job Scheduling Algorithm for Grid Computing

By Reza Fotohi Mehdi Effatparvar

DOI: https://doi.org/10.5815/ijitcs.2013.12.09, Pub. Date: 8 Nov. 2013

Grid computing enables sharing, selection and aggregation of computing resources for solving complex and large-scale scientific problems. The resources making up a grid need to be managed to provide a good quality of service. Grid scheduling is a vital component of a Computational Grid infrastructure.
This paper presents a dynamic cluster based job scheduling algorithm for efficient execution of user jobs. This paper also includes the comparative performance analysis of our proposed job scheduling algorithm along with other well-known job scheduling algorithms considering the parameters like average waiting time, average turnaround time, average response time and average total completion time.
The result has shown also exhibit that Our proposed scheduling algorithms (CHS ) has shown the best average waiting times, average turnaround times, average response times and average total completion times compared to other job scheduling approaches.

[...] Read more.
Goal Structured Requirement Engineering and Traceability Model for Data Warehouses

By Vinay Kumar Reema Thareja

DOI: https://doi.org/10.5815/ijitcs.2013.12.10, Pub. Date: 8 Nov. 2013

Data warehouses are decision support systems that are specifically designed for the business managers and executives for reporting and business analysis. Data warehouse is a database that stores enterprise-wide data that can be used to deduce useful information. Business organizations can achieve a great level of competitive advantage by analyzing its historical data and learning from it. However data warehouse concept is still maturing as a technology. In order to effectively design and implement a data warehouse for an organization, its goal needs to be understood and requirement must be analyzed in the perspective of the identified goal. In this paper we present a goal structured model for requirements engineering that also enables its users to manage traceability between the goals, decisions, business strategy and the corresponding business model.

[...] Read more.
Implementation of Palm Print Biometric Identification System Using Ordinal Measures

By V.K. Narendira Kumar B. Srinivasan

DOI: https://doi.org/10.5815/ijitcs.2013.12.11, Pub. Date: 8 Nov. 2013

Personal identification is one of the most important requirements in all e-commerce and criminal detection applications. In this framework, a novel palm print representation method, namely orthogonal line ordinal features, is proposed. The palm print registration, feature extraction, palm print verification and palm print recognition modules are designed to manage the palm prints and the palm print database module is designed to store their palm prints and the person details in the database. The feature extraction module is proposed to extract the ordinal measurements for the palm prints. The verification module is designed to verify the palm print with the personal identification record. The recognition module is proposed to find out the relevant person associated with the palm print image. The proposed palm print recognition scheme uses the intensity and brightness to measure the ordinal measurement. The ordinal measures are estimated for the 4 x 4 regions of the palm print images.

[...] Read more.
Design Parallel Linear PD Compensation by Fuzzy Sliding Compensator for Continuum Robot

By Amin Jalali Farzin Piltan Mohammadreza Hashemzadeh Fatemeh BibakVaravi Hossein Hashemzadeh

DOI: https://doi.org/10.5815/ijitcs.2013.12.12, Pub. Date: 8 Nov. 2013

In this paper, a linear proportional derivative (PD) controller is designed for highly nonlinear and uncertain system by using robust factorization approach. To evaluate a linear PD methodology two type methodologies are introduced; sliding mode controller and fuzzy logic methodology. This research aims to design a new methodology to fix the position in continuum robot manipulator. PD method is a linear methodology which can be used for highly nonlinear system’s (e.g., continuum robot manipulator). To estimate this method, new parallel fuzzy sliding mode controller (PD.FSMC) is used. This estimator can estimate most of nonlinearity terms of dynamic parameters to achieve the best performance. The asymptotic stability of fuzzy PD control with first-order sliding mode compensation in the parallel structure is proven. For the parallel structure, the finite time convergence with a super-twisting second-order sliding-mode is guaranteed.

[...] Read more.