International Journal of Information Technology and Computer Science (IJITCS)

IJITCS Vol. 5, No. 10, Sep. 2013

Cover page and Table of Contents: PDF (size: 200KB)

Table Of Contents

REGULAR PAPERS

Measurement Based Admission Control Methods in IP Networks

By Erik Chromy Tomas Behul

DOI: https://doi.org/10.5815/ijitcs.2013.10.01, Pub. Date: 8 Sep. 2013

Trends in telecommunications show that customers require still more and more bandwidth. If the telecommunication operators want to be successful, they must invest a lot of money to their infrastructure and they must ensure required quality of service. The telecommunication operators would devote to development in this area. The article deals with quality of service in IP networks. Problems of quality of service can be solved through admission control methods based on measurements. These admission control methods take care of control of incoming traffic load. New flow can be accepted only if needed quality of service is ensured for it and without quality of service breach causing of already accepted flows. In the article were made description of simulations and results of simulations for Voice over IP, constant bit rate and video sources. Simulations were realized in Network simulator 2 environment. These simulations were evaluated on the base of some parameters such as: estimated bandwidth, utilization and loss rate.

[...] Read more.
Performance Analysis of Classification Methods and Alternative Linear Programming Integrated with Fuzzy Delphi Feature Selection

By Bahram Izadi Bahram Ranjbarian Saeedeh Ketabi Faria Nassiri-Mofakham

DOI: https://doi.org/10.5815/ijitcs.2013.10.02, Pub. Date: 8 Sep. 2013

Among various statistical and data mining discriminant analysis proposed so far for group classification, linear programming discriminant analysis have recently attracted the researchers’ interest. This study evaluates multi-group discriminant linear programming (MDLP) for classification problems against well-known methods such as neural networks, support vector machine, and so on. MDLP is less complex compared to other methods and does not suffer from local optima. However, sometimes classification becomes infeasible due to insufficient data in databases such as in the case of an Internet Service Provider (ISP) small and medium-sized market considered in this research. This study proposes a fuzzy Delphi method to select and gather required data. The results show that the performance of MDLP is better than other methods with respect to correct classification, at least for small and medium-sized datasets.

[...] Read more.
Performance Evaluation of AODV, LHC-AODV, OLSR, UL-OLSR, DSDV Routing Protocols

By Reza Fotohi Shahram Jamali Fateme Sarkohaki

DOI: https://doi.org/10.5815/ijitcs.2013.10.03, Pub. Date: 8 Sep. 2013

Mobile ad hoc networks are type of wireless networks in which any kind of infrastructure is not used, i.e. there are no infrastructures such as routers or switches or anything else on the network that can be used to support the network structure and the nodes has mobility. The routing is particularly a challenging task in MANETs that selecting paths in a network along which to send network traffic. In this paper, the performance analysis is carried out on Ad-hoc On-demand Distance Vector (AODV), Limited Hop Count AODV (LHC-AODV), Optimized Link State Routing (OLSR), Unnecessary Loop OLSR (UL-OLSR) and Destination Sequenced Distance Vector (DSDV) protocols using NS2 simulator. The delay, throughput, and packet delivery ratio are the three common measures used for the comparison of the performance of above protocols.

[...] Read more.
Optimizing QoS for Multimedia Services in Next Generation Network Based on ACO Algorithm

By Dac-Nhuong Le

DOI: https://doi.org/10.5815/ijitcs.2013.10.04, Pub. Date: 8 Sep. 2013

In Next Generation Network (NGN), the backbone of the overall network architecture will be IP network, supporting different access network technologies and types of traffics. NGN will provide advanced services, such as Quality of Service (QoS) guarantees, to users and their applications. Factors affecting the QoS in NGN are speech encoders, delay, jitter, packet loss and echo. The negotiation and dynamic adaptation of QoS is currently considered to be one of the key features of the NGN concept. In this paper, I propose a novel Ant Colony Optimization algorithm to solve model of the optimal QoS for multimedia services in the NGN. Simulation results show that my new approach has achieved near optimal solutions. Comparison of experimental results with a recently researches shows that the proposed algorithm is better performance and it can meets the demand of the optimal QoS for multimedia services in NGN.

[...] Read more.
A Compression & Encryption Algorithm on DNA Sequences Using Dynamic Look up Table and Modified Huffman Techniques

By Syed Mahamud Hossein S.Roy

DOI: https://doi.org/10.5815/ijitcs.2013.10.05, Pub. Date: 8 Sep. 2013

Storing, transmitting and security of DNA sequences are well known research challenge. The problem has got magnified with increasing discovery and availability of DNA sequences. We have represent DNA sequence compression algorithm based on Dynamic Look Up Table (DLUT) and modified Huffman technique. DLUT consists of 43(64) bases that are 64 sub-stings, each sub-string is of 3 bases long. Each sub-string are individually coded by single ASCII code from 33(!) to 96(`) and vice versa. Encode depends on encryption key choose by user from four base pair {a,t.g and c}and decode also require decryption key provide by the encoded user. Decoding must require authenticate input for encode the data. The sub-strings are combined into a Dynamic Look up Table based pre-coding routine. This algorithm is tested on reverse; complement & reverse complement the DNA sequences and also test on artificial DNA sequences of equivalent length. Speed of encryption and security levels are two important measurements for evaluating any encryption system. Due to proliferate of ubiquitous computing system, where digital contents are accessible through resource constraint biological database security concern is very important issue. A lot of research has been made to find an encryption system which can be run effectively in those biological databases. Information security is the most challenging question to protect the data from unauthorized user. The proposed method may protect the data from hackers. It can provide the three tier security, in tier one is ASCII code, in tier two is nucleotide (a,t,g and c) choice by user and tier three is change of label or change of node position in Huffman Tree. Compression of the genome sequences will help to increase the efficiency of their use. The greatest advantage of this algorithm is fast execution, small memory occupation and easy implementation. Since the program to implement the technique have been written originally in the C language, (Windows XP platform, and TC compiler) it is possible to run in other microcomputers with small changes (depending on platform and Compiler used). The execution is quite fast, all the operations are carried out in fraction of seconds, depending on the required task and on the sequence length. The technique can approach an effective compression ratio of 1.98 bits/base and even lower. When a user searches for any sequence for an organism, an encrypted compressed sequence file can be sent from the data source to the user. The encrypted compressed file then can be decrypted & decompressed at the client end resulting in reduced transmission time over the Internet. An encrypt compression algorithm that provides a moderately high compression with encryption rate with minimal decryption with decompression time.

[...] Read more.
Ontology Based Information Retrieval in Semantic Web: A Survey

By Vishal Jain Mayank Singh

DOI: https://doi.org/10.5815/ijitcs.2013.10.06, Pub. Date: 8 Sep. 2013

In present age of computers, there are various resources for gathering information related to given query like Radio Stations, Television, Internet and many more. Among them, Internet is considered as major factor for obtaining any information about a given domain. When a user wants to find some information, he/she enters a query and results are produced via hyperlinks linked to various documents available on web. But the information that is retrieved to us may or may not be relevant. This irrelevance is caused due to huge collection of documents available on web. Traditional search engines are based on keyword based searching that is unable to transform raw data into knowledgeable representation data. It is a cumbersome task to extract relevant information from large collection of web documents. These shortcomings have led to the concept of Semantic Web (SW) and Ontology into existence. Semantic Web (SW) is a well defined portal that helps in extracting relevant information using many Information Retrieval (IR) techniques. Current Information Retrieval (IR) techniques are not so advanced that they can be able to exploit semantic knowledge within documents and give precise result. The terms, Information Retrieval (IR), Semantic Web (SW) and Ontology are used differently but they are interconnected with each other. Information Retrieval (IR) technology and Web based Indexing contributes to existence of Semantic Web. Use of Ontology also contributes in building new generation of web- Semantic Web. With the help of ontologies, we can make content of web as it will be markup with the help of Semantic Web documents (SWD’s). Ontology is considered as backbone of Software system. It improves understanding between concepts used in Semantic Web (SW). So, there is need to build an ontology that uses well defined methodology and process of developing ontology is called Ontology Development.

[...] Read more.
Impact of Throughput in Enhancing the Efficiency of Cognitive Radio Ad Hoc Network - A Study

By V. Jayaraj J. Jegathesh Amalraj L. Nagarajan

DOI: https://doi.org/10.5815/ijitcs.2013.10.07, Pub. Date: 8 Sep. 2013

Cognitive Radio Ad Hoc Networks (CRAHNs) constitute a viable solution to solve the current problems of inefficiency in the spectrum allocation, and to deploy highly reconfigurable and self-organizing wireless networks. Cognitive Radio (CR) devices are envisaged to utilize the spectrum in an opportunistic way by dynamically accessing different licensed portions of the spectrum. However the phenomena of channel fading and primary cum secondary interference in cognitive radio networks does not guarantee application demands to be achieved continuously over time. Availability of limited spectrum and the inadequate spectrum resource usage necessitates a new communication standard to utilize the existing wireless spectrum opportunistically. This paper discusses currently existing mechanism for providing better efficiency by utilizing cognitive network intelligence. The frequencies used are utilized to the maximum extent without any interference. This paper aims in comparing the techniques used for enhancing the throughput in CR.

[...] Read more.
A Modified Parallel Heuristic Graph Matching Approach for Solving Task Assignment Problem in Distributed Processor System

By R Mohan N P Gopalan

DOI: https://doi.org/10.5815/ijitcs.2013.10.08, Pub. Date: 8 Sep. 2013

Task assignment is one of the most fundamental combinatorial optimization problems. Solving the Task Assignment Problem is very important for many real time and computational scenarios where a lot of small tasks need to be solved by multiple processors simultaneously. In this paper a Heuristic and Parallel Algorithm for Task Assignment Problem is proposed. Results obtained for certain cases are presented and compared with the optimal solutions obtained by already available algorithms. It is observed that the proposed algorithm works much faster and efficient than the existing algorithms .The paper also demonstrates how the proposed algorithm could be extended to multiple distributed processors.

[...] Read more.
Web Crawler Based on Mobile Agent and Java Aglets

By Md. Abu Kausar V. S. Dhaka Sanjeev Kumar Singh

DOI: https://doi.org/10.5815/ijitcs.2013.10.09, Pub. Date: 8 Sep. 2013

With the huge growth of the Internet, many web pages are available online. Search engines use web crawlers to collect these web pages from World Wide Web for the purpose of storage and indexing. Basically Web Crawler is a program, which finds information from the World Wide Web in a systematic and automated manner. This network load farther will be reduced by using mobile agents.
The proposed approach uses mobile agents to crawl the pages. A mobile agent is not bound to the system in which it starts execution. It has the unique ability to transfer itself from one system in a network to another system. The main advantages of web crawler based on Mobile Agents are that the analysis part of the crawling process is done locally rather than remote side. This drastically reduces network load and traffic which can improve the performance and efficiency of the whole crawling process.

[...] Read more.
Secluding Efficient Geographic Multicast Protocol against Multicast Attacks

By A. Amuthan R. Kaviarasan S. Parthiban

DOI: https://doi.org/10.5815/ijitcs.2013.10.10, Pub. Date: 8 Sep. 2013

A Mobile Ad-hoc Network (MANETs) is composed of Mobile Nodes without any infrastructure. The network nodes in MANETs, not only act as ordinary network nodes but also as the routers for other peer devices. The dynamic topology, lack of a fixed infrastructure and the wireless nature make MANETs susceptible to the security attacks. To add to that, due to the inherent, severe constraints in power, storage and computational resources in the MANET nodes, incorporating sound defense mechanisms against such attacks is also non-trivial. Therefore, interest in research of Mobile Ad-hoc NETworks has been growing since last few years. Security is a big issue in MANETs as they are infrastructure-less and autonomous. The main objective of this paper is to address some basic security concerns in EGMP protocol which is a multicast protocol found to be more vulnerable towards attacks like blackhole, wormhole and flooding attacks. The proposed technique uses the concepts of certificate to prevent these attacks and to find the malicious node. These attacks are simulated using NS2.28 version and the proposed proactive technique is implemented. The following metrics like packet delivery ratio, control overhead, total overhead and End to End delay are used to prove that the proposed solution is secure and robust.

[...] Read more.
On Approximate Equivalences of Multigranular Rough Sets and Approximate Reasoning

By B. K. Tripathy Anirban Mitra

DOI: https://doi.org/10.5815/ijitcs.2013.10.11, Pub. Date: 8 Sep. 2013

The notion of rough sets introduced by Pawlak has been a successful model to capture impreciseness in data and has numerous applications. Since then it has been extended in several ways. The basic rough set introduced by Pawlak is a single granulation model from the granular computing point of view. Recently, this has been extended to two types of multigranular rough set models. Pawlak and Novotny introduced the notions of rough set equalities which is called approximate equalities. These notions of equalities use the user knowledge to decide the equality of sets and hence generate approximate reasoning. However, it was shown by Tripathy et al, even these notions have limited applicability to incorporate user knowledge. So the notion of rough equivalence was introduced by them. The notion of rough equalities in the multigranulation context was introduced and studied. In this article, we introduce the concepts of multigranular rough equivalences and establish their properties. Also, the replacement properties, which are obtained by interchanging the bottom equivalences with the top equivalences, have been established. We provide a real life example for both types of multigranulation, compare the rough multigranular equalities with the rough multigranular equivalences and illustrate the interpretation of the rough equivalences through the example.

[...] Read more.
A New Hybrid Grey Neural Network Based on Grey Verhulst Model and BP Neural Network for Time Series Forecasting

By Deqiang Zhou

DOI: https://doi.org/10.5815/ijitcs.2013.10.12, Pub. Date: 8 Sep. 2013

The advantages and disadvantages of BP neural network and grey Verhulst model for time series prediction are analyzed respectively, this article proposes a new time series forecasting model for the time series growth in S-type or growth being saturated. From the data fitting's viewpoint, the new model named grey Verhulst neural network is established based on grey Verhulst model and BP neural network. Firstly, the Verhulst model is mapped to a BP neural network, the corresponding relationships between grey Verhulst model parameters and BP network weights is established. Then, the BP neural network is trained by means of BP algorithm, when the BP network convergences, the optimized weights can be extracted, and the optimized grey Verhulst neural network model can be obtained. The experiment results show that the new model is effective with the advantages of high precision, less samples required and simple calculation, which makes full use of the similarities and complementarities between grey system model and BP neural network to settle the disadvantage of applying grey model and neural network separately. It is concluded that grey Verhulst neural network is a feasible and effective modeling method for the time series increasing in the curve with S-type.

[...] Read more.