Abstract

A Novel Method For Face Segmentation And Face Features Extraction Based On Voronoi Diagram
Paper Number:   01
Abstract:
Recognition of human faces out of many still images is a research field of fast increasing interest. This paper presents a novel face detection algorithm for gray intensity images.  At first, face location and extraction has to be performed in order to get the approximate if not the exact representation of a given face in an image. Many approaches have been introduced; some deal with colored images others with binary or gray-scaled ones and some with the constraint to one face per image while others can process multiple instances of faces within an image. Some of these approaches gave good results, however with much price and burden to pay for computation. Thus, a fast and accurate algorithm is still a field to be explored. Our proposed approach is based on Voronoi Diagram (VD), which is a well known technique in computational geometry. We use it to form decision whereby intensity ranges will be generated based on vertices of the external boundary of Delaunay triangulation. By performing this; our image will yield segmented regions. A greedy search algorithm examines for a face candidate focusing the action around elliptical like regions. VD is used now in many fields, but mainly researchers focus on its use in skelatization and generating Euclidean distances, and here comes our contribution namely in exploiting the Voronoi and its son (Delaunay) to generate features that can be used as patterns for face recognition.
back to Accepted Papers Page


Print Protection Using A Diffused Background Watermark
Paper Number:    03
Abstract:
In this paper a new approach to digital watermarking in printed documents for copyright protection is described.  The process is defined by using analytical techniques and concepts borrowed from Cryptography.  During watermarking the background is generated using convolution of the watermark image (logo) with a source of noise in a process termed 'diffusion'.  The cover image (text) is inserted into the foreground of the document using a simple additive process termed 'confusion'.  The watermark is subsequently recovered by removing the foreground and then correlating with the original noise source.  It is demonstrated that this method is robust to a wide variety of attacks including geometric attacks, drawing, crumpling and print/scan attack.  Experiments also show that the method is relatively insensitive to lossy compression, making it well suited to scanned electronic documents. The details of this method as well as the experimental results are shown.
back to Accepted Papers Page


Dense Disparity Map : A Multiresolution Approach
Paper Number:   04
Abstract:
Stereo is an important area in computer vision. This paper presents a efficient stereo matching algorithm based on correlation technique which produces a dense disparity map by using a pyramid structure. The basic correlation algorithm consists of computing the correlation scores for every point in the image by taking a fixed window in the first image and a shifting a window in the second image. The second window is moved in the second image along the epipolar line. For each point, the correlation score is computed by using some criteria. Two pixels are said to be matched when a maximum score is found. The most classic area-based methods used cross correlation with a fixed window-size but this method didn�t give satisfying results. In order to decrease a number of invalid points and to increase the density of sparse disparity map, we propose in this paper a hierarchical algorithm where the window�s size is fixed in order to perform the matching at several levels of resolution. Both synthetic and real image tests have been performed, and good results have been obtained.
back to Accepted Papers Page


A Cell Selection Policy For An Input Buffered Packet Switch
Paper Number:   06
Abstract:
Many Architectures of Internet routers, ATM and Ethernet switches have been proposed and analyzed in literature. Theoretically reliable and valid solutions have been developed but not feasible in practice for commercial and technological reasons. Few papers develop the implementation and simulation aspects. The objective of this paper is the design of a packet switch with a minimum cost and minimum hardware complexity. A Benes network as an interconnection network combined with a simple cell selection policy implemented by hardware has been adopted to reduce the hardware cost and scheduling algorithm complexity. We propose and simulate an input-queuing architecture using a hardware description language. Performances in terms of throughput and cell loss are evaluated. The measures confirm that performances of our switch are quiet similar to those of others switches, but with less cost and less complexity.
back to Accepted Papers Page


A Realistic Plants� Modeling And Rendering With Lindenmayer Systems And Constructive Solid Geometry
Paper Number:    10
Abstract:
Nowadays the modeling of natural phenomena like mountains, snow, plants, and others, is a field in a constant progress. The research is mainly developed in the modeling and the visualization of the objects. The objectives of the researchers are the photo realism of the image, the execution time, and the introduction of more and more functionalities to the models. In this paper, we�ll present a brief presentation of a new plants� simulator of the development of plants that combines the biological modeling of plants with L-systems and the Schematic modeling based on Constructive Solid Geometry (CSG). The rendering was realized thanks to RENDERMAN of PIXAR.
back to Accepted Papers Page


Alsat-1 Images and Zooming Enhancement Using Super Resolution Method
Paper Number:   12
Abstract:
In this paper we investigate the problem of how information contained in multiple, overlapping Alsat-1 first Algerian microsatellite images of the same scene may be combined to produce images of superior quality. In each view, light emitted from the scene is projected onto the sensor array of the Alsat-1 Imager, producing an irradiance pattern which is temporally integrated, spatially sampled and quantized to generate an image. In addition, the image is often blurred due to de-focus and/or motion. We attempt to reverse these degradations in order to estimate a high resolution representation of the intensities in the scene. The process of reconstructing a high resolution image from several different images covering the same region in the world is called Static Super Resolution. In this paper we address also the problem of producing an enlarged picture from a given digital image (zooming). This problem arise frequently whenever an user wish to zoom in to get a better view of a given picture.
back to Accepted Papers Page


An Arabic Character Recognition System in Using General Auto-Associative Memory Model
Paper Number:   13
Abstract:
This paper proposes a method for the recognition of typewritten Arabic characters. Recognition has been done in using a general auto-associative memory (GAM) based approach, using improved chain code and invariants moment shape descriptor as the main features. The role of GAM artificial neural network used as a classifier is examined, investigated, and its performance is evaluated when trained and tested with printed Arabic characters using structural features. The feature vector, which is used to train the GAM recognition model, is the chain code (direction and length), number of holes and or dots with their position, and moments of segmented character. By using the new GAM concept called supporting function, the model shows a storage capacity suitable for storing all Arabic characters. A simple experiment was conducted to show the effectiveness of the Sample Learning Algorithm (SLA) in the recognition of noisy Arabic characters. The system developed has been tested using a set of data and the result obtained has proved its efficiency and robustness.
back to Accepted Papers Page


A New Approach For Time Series Forecasting Based On Artificial Neural Network
Paper Number:   14
Abstract:
Traditional feedforward neural networks are static structures which simply map input to output. Motivation from biological consideration, a method for predicting a new artificial network weighting functions is devised. Using these weighting functions predicated, time series are derived. In the applications investigated, forecasting country population, this method outperformed all standard method of neural network prediction.
back to Accepted Papers Page


Arabic Phoneme Recognizer Based On Neural Networks
Paper Number:    15
Abstract:
An  Arabic phoneme recognizer has been designed  and implemented, the categories  of Standard  Arabic phonemes were uesed. According to that 6 sets of Arabic phonemes are defined. One special   Neural network  is designed for each set. The   proposed network is a linear hierarchical network which consists of  two subnetworks. The first subnetwork is based on  Kohonen  Self-Organizing Map. The second subnetwork  is based on Learning Vector Quantization.

back to Accepted Papers Page


A New Chaining Blocks Mode For Ciphering
Paper Number:    17
Abstract:
Electronic data and information exchange (including private and personal information, on line commercial transaction, etc.) is the most exposed service to harmful attacks, hence, the important security issue of the service. Security covers the confidentiality, integrity, authentication and non-repudiation concerns. Cryptography is known to be a basic tool for internet security. In this paper we are interested in using cryptographic techniques to secure the electronic exchanges to protect them from being attacked during their transmission on the NET. We particularly emphasis on enciphering and our solution offer the user three security levels: low, medium and high. More ever, the most important contribution of this paper is that we enforce the enciphering techniques used by introducing two new chaining blocks modes that allowed us to combine two different enciphering algorithms at same time to encrypt the message.
back to Accepted Papers Page


Automatic Arabic Phoneme Segmentation
Paper Number:    18
Abstract:
In this paper we propose a new Arabic Phoneme Segmentaion Algorithm using time domain analysis method. It  basically depends on rules of phonological arabic system. It basicly consists of three rules, one should use the three rules just stated to begin the process of segmentation in Arabic. These rules enable one to segment almost any utterance in Arabic correctly and easily.
back to Accepted Papers Page


A Layered Connector Model for A Complex Software System Design
Paper Number:    21
Abstract:
Despite the important role they play in the definition of software architecture, connectors haven�t received sufficient consideration across various research works in the field of software architecture. The concept of connector is one of the most important aspects that need to be well understood and clarified. Often, architects put all their skills and focus all their mental effort in defining the most important parts of the system called components and link them in a way that satisfy functionality and requirement of the desired system. Refining process, which takes the system from its high-level Abstract:
ion state to an implementation state, is often applied to components. In fact, often an architect is not concerned with connector refinement. At each step of refinement process, architects exploit an interconnection technology to link components rather than designing a new one. In such a way, connector implementation is often scattered among various component, yielding a non-reusable communication infrastructure. In this paper, we will try to understand more accurately the concept of connector, which first may be considered a self sufficient entities like components, and then we will present various concepts needed to define the basis for defining a connector model that may be embedded as part of an ADL or any specification tools. Finally, we propose architecture of a connector model.
back to Accepted Papers Page


Synchronous Machine Steady-State Stability Assessment Using An Artificial Neural Network
Paper Number:    24
Abstract:
This paper proposes a neural network approach for steady-state stability assessment for a single-machine system. The perceptron can forms two decision regions separated by hyperplane. The developed neural network is trained in the space of training patterns, this space can be divided into a stable region and an unstable one by using the training data. The boundary between both regions is described by a set of hyperplanes in the pattern space. These hyperplanes, which can be used to represent the connection weights between neurons, are known as weight hyperplanes. It is found that the proposed learning method converges much faster than the back propagation momentum method.
back to Accepted Papers Page


Computer simulation for the atmosphere effect on the image quality of the thermal camera
Paper Number:   25
Abstract:
Thermal radiation is attenuated in passage through terrestrial atmospheres mainly by the processes of absorption and of scattering. Knowledge on the atmospheric transmission and attenuation of the spectral radiation in the wavelengths between 2-15 mm at the earth surface is required for many purposes. So, a computer simulation demonstrates the atmosphere; aerosol and turbulance effect on the output thermal image quality of the thermal camera has been developed. The simulation program calculate the detection range that   the system able to detecte the object as a function of  signal to noise ratio (S/N). The results show that the effect of aerosol effect is much more dominant than the turbulence in the transmission of the IR radiation in atmosphere. At certain SNR the range for clear atmosphere is larger than that for dense atmosphere.
back to Accepted Papers Page


Image Compression Using Genetic Algorithm
Paper Number:   31
Abstract:
The principal objective of this research is an adoption of the Genetic Algorithm (GA) for studying it firstly, and to stop over the facilities which are introduced from the genetic algorithm. The candidate field for applying the facilities of the genetic algorithm is the Image compression field, because researchers took great interest in this field in the recent years. This research uses the facilities of the genetic algorithm for the enhancement of the performance of one of the popular compression method, Vector Quantization (VQ) method is selected in this work. After studying this method, new proposed algorithm for mixing the (GA) with this method was constructed and then the required programs for testing this algorithm was written. A good enhancement was recorded for the performance of the (VQ) method when mixed with the (GA). The proposed algorithm was tested by applying it on some image data files. Some fidelity measures are calculated to evaluate the performance of the new proposed algorithm. All programs were written by using Visual C++   language (version 6.0) and Visual BASIC language (version 6.0) and these programs were executed on the Pentium III (500 MHz) personal computer.
back to Accepted Papers Page


Oil Height Determination From Capacitance Tomography Measurements Using Neural Network
Paper Number:    40
Abstract:
This paper presents a �direct� method to gas-oil interface level determination using an artificial neural network approach based on Electrical Capacitance Tomography (ECT) measurements.  �Direct� here means that the gas-oil interface levels are obtained directly from the ECT measurements without recourse to image reconstruction.  The preliminary work models a separation tank that is filled with gas and oil.  An ECT system, attached around the tank is used to obtain ECT measurements.  Sets of ECT measurements together with their corresponding oil heights are fed into a Multi-Layer Perceptron (MLP) neural network system for training processes.  After being trained, the MLP is tested by giving it sets of independent ECT measurements.  The results show that �direct� gas-oil interface level measurement from ECT data is feasible with the use of a neural network system.
back to Accepted Papers Page


On the use of Multi Layer Neural Networks in Automatic Information Retrieval
Paper Number:    41
Abstract:
A new idea is developed here, which is a particular approach to the use of neural networks in information retrieval (storing and searching large bibliographic databases). The idea is theoretical and based on the manipulation of statistical information derived from document contents in order to build their hidden layer representations. During the learning process,  the document is entered to the network and processed sentence after sentence.  Each sentence form a tree of connections through multiple layers to the corresponding document in the output layer. Each processed document will add vocabulary and connections to the network and thus, forms a sub-network representation of its content. During the search process, words included in the query will activate the corresponding units in the first layer, which in their turns will activate other units in the upper layers until they will fire one or more output documents in the last layer. Back-propagation might be used in cases of firing irrelevant documents.
back to Accepted Papers Page


Client/Server Connection Using Java Language
Paper Number:    42
Abstract:
This Paper presents the idea of motivation for the client /server paradigm arises from the problem of rendezvous. The client/server model solves the rendezvous problem by asserting that in any pair of communicating applications one side must start execution and wait for the other side to contact it. In this article we present the natural communication, the communication types, and application interface on which a server and client contact each other. We implemented it using java language programming.
back to Accepted Papers Page


Improvement of Land Cover Change Detection Accuracy by Fuzzy Aggregation Operators
Paper Number:    52
Abstract:
Change detection is a principal task in monitoring and wide area surveillance. Usually, it involves the comparison of remotely sensed images of the same area acquired on different dates. The data are handled either by comparative analysis of independently-produced classifications of data, or by an alternative approach based on simultaneous analysis using multitemporal classification of data. since the difference of performance of these two approaches is well known, we propose the combination of their outputs to achieve the best possible performance. in this paper, both approaches utilize the same fuzzy classifier while the combination is carried out by using various fuzzy aggregation operators. Experiments on bitemporal SPOT data showed that the combination can significantly improve the change detection accuracy when using an appropriate fuzzy aggregation operator.

back to Accepted Papers Page


Time Series Forecasting with Fuzzy Genetic Aigorithms
Paper Number:    53
Abstract:
ln fuis paper we describe a time series forecasting method based on a fuzzy systems. ln particular til11e series exhibiting non-linear or chaotic behaviors are selected for forecasting. A range of l11ethodologies based on a set of fuzzy predictor's permit the prediction of the sal11e time series, but with a different fuzzy partitions. Each fuzzy predictor is developed by the means of a real genetic algorithm. The combination of the fuzzy predictors will reduce the prediction error. The prediction l11ethod is tested by using Mackay-Glass til11e series data.

back to Accepted Papers Page


MCLA: A Multi-label Classification Learning Algorithm
Paper Number:    56
Abstract:
Association rule mining and classification are important tasks in data mining. Using association rules has proved to be an effective and accurate approach for classification. In this paper, a new classification based on association rules technique is presented. It uses a method of quickly finding frequent itemsets. The proposed technique produces rules with multi-label, and adjusts the class-label ranking for some of the rules to reflect changes that occur after evaluating other higher-ranked rules. The results obtained after experimenting with several different datasets, showed that the presented method is an accurate and effective classification technique. Furthermore, the classifiers generated are highly competitive to those produced by decision trees and other associative classification techniques with regards to classification accuracy.

back to Accepted Papers Page


Acoustic Characterization Of Heart Valves Activity By Using The Smoothed-Pseudo Wigner-Ville Distribution
Paper Number:    58
Abstract:
The Phonocardiogram (PCG) signal, which is the heart sounds temporal tracing, is recorded by using a Computer�based acquisition system that we had developed. Indeed, the software part of the acquisition system, written in MATLAB language, drives a Personal Computer to allow collection of the heart sounds samples in 16 bits format. The acquisition process is carried out through a microphone at different sampling frequencies. The time�frequency analysis of the PCG signal is carried out by means if transformed Wigner�Ville Distributions (WVDs). Hence, the Smoothed�Pseudo Wigner�Ville Distribution (SPWVD) is selected over the transformed approaches of the WVDs yielding to an adequate Time�Frequency Representation (TFR). Thus, the SPWVD is applied to PCG signals of a normal subject acquired from its aortic, tricuspid, mitral and pulmonic auscultation areas over the chest wall. Therefore, the nature of normal heart sounds is characterized in systole and diastole phases. The spectral activity of each heart valve, namely; the mitral, the tricuspid, the aortic and the pulmonic heart valves, is characterized in the time�frequency domain. A discussion in relation to the auscultation area of each PCG signal is then presented allowing acoustic characterization of each myocardial orifice.
back to Accepted Papers Page


Retrieving Adaptable cases into an associative memory
Paper Number:   60
Abstract:
Case-based reasoning is a reasoning method resolving new problems by reusing past episodes of resolution called cases. During the reusing process, solution of old case could be adapted to fill the constraints of the new problem. The success of adaptation step depends on the quality of remembered case: �is it easy to adapt?� However, the recall process is generally guided by similarity. Most similar cases are not usually the easiest ones to adapt. It will be more accurate to guide our retrieval by an adaptability criterion. The main idea of this paper is to perform a retrieval guided by adaptability into an associative memory called �Case Retrieval Nets� (CRN). This memory model was initially conceived for retrieving cases by similarity with high performances. We aim to benefit from these performances to support the adaptability criterion. Which leads to two advantages: look ahead to adaptation stage without incurring the full cost of adaptation and secondly transport this knowledge rapidly throughout the CRN.
back to Accepted Papers Page


A new approach for synchronization in a multimedia scenario
Paper Number:    62
Abstract:
The objective of this work is to contribute to the domain of edition and presentation of multimedia scenarios, and particularly in the temporal aspects of scenarios by considering a p-temporal Petri net for the specification of the multimedia scenarios. Our objective is to answer to the real user (final users, authors) requirements. The user should have the possibility to specify temporal synchronization, interactions, and the possibility to present scenarios. To achieve these objectives, we define a representation model based on the following concepts: the temporal and causal relations between multimedia data, - the user interactions, - the representation of unknown durations, and the formal specification of scenarios based on p-temporal Petri nets that are useful to present and simulate the scenarios.
back to Accepted Papers Page


Towards consistency analysis of UML dynamic diagrams
Paper Number:   63
Abstract:
This paper presents a methodology for verification of consistencies between state and sequence diagrams of an UML model.  Our approach consists first of translating UML State Diagrams into Interval Timed Petri Nets (ITPN) which time intervals model well generation and dispatching delays of events. The method deals also with composite states and most of pseudo-states including the history ones. Then sequence diagrams are mapped into dispatched events paths adorned with time tags. The defined semantic consistency requires theses paths to be subsequences of the reachability graph of the whole Petri net.
back to Accepted Papers Page


Neural Network In Corner Detection Of Chain Code Series
Paper Number:    65
Abstract:
This paper presents a Neural Network Classifier to be implemented in corner detection of chain code series. The classifier directly uses chain code which is derived using Freeman chain code as training, testing and validation set. The steps of developing Neural Network Classifier are included in this paper. Comparison results between Neural Network Classifier corner detection and other computational corner detection are presented to show the reliability of the proposed classifier. This paper ends with the discussions on the implementation of proposed neural network in corner detection of chain code series. Experimental results have shown that the proposed network has good robustness and detection performance. This makes this method a great choice for machine vision.
back to Accepted Papers Page


Metrics For Object Oriented Design (MOOD) To Assess Java Programs
Paper Number:    68
Abstract:
MOOD metrics are well known metrics used to measure some characteristics of the Object-Oriented programs. In this research, a system, based on the MOOD, has been developed to evaluate and grade Java programs. The interval of each MOOD metrics has been adapted, based on experimental results, to be fit in the evaluation of Java Programs. Also, a weight factor has been introduced to reflect the importance of each characteristic. The system has been tested with many different programs that vary in their complexities and functionalities. Also, the metrics have been tested in the evaluation of student programs in the University of Jordan and grading them. For all cases, the system shows good results.
back to Accepted Papers Page


Novel Computer Network Approach for Electricity Tariff Management Systems
Paper Number:    72
Abstract:
Conserving electrical energy has recently become a distinctive element of our everyday life. Electrical energy consumers in the residential & commercial sectors are particularly sensitive to price increases. Often, policies for consumption reduction relying entirely on voluntary response by consumers do not produce much effect. By contrast, policies that offer financial incentives to consumers via tariff reduction are highly successful. It is the purpose of the present paper outlines a novel computer network approach for electricity tariff management systems. The present system is given the acronym ODETNET: software   Driver for Electricity Tariff management system based on computer NETworking techniques.  ODETNET methodology is based on using the power grid network itself for computer networking.  ODETNET prototype automatically scans readings from kilowatt-hour (kWh) meters. These automated meter readings are directly inputted to a network of computers. This is instead of the normal manual situation where the human observer notes visually the mechanical registers of the kWh meter. The innovative contribution of ODETNET results from the fact that interconnection of computers in ODETNET is via the electrical power grid network itself. ODETNET prototype experiments were successfully implemented, with the full cooperation of the Electricity Ministry. Both hardware and software aspects of ODETNET system prototype are discussed.
back to Accepted Papers Page


Forest Fire prediction. The GIS Simulation Model
Paper Number:    74
Abstract:
The mastery of strategies of forests fire-fighting passes compulsorily by deep   knowledge  of   forest fire  phenomenon. One of the efficient means  of apprehension of the forest fire, is to dispose of a tool  that is capable to inform us about the behavior of fire before its apparition according to the given climatic conditions. In this context, the simulation remains an effective tool  for the prediction of the fire behavior. It permits to determine with a relative confidence degree, susceptible zones   to be ravaged by the fire during a determined period. The objective of this work is  the study of mechanisms of forest fire progression  by the elaboration of an automatic tool capable to pattern suitably a fire forest, its parameters, its propagation and its behavior in a given region. Through out this study, it will make conspicuous  the considerable property (perhaps unavoidable)  the Geographical information Systems (GIS), in combination with techniques of simulation in the apprehension of a problematical " fires forest " [DAG 94]. This resides in the power of the GIS to modelling all phenomenon presenting a geographical character. The interest that presents such a survey for operators in charge of the management of a forest fire (Fireman department, services of forests, local collectivities, etc.) is double. It permits to define to the long term a homogeneous and coherent politics of forest fire prevention, whereas the system permits to verify the adequacy of amenities and presents infrastructures of wrestling against the fire with the reality of a disaster [DAG 97]. It also allows to determine, in the perspective of the beginning of a fire, means to put in work for a coordination of intervention teams and a strategy that remains efficient, of wrestling against the fire progression.
back to Accepted Papers Page


Speech Denoising using Wavelet Thresholding
Paper Number:    80
Abstract:
During the past decade ,Wavelet Transform (WT) have been applied to various research areas in statistical signal processing, their application include speech and image denoising, compression, detection and pattern recognition. In this paper we analyze the performance of two thresholding methods for denoising of speech signal in wavelet domain. The first method uses the Automatic Soft thresholding criteria for the removal of background noise form the speech signal and the second method removes the noise using the Automatic Hard thresholding. Our Practical analysis results are based on the application of Wavelet based hard and soft thresholding for the denoising of speech signal corrupted due to background and scaled white noise. More specifically we prove that using wavelet based techniques, our signals can achieve higher signal to noise ratio (SNR), better peak signal to noise ratio (PSNR),and reduced normalized root mean square error(NRMSE) while the retained signal energy is 99.9885% and the resulting denoised signal are generally much smoother.
back to Accepted Papers Page


Tcp Contextualization And Chaos Control In Wired Cum Wireless Environment
Paper Number:    82
Abstract:
Technological advances in mobile devices has created a new network environment namely the wired cum wireless environment. Such network has imposed great challenges for network support. Firstly, while wireless networks provide mobility and convenience to the mobile users, the quality of service (QoS) of today�s wireless networks is still lagging compared to wired network thus degradation the overall network�s performance. Secondly, transport control protocol (TCP), which is used extensively in wired network, has extend its usage in wireless network. Though TCP is optimized for performance on wired networks, its misinterpretation of wireless errors as network significantly degrade the network performance. In order to overcome these predicaments we propose a revolutionary control system that will complement TCP in stabilizing the network and thus increase network performance. Based on Cynefin framework, we contextualized TCP control management and focus on the control management at the border of complex and chaos domain that is at the edge of chaos. We will apply bifurcation control by delaying the occurrence of bifurcations in the system. This in return will stabilize the behavioral pattern of TCP traffic flow in wired cum wireless network environment.
back to Accepted Papers Page


The IP ability to disable the DDoS accessibility
Paper Number:   85
Abstract:
The most popular methods used in the Distributed Denial of Service (DDoS) attack and their defensive mechanisms (such as IP traceback techniques) are discussed in this paper. A new solution is suggested to prevent flooding Web servers. Readers will find a step by step explanation of the suggested solution to protect sites from DDoS attacks.
back to Accepted Papers Page


Construction Method of Audio Signal for Secrete Communication (CASIHO)
Paper Number:    88
Abstract:
However hidden messages techniques has become more difficult, but methods for detecting information hiding and analysis have become increasingly more sophisticated for high-resolution digital images and digital audio carriers, so that the search for new method that support the hiding information techniques be the only way to making the analyzer task be as possible be hard. This paper describes a new approach to construction-hidden messages in digital audio. The approach uses a linear transform to re-build and change the type of secret information with new statistical features, than be entered as input that embedded to the audio carrier by using law bit encoding.
back to Accepted Papers Page


Analysis of Density Effect in Probabilistic Flooding in Mobile Ad Hoc Networks
Paper Number:    89
Abstract:
Broadcasting is a fundamental and effective data dissemination mechanism, which has several applications such as route discovery, address resolution, as well as many other network services. While data broadcasting has many advantages, it induces some difficulties known as broadcast storm problems, which are natural consequences of redundant retransmission, collision, and contention. Data broadcasting has traditionally been based on the flooding protocol, which floods the network with high number of packet rebroadcasts until the desired routs are discovered [6]. Density is the number of network nodes per unit area for a given transmission range. This paper investigates the effect of density with different speeds on the behaviour of probabilistic flooding in mobile ad hoc networks (MANETs). This investigation is conducted through extensive simulation on a network of size 25 to 100 nodes. The obtained results reveal that the effect of density with different speeds has a critical impact on the levels of rebroadcast and reachability achieved by probabilistic flooding.
back to Accepted Papers Page


Lane Following for Marked Road
Paper Number:    90
Abstract:
A low-cost following lane boundaries algorithm is described in this paper. The algorithm is destined to painted road with slow curvature. The basic idea proposed in our approach is that complete processing of each image can be avoided using the knowledge of the lane markings position in the previous ones. The markings detection is obtained using Radon transform that exploits the markings brilliance relatively to the road surface. The experimental tests proved the robustness of this approach even in shadows presence. The originality of our approach compared to those using the Hough Transform is that it does not require any tresholding step and edge detection operator. Consequently, the proposed method is proved to be much faster.
back to Accepted Papers Page


Optimized Trellis Coded Vector Quantization For Encoding Memoryless Sources
Paper Number:   91
Abstract:
Abstract:
: Trellis coded quantization, both scalar and vector, improves upon traditional trellis encoded systems by labelling the trellis branches with entire subsets rather than with individual reproduction levels. Trellis Coded Vector Quantization (TCVQ) was introduced as an effective low-complexity source coding technique which achieves rate-distortion performance superior to many conventional optimal quantizers. In this paper, we describe the design of TCVQ encoding system and its improvement by an optimization procedure which was developed for TCVQ codebook conception. We will show that, for a fixed rate, TCVQ encoding with optimized codebook yields better performance than the non optimized encoder. We will also show the superiority of the TCVQ scheme over conventional optimal source coding techniques.
back to Accepted Papers Page


An Object Oriented Library Of Parallel Graph Algorithms
Paper Number:    92
Abstract:
Parallel computing is about 20 years old, since then, parallel computing has solved complex problems and high-performance applications. Distributed systems consisting of powerful workstations and high-speed interconnection networks are economical alternative to special purpose supercomputers. Networks of workstations that are comprised of independent, general purposes computers are available everywhere and have much unused computing power to be exploited. Utilizing these resources for parallel computing is becoming a good idea with the existence of many developed tools and systems that make these workstations appears as a single virtual machine. Heterogeneity, high latency communication, fault tolerance and dynamic load balancing are issues that should be addressed in these systems to exploit parallelism. Therefore, developing parallel programs in such systems requires expertise not only in parallel programming issues but also in distributed computing. Dynamic distributed parallelism is achieved by developing portable libraries that provide support for these issues and liberate users from the complexities of distributed systems and parallel programming, thus the programmer have only to learn the interface to the library instead of learning parallel programming. We have chosen to do this by implementing a library of parallel graph algorithms. Doing so addresses both software design and development issues, and hardware and performance issues like allocation of computations to workstations, communication latency, the number of workstations in the system, granularity of computation, and load balancing. We can show the efficiency of a library either by comparing the execution time required for the same problem as a sequential program for the problem using our library. We have also chosen a second approach and solved large instances of problems which could not be solved before on a single computer for reasons of speed or memory constraints.
back to Accepted Papers Page


Towards a Multi-representation Ontology-Based Information Systems Mediation
Paper Number:    97
Abstract:
Recent research shows that ontologies are a prominent tool for the semantic interoperability of information sources. Unfortunately, semantic interoperability is not easy to achieve, as related knowledge is most likely to be described in different terms using different assumptions and different data structures. In this paper we provide an interoperability solution based on semantic mediation permitting to define a system of sharing, retrieving and transparent access to the heterogeneous information sources. The proposed approach is based on the utilization of a multi-representation ontology specified in description logics language extended by stamping mechanism. This solution provides a structured and shared vocabulary of several domains, serving as support to a semantic mediator charged to localize the necessary sources to answer a request, analyze this request for distributed extraction of data and recompose the results in a user answer.
back to Accepted Papers Page


Concealment of Information in Digital Image
Paper Number:    98
Abstract:
Transmission secret information provides an interesting problem space for investigating information hiding. In this paper we present a new hiding system �Single and Double Hiding�, to embed text file in digital image and all will be hide in cover image or to hide text file in cover image only. The cover image could be one of different image file format, such as (BMP, JPEG, GIF, and GIF animation). The proposed system used cryptography, compression, and image filtering to recover the weakness in the known methods. The system used three new invented methods for hiding, which are hiding in edge points, direct hiding, and hiding with reference images, after hiding process the proposed system submitted into various kinds of attack to improve its strength. Finally a comparison between (stego and cover) image will be presented through using the histogram and the quantify measures, and the resulting images from hiding didn�t draw any suspicion.
back to Accepted Papers Page


Towards a Methodology for Modelling Combinatorial Search Problems with Constraints
Paper Number:    99
Abstract:
Constraint programming CP is a programming paradigm especially suited for solving complex combinatorial problems. In this paper, we address the issue of the methodologies for developing constraint-based programs. Modeling with Constraints-based Systems has been indirectly tackled through the improvement of constraints expressiveness. However, the assumption that constraints are declarative enough to allow a natural approach to modeling proves itself to be unrealistic. The basic idea of the proposed constraint-based programming methodology is the identification of problem/specification families and of efficient corresponding programs. We show that the synthesis of Constraint programs from specifications can be automated. In order to enable (or facilitate) automated synthesis, inputs ought to be formal though it would then be more adequate to say that the inputs are programs and that synthesis is compilation.
back to Accepted Papers Page


Using Fuzzy AODV Routing Protocol in Ad-hoc Networks
Paper Number:   102
Abstract:
 Ad hoc On-Demand Distance Vector (AODV) routing protocol has been and continue to be a very active and fruitful research area since it was introduced for the ad hoc wireless networking. Ever since AODV uses static value for its timeout parameters, like Active Route Timeout (ART) which states the time that the route can stay active in the routing table. Timeouts may be more accurately determined dynamically vi a measurement, instead of using a statically configured value. To accomplish this, we use fuzzy logic system to obtain dynamic values for ART depending on situation of transmitter node. The analysis shows that the proposed design method is quite efficient and superior to conventional design method with respect to data packet delivery ratio, routing overhead and average end-to-end delay.
back to Accepted Papers Page


A New Word And Diacritic Segmentation Technique For Arabic Handwriting
Paper Number:    103
Abstract:
Failing to segment a handwritten text correctly will definitely result in poor recognition, no matter how well the following and previous stages are designed. We can say that a considerable share of recognition errors is attributed to the segmentation stage. In this paper a new algorithm for segmenting handwritten Arabic text-line into words and diacritics is proposed. In our approach we rely on using fill-in function, stroke thickness, text line height in segmenting the words/ subwords and diacritics of Arabic handwriting. We have developed a simple procedure for stroke thickness estimation. In addition to segmentation, our algorithm performs partial recognition as it can recognize the Arabic character Alif and some diacritics during the segmentation process A number of experiments were conducted and higher accuracy was obtained.
back to Accepted Papers Page


Artificial Neural Network Model for Prediction Solar Radiation Data: Application for Sizing Stand-alone Photovoltaic Power System
Paper Number:    104
Abstract:
Abstract:
The prediction of daily global solar radiation data is very important for many solar applications, possible application can be found in meteorology, renewable energy and solar conversion energy. In this paper, we investigate using Radial Basis Function (RBF) networks in order to find a model for daily global solar radiation data from sunshine duration and air temperature. This methodology is considered suitable for prediction time series. Using the database of daily sunshine duration, air temperature and global solar radiation data corresponding to Typical Reference Year (TRY). A RBF model has been trained based on 300 known data from TRY, in this way, the network was trained to accept and even handle a number of unusual cases. Known data were subsequently used to investigate the accuracy of prediction. Subsequently, the unknown validation data set produced very accurate estimation, with the mean relative error (MRE) not exceed 1.5% between the actual and predicted data, also the correlation coefficient obtained for the validation data set is 98.9%, these results indicates that the proposed model can successfully be used for prediction and modeling of daily global solar radiation data from sunshine duration and air temperature. An application for sizing of stand-alone PV system has been presented in this paper in order to shows the importance of this modeling.
back to Accepted Papers Page


Texture Discrimination Using OCON and Multilayer Neural Networks
Paper Number:    105
Abstract:
Texture analysis is an important and useful area of study in machine vision and image processing. Texture analysis applications have been utilized in a variety of image processing fields such as: medical image processing, remote sensing, document processing, etc. One immediate application of texture analysis in image processing is the recognition of an image using texture properties (i.e texture classification). This work aims to investigate the use of OCON (One-Classes-One-Net) neural networks for the classification of natural-textured images (from Brodatz album). Therewithal, attempt to draw a conclusion about the best feature-set (recommended) that could be used to discriminate texture images and improve the overall performance. To do this, four different sets of features (statistical, statistical with radial/angular, spectral) are used to train OCON based neural networks (Kohonen, and PCA) for texture classification. Further, the recommended feature-set is used in training the neural networks presented in this work to classify two composite-texture -images (one at a time). Then, the networks performances were tested on the entire composite-texture-image and their classification abilities were compared visually. During this work, weight initialization strategy for Kohonen is developed to improve the convergence time and overcome the isolation problem. These developed techniques of weight initialization leads to better clustering operation.
back to Accepted Papers Page


PalmPrint Authentication System (PPAS)
Paper Number:    114
Abstract:
Palmprint authentication is an important complement of Biometric authentication, this paper explore the method for palmprint verification depending on the three principle lines that founded in the palm of the hand. The principle lines of a palmprint have a remarkable advantage of authentication because these lines have two important properties (uniqueness, and stability), from these two important characteristics it became easier to recognize one person from another depending on the features of each person. It is important to extract those features from the principle lines of a palm by using one of features extraction methods, where the moments is a method that was used to extract those features. Depending on these features an authentication system depending on a palmprint known as PPAS was designed, implemented, and tested.
back to Accepted Papers Page


Distributed Computing with Bluetooth
Paper Number:    115
Abstract:
This paper will analyze the capability of Bluetooth network for distributed computing between its devices, by thoroughly analyzing the Bluetooth Specification. Researching and comparing the various systems that Bluetooth runs on.  The platform and OS independence claims will also be checked.
back to Accepted Papers Page


A development Process of Enterprise Portal based on the Web Services
Paper Number:    118
Abstract:
The concept of enterprise portal (EP) has exploded since the success of the public portals like Yahoo! and MSN. The portal provides an easy access to internal and external applications using a browser-based interface. But, before the development of a portal, it is necessary that the enterprise provides a homogeneous environment offering the interoperability between the portal and the existing applications. In this context, the Web services are the best solution to expose the various functionalities of the different applications. They have a motivate key, which is the interoperability. This paper presents a development process of enterprise portal based on the Web services architecture. This process describes how to develop the Web services by re-using the existing applications and how to build a portal that integrates and aggregates these services and other external ones from other portals.
back to Accepted Papers Page


Enhanced Multi-Signature Scheme
Paper Number:    120
Abstract:
An Enhanced Multi-Signature Scheme, EMSS is suggested in this paper for group-users applications. The algorithm of this method employs a modified RSA technique, which represents the signature for a classified data as a separate value taken at the end together with the enciphered message. It is signified by a short and fixed length signature for a message regardless of its length, which comes from repeated signing of the message segments. A serial number is included in the initial stage for bot message encryption and signing, in order to increase the security and prevent the birthday attack. A major feature of this method is to put the modulo of all users in order aiming to solve the re-blocking problem.
back to Accepted Papers Page


Full-wave Analysis of sensitivities of microstrip open-end  discontinuity using the ameliorated moment method
Paper Number:    121
Abstract:
In this  paper, we carry  out a dynamic study of analysis of the sensitivities of planar microstrip structures using the ameliorated moment method. The shape sensitivities are obtained by using a new potential integral equation for the derivative of the surface currents   with respect to geometrical parameter of microstrip. This new integral equation is solved together with the original integral equation with the ameliorated moment method. From the geometrical derivative of the surface currents, geometrical derivatives of the S parameters are obtained.
back to Accepted Papers Page


A Swarm Algorithm for Medical Image Registration Based on the Maximization of Mutual Information information images
Paper Number:    122
Abstract:
In this paper, we propose an algorithm of particle swarm optimization to solve the problem of medical image registration. This algorithm combines the robustness of entropy based measures and the search power of the particle swarm optimization. The main idea is to find the best non linear transformation that superimposes two medical images by maximizing the mutual information value through the particle swarm optimization. We show that this algorithm, besides its simplicity, provides a robust and efficient way to rigidly register medical images in various situations. This includes for instance multimodal registration of CT and MR images as well as Scanner and MR images.
back to Accepted Papers Page


The Intranet and its impact in the hold of decisions at the level of enterprises
Paper Number:    128
Abstract:
Principal objective of the introduction of a computerized information system using the new technologies of information and the communication (NTIC) by the Internet and the intranet are: improvement of the quality of the hold in charge of problems, assessment of service activities, restraint of the administrative management, restraint of the continuous increase of management costs and facilitated the hold of decisions. The management of the enormous mass of information with the conventional means: better organization, good administration and qualified and sufficient staff, drag a difficult and complex management: not of control serious of management and men, not of transparency in activities and bad administrative management. Solutions susceptible aim to optimize the use rational of the enormous material, human and financial resources of enterprise to satisfy our objectives and to introduce the techniques most modern of treatments of information by the setting up of a leading diagram of the enterprise. contributions of the computer tool by the new technologies of information and communication �NTIC� in the domain of the management are: better organization of activities, considerable gain in time, minimum of expense, better exploitation of information in real time and to improve methods, means of storage, treatments of data and holds of decisions efficiently. In the setting of this communication we are going to approach aspects of the Intranet of enterprises in the setting of the decision hole.
back to Accepted Papers Page


Genetic Algorithms Application to the Standard Arabie Vocalic Recognition
Paper Number:    130
Abstract:
The aim ofthis work is to apply Genetic Aigorithms at Standard Arab vowels Automatic Recognition. For tha�, wc have analyzed a corpus composed of several sent�nces, registering by only one speaker, using Linear Predicting Coding method. It results a set of discriminate parameters vectors, which serves as reference for the Parameterization Generic Modelisation. The adaptation or Fitness function of the Genetic Model has been dermite. The Genetic Aigorithms which search to maximize the inverse function of the Fitness one, allow as to obtain interesting results.
back to Accepted Papers Page


Arabic Connected Digits Recognition Using The Probabilistic Approach Of Hidden Markov Models
Paper Number:    138
Abstract:
Although a great deal of effort has gone into studying large vocabulary speech recognition problems, there remains a number of interesting problems which do not require the complexity of these large systems. One such problem is connected digit recognition, which has many practical applications. Connected digit recognition is also an interesting problem for another reason, namely that is one in which whole word training patterns are applicable as the basic speech recognition unit. Thus we can bring all the fundamental speech recognition technology associated with whole word recognition to solve this problem. Hidden Markov models HMMs are probabilistic approach and widely used in most world speech recognition systems from small to very large vocabularies. Our paper is focused on how to apply these probabilistic models in a connected arabic digit (0-9) recognition task using moderate size databases. Results obtained are satisfactory and would encourage using our system in more sophisticated tasks.
back to Accepted Papers Page


A New Direction - Forecasting Cryptographic Keys
Paper Number:    141
Abstract:
Internet is the most hazardous environment for data security. Everybody may have the chance to unveil anybody�s secret messages when interested if one is given the skill and resources, no matter how strong security system is used. With the great increase in the number of applications present on the internet and the increased number of cyper-customer, it is highly profitable to predict that a security attack on messages is likely to occur within certain time, compromising information or encryption keys. This paper presents a new direction in security that enables internet users to forecast an attack against a secret key. This technique suggests an algorithm that includes vital and extremely important factors. In order to perform this task, the lifetime of the of the used key, the frequency of key usage, the number of messages encrypted with the same key and the message routing can be considered. The forecasting algorithm can be equipped with an alarming tool to safeguard the users. This technique will be useful for any enterprise in order to forecast the security attacks and in order to change the key before it can be compromised, so they can budget to defend against the attacks. This procedure provides insufficient information and will be considered as useful tool for approaching perfect security according to Shanon criteria. 
back to Accepted Papers Page


Efficient Grid Location Update Scheme for Mobile Ad Hoc Networks
Paper Number:    146
Abstract:
Location update schemes are vital for the success of position-based routing in mobile ad hoc networks. However, efficient location update schemes have been focus of research for scalability of ad hoc networks. This led to development of few location update schemes. Grid Location Service (GLS) has been one of commonly referred update scheme in the literature. In this paper, we propose an efficient grid location update scheme that outperforms the Grid Location Service scheme. We use the concept of Rings of squares for location updates and selective query mechanism for destination query. Our simulation results show that our new grid scheme has better percentage of successful queries with smaller location update cost to update the location servers. Our new grid scheme also performs location updates faster than GLS scheme.
back to Accepted Papers Page


Intelligent Agents and Smart Businesses
Paper Number:    149
Abstract:
Examining potentials of agents in electronic environment in which companies can proceed their activities thoroughly, we propose two conceptual models describing know-how of using agents in strategic decisions. The first model presents a big picture of using intelligent agents in a competitive environment, where all the competitors are struggling virtually. The second one prototypes a typical competitor from the interior perspective with more details. This work is a beginning and a stand stone to open a breakthrough toward constructive a smart business media.
back to Accepted Papers Page


A Model for Management of Transactions in the Mobile Databases
Paper Number:    154
Abstract:
In a mobile computing environment, it is important to develop transaction processing systems that accommodate the limitations of mobile computing, such as frequent disconnections, low power, limited bandwidth communication and reduced storage capacity. In this paper we propose a mobile transaction processing model which distincts in treatment between LDD: Location Dependent Data and NLDD: No Location Dependent Data.
back to Accepted Papers Page


Meta-middleware:A Solution Allowing Interoperability between Middlewares
Paper Number:    156
Abstract:
Middlewares such as CORBA, RMI, DotNet or even DCOM allow the interoperability of the various customers of the company with the server, but their heterogeneity called into question the interoperability of the servers of different enterprises (for instance in a B2B context). In order to solve this problem, we propose a software layer called meta-middleware, which contains protocolar personalities of the existing middlewares. Its role consists of translating a request in the format of the target middleware.  In addition to this meta-middleware layer, it is necessary to extend the existing middlewares with new components. The latter allow intercepting the request sent outside the company and extracting  information such as object reference, parameters, name of the invocate method�.for transmitting them to  meta-middleware.
back to Accepted Papers Page


Explanatory dialogical agents for qualitative simulation
Paper Number:    158
Abstract:
Our proposal in this paper is to explain a qualitative simulation algorithm. Five active software components, considered as agents, collabrate to produce this explanation. Communication between them is described using an Agent Communication Language (ACL). Explanatory text is elaborated cooperatively with the user. User interaction with explanatory agents is task-oriented. It is said dialogical interaction. Explanatory agents carrying out this interaction are called dialogical agents.
back to Accepted Papers Page


 Injecting bit flips in VHDL models A comprehensive approach and prelimmary results
Paper Number:    159
Abstract:
Fault injection in VHDL models is a well established approach to study the consequences of transient faults on the behavior of integrated circuits. This paper describes details of a strategy adopted to reproduce as close as possible real faults (bit-flips) that may occur in the atmosphere due to neutron radiation. Benchmark programs were used to get simulated error rates on an 8-bit microcontroller: the Intel 8051. Obtained results were compared with other fault injection experiments.

back to Accepted Papers Page


An Ontology Development Process for the Semantic Web
Paper Number:    162
Abstract:
The idea of Semantic Web is to tend towards a Web where semantics of data would be at the same time understood both by machines and by human. For this reason, it is necessary to solve problems involved in the description and the processing of the meaning contents of the web documents. One of the solutions brought at the present to this problem of document semantics representation consists of exploiting ontologies and proposes new languages to represent and reason on these ontologies. The first languages which were proposed are RDF and RDFS. These latter, have a very limited capacity of expressions, which recently led to the development of DAML+OIL language. it brings to Web languages, the equivalent of a description logics. The use of ontologies in this context requires a good design and a good definition. Actually, there is no a complete process, begining by brut data and ending at an ontology represented by DAML+OIL language. This article, is based on the work results realized in Artificial Intelligence and knowledge engineering, in order to propose a development process of ontologies defined by DAML+OIL language.
back to Accepted Papers Page


 An Ontological Approach for Building User Profiles
Paper Number:    163
Abstract:
We describe in this paper how ontologies may be used to model different user categories. We first look at what aspects need to be described for the purpose of user model design in the context of a research laboratory. Then we look at how these aspects are related to each others. In order to be generic and reusable in different application domains, the user ontology is formalized with Description Logic and implemented using Semantic Web technology. Specific characteristics of the users such as physical data, social background, computer experience level, type of activity, personal characteristics, preferences, goals, interests and so forth, are integrated in this ontology.
back to Accepted Papers Page


Objective Evaluation of Multi code books Vector Quantization of LSF Parameters of the CELP Coding
Paper Number:    166
Abstract:
Vector quantization (VQ) is very used in speech coding. Our goal is the coding of the frequencies spectral line (LSF) with the less complexity without losing the quality of speech discounted however. Lot of code books transformation methods of VQ have been studied until now.  Among them the split VQ and multi stage VQ (SVQ and MSVQ) presenting each of them a disadvantage corrected lately by the method of Multi codebooks applied for the SVQ giving the multi code books split vector quantization MC-SVQ thing that is the subject of our study for an optimal application of this technique on the MSVQ to give the multi code books multi stage vector quantization MC-MSVQ, and for an acceptable complexity without loss of information and the quality.
back to Accepted Papers Page


Relationships between attributes to integrate heterogeneous data sources
Paper Number:    168
> Abstract:
In this paper we present our manner to build a data warehouse using heterogeneous data sources. Nowadays, information systems are constructed thanks to the use of internal data within an organization and         also some of the numerous data sources available on the web. Thus, the challenge is to integrate all these kinds of data which may be heterogeneous. Our work is to present a methodology to construct a set of views (a data warehouse) based on the extraction of the inter-schema relationships between sources. We define three types of relationships between attributes: the synonymy, the inclusion and the disjunction. To illustrate these relationships we consider two relational and/or object-relational databases. All the different stages proposed in this work are implemented by the use of a functional prototype.
back to Accepted Papers Page


XML-Based Query Processing in a Data Integration Architecture
Paper Number:    170
Abstract:
Abstract:
. In this paper we present an XML based query processing for data integration problem. Our approach is based on two main concepts; ontology and mediator. By providing local ontologies, which describe the local sources, we attempt to solve semantic conflict. A global ontology permits to integrate the local ones using the global concept dictionary and the global mapping tree. The second concept is used to perform the tasks necessary to the integration, such as transformations of queries, conversion of local results, their merging into a global one. In our work, the queries and the results are represented using XML. The latter provides a strong level of representation and permits to have coherence in the system. In the implementation phase, we choose the distributed object paradigm and the web services technology. The latter, improve the interoperability of the different mediators, especially in the web environment.
back to Accepted Papers Page


Trinocular Image Matching by Hopfield network
Paper Number:    173
Abstract:
Computer vision has become in the last years a very important reserach field. Image matching is one of problems in computer vision known by their difficulty, its results are used for the purpose of depth analysis or 3D reconstruction. Matching approaches are grouped generally in two categories: feature or edge based methods and intensity or region based methods. We propose a matching method that uses Hopfield neural networks on images triplet that processes on two stages: the first is based on region primitives and the second on point frontiers. In the first stage based on regions, we use region characteristics which are surface, gray level, elongation and gravity center constraints. Each neuron represents a potential matching of three regions from three different images. The region relational characteristics are unicity and adjacency/order constraints which represent connections between neurons. In the second stage, we use point primitives extracted from region frontiers to do the matching process after polygonal approximation of the region frontier. Point characteristics are correlation similarity at the point neighborhood window, epipolar constraint and gray level. Point relational characteristics are orientation and order constraints. As result, results of matched frontier points can be applied to calculate 3D point coordinates of each point. Experimental results are shown in our method.
back to Accepted Papers Page


A New Adaptive Zone Filter To Estimate RTT In MANET
Paper Number:    176
Abstract:
The guaranty of the Quality of Service (QoS) in the Mobile Ad hoc NETwork (MANET) is a need to generalize their deployment [1]. Indeed, the performances of wireless network are related to many parameters: the load of the network, the number of nodes, the state of the radio link, etc. It is very difficult to quantify exactly all these parameters. So, we propose to use the Round-Trip Time (RTT) as an estimator of the network exchange capacity and its available bandwidth. A fine analysis of this estimator based on the theory of EWMA filters is proposed: the Adaptive Zone filter. The purpose of the presented work is to improve the performance of existing algorithms by using a more progressive behaviour than the traditional Flip-Flop filter. The performances of the Adaptive Zone filter are analysed through simulation under OPNET�. The prospects of the RTT estimator utilization are finally clarified.
back to Accepted Papers Page


Modeling By A Multi Agents System an Educationnal Adaptive Dynamic Hypermedia System
Paper Number:    179
Abstract:
For many years, hypermedia has been a new research area in the field of computer aided teaching systems, with its simplicity and its use in the Web which has been the best technology for distributing information. Three kinds of systems have successively appeared:  classical hypermedia, adaptive hypermedia and dynamic adaptive hypermedia. This paper presents a modeling of an educational adaptive dynamic hypermedia system by a multi agent system. This system generates dynamically the hypermedia pages of the course, these pages adapt to the model of learner. It use the teaching activities to present many sights of same concept and to specify the tasks defining the structure best adapted to understand the concepts to be taught. This system consists of five agents : learner model agent, that incorporate and manipulate the learner model, the pedagogical agent of adaptation , which incorporates the  domain model and the pedagogical activities model defined by  the teacher;  he specifies the pedagogical structure of the course and specifies  the concepts to be studied before the others by communicating with the learner model agent to consult the learner model, then it asks the filtering agent to filter the documents adequate to the learner model, this last agent applies filters on the multimedia documents database, then sends the addresses of filtered documents to  the generator of hypermedia page agent, which generates the page and sends it to the interface agent which visualizes it to the learner. The interface agent acts as a gateway between the learner and the pedagogical agent for adaptation, learner model agent and the filtering agent.
back to Accepted Papers Page


Extraction of Road Networks from Satellite Images
Paper Number:    180
Abstract:
The goal of this study is the definition of a model of extraction of the road network from satellite images.  It is a question of building a model which takes account at the same time of the radiometric aspect and the space aspect of the road.  We show that a road cannot be characterized satisfactorily by a whole of methods based on radiometric characteristics like the morphological operators. On the other hand, it proves that a method based at the same time on radiometric and space characteristics characterizes a road satisfactorily.  It makes it possible to study the problem of extraction generally.  This method of extraction of the road network is based on an approach suggested by D. Geman and B Jedynak [ 1 ] and improved by us. In this work, we improved the following steps:  detection of homogeneous elements of road, detection of linear components of road and optimization of a certain cost function with the resolution of the large pixel (area) to find the best binary configuration representing our road networks.  The experimental study concerning various types of satellite images, more particularly urban image and semi-urban image, shows that this method characterizes the road networks satisfactorily, but its major disadvantage is the arbitrary selection of the parameters used which remain difficult to calculate in a single way for all the images to treat.
back to Accepted Papers Page


Further Investigations Of Accuracy-Based Fitness Using A Simple Learning Classifier System
Paper Number:    189
Abstract:
After the introduction of XCS, most current Learning Classifier Systems research involves the use of accuracy-based fitness, where rule fitness is based on a rule�s ability to predict the expected payoff from its use. Whilst XCS has been shown to be extremely effective in a number of domains, its complexity can make it difficult to establish clear reasons for its behaviour. In this paper, we investigate drawing on XCS, aspects of the steady-state genetic algorithm within YCS - previously introduced based on XCS�s framework - with the aim of further explaining the effects of operators within accuracy-based fitness. Using the multiplexer problem and drawing on XCS, results confirm that increased separation in the fitness function aids performance and that increasing the rate of mutation and/or  degree of crossover has a marked effect on specificity, but to a lesser extent in the case of the crossover operator. Also, use of fitness bias in the replacement scheme has shown to increase specificity to significantly improve performance, particularly for lower reproductive pressure. Finally, deletion was found to interact with the GA subsumption process, resulting in improved rule compression than either deletion or GA subsumption alone.
back to Accepted Papers Page


Modeling Dynamic Destination Target Tracking
Paper Number:    190
Abstract:
In this paper we present a modeling of dynamic destination target tracking in public transit network, aiming at minimizing the travel time associated with the required OD-trip. Public transit network is viewed as a set of mode lines in which vehicles served passengers at time points. The proposed strategy accounts for the real time information such as the number of passengers waiting at station, the forecasting arrival times of vehicles at stations and the number of passengers boarding vehicle. Delay may modeled passengers boarding and alighting duration or modal transfer time from one line to an other. The experimental results are given in the paper.
back to Accepted Papers Page


Visually Guided Object Grasping Without Calibration of the Stereoscopic System
Paper Number:    197
Abstract:
In this paper we propose a new method for visual servoing of an autonomous robot manipulator based on projective geometry. The originality of this work resides in the fact that we avoid the calibration step that is tedious and unstable (measurements errors, approximated model). The method proposed is   based only on bi-dimensional data extracted from the images. To determinate the trajectory of the robot: translational and rotational movements, we use 3D projective reconstruction. The experimental results validate the proposed approach, taking several positions of the point to reach.
back to Accepted Papers Page


VEMMA: A Virtual Electronic Market Place based on Mobile Agents
Paper Number:    199
Abstract:
Several architectures were proposed in the literature for the modeling of the interactions between agents. Within the framework of this article, we describe three architectures based on the multi agents design. In these systems, Buyer and Seller agents interact in an environment comparable to an electronic market in order to sell and buy goods. We propose then an electronic architecture of commerce based on intelligent and mobile agents called VEMMA. We present the formal model of this architecture as well as an implementation using the Java language and the RMI technology.
back to Accepted Papers Page


A Text-Dependent Speaker Identification Methods And Experiments On Arabic Database
Paper Number:    201
Abstract:
In this paper we aimed to apply tow traditional algorithms for speaker recognition on Arabic database. One using determinist model based dynamic time warping (DTW), other one using probabilistic model based Gaussian distribution probability. In both methods, we have study the effect of the order of LPCC coefficients on the classifiers performance. In particular, we have attempted to conceive a system able to recognize all speakers based DTW multi-references. In the second method, we have also study the effect of variance on the rate identification. Our experiments are evaluated on Arabic database composed of twenty speakers and 100% identification rate is reached.
back to Accepted Papers Page


Satellite-Derived Surface Radiation Budget over the Algerian area. Estimation of surface albedo
Paper Number:    206
Abstract:
Surface albedo is the main input variable for the algorithms that solve the surface energy balance. Albedo is here referred to as the ratio of reflected solar radiation and the total incoming solar radiation for the wavelength of 0.3 �m to 1.05 �m (visible band). Absorption and scattering in the atmosphere causes the albedo, as observed by the satellite in the visible band (planetary albedo) to differ from the actual surface albedo. Absorption in the atmosphere is mainly due to water vapour. Scattering occurs as a result of the presence of air molecules (e.g.N2,O2) and aerosols in the atmosphere. After calibration of the visible images, the planetary albedo is obtained. The atmospheric correction procedure converts planetary albedo to surface albedo.
back to Accepted Papers Page


Task-to-Processor Assignment in Heterogeneous Network of Processors (Cluster) Systems
Paper Number:    209
Abstract:
Task to Processor Assignment (TPA) problem has recently gained a great deal of attention with the ever increase of using   distributed and cluster systems as an underlying architecture to execute parallel tasks. The challenge is to assign subtasks of a job such that minimum number of processors is used while minimizing the finish time and the communication overhead. In this paper, we propose an algorithm that schedules the subtasks ( processes) on a heterogeneous distributed system, taking into consideration the processors speeds, processors distances and adjacencies, communication paths latencies and tasks weights at the same time. Considering all those elements, the algorithm proposed is successful in obtaining a schedule with a minimum finish time. The algorithm is validated using an example and an exhaustive search simulation program.
back to Accepted Papers Page


����� ������� ������� ���������� �������� ������ ����� ���� �� ������ ����������
Paper Number:    211
Abstract:
This study can be summed up as follows: The embodiment of the recent out comes in the approach of the rudimentary theory of the neural network and its application in the field of the photovoltaic pumping water system with centrifugal pump The application of the method of the back propagation for the error of artificial neural network intelligence, and the reliance on this intelligence gives an ideal addition to photovoltaic pumping system. The training of artificial neural network with two inputs and a hidden layer with four hidden neurons of suitable number examples. That makes it with strong robustness, good efficiency and simple hierarchy, and an easy realization.
back to Accepted Papers Page


Fuzzy Identification of Gluconic Acid to Glucose Fermentation Parameters
Paper Number:    213
Abstract:
A construction of Takagi-Sugeno (TS) fuzzy models using clustering is presented. The input, which is a sequence of empirical data, is partitioned into a certain number of clusters. The important property of the modelling method is the global model accuracy and the local model interpretability; we use, in our modelling the global and the local learnings, in order to improve the interpretability, and obtain a good global approximation. The model representation is based on the interpolation of a number of local models, where each local model has a limited range of validity. The method is illustrated using simulated data from the fermentation of glucose to gluconic acid by the microorganism Pseudomonas ovalis in a well-stirred batch reactor.
back to Accepted Papers Page


Methodological Test Integrating the SIG and the Remote Sensing for the survey of the marin pollution in term of hydrocarbons: case of ARZEW harbors (Algerian West).
Paper Number:    214
Abstract:
The problem of marin pollution is probably one of some troubling aspects of the deterioration of natural middle. this pollution is a major risk which would hadicap seriously the biosphere_Geosphere balance. The oil is the principal pollutant and the Mediterranean sea is concerned by 18% of the worldwide pollution bye the oils. The harbors of Arzew are concerned by a big naval traffic; bound to an industrial zone particularly to risk in matter of pollution. Building an approach which uses the remote sensing data and the interpretation of the chemical data of the waters of the taking as well as the integration of these acquaintances in a Geographical Information System (GIS), constitutes the objectives assigned to our work
back to Accepted Papers Page


iSWare: An Inter Organizations Web-Based Urban Planning Coordinator
Paper Number:    215
Abstract:
This paper describes a urban planning coordination tool, called �iSWare�. iSWare means inter Sites groupWare, it is the result of an analytical study made on a group of independent organizations. Considered organizations use GIS in their urban projects. iSWare consists of an open component-based system. It is an online exchange coordinator playing the role of a general inter-organizations committee (board). It provides an easy internet support using groupware technologies. iSWare opts for a specific coordination protocol by applying the theory of competition and data distribution.
back to Accepted Papers Page


Tools of  assistance to the design of  image processing applications
Paper Number:    216
Abstract:
This research is part of an interactive and  incremental design  of  image processing and analyzing applications framework. Several tracks are explored, a system  of assistance to the design of Image Processing (IP)  applications using an approach based on the case-based reasoning  (CBR) which is still used very little in the field of (IP) is proposed .  This  type of reasoning is however well adapted to badly formalized fields,  such as (IP), since it allow the representation of the exceptions, the  use of missing information and the resolution of a complex problem by  combination of solutions of simpler problems.  In this article, the concept of " case in IP " is  defined.  Our objective is to   present a vision different from the process of reasoning based case  through the multi-agent paradigm.  Previous work on the  use of the case-based reasoning, presented the case as a passive  entity in waiting to be found to offer an assistance to the decision or  the resolution of problem.  Here, the use of active cases  organized in networks.  In other words agents in a company which solve   the problems while possibly adapting.
back to Accepted Papers Page


Managing a Pre fetching Scheduling into a SMIL presentation
Paper Number:    219
Abstract:
The SMIL presentation consistency can not be guaranteed when the required and synchronized medias can not be delivered in real time for the client player. This is due generally to the bandwidth fluctuations or to its bad use. In this paper we present an approach for providing a consistent presentation for SMIL1.0 documents by scheduling the insertion of pre fetching commands into a SMIL1.0 document. Contrary to the existing methods, the process is done with managing the client resources like bandwidth and memory which avoids client player overflow and crash.
back to Accepted Papers Page


Geographical Information System for Urban Spatial Analysis
Paper Number:    220
Abstract:
Many research works in automated cartography are carried out to solve problems related to social and urban human needs. In this paper, an information system for urban spatial analysis is presented. The proposed system outlines a method for urban spatial representing, modelling and reasoning that provide valuable intelligent aids for sorting out different spatial problems such land use planning, environmental assessment and demo-socio-economic analysis. Thus, the proposed system would be a major contribution for solving difficulties frequently encountered by large users in urban agencies.
back to Accepted Papers Page


A Parallel Algorithm for Edge Detection Based on Infinite Impulse Response filter (IIR filter)
Paper Number:    221
Abstract:
In this paper, we present a parallel algorithm for edge detection based on Infinite Impulse Response filter (IIR filter). In particular, the Infinite size Symmetric Exponential Filter (ISEF) which is an optimal IIR filter and computationally efficient smoothing filter is studied. The proposed algorithm exploits efficiently all aspects of potential parallelism (spatial parallelism, temporal parallelism and systolism) inherent in the considered egde detection algorithms. The designed concurrent algorithm is expressed in terms of a collection of concurrent processes communicating and synchronizing in an efficient way in order to speed up the low-level operations.
back to Accepted Papers Page


Evolutionary Algorithms Based Neural Networks Training for Phonetic Classification
Paper Number:    222
Abstract:
Abstract:
: In this paper, we propose a biologically inspired method based on evolutionary algorithms (EA). They are three main aims. First to explain the general mechanism of evolutionary algorithms. Second to present the manipulation of neural network by evolutionary algorithms another way determine the population and the fitness function  for each individual. Finally describe the similarities and differences between genetic algorithms (GA) and evolution strategies (ES). This new paradigm is applied to  real world problem like speech recognition  exactly on TIMIT database and illustrates the performance of evolution strategies compared to genetic algorithms.
back to Accepted Papers Page


Infrared (IR) Wireless Remote Control of the iPod Digital Music Player
Paper Number:    224
Abstract:
In this paper, we propose the use of a multi-threading real-time operating system (RTOS) to control an infrared (IR) wireless receiver, for use with the iPod digital music player. Included with the iPod is a wired remote control, but its usefulness is questionable at best. A wireless remote control would be very useful when the iPod is docked and connected to a home entertainment system. It would be beneficial to have access to the nearly 4,000 songs on the iPod�s 15GB hard disk, through a high-quality home entertainment system, all available at your fingertips via a wireless remote control.
back to Accepted Papers Page


A First Step Towards Adding Context to Web Services Standards CWSDL: a Context-Based Web Services
Paper Number:    226
Abstract:
Abstract:
. Our long term research objective is to add context to Web services, in order to help with better service discovery and selection in pervasive environments. As a .rst step towards this goal we propose to add context to the current Web services industry standards namely the WSDL and UDDI. In this paper we present the CWSDL (Context-Based Web Services Description Language), we believe that one of the best ways for enhancing WSDL with context-aware features could be to add context to the WSDL following the standard schema; instead of creating a completely new language. It is worth noticing that the work presented in this paper is a preliminary work and thus needs more enhancements in the future.
back to Accepted Papers Page


Development Of A Vocal Calcula Tor Based On Arabic Speech Recognition System
Paper Number:    233
Abstract:
Technologies vocal and speech processing currently represent an increasingly significant potential in Human-machine interface. It is a question of developing a system able to translate or decode an acoustic signal of ward emitted by a speaker. ln this article we present basic technologies intervening in the design of a vocal calculator based on real lime speech recognition system of isolated Arabic words: the acoustic analysis by treatment of the signal the use of corpus of training and a statistical modeling by the continuous Hidden Markov Model, although Several experiments are carried out for the choice of the optimal parameters of the system. Good results are obtained using a real lime implementation.
back to Accepted Papers Page


Fuzzy Automatic Classification without prior knowledge applied to brian image segmentation
Paper Number:    235
Abstract:
The objective of this paper is to propose a blind segmentation method able to localize aIl relevant objects in medical images by using a fuzzy classification based mean shift algorithm. To achieve this, we have to build cartography of attributes consequent upon images characterization. The objects localization is realized by searching modes frOID a point sample distribution through mean shift procedure. ln order to obtain an image automatic segmentation, the approach is joined at a fuzzy classification based fuzzy c-means (FCM) approach. The fuzzy insertion here allows to take into account imprecision related to information extraction which is necessary to region classification. ActuaIly, the obtained results by our approach are very encouraging and show an accurate segmentation compared others supervised techniques.
back to Accepted Papers Page


Design and Implementation of a Voronoi Diagrams Generator using Java
Paper Number:    239
Abstract:
Voronoi Diagrams (VDs) and Delaunay Triangulations (DTs) are rediscovered mathematical concepts from the 19th century. These concepts are of great significance in many fields of science. This comprises of biology, medicine, telecommunication networks, imagery, geography and countless more. This paper describes the design and the implementation phases of a system used to generate 2D VDs and DTs. The paper commences with highlighting these momentous concepts by introducing their pioneers, over-viewing its functions, related terminologies, and properties. The incremental algorithm was selected to generate VDs and DTs respectively. Java was used to develop the system. The method is briefly described. The intention is to further develop the system to integrate it with other applications such as generating VD for a loaded image. The system can be utilized in teaching knowledge-seekers its usage in real world applications.
back to Accepted Papers Page


The Generalized Service Level Agreement Model and its application to structuring WLANs into Wireless Management Communities
Paper Number:    242
Abstract:
In this work, a service-driven model for structuring WLANs into overlay networks of interacting Wireless Management Communities is proposed. We define the GSLA, a generalized model for the specification of Service Level Agreements (SLA). The GSLA information model supports multi-party service relationships through a role-based mechanism. It is intended to catch up the complex nature of service interactivity in the broader range of SLA modeling of all sorts of IT business relationships. This model accommodates both granularity and modularity of behavioral specifications by having each party playing a role within a service relationship. We then apply the GSLA to structuring WLANs into Wireless Management Communities (WMC). A WMC is composed of a set of Parties and is governed by a charter named the WMC-SLA. A WMC constitutes the basic unit of management upon which will be installed any form of service interaction between parties belonging to the wireless community. The models we propose intend to bring a step towards SLA-driven management within pervasive service environments.
back to Accepted Papers Page