International Journal of Applied Mathematics and Informatics

E-ISSN: 2074-1278
Volume 10, 2016


Notice: As of 2014 and for the forthcoming years, the publication frequency/periodicity of NAUN Journals is adapted to the 'continuously updated' model. What this means is that instead of being separated into issues, new papers will be added on a continuous basis, allowing a more regular flow and shorter publication times. The papers will appear in reverse order, therefore the most recent one will be on top.

Main Page

Submit a paper | Submission terms | Paper format

 


Volume 10, 2016 


Title of the Paper: A Geometric Heuristic for Uncapacitated Vehicle Routing Problem

 

Authors: Nodari Vakhania, Jose Alberto Hernandez, Federico Alonso-Pecina, Crispin Zavala

Pages: 119-123

Abstract: We propose a two-phase construction heuristic for the solution of the classical Euclidean (uncapacitated) vehicle routing problem in which the minimum cost k distinct vehicle tours are to be formed for the given n customer locations. At the first phase we construct a polygon in the 2-dimensional Euclidean space that girds all the given points (customer locations and the depot). The second phase combines the clustering and the routing stages which are performed in an alternate fashion. Iteratively, if the current clustering doesn’t bring us to k distinct tours then it is modified and a new routing attempt is made until k distinct tours are formed.


Title of the Paper: Triangulation Positioning System - Network and Topology Issues

 

Authors: Marios Sfendourakis, Ioannis Sarantopoulos, Rajagopal Nilavalan, Emmanuel Antonidakis, Ioannis Barbounakis

Pages: 110-118

Abstract: This paper presents ongoing work on localization and positioning through triangulation procedure. Issues of scalability and topology are also examined and areas that are problematic and need further analysis and implementation in the network are analyzed. In a Fixed Stations Network, as it was presented in [1] a triangulation problem is becoming high complicated when we have a large number of sensors and transmitters.Sensors bearings and data readings have to be checked on a case by case basis.The combination and processing of a vast number of data needs filtering and implementation of the various cases, whilst synchronously data processing in various stages can provide accurate results.


Title of the Paper: Distributed Backstepping Control with Actuator Delay for Active Suspension System

 

Authors: Aziz Sezgin, Yuksel Hacioglu, Nurkan Yagiz

Pages: 105-109

Abstract: The actuator time delay problem for a linear active suspension system using the theory of backstepping control design is examined in this study. Time-delay may arise in active suspension systems because of transport phenomenons, information processing, sensors or some mechanical reasons. Designing the controller without taking into account the actuator time delay may degrade the performance of the controller or even destabilize the closed loop control system. It is aimed to improve the ride comfort of passengers without degrading road holding. Therefore, a backstepping controller was designed which takes into account the actuator time delay by combining a first order hyperbolic partial differential equation(PDE) with the linear suspension system. The numerical results confirm the success of the controller.


Title of the Paper: Frequency Transform Basis with Even Symmetry Elements

 

Authors: A. Shoberg, S. Sai, K. Shoberg

Pages: 101-104

Abstract: The frequency transform modification is proposed. It calculated with different basis functions. This approach considered on the digital cosine transform as simplest example. We compared results of the transform in forward and reverse data index orders. The differences are presented. The proposed algorithm in which a input data set is divided into two equal parts and transform is performed independently in both directions from the center. It allows to eliminate changing values of amplitude and sign. This approach applies to two-dimensional signal. It simplifies a calculation for different applications.


Title of the Paper: Computational Method for Reliability Analysis of Complex Systems Based on the One Class Markov Models

 

Authors: Igor Kabashkin

Pages: 98-100

Abstract: Markov analysis is a powerful modelling and analysis technique with strong applications in time-based reliability and availability analysis. The reliability behavior of a system is represented using a state-transition diagram, which consists of a set of discrete states that the system can be in, and defines the speed at which transitions between those states take place. Markov models consist of comprehensive representations of possible chains of events, i.e. transitions within systems which, in the case of reliability and availability analysis, correspond to sequences of failures and repair. The paper describes specific computational approach to reliability analysis of complex systems, which behavior is described by the Markov chain finite-state transition diagram which contains two no crossing sets of arbitrary configuration states, transitions between which is possible only through an one intermediate state. The method of calculation of stationary probabilities of states of the original system includes it decomposition into two separate subsystems and calculation of stationary probabilities of the original model from the known values of stationary probabilities of subsystems using the proposed transitional equations.


Title of the Paper: Application of SIR Epidemiological Model: New Trends

 

Authors: Helena Sofia Rodrigues

Pages: 92-97

Abstract: The simplest epidemiologic model composed by mutually exclusive compartments SIR (susceptible-infectedsusceptible) is presented to describe a reality. From health concerns to situations related with marketing, informatics or even sociology, several are the fields that are using this epidemiological model as a first approach to better understand a situation. In this paper, the basic transmission model is analyzed, as well as simple tools that allows us to extract a great deal of information about possible solutions. A set of applications - traditional and new ones - is described to show the importance of this model.


Title of the Paper: Adaptive Coverage Control with Power-Aware Control Laws and Exponential Forgetting

 

Authors: Mert Turanli, Hakan Temeltas

Pages: 86-91

Abstract: In this paper, we extend the coverage control problem by using adaptive coordination with exponential forgetting and power aware control laws. The Centroidal Voronoi Tesellations enable the nonholonomic mobile nodes position themselves sub-optimally according to a time-varying density function. The Lyapunov stability analysis of the adaptive and decentralized approach is presented. The synchronization among the mobile nodes is achieved by using a linear consensus protocol. Also, repulsive forces prevent nodes from collision. Simulation results show that by using power aware control laws, energy consumption of the nodes can be reduced.


Title of the Paper: Planning Data Structure for Space Partitioning Algorithms

 

Authors: Gábor Fábián, Lajos Gergó

Pages: 77-85

Abstract: In this paper we give a new data structure that can be used efficiently by space partition. We consider an approximation schema introduced recently which is a generalization of space partitioning algorithms. The input of the schema is triangular mesh, in special case it can be a set of space points as well. The approximation schema is an iterative process. The first approximation of the input mesh is its bounding box, or an arbitrary convex polyhedron containing the mesh. The process gives an atomic decomposition of the initial polyhedron. Every iterative step an atom is chosen, and divided by a plane into two disjoint atoms. In the n-th step the decomposition consists n+1 atoms. We give a sufficient condition for the convergence of the method, minimizing a volume based error metric and leaving the atoms that are irrelevant to the approximation. It can be shown, the convergence depends only on the strategy of the choosing and dividing. We define a data structure for the atomic decomposition, where vertices and faces of atoms are contained in a global list for minimizing the redundancy. We give a short survey of geometric operations and properties of polyhedra, the dividing operation of the process will be discussed, as well.


Title of the Paper: Bioinformatics Evolutionary Tree Algorithms Reveal the History of the Cretan Script Family

 

Authors: Peter Z. Revesz

Pages: 67-76

Abstract: This paper shows that Crete is the likely origin of a family of related scripts that includes the Cretan Hieroglyph, Linear A, Linear B and Cypriot syllabaries and the Greek, Phoenician, Old Hungarian, South Arabic and Tifinagh alphabets. The paper develops a novel similarity measure between pairs of script symbols. The similarity measure is used as an aid to develop a comparison table of the nine scripts. The paper presents a method to translate comparison tables into DNA encodings, thereby enabling the use of bioinformatics algorithms that construct hypothetical evolutionary trees. Applying the method to the nine scripts yields a script evolutionary tree with two main branches. The first branch is composed of Cretan Hieroglyph, Cypriot, Linear A, Linear B, Old Hungarian and Tifinagh, while the second branch is composed of Greek, Phoenician and South Arabic. It is also considered how Proto-Sinaitic and Ugaritic may belong to this script family.


Title of the Paper: A Pseudo Quantum Watermarking Algorithm in M-band Wavelet Domain

 

Authors: Tong Liu, Xuan Xu, Xiaodi Wang

Pages: 61-66

Abstract: Computational methods derived from digital signal processing are playing a significant role in the security and copyrights of audio, video, and visual arts. In light of the quantum computing, the corresponding algorithms are becoming a new research direction in today’s high-technology world. The nature of quantum computer guarantees the security of quantum data, so a safe and effective quantum watermarking algorithm is in demand. Quantum watermarking is the technique that embeds the invisible quantum signal into quantum multimedia data for copyright protection. Different from most traditional algorithms, we propose new algorithms which apply a quantum or a pseudo quantum watermarking in M-band wavelet domain. Assured by the Heisenberg uncertainty principle and quantum no-cloning theorem, the security of quantum watermark can reach a very high-level standard. In other words, these watermarking algorithms can defeat nearly all attackers, no matter using classical computer or quantum computer.


Title of the Paper: Cloud and Automated Computations in Modern Personalized Medicine - AirPROM Project Perspective

 

Authors: Michal Kierzynka, Marcin Adamski, Andreas Fritz, Dmitriy Galka, Ian Jones, Dieter Maier, Oleksii Shtankevych, Andrew Wells

Pages: 52-60

Abstract: Personalized medicine may be defined in general as a customization of medical approach to an individual patient, involving diagnosis, treatment and other medical procedures. The EU-founded AirPROM project is a prime example of joint cooperation that aims to develop a more personalized treatment in the area of respiratory medicine. In particular, project partners develop models and software tools that help to predict the progression of asthma and COPD (chronic obstructive pulmonary disease) as well as response to treatment for individual patients. However, such development would not be possible without computer science, its methods and technologies. The large amount of data produced for each patient, e.g. lung and airway models, together with complex simulations and effective data/results sharing are just selected challenges that need to be addressed. Therefore, a lot of effort has been spent to integrate the specific software tools used in the project with cloud-based infrastructure that allows scalable storage and computing. The computational automation achieved in the project translates into more time that may be spent on direct patient care. The story presented in this paper proves that personalized medicine is not only a matter for the future but a reality that is already happening.


Title of the Paper: Bringing informatics concepts to students through e-activities and contests

 

Authors: Javier Bilbao, Eugenio Bravo, Concepción Varela, Olatz García, Miguel Rodríguez, Purificación González

Pages: 44-51

Abstract: The very fast evolution of Internet is stimulating changes in all sectors and economies globally. Information and Communication Technologies (ICT) provide us enormous opportunities about any subject, how learning and teaching, for all levels of education, in a ubiquitous web-based format, and in ways that were inconceivable a few decades ago. However, effective application of ICT in education continues to be a challenge worldwide. In several countries, different actuations in the educational systems are being developed. In Europe, this development has as main novelty eight basic competences to be acquired by all pre-university students, but ICT is mainly being used to sustain or support existing pedagogical approaches as opposed to being used to transform teaching and learning. The development of training and learning activities in virtual contexts must be considered a key element in the educational planning. E-activities can be a very good tool for improving abilities and skills of different subjects. And contests are a way to implement e-activities in the per university educational system. The International Contest on Informatics and Computer Fluency, called Bebras, can be an example of bringing informatics concepts to students in an informal way.


Title of the Paper: Precision Agricultural and Game Damage Analysis Application for Unmanned Aerial Vehicles

 

Authors: András Molnár, Dániel Stojcsics, István Lovas

Pages: 38-43

Abstract: In modern agriculture the game damage is a real problem. Analyzation of the affected areas can be measured usually from the ground or with human piloted airplanes. With unmanned aerial vehicles (UAV) the survey can be made easily for a 1 km2 area from a single flight, while the operating costs are significantly lower. During the flight (depending on the camera and the UAV airspeed) around 1-2000 pictures were taken. The photogrammetric reconstruction and the orthomosaic analysis needs a significant computation power. The paper presents the scientific basis of the analysis and the possible applications for the UAVs in modern agriculture including game damage made by rodents and deer in corn and sunflower filed, using aerial time line series.


Title of the Paper: Intelligent System for Disasters Management Using Boolean Delay Equations Models

 

Authors: Theodora Dumitrescu, Razvan Popescu, Daniel Merezeanu, Radu Dobrescu

Pages: 29-37

Abstract: The paper proposes an architecture for an Intelligent System for Disaster Management. It is envisioned as a Multi-Agent System. By including a Model Integration component to form a hybrid system, it aims to offer support for an as wide as possible range of decisions. A final section addresses the dynamical modeling of disasters (defined as extreme events), focusing on Boolean Delay Equations (BDEs) models and its application in a case study on seismic phenomena.


Title of the Paper: A Symbolic Algorithm for the Computation of Periodic Orbits in Non–Linear Differential Systems

 

Authors: Juan F. Navarro

Pages: 19-28

Abstract: The Poincaré–Lindstedt method in perturbation theory is used to compute periodic solutions in perturbed differential equations through a nearby periodic orbit of the unperturbed problem. The adaptation of this tecnique to systems of differential equations of first order could produce meaningful advances in the qualitative analysis of many dynamical systems. In this paper, we present a new symbolic algorithm as well as a new symbolic computation tool to calculate periodic solutions in systems of differential equations of first order. The algorithm is based on an optimized adaptation of the Poincar´e–Lindstedt technique to differential systems. This algorithm is applied to compute a periodic solution in a Lotka–Volterra system.


Title of the Paper: Automatic Correction of OCR Results Using Similarity Detection for Words and Fonts

 

Authors: Costin A. Boiangiu, Mihai Zaharescu, Oana Ferche, Andrei Danescu

Pages: 10-18

Abstract: This paper presents a novel approach for optimizing the results of a traditional OCR system, by implying a prior analysis of the input image document. The purpose is the interpretation of highly deteriorated, low resolution images. The idea behind this approach is to use text redundancy in order to estimate unclear areas. This is done at the image level, by replacing degraded areas with other regions likely to contain the same information. This replacement is done based on two systems: selection of the similar regions based on surrounding font statistics, constructing lists of deteriorated and clear words and choosing the best possible replacements.


Title of the Paper: Modified Feature Descriptor Based Approach to Recognize Surgically Altered Human Faces

 

Authors: Steven Lawrence Fernandes, G. Josemin Bala

Pages: 1-9

Abstract: Plastic Surgery is being cited as one of the major challenges for face recognition issues. Plastic surgery method significantly alters the facial regions by modifying the appearances, facial features, textures, both locally and globally. These features greatly affect routine of algorithm on the database of plastic surgery, which contains both local and global face images. The study clearly depicts how the commonly used algorithms suffer in identifying and reflecting the modifications done by plastic surgery procedures.For finding the closeness between the pre-surgical face and post-surgical face closeness between the pre and post-surgical faces are found out by extracting the nose, eyes and lips feature.In this paper, an attempt has been made to develop a novel technique to recognize surgically altered images using Modified Feature Descriptor (MFD) technique and Adaptive Feature Matching (AFM). MFD consist of two steps, firstly an encoding scheme is devised that compresseshigh-dimensional dense features into a compact representationby maximizing the intrauser correlation. Secondly AFM is developed for effective classification. The proposed system is compared with two state of the art techniques to recognize surgically altered human faces are firstly Local Binary Pattern and Local Derivative Patternbased approach and secondly Local Binary Pattern and Principle Component Analysisbased approach. The proposed system is validated using IIIT-Delhi Plastic Surgery Face Database.