International Journal of Mathematics and Computers in Simulation

 
E-ISSN: 1998-0159
Volume 6, 2012

Main Page

Submit a paper | Submission terms | Paper format

 


Issue 1, Volume 6, 2012


Title of the Paper: RRTs Review and Statistical Analysis

Authors: Ahmad Abbadi, Radomil Matousek

Pages: 1-8

Abstract: Path planning is one of the important issues in robotics area. There are many ideas to deal with this issue one of them is RRT (Rapidly Exploring Random Tree). This method is not optimal but it reduces the time needed for obtaining solutions. This algorithm is based on portability, the result of this algorithm is a tortuous path which has a lot of useless points. In this paper we introduce some variants of RRTs and a method for reduce a degree of tortuous, making the path shorter and omitting useless points. Also because of RRT’s randomizes we make some statistical test on many variations of RRT, to make decisions about the best variations.


Title of the Paper: An Idea for Finding the Shortest Driving Time Using Genetic Algorithm Based Routing Approach on Mobile Devices

Authors: Umit Atila, Ismail Rakip Karas, Cevdet Gologlu, Beyza Yaman, Ilhami Muharrem Orak

Pages: 9-16

Abstract: People‘s orientation to the mobile devices all over the world have made the using of route guidance systems that assist drivers on the traffic widespread in daily life. For an effective routing, these systems should take into account the effectual factors of traffic flow such as allowable velocity limits of the roads and density. The computational cost of the system is up to the amount of nodes in road network and effectual factors. When we consider the road networks with excessive number of nodes, finding the exact routes in real time using some well known deterministic methods such as Dijkstra‘s algorithm on such routing systems may not be accurate using mobile devices with limited memory capacity and processing speed. In this paper, a Genetic Algorithm (GA) approach applied on a route guidance system for finding the shortest driving time is proposed. A different gene search approach on crossover operation named ―first-match-genes‖ had been introduced. A mobile application for the traffic network of Ankara and the performance of the genetic algorithm tested on networks with 10, 50, 250, 1000 nodes was presented.


Title of the Paper: Measuring IT Governance Performance: A Research Study on CobiT- Based Regulation Framework Usage

Authors: Mario Spremic

Pages: 17-25

Abstract: After explaining the Information Technology (IT) governance concept and external and national regulation, in this paper we investigate if the prescribed regulatory requirements and regular information system (IS) audits affect the IT Governance initiatives and foster strategic business/IT alignment. External and especially national IT Governance regulation framework in the Republic of Croatia was explained in further details. We constructed the research model around IT Governance components and conducted the research by the series of long-lasting comprehensive in-depth interviews with responsible employees. On the sample of selected Croatian small banks, the organizational position and the role of IT in the business has been investigated, while specific research interest was to get the clear view of the maturity level of IT usage. We hoped that such approach could be useful when trying to answer the posed research question: can national IT Governance regulatory framework help to start to measure IT Governance maturity and are such initiatives helpful in aligning IT and business?


Title of the Paper: A New Standard Uptake Values (SUV) Calculation based on Pixel Intensity Values

Authors: Somphob Soongsathitanon, Pawitra Masa-Ah, Malulee Tuntawiroon

Pages: 26-33

Abstract: Positron Emission Tomography (PET) is one of the major cancer imaging tools for both diagnosis and staging. The prognostic significant parameter for PET scan is Standard Uptake Values (SUV). The SUV can be used as a tool to supplement the visual interpretation for physician. However the SUV can not be calculated without the vendor’s application software. In order to calculate the SUV without the vendor’s application software, a new standard uptake values (SUV) calculation scheme based on image intensity has been introduced. This new scheme is tested by using 108 slices of DICOM files obtained from 11 patients (8 men, 3 women). The comparative study between this scheme and the GE Xeleris workstation has been done and the results showed that the correlation between the two systems is statistically significant with a 99% confidence interval. The average percentage of accuracy is 85% for the report at a 95% confidence interval. So this scheme can be used as an alternative tool to calculate the SUV and it can be installed in any computers.


Title of the Paper: Rectification Column - Mathematical Modeled and Computer Simulated Real System

Authors: S. Hubalovsky

Pages: 34-45

Abstract: One of the most important methods in current scientific and technological research is process of modeling and simulation of real experiment as well as modeling and simulation of real experimental device. System approach, modeling and simulation are discipline with its own theory and research methodology. The paper focuses to the theory of the process of modeling and simulation, visualization and model validation and verification of real experiment and experimental device. Multidisciplinary approach is point out too. Step by step there will be shown the process of creation of static and dynamic mathematical model of the real experimental device – seven storey rectification column. Mathematical model is supplemented the simulation model written in Visual Basic for Excel. Visualization is a part of the simulation model. Validation of the mathematic model as well as verification of the simulation model is shown in the paper too.


Title of the Paper: Mixing of Two Different Temperature Water Flows as Feedback Controlled System Mathematically Modeled and Simulated in MS Excel Spreadsheet

Authors: S. Hubalovsky

Pages: 46-57

Abstract: One of the most important methods in current scientific and technological research is process of modeling and simulation of real experiment as well as modeling and simulation of of feedback regulated systems. System approach, modeling and simulation are discipline with its own theory and research methodology. The paper focuses to the theory of the process of modeling and simulation, visualization of feedback controlled system. Multidisciplinary approach is point out too. Step by step there will be shown the process of creation of static, dynamic and feedback controlled mathematical model. Mathematical model is supplemented by the simulation model realized in MS Excel spreadsheet. Visualization of the simulation model is realized in MS Excel XY chart.


Title of the Paper: The Averaging Model of a Six-Pulse Diode Rectifier Feeding Paralleled Buck Converters

Authors: T. Sopaprim, K-N. Areerak, K-L. Areerak

Pages: 58-65

Abstract: Power converter models are time-varying in nature because of their switching behaviors. This paper presents the averaging methods called DQ and generalized state-space averaging modeling methods that are used to eliminate the switching actions to achieve the time-invariant models. The DQ modeling method is used to analyze the dynamic model of a three-phase rectifier including the transmission line on AC side, while the generalized state-space averaging modeling method is applied to derive the dynamic model of a buck converter. Intensive time-domain simulations via the wellknown software packages with the exact topology models are used to validate the proposed models. The simulation results show that the proposed mathematical models provide high accuracies in both transient and steady-state responses. The reported models require the very fast simulation time compared with the full topology model of commercial software packages. Therefore, the averaging model is suitable for the system design via the searching algorithms in which the repeating calculation is needed during the searching process.


Title of the Paper: Using NARX Model with Wavelet Network to Inferring the Polished Rod Position

Authors: Emanuel Cajueiro, Ricardo Kalid, Leizer Schnitman

Pages: 66-73

Abstract: Although several studies have been conducted on the sucker-rod pumping system, even today, the acquisition of the polished rod position is carried out by using position transducers. In this paper, we present experimental results showing that the dynamic position of the polished rod can be inferred from the torque current ofa three-phase induction motor (which is given by sensorless vector AC-drive) using nonlinear autoregressive model with exogenous input (NARX) with wavelet network (wavenet). The results obtained in the validation stage show that, on the basis of the experimental set used in this work, the best estimated model is suitable to represent the dynamic behavior of the polished-rod of the pumping unit.


Title of the Paper: Influence of Raster Data Quality on Spatial Modelling and Analyses

Authors: Jitka Komarkova, Pavel Sedlak, Martin Jedlicka, Lucie Horakova, Petr Sramek, Jana Hejlova

Pages: 74-82

Abstract: A good decision making is partially based on a good quality of data and/or information provided by information systems. Without input data at an appropriate level of quality, information systems cannot provide quality information. Therefore, many standards and data quality models have been developed. Later on, spatially oriented decision-making have become more important, so attention was focused on spatial data too. Quality evaluation of spatial data requires special standards because of the special properties of spatial data. The contribution is mainly focused on raster data models. At first, a brief description of spatial data quality evaluation is provided. Then, a set of quality characteristics and parameters for raster data within the framework of existing ISO standards is proposed. Finally, the proposed set is used to evaluate two example data sets and several practical examples connected to the raster data quality and its influence on spatial modelling are described.


Title of the Paper: A Software for Calculation of Optimum Conditions for Cotton, Viscose, Polyester and Wool Based Yarn Bobbins in a Hot-Air Bobbin Dryer

Authors: H. Kuşçu, K. Kahveci, U. Akyol, A. Cihan

Pages: 83-90

Abstract: In this study, a software has been developed to predict the optimum drying conditions of viscose, wool, polyester and cotton based yarn bobbins for drying in a pressurized hot air dryer. For this purpose, firstly, a suitable drying model has been found in defining the drying behavior of bobbins using the experimental drying behavior. After that, additional regression analyses have been made to take into account the effect of the drying parameters on drying. Then, a software has been developed using Visual Basic programming language. With the aid of this software, optimum drying conditions for drying time and energy consumption can be obtained for the cotton, viscose, polyester and wool based yarn bobbins.


Title of the Paper: Optimization of Digital Image Watermarking Scheme Using Artificial Intelligent Technique

Authors: P. Kumsawat, K. Pasitwilitham, K. Attakitmongcol, A. Srikaew

Pages: 91-98

Abstract: In this paper, a robust image watermarking scheme for copyright protection of electron microscope image is proposed. The watermark insertion and watermark extraction are based on quantization index modulation technique and does not need the original image in the watermark extraction process. We have developed an optimization technique using the genetic algorithms to search for optimal quantization steps to improve the quality of watermarked image and robustness of the watermark. In addition, we construct a prediction model based on image moments and back propagation neural network to correct an attacked image geometrically before the watermark extraction process begins. Experimental results demonstrate that the proposed algorithm can achieve good perceptual invisibility and security, and it is also robust against various image processing attacks.


Title of the Paper: Planning Algorithm and its Modifications for Tactical Decision Support Systems

Authors: Petr Stodola, Jan Mazal

Pages: 99-106

Abstract: This paper is divided into the two main parts. The first part deals with a planning algorithm being used in tactical decision support systems, which has been developed at the University of Defence in Brno. In the first part, there is presented improved versions of the original algorithm which are demonstrated while searching for an optimal path for a ground autonomous robot in a general environment. The article shows two different approaches for the algorithm improvement, along with their basic principles. The possibilities of the improvement are analyzed on two particular examples and the results of the new versions are compared with the original algorithm. In the second part, the article presents the issue of tactical decision support systems. The state of development of these systems is presented here, along with an example of their utilization.


Title of the Paper: Modeling and Computer Simulation of Real Process – Solution of Mastermind Board Game

Authors: Stepan Hubalovsky

Pages: 107-118

Abstract: One of the most important methods in current scientific and technological research as well as in research of strategy algorithm and programming is modeling and computer simulation of real systems and real processes. System approach, modeling and simulation are discipline with its own theory and research methodology. The paper focuses to the theory of modeling, simulation, visualization, model validation and verification of feedback controlled real process – solution of the Mastermind board game. Multidisciplinary approach is point out too. Step by step there will be shown the process of system identification, mathematical analysis and strategy of the solution. Conceptual model of the process is realized by process chart. Conceptual model is supplemented by the simulation model realized in Visual Basic for Excel. Visualization is a part of the simulation model. Validation of the conceptual model as well as verification of the simulation model is shown in the paper too.


Title of the Paper: Robust Autocorrelation Testing in Multiple Linear Regression

Authors: Lim Hock Ann, Habshah Midi

Pages: 119-126

Abstract: It is very essential to detect the autocorrelation problem due to its responsibility for ruining the important properties of Ordinary Least Squares (OLS) estimates. The Breusch-Godfrey test is the most commonly used method for autocorrelation detection. However, not many statistics practitioners aware that this test is easily affected by high leverage points. In this paper, we proposed a new robust Breusch-Godfrey test which is resistant to the high leverage points. The results of the study signify that the robustified Breusch-Godfrey test is very powerful in the detection of autocorrelation problem with and without the presence of high leverage points.


Title of the Paper: Algoritmization of the Information Concept of the Complex Logistic System

Authors: Robert Bucki, Bronislav Chramcov, Roman Jašek

Pages: 127-135

Abstract: The paper highlights the vital problem of information mathematical modelling of the logistic system. The complex system itself consists of identical parallel manufacturing subsystems in which there is a manufacturing route arranged in a series of stands. Each stand is equipped with a machine with the dedicated tool. There are interoperation buffer stores between subsequent production stands. After getting worn out, certain tools require regeneration. Used tools from the identical production stands share the same regeneration plant. Irreplaceable tools need to be exchanged for new ones. The replaceable tool can be regenerated a certain number of times. The production process is optimized by means of the stated criteria respecting defined bounds. There is a set of control approaches of which the most effective one is to be chosen in order to either maximize the production output or minimize the lost flow capacity or, finally, minimize the total tool replacement time. The logistic system is controlled by a determined heuristic algorithm. There are also given sub-line heuristic algorithms. Equations of state illustrate the flow of charge material and changes of the order vector elements. Manufacturing strategies allow us to decide which approach will be implemented. Moreover, optimization issues are discussed by means of introducing the multi-stage process model.


Title of the Paper: Characterization of Electronic Circuit Elements by Exclusive and Corrective Artificial Neural Networks

Authors: Ladislav Pospisil, Josef Dobes, Abhimanyu Yadav

Pages: 136-143

Abstract: At present, there are many novel electronic circuit elements for which their nonlinear models for CAD are necessary, especially for microwave ones. However, in the PSpice-family programs, only a class of several classic types of the MESFET model is available for the microwave area. In the paper, a novel reliable way is suggested for modeling various electronic structures by exclusive neural networks, or by corrective neural networks working attached to a modified analytic model. The accuracy of the proposed modification of the analytic model is assessed by extracting the model parameters of GaAs MESFET, AlGaAs/InGaAs/GaAs pHEMT, and GaAs microwave varactors. First, a precise approximation of the pHEMT output characteristics is carried out by means of both exclusive and corrective artificial neural networks; and second, an approximation of the capacitance (C-V) function of the SACM InGaAs/InP avalanche photodiode is performed by the exclusive neural network. Further, the Pt−TiO2−x −Pt memristor characteristic with an extraordinary (but typical) hysteresis is approximated by a set of cooperative artificial neural networks, because a single network is unable to characterize this especial element. Last, a sequence of systematic experiments is performed, which shows that the optimal structure of the network can be found relatively easily, and it should not be too complicated.


Title of the Paper: Simulation of Molecular Ring Emission Spectra: Localization of Exciton States and Dynamics

Authors: David Zapletal, Pavel Herman

Pages: 144-152

Abstract: Computer simulation of steady state fluorescence spectra of the ring molecular system is presented in this paper. The cyclic antenna unit LH2 of the bacterial photosystem from purple bacterium Rhodopseudomonas acidophila can be modeled by such system. Three different models of uncorrelated static disorder are taking into account in our simulations: Gaussian disorder in local excitation energies, Gaussian disorder in nearest neighbour transfer integrals and Gaussian disorder in radial positions of molecules in the ring. Dynamic disorder, interaction with a bath, is also included in Markovian approximation. The cumulant-expansion method of Mukamel et al. is used for the calculation of spectral responses of the system with exciton-phonon coupling. The peak position of single ring spectra and localization of exciton states depend on realization of static disorder and is also influenced by dynamic disorder. We discuss different types of exciton dynamics too, that are coupled to above mentioned effects and compare the results in that the dynamic disorder is taken into account with the results without dynamic disorder.


Title of the Paper: Does the Higher Order Mean the Better Internal Delay Rational Approximation?

Authors: Libor Pekař, Eva Kurečková

Pages: 153-160

Abstract: The aim of this contribution is to test by simulations whether the higher order rational approximation for exponential elements in linear time-invariant time-delay systems (LTI-TDS) automatically means the better (i.e. more accurate) finite dimensional approximating model. The presented approximations are utilized to the Laplace transfer function model in the form of fractions of so-called quasipolynomials and the methods are chosen so that they are easy to handle with. Namely, Padé approximation, shift operator approximations – Laguerre and Kautz shift - and Fourier analysis based method are introduced and benchmarked. The work is motivated i.a. by the fact that direct controller design for LTI-TDS based on such models is mostly rather intricate and there are no theoretical results for internal delays. Moreover, the authors intend to use the results for rationalization of so-called anisochronic controllers when their discretization. The quality of approximation is measured by the well known H2 and H∞ norms instead of exact analytic calculations since it is sufficient for practical engineering problems. Some simulation examples for anisochronic controllers by means of a developed program testing interface in Matlab-Simulink environment are presented as well.


Title of the Paper: Various Approaches to Solving an Industrially Motivated Control Problem: Software Implementation and Simulation

Authors: Radek Matušů, Roman Prokop

Pages: 161-168

Abstract: The main aim of this paper is to present various approaches to solving an industrially motivated control problem, especially from the viewpoint of implementation of control algorithms into the Matlab and Pascal environment. The motivation and basic conditions of the application have been based on real technical assignment of a manufacturer of aluminium-based rolled products and packaging materials. The primary part of the work deals with selected digital self-tuning controllers where the applied methods comprise a polynomial approach to discrete-time control design and recursive least-squares identification algorithm LDDIF. Subsequently, two alternative approaches were analyzed, namely control using continuous-time regulator with fixed parameters and usage of delta approach in self-tuning control.


Title of the Paper: Prediction of Heat Consumption Parameters in Distribution Network

Authors: Lubomir Vasek, Viliam Dolinay, Erik Kral

Pages: 169-176

Abstract: This article analyses methods used typically for controlling the processes in the distribution system of heat energy in the urban agglomeration (SHDC - System of Heat Distribution and Consumption). The key problem in this controlling mechanism consists in transport delay of transferring heat media. Therefore the control mechanism must operate in prediction kind of mode. The two control parameters, or control variables, are temperature of heat carrier and its flow rate. Their time behavior must be predicted for efficient control of the whole heat energy supply. There are many methods used for this prediction. They are briefly described, classified and analyzed in this paper. In more details there are described the methods, developed by authors, which combined procedures for mathematical analysis of historical production data and procedures for modeling of physical features in SHDC. For modeling simulation models are used. Further the results of practical experiments reached with described methods in the concrete real heat distribution system are presented.


Title of the Paper: DoS Attacks Targeting SIP Server and Improvements of Robustness

Authors: M. Voznak, J. Safarik

Pages: 177-184

Abstract: The paper describes the vulnerability of SIP servers to DoS attacks and methods for server protection. For each attack, this paper describes their impact on a SIP server, evaluation of the threat and the way in which they are executed. Attacks are described in detail, and a security precaution is made to prevent each of them. The proposed solution of the protection is based on a specific topology of an intrusion protection systems components consisting of a combination of Snort, SnortSam and Iptables applications, the solution was verified in experiments. The contribution of this paper includes the performed comparison of the DoS attacks’ efficiency which were tested both without any protection and then with implemented Snort and SnortSam applications as proposed in our solution.


Title of the Paper: Implementation and Performance of an Object-Oriented Software System for Cuckoo Search Algorithm

Authors: Nebojsa Bacanin

Pages: 185-193

Abstract: Evolutionary computation (EC) algorithms have been successfully applied to hard optimization problems. In this very active research area one of the newest EC algorithms is a cuckoo search (CS) metaheuristic for unconstrained optimization problems which was developed by Yang and Deb in MATLAB software. This paper presents our software implementation of CS algorithm which we called CSApp. CSApp is an object-oriented system which is fast, robust, scalable and error prone. User friendly graphical user interface (GUI) enables simple adjustment of algorithm’s control parameters. The system was successfully tested on standard benchmark functions for unconstrained problems with various number of parameters. CSApp software, as well as experimental results are presented in this paper.


Title of the Paper: Hybridizing Artificial Bee Colony (ABC) Algorithm with Differential Evolution for Large Scale Optimization Problems

Authors: Nadezda Stanarevic

Pages: 194-202

Abstract: Artificial bee colony (ABC) and differential evolution (DE) are two metaheuristics used for hard optimization problems. In this paper, a novel method called DEM-ABC is proposed to improve the exploitation process in ABC algorithm. The method combines differential evolution mutation strategies with original ABC algorithm for improving its convergence and performance. The proposed approach was tested by using a set of well-known large-scale unconstrained benchmarks problems. Comparisons show that DEM-ABC outperforms or performs similarly as the original ABC algorithms in terms of the quality of the resulting solutions.


Title of the Paper: Assessing User Acceptance toward Blog Technology Using the UTAUT Model

Authors: Bens Pardamean, Mario Susanto

Pages: 203-212

Abstract: Blogs are among the many commonly used technologies for education and learning. They are also both conversational technologies and constructivist learning tools. Their interactive, collaborative, user-friendly, and instant archival features have transformed blogs into effective tools for enhancing case-based teaching methods in the asynchronous nature of the online environment. This study investigated the student populace’s acceptance of the blog technology through the Unified Theory of Acceptance and Use of Technology (UTAUT) framework. UTAUT integrates eight theories from social psychology and sociology in order to examine the effects of major factors on behavioral intention and actual use of blog to learn e-business course materials and topic discussions. The results showed that both social influence and performance expectancy had a significant relationship with behavioral intention, whereas effort expectancy did not. In this study, behavioral intention did not have a significant relationship with actual usage level of blogs as a learning tool.


Title of the Paper: Calculation and Visualisation of Radar Protection Zone

Authors: Jan Hovad, Jitka Komarkova, Pavel Sedlak, Martin Tulacka

Pages: 213-221

Abstract: Radar is a very important device, e.g. for flight control. In the case, the airport is located close to the settlement, it can influence people living there. Identification of radar protection zone and its intersection with surrounding buildings are useful spatial analyses which can support decision-making both in public administration and personal life of people. Proper visualization of analyses results is a very important step which helps people to understand the results. A possible way how to model radar protection zone, surrounding buildings and how to identify their intersection is described in the article. A strong attention is paid to the final visualisation of obtained results. ArcGIS Desktop, ArcScene, Google SketchUp and 3ds Max were used as software tools.


Title of the Paper: About Fibonacci Numbers and Functions

Authors: Alina Bărbulescu

Pages: 222-229

Abstract: Fibonacci numbers and functions are topics of major interest in mathematics, due to the importance of their applications in many sciences. In the first part of this article we present some congruences involving Fibonacci and Lucas numbers. In the second one we discuss the dimensions of the Fibonacci numbers, defined on different closed intervals, starting with the evaluation of the Box dimension of this function defined on [0, 1].


Title of the Paper: Mathematical Foundations and Principles in Practice of Computer Aided Design Simulation

Authors: J. Sedivy, S. Hubalovsky

Pages: 230-237

Abstract: Today, programs for technical modeling are used in almost every field or industry. We see them in technical industries as well as in our everyday life and even in areas where we would never expect to see them. It generally replaces the creativity and imagination of the designer, and in many cases it helps determine collision situations and points during the creation of new products. As these programs recently became less expensive and therefore may be obtained more easily, computer graphic design is also being used in practical training, during regular classes as well as for the preparation of studying materials.


Issue 2, Volume 6, 2012


Title of the Paper: Bond Graph Modeling In Simscape

Authors: Pršić Dragan, Nedić Novak, Dubonjić Ljubiša, Djordjević Vladimir

Pages: 239-247

Abstract: Modeling is a complex process realized through several levels of abstraction. Since each level has its own ontological primitives, a problem of model transformation from one level to another appears. In order to decrease discontinuities in a development process, this paper discusses a bond graph model library implemented in Simscape. Simscape is a software tool intended for modeling and simulation of physical systems in Simulink environment. Thanks to this library, it is possible to use physical network and bond graph approach in modeling, within the same model, on two different levels. In other words, for both structure and behavior description a unique notation is used. Besides the library of basic bond graph elements, an example of a model of a component used as interface between a bond graph and other Simscape domains is also given. Application of Simscape bond graph library is illustrated through an example of a hydraulic system model. The model combines standard Simscape and bond graph blocks.


Title of the Paper: Three-Dimensional Heat Conduction in Multi-Connected Domains using GFEM with Hexaedrals of 27 Nodes

Authors: Estaner Claro Romão, Jairo Aparecido Martins, Luiz Felipe Mendes de Moura

Pages: 248-256

Abstract: A numerical solution for temperature profile in three-dimensional heat conduction inside multi-connected geometry is presented. The special discretization has been done by Galerkin Finite Element Method (GFEM). Four applications are presented to demonstrate the efficiency of the proposed method. Of these, the first two use a doubly connected domain, and the other, a multi-connected domain, and the first and the third used to validate the results in their respective fields through the analytical solution. To analyze the results, the norms of the errors and their graphs are studied.


Title of the Paper: Adaptive TVD-RK Discontinuous Galerkin Algorithms for Shallow Water Equations

Authors: Thida Pongsanguansin, Khamron Mekchay, Montri Maleewong

Pages: 257-273

Abstract: The adaptive Discontinuous Galerkin(DG) method for solving the one-dimensional conservationequation is presented. In this paper we considerthe advection equation, the Burgers’ equation, and theshallow water equations. To improve the accuracy of thismethod, two types of adaptive technique are employed.These are the adaptive polynomial(p-daptive) and theadaptive mesh(h-adaptive). Troubled cells needed to berefined are detected by two types of indicators, which areerror and gradient indicators. The present schemes havebeen improved the accuracy of the solution during timeintegration process. For smooth solution the accuracy canbe improve by the adaptive polynomial criteria, while theaccuracy of moving shock, rarefaction and high gradientsolution can be improved by the adaptive mesh scheme.High gradient area in the computational domain can bedetected efficiently by both presented indicators


Title of the Paper: Adaptive Tabu Search for Traveling Salesman Problems

Authors: S. Suwannarongsri, D. Puangdownreong

Pages: 274-281

Abstract: One of the most intensively studied problems incomputational mathematics and combinatorial optimization is thetraveling salesman problem (TSP). The TSP is classified andconsidered as the class of the NP-complete combinatorialoptimization problems. By literatures, many algorithms andapproaches have been launched to solve such the TSP. However, nocurrent algorithms can provide the exactly optimal solution of theTSP problem. This article proposes the application of adaptive tabusearch (ATS), one of the most powerful AI search techniques, tosolve the TSP problems. The ATS is tested against ten benchmarkreal-world TSP problems. Results obtained by the ATS will becompared with those obtained by the genetic algorithms (GA) andthe tabu search (TS). As results, the ATS, TS, and GA can providevery satisfactory solutions for all TSP problems. Among them, theATS outperforms other algorithms.


Title of the Paper: Performance Comparison of Different Over-Sampling Rates of Decision Trees for the Class of Higher Error Rate in the Liver Data Set

Authors: Hyontai Sug

Pages: 282-289

Abstract: Because of the comprehensibility of decision trees theyare good tools for data mining of data sets in medicine domain. Liverdisorder disease is a disease in such domain, so that decision trees canbe very useful data mining tools to diagnose the disease. But theaccuracy of decision trees may be limited due to insufficient data in thedomain. In order to generate more accurate decision trees for thedisease this paper suggests a method based on over-sampling for thedata instances that are in the class of high error rate so that we cancompensate the insufficiency. Experiments were done with tworepresentative decision tree algorithms, C4.5 and CART, and foundvery good result in accuracy which is up to 81.16%.


Title of the Paper: Solution of Technical Projects using Computer Virtual Prototypes

Authors: Karel Dvorak, Josef Sedivy

Pages: 290-297

Abstract: Tools for design, simulation and engineering technologies used in industrial practice have a strong potential for visualization and simulation options. Their systematic use especially in the project education is the subject of the research aimed at increasing of knowledge of engineering issues and increasing of skills of work with applied information technologies. Development of creativity of technicians and faster adaptability of graduates in their professional practice is expected. This text presents the concept of understanding of engineering methods, supported by deployment of these instruments, the issue relevant materials and the conclusions of investigations. Research question and hypotheses are introduced in the paper.


Title of the Paper: Information Control of the Autonomous Production System

Authors: Bronislav Chramcov, Robert Bucki

Pages: 298-305

Abstract: The paper highlights the problem of mathematicalmodelling of the autonomous production system in which there areproduction stands equipped with exactly the same machines arrangedin a series. Each machine can perform a number of operations bymeans of a specified tool predefined from the set of tools. However,only one operation can be performed in each work stand at one time.Required specification is shown in order to model the expectedsystem. The presented system requires the strategy on the basis ofwhich the whole logistic process is run. Moreover, there is the needto implement the adequate criterion to obtain the expected results.The heuristic approach determines the order vector element to berealized. A sample case study is analyzed.


Issue 3, Volume 6, 2012


Title of the Paper: Introduction to the Absolute Zero Velocity Machine

Authors: Claude Ziad Bayeh

Pages: 307-314

Abstract: The “Absolute zero velocity machine” is a new instrument invented by the author; it is introduced to detect the velocity of celestial objects that contain this instrument for example: rockets, satellites, planets… this instrument measure objects that contain it running at very high velocity for example 20,000km/h. At very high velocity the existing instruments cannot detect the exact velocity of objects because the speed is comparable to the speed of the light for example 0.1c, this cannot be detectable with existing systems and circuits because the velocity will be relative with respect to the observer, which means the internal measured velocity will not be the same as the external measured velocity. So the best solution for this problem is to fabricate an instrument that measures the speed of objects by using the light as a device. So every velocity will be compared to the speed of the light, in this manner we can measure the exact velocity of the objects. The concept of this instrument is to send a beam of light from the head of laser that will be reflected by a mirror and will return and hit sensors placed behind the head of laser. The displacement of the light from the head of laser will give the exact velocity of the object with respect to the internal and external observers by using some formulae developed in this paper. The internal velocity of the object with respect to the internal observer is not equal to zero when the object is running in the space. If the object is at rest then the indicated velocity is equal to zero with respect to the internal observer. In this paper the author proposed advanced postulates rather than Einstein’s postulates. The new postulates can resolve big problems that exists, for example the existence of some particles that travel faster than the speed of light.


Title of the Paper: SIP End to End Performance Metrics

Authors: Miroslav Voznak, Jan Rozhon

Pages: 315-323

Abstract: The paper deals with a SIP performance testingmethodology. The main contribution to the field of performancetesting of SIP infrastructure consists in the possibility to perform thestandardized stress tests with the developed SIP TesterApp without adeeper knowledge in the area of SIP communication. The developedtool exploits several of open-source applications such as jQuery,Python, JSON and the cornerstone SIP generator SIPp, the result ishighly modifiable and the application is able to carry outbenchmarking in accordance with RFC 6076. The main advantage ishigh performance, meanwhile has been tested up to tens thousandssimultaneous connections, and scalability.


Title of the Paper: Information Control of Allocation Tasks in the Synthetic Manufacturing Environment

Authors: R. Bucki, P. Suchánek, D. Vymětal

Pages: 324-332

Abstract: The paper focuses on the problem of mathematical modeling of the highly complex manufacturing system imitating real production systems. However, for simplicity needs modeling is carried out in the proposed synthetic manufacturing environment. Authors concentrate on introducing extended specification details leading to creating the proper functional model of the potential manufacturing system. The most characteristic thing highlighted here is the way of modeling its flexibility which allows the modeled system to adjust to the required configuration of production stations resulting from customers’ demand e.g. the number of stations in the manufacturing line, implemented tools in each production station and a sequence of passing ordered elements to be realized in other available manufacturing plants of identical production possibilities. Orders are accepted on the basis of the order matrix distinguishing customers and their demands. Heuristic algorithms choose the production plant and, subsequently, orders which are to be realized to meet the stated criterion. The operating principle forms the basis for creating the simulator of the modeled manufacturing system. However, there are production strategies which decide about the exact moment of beginning realization of the order matrix. The sample study case justifies the approach presented in the paper.


Title of the Paper: Applying Genetic Algorithm and Fourier Series to WQI of Tha Chin River in Thailand

Authors: Somkid Amornsamankul, Busayamas Pimpunchat, Sartja Duangchai-Yoosook, Wannapong Triampo

Pages: 333-340

Abstract: Water pollution is the main problem that effects onthe community in Thailand. The Pollution Control Department inThailand has indexed the water quality using eight parameters. In thispaper, the main propose is to reduce eight parameters of water qualityindex(WQI) to four parameters which are dissolved oxygen (DO),total solid (TS), biological oxygen demand (BOD), and suspendedsolid (SS). The factor analysis, correlation analysis and fourier seriesare used to simulate the data. The data obtained from Tha Chin riverduring 2002 - 2007 is used in this model. The genetic algorithm isapplied to find the weight of each parameter. The result shows thatthe modified WQI using four parameters provides the same resultobtained from the model using eight parameters.


Title of the Paper: Comparison of Markov Chain Monte Carlo and Generalized Least Square Methods on a Model of Glucose / Insulin Dynamics with GLP1-DPP4 Interaction

Authors: Sutharot Lueabunchong, Yongwimon Lenbury, Simona Panunzi, Alice Matone

Pages: 341-350

Abstract: In this paper, the performances of Markov Chain Monte Carlo (MCMC) method and Generalized Least Square (GLS) method are compared when they are used to estimate the parameters in a nonlinear differential model of glucose/insulin metabolism with GLP1-DPP4 interaction. The model is used to generate the data that consists of the time-concentration measurements of plasma glucose and of insulin, which are important in Diabetes Mellitus (DM) treatment. We show the results from three different runs to obtain parameter estimations by both MCMC and GLS. The true values (TV), point estimates (PM), standard deviation (SD) and 95% credible intervals (CI) of population parameters based on the two methods are presented. Our results suggest that MCMC is better able to estimate the parameters based upon smaller bias and standard deviation. Although MCMC requires more calculation time than GLS, it offers a more appropriate method, in our opinion, for nonlinear model parameter estimations without knowledge of the distribution of the data and when heterogeneity of variance is evident.


Title of the Paper: Stability Analysis and Analytical Solution of a Nonlinear Model for Controlled Drug Release: Travelling Wave Fronts

Authors: Chontita Rattanakul, Yongwimon Lenbury

Pages: 351-359

Abstract: In this paper, the process of drug dissolution and release from a planar matrix is investigated based on two coupled nonlinear partial differential equations proposed by Goran Frenning in 2003. In the modelling the process drug adsorption has been disregarded, assuming concentration-independent diffusion coefficients, using perfect sink conditions, and specializing to a planar geometry, The concentration profile of the mobile, or diffusing, the resulting model is rather complex and has been investigated only numerically and only approximate solution have been possible. In this paper it is shown that an analytical solution can be obtained exactly in the form of a travelling wave front. We describe the method for finding the analytical solutions using the travelling wave coordinate when the wave is assumed to be moving at constant speed. The model system of partial differential equations is transformed into two coupled ordinary differential equations, which is analysed interms of the stability of its steady state. Analytical solutions are derived in three possible cases, giving travelling wave solutions. We then discuss a comparison between the exact solutions obtained here and the “analytical short-time approximation” as well as the curves obtained from the modified Higuchi formula reported by Frenning in 2003.


Issue 4, Volume 6, 2012


Title of the Paper: Open Source E-learning Anxiety, Self-Efficacy and Acceptance – A Partial Least Square Approach

Authors: Norshidah Mohamed, Nor Shahriza Abdul Karim

Pages: 361-368

Abstract: Open source electronic learning (e-learning) has given rise to a new way of learning for students. Implementation of open source e-learning provides quick benefits to educational administrators and educators but may present issues to students as users. Students may feel challenged to accept e-learning immediately as a new mode of learning, may experience computer application anxiety and may not have the confidence to use it quickly as desired. The paper reports the implementation of Claroline an open source e-learning at a public institution of higher learning in Malaysia. The research aims to establish among postgraduate business students (1)the acceptance of open source e-learning, that is, in terms of intention to use, perceived usefulness and perceived ease of use (2) the relationship between computer application anxiety and e-learning acceptance (3) the relationship between self-efficacy and e-learning acceptance. Students were encouraged to use the functions available in Claroline although not mandatory. A survey questionnaire was used as the instrument to collect data about e-learning acceptance. Partial least square was used for data analysis. There is evidence to suggest open source e-learning acceptance through significant relationships among students’ perceived ease of use and intention to use. Contrary to past findings, there was no link between computer application anxiety and e-learning acceptance. There was, however, a positive and significant link between self-efficacy and intention to use. The implications of the research are discussed herein.


Title of the Paper: Heatmap Generation Techniques used for GeoWeb Application User-Interface Evaluation

Authors: Oldřich Horák, Martin Novák, Vojtěch Zákoutský

Pages: 369-377

Abstract: This article describes techniques of the heatmaps generation, and their implementation. The specific issues of the implementation in the web-based environment are discussed, and the special features of the using in the GeoWeb application are described. There are discussed the differences depending on the limited features of the web-application architecture, the possible solution using the web-browser scripting capabilities, and the impact on the GeoWeb application user-interface evaluation. In the conclusion, the future ways of the research are described and briefly explained.


Title of the Paper: REPTree and M5P for Measuring Fiscal Policy Influences on the Romanian Capital Market during 2003-2010

Authors: Mihaela Göndör, Vasile Paul Bresfelean

Pages: 378-386

Abstract: The present paper is an extension of our latest studies [44] where we intend to analyze the importance of fiscal policy influences on the Romanian capital market, among other factors, during 2003-2010. By employing data mining techniques in our research, such as regression and model trees, which outline the average daily trading on Bucharest Stock Exchange (BVB), we assert that fiscal policy is a major factor influencing capital markets, its influence being found in the behavior of all factors mentioned as important for market capital strength, like interest rates, inflation rates and exchange rates. Although most of authors consider an undersized influence of fiscal policy on capital market, we affirm that fiscal policy can be a successful instrument for alleviating business rotations.


Title of the Paper: Parameter-Driven Rapid Virtual Prototyping of Flexible Manufacturing System

Authors: Kwan Hee Han, Sung Moon Bae, Sang Hyun Choi, Geon Lee

Pages: 387-396

Abstract: Most enterprises are struggling to change their existing business processes into agile, product- and customer-oriented structures to survive in the competitive and global business environment. In order to sustain competitiveness, manufacturing organizations should provide the sufficient flexibility to produce a variety of products on the same system. FMS (Flexible manufacturing System) is regarded as one of the most efficient methods in reducing or eliminating today?s problems in manufacturing industries. In order to cope with current dynamic changes of manufacturing system, it is quintessential to design and verify the layout of FMS rapidly and easily during the design stage. And it is needed that supervisory control patterns for material flow should be categorized for later reuse in control programs. It is also necessary that the existing 3D layout components for simulation-based verification should be reused for other FMS layout verification tasks to shorten the design time of FMS. The purpose of this paper is to propose the tool of rapid parametric layout determination and construction of 3D discrete event simulation model, and the categorization of control patterns of material flow within FMS. To be a parameter-driven solution, FMS is modularized by „station? concept and resources within FMS are standardized in this paper. This approach can serve as a rapid prototyping tool of layout configuration and control program preparation for FMS design engineers as well as a communication tool for managers and other personnel involved.


Title of the Paper: Distribution of Number of Roots of Random Polynomial Equations in Small Intervals

Authors: E. Shmerling

Pages: 397-404

Abstract: The problem of finding the probability distribution of the number of zeros in some real interval of a random polynomial whose coefficients have a given continuous joint density function is considered. A new simulation algorithm for solving this problem is presented. The effectiveness of the presented algorithm for the case where the real interval is small is proved.


Title of the Paper: Vensim PLE to Create Models for Water Management

Authors: Rui M. S. Pereira, Naim Haie, Gaspar J. Machado

Pages: 405-412

Abstract: This paper intends to show how easy it can be to build a prototype that will help water resources managers to make their decisions not only based on politics or economics, but also with a scientific tool that will help them build different weather scenarios. First we present a very simple mathematical model that has all the potential to evolve from version to version. Its implementation is easy to do, using a platform to perform simulations, called Vensim PLE. The philosophy that Vensim PLE follows to build up models of simulation is very interesting and simple. It is based in 3 main entities - container variables, auxiliary variables and fluxes. It is a pictorial based language, and therefore, it is quite easy to follow models. The details are hidden when you define these entities represented in a figure. We used a freeware version for students. An example that shows a possible model of what happens in the vicinity of an Urban area is presented, showing it is possible, to use Vensim PLE to build rather complex models of simulation. Provided we have good data, we are able to create different scenarios with literally the click of the mouse in our PC. This paper can help water managers to understand how to implement their own models using a freeware software that is easy to use and produces nice and easy graphs without a lot of effort.


Issue 5, Volume 6, 2012


Title of the Paper: Analysis of Air Velocity Distribution in a Laboratory Batch-Type Tray Air Dryer by Computational Fluid Dynamics

Authors: D. A. Tzempelikos, A. P. Vouros, A. V. Bardakas, A. E. Filios, D. P. Margaris

Pages: 413-421

Abstract: Batch dryers are some of the most widespread equipment used for fruit dehydration. Nevertheless, the optimization of the air distribution inside the drying chamber of a batch dryer remains a very important point, due to its strong effect on drying efficiency as well as the uniformity of the moisture content of the drying products. A new scale laboratory batch-type tray air (BTA) dryer was designed, constructed and evaluated for the drying of several horticultural and agricultural products. The airflow field inside the dryer was studied through a commercial computational fluid dynamics (CFD) package. A three-dimensional model for a laboratory BTA dryer was created and the steady-state incompressible, Reynolds-Averaged Navier-Stokes equations that formulate the flow problem were solved, incorporating standard and RNG k-ε turbulence models. In the simulation, the tray, used inside the BTA drying chamber, was modeled as a thin porous media of finite thickness. The simulations for testing the chamber were conducted at an average velocity of 2.9 m/s at ambient temperature. The CFD models were evaluated by comparing the airflow patterns and velocity distributions to the measured data. Numerical simulations and measurements showed that the new scale laboratory BTA dryer is able to produce a sufficiently uniform air distribution throughout the testing chamber of the dryer.


Title of the Paper: Software Development for Memorandum Report System in Monitoring Students

Authors: Anon Sukstrienwong

Pages: 422-429

Abstract: Nowadays, most teachers encounter the problem that students cause more troubles in school related to the behavior of students. As it is involved in the development of skills and ability to learnt, there are many computer systems implemented to solve this problem and ensure that students will behave well and study efficiently. However, in Thailand, there are few computer and information systems implemented for school attendance at schools. Therefore, a case study of memorandum report system is developed to help teachers for monitoring student behaviors in school. This system is capable of holding daily school attendance and behavior of each student that may harm learning skills. The system called the Memorandum Report System (MRS) was deployed for Bangban School as a case study. It has revealed some important information for monitoring students in school. The software evaluation of satisfaction and efficiency of the system indicated that our system works efficiently and supports all of school’s requirements.


Title of the Paper: A Delay-Differential Equation Model of Bone Remodeling Process: Effects of Estrogen Supplements

Authors: Wannapa Panitsupakamon, Chontita Rattanakul

Pages: 430-438

Abstract: We modify a mathematical model proposed for describing bone resorption and bone formation process based on the effect of calcitonin to investigate the effect of time delay on bone remodeling process. The model is then analyzed by using Hopf bifurcation theorem. The conditions on the system parameters are then derived so that a periodic solution can be assured. A computer simulation is carried out in order to support our theoretical prediction. Moreover, the effects of estrogen supplement in different manner are also investigated numerically.


Title of the Paper: Simulation Methods for the Heat Distribution Systems

Authors: L. Vasek, V. Dolinay

Pages: 439-446

Abstract: This study deals with the issues of heat consumption in the urban agglomeration and problems linked with modeling and simulation of systems providing heat supply. In this article we will call such system SHDC (System of Heat Distribution and Consumption). The key problem in the controlling mechanism of such systems consists in transport delay of transferring heat media. Therefore the control mechanism must operate in prediction kind of mode. The two control parameters, or control variables, are temperature of heat carrier and its flow rate. Their time behavior must be predicted for efficient control of the whole heat energy supply. Main focus will be placed on description of the proposed and implemented computer model of the heat distribution system in the selected agglomeration. This model is proposed as a discrete simulation model. The simulation is one of the methods, which can be effectively used for the analysis of large and complex dynamic systems properties such as the municipal heating networks - distribution and heat consumption system. The model is implemented in the form of computer applications and tested on real operational data.


Title of the Paper: Chromium VI Issue in Leather Waste – A Technology for the Processing of Used Leather Goods and Potential of Raman Spectroscopy in Chromium Traces Detection

Authors: Karel Kolomaznik, Michaela Barinova, Hana Vaskova

Pages: 447-455

Abstract: In this paper, the authors deal with the problem of chromium present in various products and materials, and the health and environmental risks that this chromium can represent if uncontrolled. Special emphasis is given to various chrome tanned wastes generated by the leather industry. The agent that makes this waste potentially hazardous is hexavalent chromium. Its compounds can have negative effects on human health and some chromium VI salts are considered carcinogens. The authors present the risks of spontaneous oxidation of trivalent chromium to its hexavalent form in various conditions, as well as an analytical method for detection of small concentrations of both forms of chromium using Raman spectroscopy. Raman spectra of hexavalent chromium and simultaneous detection of the two valences of chromium obtained from leather samples are presented. Another important issue addressed in this paper are technologies processing various kinds of chrome-tanned waste. From the technological point of view, there are several ways of handling primary leather waste, but no satisfactory technology has been developed so far for the secondary waste (manipulation waste, e.g. leather scraps, and used leather products). An innovative hybrid technology of processing the secondary waste is presented and tested in industrial conditions and its application on used leather goods is discussed, as well as possibilities of commercial utilization of the products generated by the technology.


Title of the Paper: Potential of Tannery Fleshings in Biodiesel Production and Mathematical Modeling of the Fleshing Pre-Treatment

Authors: Karel Kolomaznik, Jiri Pecha, Michaela Barinova, Lubomir Sanek, Tomas Furst, Dagmar Janacova

Pages: 456-464

Abstract: Waste fat produced by tanneries during the processes in which raw hide is transformed into leather represents a very important raw material for biodiesel production. However, the acid number of this feedstock is usually high above the limit 1 mg KOH/g, which makes it unsuitable for direct processing into biodiesel via alkali-catalyzed transesterification. An effective and economically viable pre-treatment method is needed to overcome this disadvantage and at the same time to maintain the advantage of low price of the feedstock. In this paper, we present a pre-treatment method consisting of the refining melting and deacidification steps. The pre-treatment processes are part of a complex technology for biodiesel production from low quality fats and oils. Mathematical models of the said processes have been elaborated for the optimization of the pre-treatment procedure, as well as a software application developed for simulation of the deacidification step. Suitability and economic parameters of the pre-treatment technology for feedstocks with various free fatty acid content is discussed. Finally, the waste tannery fat potential in biodiesel production is briefly evaluated from the economic and raw material base point of view.


Title of the Paper: Simulation and Experiments on the Secondary Heat Distribution Network System

Authors: V. Dolinay, L. Vasek

Pages: 465-472

Abstract: This paper presents simulation model and results obtained during the practical experiments on secondary net of distribution heating system. It covers systems for urban location, typically housing estate or groups of family houses. Such systems deal with distribution between heat exchanger and individual door stations. The experiment was based on similar day identification and subsequent prediction of recommendations for system control. Predicted values were applied in real control process and later obtained results were compared with predicted courses and analyzed causes of all differences. To identify deviations caused by imperfection of the model and methods and separate them from inaccuracy of weather forecasts the simulation were repeated with ideal course – weather forecast were replaced with measured course.


Title of the Paper: A Knowledge Mining Component for Computational Health Informatics

Authors: Nittaya Kerdprasop, Kittisak Kerdprasop

Pages: 473-482

Abstract: Computational health informatics is an emerging field of research focusing on the devise of novel computational techniques to facilitate healthcare and a variety of medical applications. Healthcare organizations at the present day regularly generate huge amount of data in electronic form stored in databases. These data are valuable resource for automatic discovering of useful knowledge, known as knowledge mining or data mining, to gain insight knowledge and to support patient-care decisions. During the past decades there has been an increasing interest in devising database and learning technologies to automatically induce knowledge from clinical and health data using imperative and object-oriented programming styles. In this paper, we propose a different scheme using declarative programming, implemented with functional and logic-based languages, in which we argue to be more appropriate for the knowledge intensive tasks. Easy knowledge transfer to the knowledge base content is demonstrated in this paper to confirm the appropriateness of high level declarative scheme. Our system includes three major knowledge mining tasks: data classification, association analysis, and clustering. We demonstrate knowledge deployment aspects through the trigger creation and the automatic generation of knowledge base for the medical decision support system. These knowledge deployment examples illustrate advantages of the high level logic and functional scheme.


Title of the Paper: Declarative Parallelized Techniques for K-Means Data Clustering

Authors: Kittisak Kerdprasop, Surasith Taokok, Nittaya Kerdprasop

Pages: 483-495

Abstract: The k-means clustering algorithm is an unsupervised learning method for non-hierarchical assigning data points into groups. K-means algorithm performs in an iterative manner the data assignment and central point calculation steps until data points do not move from one group to another. On clustering large datasets, the k-means method spends most of its execution time on computing distances between all data points and existing central points. It is obvious that distance computation of one data point is irrelevant to others. Therefore, data parallelism can be achieved in this case and it is the main focus of this paper. We propose parallel methods, including the approximation scheme, to the k-means clustering. Then demonstrate the implementation of parallelism through the message passing model using a concurrent functional language, Erlang, and also through the multi-threading technique using Prolog. Both Erlang and Prolog are declarative method that efficiently support rapid prototyping. The experimental results of both parallelized implementation techniques show the speedup in computation. The clustering results of an approximated parallel method are impressive in terms of its fast execution time.


Title of the Paper: Comparison of Single-phase and Two-phase Flow Dynamics in the HLTP for Microalgae Culture

Authors: Ujjwal Kumar Deb, Kittisak Chayantrakom, Yongwimon Lenbury

Pages: 496-503

Abstract: This paper aims to show dynamic behavior of microalgae suspension in a Horizontal Loop Tubular Photobioreactor (HLTP). Two models of a single-phase flow and a two-phase flow have been proposed taking into account the light irradiance. The governing equations describing a single-phase flow are the continuity equation and the Navier-Stokes equation. The viscosity of the microalgae suspension is a function of microalgae cell concentration which varies in time. Using the governing equations of a single-phase flow together with the Cahn-Hiliard mass conservation equation, we can describe the dynamic behavior of a two-phase flow. The results obtained from both models are compared. It is noted that both results are significantly different. In the two-phase flow model, the mass transfer rate and the shear rate are higher than those obtained from the single-phase model.


Title of the Paper: SEIQR Disease Transmission on GA-Network

Authors: W. Jumpen, S. Orankitjaroen, B. Wiwatanapathaphee, P. Boonkrong

Pages: 504-511

Abstract: This paper aims to present a local network model of Susceptible-Exposed-Infectious-Quarantined- Recovered disease transmission taking into account the community structure of the population. The population structure is generated by genetic algorithm based on the network modularity concept for the heterogeneous property. The basic reproductive number of the model is derived and used to predict the epidemiological situation. In numerical simulation, the disease transmissions within and across the communities are considered. The results show that this approach is able to capture the essential feature of epidemic spreading in human community.


Issue 6, Volume 6, 2012


Title of the Paper: Learning the Value of a Function from Inaccurate Data with Different Error Tolerance of Data Error

Authors: K. Khompurngson, B. Novaprateep, D. Poltem

Pages: 513-520

Abstract: The intend of learning problem is to identify the best predictor from given data. Specifically, the well-known hypercircle inequality was applied to kernel??based machine learning when data is know exactly. In our previous work, this lead us to extend it to circumstance for which data is known within error. In this paper, we continues the study of this subject by improving the hypothesis of nonlinear optimization problem which is used to obtain the best predictor. In additional, we apply our results to special problem of learning the value of a function from inaccurate data with different error tolerance of data error.


Title of the Paper: Function Representation Using Hypercircle Inequality for Data Error

Authors: D. Poltem, K. Khompurngson, B. Novaprateep

Pages: 521-528

Abstract: In this paper, the study of Hypercircle inequality for data error (Hide) is briefly reviewed. Within the framework of Hide and to find a function representation from inaccurate data, the midpoint algorithm is provided. We give a new result for a function representation that has the form of the representer theorem. We illustrate some important facts for a practical computation and study the problem in the learning value of a function for a learning kernel. We demonstrate the potential of this framework by comparing our result to the regularization method, which is the standard method in the learning value of a function. The present example compares the performance of the methods when the optimal values of regularization parameters are used.


Title of the Paper: Methods for Airport Terminal Passenger Flow Simulation

Authors: Gabor Kovacs, Istvan Harmati, Balint Kiss, Gabor Vamos, Peter Maraczy

Pages: 529-541

Abstract: Increased air traffic has also caused major rise of passenger flow at airport terminals. In order to provide efficient and comfortable service at airports, passenger flow has to be improved, which has to be based on analysis of simulation results. This paper presents an evaluation of two methods for simulating passenger flow of an airport terminal. The terminal is decomposed to several zones, referred to as cells, each having its own behavior. Passenger flow between these cells is defined as a directed graph. The paper presents the difference equation based store–and–forward model and a Petri net based model for the simulation of passenger flow. Principles of passenger flow modeling by the two methods are presented, and detailed description of typical cell models are given for both approaches. The methods are evaluated on the simulation of a smallscale example. Based on the results, comparison on the two methods is given and a conclusion is drawn.


Title of the Paper: Polymer Fluidity Influenced by the Percentage of Filler

Authors: M. Stanek, D. Manas, M. Manas, K. Kyas, V. Senkerik, A. Skrobak, J. Navratil

Pages: 542-549

Abstract: Delivery of polymer melts into the mold cavity is the most important stage of the injection molding process. This paper shows the influence of cavity surface roughness and technological parameters on the flow length of polymers into mold cavity. Application of the measurement results may have significant influence on the production of shaping parts of the injection molds especially in changing the so far used processes and substituting them by less costly production processes which might increase the competitiveness of the tool producers and shorten the time between product plan and its implementation.


Title of the Paper: Comparison of Different Rapid Prototyping Methods

Authors: M. Stanek, D. Manas, M. Manas, J. Navratil, K. Kyas, V. Senkerik, A. Skrobak

Pages: 550-557

Abstract: Rapid prototyping technologies for easy production of prototypes, parts and tools are new methods which are developing unbelievably quickly. Successful product development means developing a product of high quality, at lowest cost, in the shortest time, in at a reasonable price. The development of the part and its introduction to market is time consumption process. But „time is money“ and therefore could be said that money saving is greatest when time to market is minimalized utmost. The main objective of this article is to give the basic introduction to this problematic and compare two different methods commonly used for prototype parts production. Especially cost and time consumption and final mechanical properties of the produced model.


Title of the Paper: Gate Location and Cooling System Optimization

Authors: V. Senkerik, M. Stanek, M. Manas, D. Manas, A. Skrobak, J. Navratil

Pages: 558-565

Abstract: There are higher and higher quality and productivity requirements on plastic products. There increased requirements are mainly in automotive industry. This research paper deals with construction solution of an injection mold for specific product in automotive industry. It examines 4 designed gate position and material with different content of a filler and shape of the filler and its influence on deformation. Differences in deformations between individual versions are rather significant. There are also different layouts of drilled cooling channels and their influence on deformation compared. Analysis results shows that it influence mainly manufacturing production and also deformation. Eligible usage of these parameters can improve quality – lower product deformation.


Title of the Paper: Microhardness of HDPE Influenced by Beta Irradiation

Authors: M. Ovsik, D. Manas, M. Manas, M. Stanek, K. Kyas, M. Bednarik, A. Mizera

Pages: 566-574

Abstract: Hard surface layers of polymer materials, especially HDPE, can be formed by chemical or physical process. One of the physical methods modifying the surface layer is radiation cross-linking. Radiation doses used were 0, 33, 66, 99, 132, 165 and 199 kGy for HDPE. Individual radiation doses caused structural and micromechanical changes which have a significant effect on the final properties of the HDPE tested. Radiation doses cause changes in the surface layer which make the values of some material parameters rise. The improvement of micromechanical properties was measured by an instrumented microhardness test.


Title of the Paper: Microhardness of PA6 Influenced by Beta Low Irradiation Doses

Authors: M. Ovsik, D. Manas, M. Manas, M. Stanek, S. Sanda, K. Kyas, M. Reznicek

Pages: 575-583

Abstract: The experimental study deals with the effect of modification of the surface layer by irradiation cross-linking on the mechanical properties of the PA6 tested using the instrumented microhardness test. The surface layer of PA6 specimen made by injection technology was modified by irradiation cross-linking using beta irradiation, which significantly influences mechanical properties of the surface layer. Compared to the heat and chemical-heat treatment of metal materials (e.g. hardening, nitridation, case hardening), cross-linking in polymers affects the surfaces in micro layers. These mechanical changes of the surface layer are observed in the instrumented microhardness test. Our research confirms the comparable properties of surface layer of irradiated PA6 with highly efficient polymers. The subject of this research is the influence of irradiation dosage on the changes of mechanical properties of PA6.


Title of the Paper: Properties of HDPE after Radiation Cross-Linking

Authors: A. Mizera, M. Manas, Z. Holik, D. Manas, M. Stanek, J. Cerny, M. Bednarik, M. Ovsik

Pages: 584-591

Abstract: Radiation processing involves the use of natural or manmade sources of high energy radiation on an industrial scale. The principle of radiation processing is the ability of high energy radiation to produce reactive cations, anions and free radicals in materials. The industrial applications of the radiation processing of plastics and composites include polymerization, cross-linking, degradation and grafting. Radiation processing mainly involves the use of either electron beams from electron accelerators or gamma radiation from Cobalt-60 sources. The HDPE high density polyethylene tested showed significant changes of temperature stability and mechanical properties after irradiation. From this point-of-view, new applications could also be seen in areas with service temperatures higher than their former melting point. The comparison of the temperature stability and mechanical properties of irradiated and non irradiated HDPE is presented in this paper.


Title of the Paper: Properties of Selected Polymers after Radiation Cross-Linking

Authors: Mizera, A., Manas, M., Holik, Z., Manas, D., Stanek, M., Cerny, J., Bednarik, M., Ovsik, M.

Pages: 592-599

Abstract: Radiation processing involves the use of natural or manmade sources of high energy radiation on an industrial scale. The principle of radiation processing is the ability of high energy radiation to produce reactive cations, anions and free radicals in materials. The industrial applications of the radiation processing of plastics and composites include polymerization, cross-linking, degradation and grafting. Radiation processing mainly involves the use of either electron beams from electron accelerators or gamma radiation from Cobalt-60 sources. The TPE-E thermoplastic elastomer, LDPE low density polyethylene and PA6 polyamide 6 tested showed significant changes of temperature stability and mechanical properties after irradiation. From this point-of-view, new applications could also be seen in areas with service temperatures higher than their former melting point. The comparison of the temperature stability and mechanical properties of irradiated and non-irradiated TPE-E, LDPE and PA6 are presented in this paper.


Title of the Paper: Measuring of Temperature and Pressure in Injection Mold

Authors: K. Kyas, J. Cerny, M. Stanek, M. Manas, D. Manas, V. Senkerik, A. Skrobak

Pages: 600-607

Abstract: Injection molding is one of the most widespread technologies in polymer process. This technology has enough advantages in rubber industry too. This article compare results from temperature/pressure sensors in real process with flow analyses in computational software. These received dates should be helpful and advantageous for the polymeric industry, especially rubber industry. The product’s production cycle can be shorted with right setting of process. Using sensors and flow analyze can be good way for right process setting.