International Journal of Computers and Communications

E-ISSN: 2074-1294
Volume 6, 2012

Main Page

Submit a paper | Submission terms | Paper format

 


Issue 1, Volume 6, 2012 


Title of the Paper: Tele-market Modeling of Fuzzy Consumer Behavior

Authors: R. Bashaand, J. Ameen

Pages: 1-8

Abstract: Just like any other aspect of human life on our planet, the advancements in internet technology over the past decades have greatly influenced the way we live, changing the socio-political structure of the nations as it can clearly be seen in the Middle East and northern African countries. In specific terms, they have influenced our behavior in the way that we conduct marketing extending it beyond traditional boundaries both from business actions and consumer’s behavior. With ease of access and the vast amount of information that can be observed on the internet, consumers are lost for choice. These have increased the challenges that both consumers and business people face manifold. The survival conditions for both sides require more information in order to build new models that can respond to these challenges and help the decision making process in line with these changes and making them more efficient. This paper attempts to assess the fuzzy actions of buying behavior from a multinational viewpoint using data that have been collected on consumer risk assessment while attempting to buy a product on the internet. The survey covered a sample of 270male and female participants of different nationality with different levels of income and education. In a hierarchical modeling attempt, Logistic regression is used to identify factors that are significant in the action of purchasing leading to a sensitivity of 0.94 and specificity of 0.81. The resulting significant components from the first stage model are used as inputs in the application of AnswerTreeand more specifically, Classification Regression Tree (CRT) model to formulate what-if decision scenarios to help decision makers improve their targeting and selling options. The sensitivity and specificity of the latter approach were 0.93 and 0.83 respectively and led to the establishment of a set of decision rules each with their probability of success ranging from 0.67 to 1. None of the factors of gender, geographic location or income were significant in the process.


Title of the Paper: Will Cloud Computing Change Standards in IT Service Management?

Authors: Marc Jansen

Pages: 9-16

Abstract: One of the latest hypes in IT is the well-known Cloud Computing paradigm. This paradigm that showed up in recent years is a paradigm for the dynamic usage of computational power, memory and other computational resources. With respect to hypes, the author strongly believes that the Cloud Computing paradigm has the potential to survive the hype and to become a usual technology used for the provision of IT based services. Therefore, it will be necessary to deploy Cloud Computing based infrastructures in a professional, stable and reliable way. This would lead to the idea that the Cloud Computing paradigm needs to be concerned with respect to IT Service Management, since cloud based infrastructures have to be managed differently in comparison to a usual infrastructure. This paper discusses, based on the IT Infrastructure Library (ITIL), as the de-facto standard for IT Service Management, whether this de-facto standard might also be able to manage Cloud Computing based infrastructures, how the according processes might change and whether ITIL supports a division of labor between the customer and the service provider of a Cloud Computing based infrastructure.


Title of the Paper: Designing Systems for Control and Verifying the Authenticity of Products Using RFID Technology

Authors: Eleonora Tudora, Adriana Alexandru, Marilena Ianculescu

Pages: 17-25

Abstract: In contrast to the typical utilization of Radio Frequency IDentification (RFID) technology today in warehouse management and supply chain applications, the focus of this paper is an overview of the structure of RFID systems used by RFID technology and it also presents a solution based on the application of RFID for brand authentication, traceability and tracking, by implementing a production management system and extending its use to traders.


Title of the Paper: Extending Kosovo Civil Registry String Searching Algorithm

Authors: Blerim Rexha, Valon Raça, Agni Dika

Pages: 26-34

Abstract: Offering e-Government services to citizens is linked primarily to civil registry data. Searching for a citizen’s data in civil registry is a common service carried out by string search algorithms using unique keywords such as citizen’s name and surname. Similar pronunciation of some Albanian language consonants challenges search on citizen’s data, names of which are similarly pronounced, despite different spelling. This paper presents a novel approach for extending string searching algorithm based on Albanian names in Kosovo Civil Registry. This paper compares Levenshtein distance, American Soundex and extended Soundex algorithm results in a database of 271.000 citizens of Prishtina municipality. The extended algorithm accommodates basic rules of pronunciation in Albanian language and its accuracy and efficiency is better than Levenshtein distance and American Soundex.


Title of the Paper: Steganographic Software: Analysis and Implementation

Authors: Akram M. Zeki, Adamu A. Ibrahim, Azizah A. Manaf

Pages: 35-42

Abstract: Steganography is the method of hiding data in such a way that no one, except the sender and the intended recipient, expects the existence of the hidden data. Thus the goal here is always to conceal the very existence of the secret data embedded in an innocent data in such a way that it will be undetectable, robust and the innocent data should be able to accommodate high capacity of the secret data. Unfortunately, these goals were not commonly seen in most of the techniques. This paper studied different Steganographic techniques and undertakes an experiment using five Steganographic software in order to explore their capabilities. Benchmarking tool for identifying different performance aspects of the Steganographic techniques and Steganographic software like visual quality, performance indices, memory requirement and the evaluation of the maximum capacity for each software under this study. Experimental results show that all the software under this study performs above optimal level, although there are some differences of features and capabilities observed.


Title of the Paper: Connections among CRM, Cloud Computing and Trading Income of Selected Companies

Authors: Jan Němeček, Lucie Vaňková

Pages: 43-50

Abstract: This article is focused to time period 2007-2010. It was time before and during the economic crisis. In this article the readers can found how the selected companies doing business in the Czech Republic are using information technology and business strategy Customer Relationship Management (CRM) and how it have impact to the Trading Incomes. This article also found the answer to the question whether the companies use complete solutions of CRM as purchase and installation of software or whether they use the CRM system only as part of services which are provided through Cloud Computing technology. Also there try to find connections between implementing CRM and Trading Income of companies. Apparently also because of ending of the economic crisis is CRM in the Czech Republic beginning more use. The main goal of implementing CRM is to help to company increase quality of relationship and communication with customers. In this article there are described types of Cloud Computing and most common definition of CRM. The analysis of using of CRM and Cloud Computing in the selected companies, which are grouped by number of employees, is described, too. After that it is comparing Trading Incomes of companies grouped by number of employees. At the end of this article is summary of analysis results and assumptions and found contributions of CRM and Cloud Computing.


Title of the Paper: Teachers’ Professional Development in Free Software for Education in Taiwan

Authors: Jui-Chen Yu, Hung-Jen Yang, Lung-Hsing Kuo, Hsieh-Hua Yang

Pages: 51-59

Abstract: The purpose of this study was to identify the status of promoting teachers to use free software as educational resource. In this information age, using computer software to support learning had become a reality. On campus, teachers lead our students to learn. Teachers’ knowledge of applying computer software in education initiates and leads next generations of using information technology. There is a need to understand how education system helps teachers to learn up to date information about educational software. An investigation method was applied in this study. The population of this study was 7540 from 2002 to July, 2011. 343 courses were sampled for reaching 95% confidence and 5% confidence interval. By applying statistical tests, investigation results were revealed. Based upon statistical results, conclusions of research problems were reached. The increasing frequency of promoting courses is identified. The life cycle of courses offered for learning about free software is also concluded.


Title of the Paper: Predicting the Next Page that Will be Visited by a Web Surfer Using Page Rank Algorithm

Authors: D. Ciobanu, C. E. Dinuca

Pages: 60-67

Abstract: Predicting the next page to be visited by a web user with increasing accuracy have many important applications like caching and prefetching web pages to improve the speed of navigation or creating systems of recommendation to help users to find faster in the site what they are looking for. We have created a java program, using Net Beans IDE, that calculates the probability of visiting the pages using the page rank algorithm and counting links. For exemplification we used the NASA log file available online at http://ita.ee.lbl.gov/html/contrib/NASA-HTTP.html and a log file from a commercial web site http://www.nice-layouts.com. We applied to the entire data set of sessions the program and we obtained probabilities of visiting the pages. After that we applied the program only to the subset of sessions which contain the current page. For data obtained from log files of the NASA website was obtained an improvement in prediction in the sense of increasing the precentage from 19,75% to 32,5%. In the case of data obtained from the log files of the commercial site the improvements for the predictions was smaller from 74,66% to 77,77%. In the chapter with conclusions we present explanations for this differences of improvments obtained in those two cases.


Title of the Paper: An Application for Clickstream Analysis

Authors: C. E. Dinucă

Pages: 68-75

Abstract: In the Internet age there are stored enormous amounts of data daily. Nowadays, using data mining techniques to extract knowledge from web log files has became a necessity. The behavior of Internet users can be found in the log files stored on Internet servers. Web log analysis can improve business firms that are based on a Web site through learning user behavior and applying this knowledge to target them for example to pages that other users with similar behavior have visited. The extraction of useful information from these data has proved to be very useful for optimizing Web sites and promotional campaigns for marketing, etc. In this paper I will focus on finding associations as a data mining technique to extract potentially useful knowledge from web usage data. I implemented in Java programming language, using NetBeans IDE, a program for identification of pages’ association from sessions. For exemplification, I used the log files from a commercial web site.


Title of the Paper: Interleaving Commands: a Threat to the Interoperability of Smartcard Based Security Applications

Authors: Maurizio Talamo, Maulahikmah Galinium, Christian H. Schunck, Franco Arcieri

Pages: 76-83

Abstract: Although smartcards are widely used, secure smartcard interoperability has remained a significant challenge. Usually each manufacturer provides a closed environment for their smartcard based applications including the microchip, associated firmware and application software. While the security of this “package” can be tested and certified for example based on the Common Criteria, the secure and convenient interoperability with other smartcards and smartcard applications is not guaranteed. Ideally one would have a middleware that can support various smartcards and smartcard applications. In our ongoing research we study this scenario with the goal to develop a way to certify secure smartcard interoperability in such an environment. Here we discuss and experimentally demonstrate one critical security problem: if several smartcards are connected via a middleware it is possible that a smartcard of type S receives commands that were supposed to be executed on a different smartcard of type S’. Such “external commands” can interleave with the commands that were supposed to be executed on S. Here we demonstrate this problem experimentally with a Common Criteria certified digital signature process on two commercially available smartcards. Importantly, in some of these cases the digital signature processes terminate without generating an error message or warning to the user.


Title of the Paper: Improving Authentication and Transparency of e-Voting System – Kosovo Case

Authors: Blerim Rexha, Vehbi Neziri, Ramadan Dervishi

Pages: 84-91

Abstract: Authentication and privacy are central issues for acceptance of any e-Voting system in particular and growth of e- Services in general. This paper aims to: (i) to analyze the appropriate architecture and propose new efficient architecture of electronic voting system in Kosovo, and (ii) to analyze the threat vectors and their avoidance in such system. The novelty of implemented solution is based on using dynamic queue list generated based on voters arrivals and identification at the polling station. The proposed architecture enables citizens to cast their vote in any polling station, in opposite to paper form voting where citizen is linked to his predefined polling station. The national election commission configures the smart card, as part of electronic voting infrastructure, to allow decryption of number of records that matches the number of voters in final country wide voting list. The communication between polling stations and central server is encrypted with server’s public key stored in digital certificate and every casted vote is digitally signed by ballot box private key. The developed model is used to compare the costs and efficiency of e-Voting against the traditional paper based voting system in Kosovo.


Issue 2, Volume 6, 2012


Title of the Paper: Results of the Implementation of IP Multimedia Subsystem in one Telecom Operator for the ITIL Incident Management and Problem Management Process

Authors: A. Tanovic, I. Androulidakis, F. Orucevic

Pages: 93-106

Abstract: This paper describes the implementation and using of the IP Multimedia Subsystem (IMS) in one Telecom operator in Bosnia and Herzegovina. In the first part of the paper, is described the design, implementation and testing of the IP Multimedia Subsystem in one Telecom operator. In the second part of the paper, it is described a new organization structure of Telecom operator after releasing into production of the new IMS system. Measurements, which describe the implementation of IMS system in Telecom operator, are finished for two ITIL processes: Incident Management and Problem Management which are integrated into Service Desk function. Gap analysis is selected as the technique for these measurements. Final results show that the Incident Management is implemented with 80% of recommendations and Problem Management is implemented with 76% of recommendations. These results show that the improvement of the IMS system is needed and desirable.


Title of the Paper: Differences in Results of Measurement between ITIL 2007 and ITIL 2011 Model for the IMS System

Authors: A. Tanovic, I. Androulidakis, F. Orucevic

Pages: 107-118

Abstract: The aim of this paper is to present differences between ITIL model from 2007 and ITIL model from 2011. IP Multimedia Subsystem is chosen as the test architecture for the implementation of these two models. The paper contains two different measurements. In the first measurement are taken only 2 parameters: time period needed for the implementation and the number of employees needed for the implementation. Results have showed that the model from 2011 is better for 12% than the model from 2007. In the second measurement are taken all Key Performance Indicators for 15 ITIL processes: Financial Management, Service Portfolio Management, Service Level Management, Capacity Management, Availability Management, IT Service Continuity Management, Information Security Management, Supplier Management, Change Management, Service Asset and Configuration Management, Release and Deployment Management, Service Validation and Testing, Incident Management, Problem Management and Continual Service Improvement process. Results in this measurement have showed that the model from 2011 is better for 10% than the model from 2007.


Title of the Paper: Psyche Mining with PsycheTagger – A Computational Linguistics Approach to Text Mining

Authors: Ahsan Nabi Khan, Liaquat Majeed Sheikh, Summaira Sarfraz

Pages: 119-127

Abstract: The human elements of personality working behind the creation of a write-up play an important part in determining the final dominant mood of a text. This article is a detailed description of a formal research in Text Mining using purpose-built Computational Intelligence tools, PsycheMap and PsycheTagger. PsycheMap is created to classify documents based on emotive content, while PsycheTagger, is the first semantic emotive statistical tagger in English Language. Working in the lines of statistical Parts-of-Speech Taggers, this tool is adapted to perform efficiently and accurately for emotive content. The tagger self-ranks its choices with a probabilistic score, calculated using Viterbi algorithm run on a Hidden Markov Model of the psyche categories. The results of the classification and tagging exercise are critically evaluated on the Likert scale. These results strongly justify the validity and determine high accuracy of tagging using the probabilistic parser. Moreover, the six-step mining implementation provides a linguistic approach to model semi-structured semantic dataset for classification and labeling of any set of meaningful conceptual classes in English Language Corpus.


Title of the Paper: Bayesian-Based Instance Weighting For More Noise-Tolerant Instance-Based Learners

Authors: Khalil El Hindi, Bayan Abu Shawar

Pages: 128-139

Abstract: Instance-Based learners are simple, yet, effective learners. They classify a new instance based on the k most similar instances which makes them sensitive to noise in training data sets. Obtaining good classification accuracy may, therefore, require cleaning the data sets using labor-extensive or computationally expensive data cleaning procedures. In this work, we present some Bayesian-based instance weighting techniques to make such learners more tolerant to noise. The basic idea is that typical or classical instances should be given more weight or voting power than less typical or noisy instances. We present three techniques to determine instance weights that are based on the conditional probability of an instance belonging to its actual class and not to another class. Our empirical results using the kNN algorithm shows that all presented techniques are effective in making the kNN more tolerant to noise. These results suggest that these techniques can be used with instance based learners instead of more expensive data cleaning procedures.


Title of the Paper: A Realtime Filtering Method of Positioning Data with Moving Window Mechanism

Authors: Ha Yoon Song, Han-Gyoo Kim

Pages: 137-148

Abstract: Nowadays, advanced mobile devices can obtain current position with the help of positioning data systems such as GPS, GLONASS, Galileo, and so on. In addition cellular network positioning with base station locations or crowd source WIFI positioning approaches are available. However, positioning data sets usually have erroneous data for various reasons, mainly due to the environmental issues as well as inherent systematical issues. While doing research related to positioning data sets, authors experienced quite a large number of erroneous positioning data using Apple iPhone or Samsung Galaxy devices, and thus need to filter evident errors. In this paper, we will suggest relatively simple, but efficient filtering method based on statistical approach. From the user’s mobile positioning data in a form of < latitude; longitude; time > obtained by mobile evices, we can calculate user’s speed and acceleration. From the idea of sliding window (moving window), we can calculate statistical parameters from speed and acceleration of user position data and thus filtering can be made with controllable parameters. We expect that the simplicity of our algorithm can be applied on portable mobile device with low computation power. For the possible enhancement of our method, we will focus on the construction of more precise window for better filtering. A backtracking interpolation was added in order to replace erroneous data with proper estimations in order to have more precise estimation of moving window. We also proposed this filtering algorithm with interpolation as a basis of future investigation in the section of conclusion and future research.


Issue 3, Volume 6, 2012


Title of the Paper: Simulation Tools in Wireless Sensor Networks: Ant Colony Optimization of a Local Routing Algorithm

Authors: Nuria Gomez Blas, Luis Fernando de Mingo Lopez, Alberto Arteta Albert

Pages: 149-156

Abstract: Wireless Sensor Networks (WSN) consists of spatially distributed autonomous sensors to monitor physical or environmental conditions, such as temperature, sound, vibration, pressure, motion or pollutants, and to cooperatively pass their data through the network. Network Management of such a sensor network is a very big challenge. Also the fast changing nature and the adhoc necessity of the network prevents the choice of a Centralized Solution which can decide the best route to route packets and at the same time minimize the different parameters like congestion, load, etc. Also no single node can take up the job of centralized manager due to the limited energy and processing capabilities of mobile nodes. Hence this has resulted in the need for a distributed approach which involves limited processing and power from the individual nodes but which work towards a concerted goal of routing and network management. This paper proposes a routing algorithm based on ant colonies. A local routing, instead of storing the whole network graph, will be more suitable in order to keep track of the information going to a estination node. A testing environment has been established for a future simulation.


Title of the Paper: Re-Cluster Node on Unequal Clustering Routing Protocol Wireless Sensor Networks for Improving Energy Efficient

Authors: Nurhayati

Pages: 157-166

Abstract: Clustering in wireless sensor networks (WSNs) is an important technique for increasing the lifetime of a wireless sensor network. Organizing wireless sensor networks into clusters enables the efficient utilization of the limited energy resources of the deployed sensor nodes. However, the problem of unbalanced energy consumption exists, and it primarily depends on the role and on the location of a node in the network. If the network is organized into heterogeneous clusters, it is important to ensure that energy dissipation of these cluster head nodes is balanced. It should also be ensured that the hot spot problem would not occur, because in multihop the node is burdened from the relay data to the base station. Since the result of routing can prolong network life time, therefore Re-cluster Node on Unequal Clustering Routing Protocol Wireless Sensor Networks for Improving Energy Efficient is proposed. In this routing adopted hierarchical and multi-hop before cluster processed. This action enables energy dissipation among the cluster head nodes, thus increasing network lifetime. The different of this routing algorithm used the equation of value energy node (VN). It’s formula tocalculation of highest energy node. By choosing one node with thehighest energy as Cluster Head Leader Node and re-clustering in eachCluster Head (CH), the data is finally sent to Cluster Head LeaderNode (CHLN) and then to the Base Station (BS). After running thesimulation and analyzing the results of the routing protocol it wasfound that it prolonged network lifetime compared to the BCDCP,HEED, and UCR existing routing protocol.


Title of the Paper: Secure and Reliable Communications for SCADA Systems

Authors: Jari Ahokas, Tewodros Guday, Teemu Lyytinen, Jyri Rajamäki

Pages: 167-174

Abstract: Uninterrupted electric power distribution is vital for modern society. Secure data transfer between control center and power stations is critical for controlling and protecting power distribution. Supervisory Control and Data Acquisition (SCADA) systems are used for controlling the power stations. SCADA systems have traditionally used a limited propriety communication networks to transfer only control signals between centralized control systems and power stations. To improve security and reliability of an electrical power distribution, a video surveillance is required at power stations and distribution centers. Current telecommunication networks used for the SCADA system does not provide required capacity for real time video streaming. A standard Internet connection does not offer required reliability and security for SCADA communications. Multi-Agency Cooperation In Cross-border Operations (MACICO) project aims to produce a new way of combining multiple telecommunication channels, such as TETRA, satellite and 2G/3G/4G networks to create a single redundant secure and faster data transfer path for SCADA and video surveillance systems. The project relies on the Distributed Systems intercommunication Protocol (DSiP) that allows combining all kinds of telecommunication resources into a single, uniform and maintainable system. In Finland there is a project starting utilizing new technologies for data transfer thus demonstrating usability and reliability of this new communication method.


Title of the Paper: Cloud Computing with SOA Approach as Part of the Disaster Recovery and Response in Finland

Authors: Jouni Lehto, Jyri Rajamäki, Paresh Rathod

Pages: 175-182

Abstract: The Rescue Services in Finland have a significant problem of communication with other authorities who also participate in the rescue process. The greatest challenge is a lack of shared programs, applications or any other e-services which they can use to communicate with each other. The cloud computing combined with Service-Oriented Architecture (SOA) might be the answer for this problem. There are several solutions and guidelines available. This research paper explores which cloud computing deployment model and cloud service model could be suitable to address the problem along with Service-oriented approach. Further study also conducted on cloud services provided by the Public Authority Network (VIRVE) in Finland. The Cloud approach is compared to a System of Systems approach of SPIDER (Security System for Public Institutions in Disastrous Emergency scenaRios) project. The SPIDER project emphasises on enabling interoperable information sharing between public institutions for efficient disaster recovery and response. The paper presents conceptual view on usability of cloud and service-oriented computing in the disaster recovery and response services in Finland.


Issue 4, Volume 6, 2012


Title of the Paper: Scanned Document Image Segmentation Using Back-Propagation Artificial Neural Network Based Technique

Authors: Nidhal Kamel Taha El-Omari, Ahmed H. Omari, Omran Fadeel Al-Badarneh, Hussein Abdel-Jaber

Pages: 183-190

Abstract: Document images are composed of graphics, pictures, text, and background with varying number of colors. Based on the detected number of colors contained in a document image, a new approach for document image segmentation and classification using an Artificial Neural Network (ANN) technique is proposed. The ANN is designed to learn how to recognize some interesting color patterns from the labeled document images. Then, the unlabeled document images are classified according to these color patterns. This approach aims at segmenting the original image content into consistent and homogeneous four regions: picture, graphics, text, and background. In order to achieve better compression ratios, every component is compressed separately using the most appropriate compression technique.


Title of the Paper: Speaker Verification over MIMO-OFDM Systems based on Artificial Intelligence Techniques

Authors: Omar R. Daoud, Qadri J. Hamarsheh, Wael H. Al-Sawalmeh

Pages: 191-201

Abstract: In this work, an enhancement of a previously published work that tackles the use of automatic speaker verification (ASV) techniques in the Beyond Third generation (B3G) cellular systems has been proposed. The new proposition has been studied to overcome the effect of the Peak-to-Average Power Ratio (PAPR), which is a vital problem that found in the Orthogonal Frequency Division Multiplexing (OFDM) techniques, where a powerful combination between two main technologies; Multiple-Input Multiple-Output (MIMO) and OFDM has been developed to meet the rapidly increment in the users demand such as the ubiquitous transmission, imposing new multimedia applications and wireless services. The work space has been divided into three main areas; firstly, reducing the ASV complexity by selecting the weight of the text independent speakers based on Self-Organizing Map (WSOM) Neural Network (NNT), secondly, using the Eigen values/vector extracting features techniques as a pre-processing one to enhance the orthogonality, and finally proposing a new algorithm to combat the effect of the PAPR in the MIMO-OFDM systems. In this algorithm, wavelets techniques have been used to Denoise the affected OFDM symbol by high PAPR values. After that and based on adaptive threshold method the local maxima and minima will be determined and replaced by the average of them and their surrounding neighbors; Denoise OFDM and Replace PAPR (DORP). A system performance investigation process will be accomplished based on both of numerical method and MATLAB simulation. Moreover, a comparison has been made to check the validity of our proposition with our previously published work. Although, the achieved results show that the proposed work has lower PAPR values; an additional complexity has been added to transceiver’s structure. Moreover, and as a result to the comparison with the conventional systems, the bit error rate (BER) performance has been improved for the same bandwidth occupancy. Our simulation results showed that around 28% extra reduction in PAPR over current values in the literature, it can be achieved depending on the system type. Moreover, two different investigation and verifications techniques have been used in this work; Gaussian mixture model based method (GMMWPE) and K-Means clustering based method (KMWPE). A promising verifications result has been showed for verifications rate; around 91% and for noise immunity.


Title of the Paper: Multi Standard Accreditation as Expert System Tool in Jordan University Hospitals

Authors: A. M. Abu-Arida, I. Salah, M. Alshraideh, F. Hayajneh

Pages: 202-217

Abstract: Accreditation is a process of evaluating business activities based upon a set of pre-determined standards. Hospitals and health care centers seek international and local accreditations to win the confidence of patients and increase a competitive edge in the health services. Hospitals and health care centers do great efforts to achieve international accreditation certificate despite the many difficulties and pitfalls awaiting them along the way. Trial and error lead to long time to success meaning escalating costs and late to gain large number of benefits of that certificate. This proposed expert system aim at provide medical professionals and organizations’ administrative staff necessary expertise in dealing with complicated information subtleties, tackled with on day to day basis, as to comply with standards in order to achieve this esteemed accreditation in systematic and coherent manner. This methodology is distinguished from other systems in its flexibility of expert system in selecting specific standard (local or international), following up fault points, and analyzing results. The flexibility is provided to make settings for evaluation process adaptable to the selected standard, and also standard itself can be easily changed upon need. Henceforth it is suitable for both direct clients (hospitals) and indirect evaluator organization. The proposed system is built in multiple phases. In first phase the HCAC is considered as a sample for proposed system. Power designer was used to design the proposed system database entities relationships, Oracle database, Developer 6i, Report Builder and Graphics to implement the proposed expert system. All these tools were utilized under Microsoft Windows OS.


Title of the Paper: Total System Cost Analysis of Different Classes of Master-Slave Super-Hypercube Message Passing Architectures

Authors: Hamid Abachi

Pages: 218-227

Abstract: The need for high processing power, as well as advances in the semiconductor technology has resulted in the rapid development of High-Performance Message Passing Architectures (HPMPAs). The former claim is supported through excessive data processing requirements of today's advanced research topics which are contineously on the rise. These areas include global warming, weather forecasting and simulation of the performance, safety and reliability of nuclear weapons to name a few. This is one of the main reasons that many researchers attempt to develop new architectures and through modeling and simulations try to justify improvements in the areas such as, performance, speed, fault tolerance and cost which are determining factors and would identify the suitability or otherwise of a system for a given application. With this motivation in mind the author has introduced a new architecture as part of the HPMPAs, which is coined as Master -Slave Multi-Super Hypercube X-Tree architecture . For this architecture the total system cost through mathematical modeling and simulation are compared with the similar parameters of the existing High Performance Computing (HPC) systems. The result highlights any merits or demerits of the proposed architecture from scientific research point of view.


Title of the Paper: An ARDL Analysis on the Internet Usage Rate and the Wealth of Malaysians

Authors: Fennee Chong, Venus Khim-Sen Liew

Pages: 228-235

Abstract: The objective of this study is to examine the relationship between gross national income per capita and the internet usage rate per 100 people of an emerging economy – Malaysia both in the short and long-run. Autoregressive Distributed Lag Modeling Approach (ARDL) was used to analyze the thirteen-year time series collected for these two variables. Empirical findings from the econometrics analysis reported a significant long-run and short-run relationship between these two variables. Hence, this study concludes that, enhancing internet usage among the public and private sectors is a pertinent strategy for Malaysia towards achieving a higher wealth status for its people.


Title of the Paper: Real-Time Environmental Changes and Medical Risks Monitoring in a Context-Aware System, using Distributed Mobile Applications

Authors: Sveatoslav Vizitiu, Lazar Sidor

Pages: 236-244

Abstract: Personal Health Applications (PHA) are tools andservices in medical informatics domain, which use informationtechnologies to help individuals create their own personal healthinformation. This paper introduces a personal health application as adistributed mobile application communicating with a context-awaresystem meant to collect environmental data, using sensors connectedto a FBGA board. The physiological risk factors measurements arestored into a database running on web-server and distributed then toall the registered client applications running on mobile devices. Theclient applications will receive updates from the server with the newavailable information gathered from the context-aware systemshowing to the user the possible health risks identified. While a highlevel overview of the entire context-aware system designed forphysiological risk factors measurement station is presented, thispaper is focused on the data processing layer and on the softwareimplementation of the client application running on mobile devices.


Title of the Paper: Social Media Risks from Forensic Point of View

Authors: Zsolt Nagy

Pages: 245-253

Abstract: In the age of Facebook children are better computer or mobile phone users than their parents; we do not know any teenagers who do not have a social media profile, an email or an instant messenger account. It became the part of their everyday life; however they do not care and even do not know much about the other side of the social life. The Internet gives freedom for everyone, but in this big freedom we forget to teach our children and ourselves how to handle and protect sensitive information properly. In this article we focus on the risk of cyber crime against a single user who is not sufficiently careful to protect his or her information, we are going to show the way in which a forensic expert or even a cyber criminal can use internet activity reconstruction tools. We undertook this research using a real criminal investigation example, find out the kind of information that has been collected and stored about a user by a client computer and give some useful advices how to protect ourselves against cyber criminals.


Title of the Paper: Natural Language based Human Computer Interaction: A Necessity for Mobile Devices

Authors: Emdad Khan

Pages: 254-264

Abstract: Human Computer Interaction (HCI) with any computing device is becoming more important with the rapid growth of the use of such devices worldwide. Computing devices range from a computer (large, medium or small) to small devices, like a mobile phone. Since mobile devices are ubiquitous and many computing functions are moving to mobile devices, the need for a very good HCI is becoming even more important for such devices. While existing mechanisms are good for high end mobile devices (e.g. smart phones and PDAs), they mainly serve the technically literate people. Nontechnical, semi-literate and illiterate people have great difficulty in using existing interfaces, mainly because of the complexity in learning and the literacy needs. Besides, many people cannot afford such high end mobile devices. The User Interface (UI) for low end and medium end mobile devices are even more difficult as such devices do not have some of the nice features that high end devices have (e.g. touch screen, scrolling, larger size screen/keypad). However, complexity of learning is relatively less as such devices mainly have simple features and do not have complex features like accessing the Internet and interacting with Internet applications. Clearly, to provide the key benefits of this Information Age to many more people including largely dominated Base of the Pyramid People (BOP), low and medium end mobile devices should have all the key features that high end mobile devices, tablets and computers have. And such key features should have a very good UI. Two technologies are essential to achieve these goals – a Natural Language Interface and an Intelligent Agent. In this paper, we propose a Natural Language Understanding (NLU) based Intelligent Agent (IA) that overcomes the existing problems by automating the key tasks while allowing a natural user interface using user’s voice (or typing). Our solution not only makes it much easier to effectively use mobile devices by non-technical, semiliterate and illiterate people, but also by technical people as the key tasks in a high end mobile device now handled by manual scrolling and rendering by user’s eyes & brain are automated via the use of IA. We argue that such automation is, in fact, becoming a necessity when one would like to use various complex on-line applications including e-Services using low & medium end mobile devices. We also argue that our proposed approach makes the UI much simpler, easier and more effective for all other devices including high end mobile device, tablets and computers.


Title of the Paper: NFC Secured Online Transaction in Mobile Computing Business

Authors: Teddy Mantoro, Media A. Ayu, Goenawan Brotosaputro, Nur F. Ain, Noorzalina Ghazali

Pages: 265-272

Abstract: The growth and expansion of the Internet come in a rapid speed and took important place in many aspects of lives, including online payment for transaction information. NFC technology has become a success across a broad range of applications depending on its large-scale adoption by enterprises and consumers. Unfortunately, NFC security is still a major concern for the business. This study proposes a secure mobile transaction model for any transaction using NFC Technology. As proof of concept, a Top-up printing system is developed and for interaction with contactless interface, an ACR 100 Reader with ACOS3 SIM card is used. As for the security measurement, MD5 algorithm is implemented to process the system authorization. As a result of employing NFC Technology, the users no longer need to wait in a long queue. Just “touch” or “wave” at the nearest reader to top-up.