International Journal of Computers

ISSN: 1998-4308
Volume 6, 2012

Main Page

Submit a paper | Submission terms | Paper format


Issue 1, Volume 6, 2012

Title of the Paper: Incremental Algorithms for Optimal Flows in Networks

Authors: Laura A. Ciupala

Pages: 1-8

Abstract: In this paper, we will use incremental algorithms in order to save computational time when solving different network flow problems. We will focus on two important network flow problems: maximum flow problem and minimum cost flow problem. Incremental algorithms are appropriated to be used when we have a network in which we already have established an optimal flow (in our case either a maximum flow or a minimum cost flow), but we must modify the network by inserting a new arc or by deleting an existent arc. An incremental algorithm starts with an optimal flow in the initial network and determines an optimal flow in the modified network. First, we present incremental algorithms for the maximum flow problem. These algorithms were developed by S. Kumar and P. Gupta in 2003. They described algorithms for determining maximum flows in a network obtained from a given network in which a maximum flow is already known and in which a new arc is inserted or an existent arc is deleted. Finally, we describe our incremental algorithms for the minimum cost flow problem. Let us consider a network in which we already established a minimum cost flow. We describe and solve the problem of establishing a minimum cost flow in this network after inserting a new arc and after deleting an existent arc. We focus on these problems because they arise in practice.

Title of the Paper: High Performance Hardware Operators for Data Level Parallelism Exploration

Authors: Libo Huang, Zhiying Wang, Nong Xiao

Pages: 9-18

Abstract: Many microprocessor vendors have incorporated high performance operators in a single instruction multiple data (SIMD) fashion into their processors to meet the high performance demand of increasing multimedia workloads. This paper presents some recent works on hardware implementation of these operators for data-level parallelism (DLP) exploration. Two general architectural techniques for designing operators with SIMD support are first described including low precision based scheme and high precision based scheme. Then new designs for integer operators as well as floating-point operators are provided to accommodate best tradeoff between cost and performance. To verify the correctness and effectiveness of these methods, a multimedia coprocessor augmented with SIMD operators is designed. The implemented chip successfully demonstrates that the proposed operators get good tradeoff between cost and performance.

Title of the Paper: Performance Comparison of SVM and kNN in Automatic Classification of Human Gait Patterns

Authors: L. R. Sudha, R. Bhavani

Pages: 19-28

Abstract: Information fusion offers a promising solution to the development of a high performance classification system. In this paper multiple gait components such as spatial, temporal and wavelet are fused for enhancing the classification rate. Initially background modeling is done from a video sequence and the foreground moving objects in the individual frames are segmented using the background subtraction algorithm. Then gait representing features are extracted for training and testing the multi_class k-Nearest Neighbor models (kNN) and multi_class support vector machine models (SVM). We have successfully achieved our objective with only two gait cycles and our experimental results demonstrate that the classification ability of SVM is better than kNN. The proposed system is evaluated using side view videos of NLPR database.

Title of the Paper: Nonlocal Flexural Wave Propagation in an Embedded Graphene

Authors: S. Narendar, S. Gopalakrishnan

Pages: 29-36

Abstract: This paper presents the strong nonlocal scale effect on the terahertz flexural wave dispersion characteristics of a monolayer graphene sheet, which is embedded in elastic medium. The graphene is modeled as an isotropic plate of one atom thick. The chemical bonds are assumed to be formed between the GSs and the elastic medium. The polymer matrix is described by a Pasternak foundation model. The elastic foundation is approximated as a series of closely spaced, mutually independent, vertical linear elastic springs where the foundation modulus is assumed equivalent to stiffness of the springs. Nonlocal governing equation of motion is derived and wave propagation analysis is performed using spectral analysis. The present analysis shows that the flexural wave dispersion in graphene obtained by local and nonlocal elasticity theories is quite different. From this analysis we show that the elastic matrix highly affects the flexural wave mode and it rapidly increases the frequency band gap of flexural wave. The nonlocal elasticity calculation shows that the wavenumber escapes to infinite at certain frequency and the corresponding wave velocity tends to zero at that frequency indicating localization and stationary behavior. This behavior is captured in the spectrum and dispersion curves. It has been shown that the cut-off frequency of flexural wave not only depends on the axial wavenumber but also on the nonlocal scaling parameter. The effect of y-directional wavenumber and nonlocal scaling parameter on the cut-off frequency is also captured in the present work.

Title of the Paper: Ext4 File System in Linux Environment: Features and Performance Analysis

Authors: Borislav Djordjevic, Valentina Timcenko

Pages: 37-45

Abstract: This paper considers the characteristics and behavior of the modern 64-bit ext4 file system under the Linux operating system, kernel version 2.6. It also provides the performance comparison of ext4 file system with earlier ext3 and ext2 file systems. The work involves mathematical analysis of the file system access time with and without journaling option. The performance is measured using the Postmark benchmarking software that simulates the workload of Internet mail server. We have defined three types of workloads, generally dominated by relatively small objects. Test results have shown superiority of modern ext4 file system compared to its predecessors, ext2 and ext3 file systems. Benchmark results are interpreted based on mathematical model of file system access times.

Title of the Paper: Impacts of the Using of Advanced Technologies for Management

Authors: Jan Nemecek, Katerina Cebisova, Jan Hribik

Pages: 46-53

Abstract: In this article readers can found describes the issue of using Advanced Manufacturing Technologies in companies doing their business in the Czech Republic. Parts of this article are the economic indicators of Net Profit, Sales, Equity, Assets, Added Value per Employee and Profit per Employee, which have been subjected to detailed examination in the horizon of the time period years 2007 2010. This study was supported by research conducted on a sample of 131 companies. Collected data were explored mainly by correlation analysis. The aim of this article is to identify the relationships and dependence between economic indicators and the number of Advanced Manufacturing Technologies implemented in the companies. In addition to that, it was intended to study the level of contribution these technologies can bring to a company. Also were tested hypotheses about connections between using Advanced Technologies and Added Value per Employee and Profit per Employee. There has been established low to moderate dependence between the use of Advanced Technologies and economic results of a company. Thus the Advanced Manufacturing Technology can to some extent contribute to better economic results, but they also represent great burden in the company?s budget. Therefore the use of these technologies should be properly considered and planned by management.

Title of the Paper: Easy Database Management in C++ Applications by Using DatabaseExplorer Tool

Authors: Peter Janku, Michal Bliznak

Pages: 54-62

Abstract: One of the most important task in an application development process is a database management and data manipulation. Nearly every programming toolkit includes tools, constructs or components for manipulation with database content and structure. DatabaseExplorer is an open-source cross-platform software tool for C++ programming language and wxWidgets toolkit which makes database structure management and data manipulation easy. It provides simple and intuitive GUI interface for handling of multiple different database engines and stored data. The application can be used as a stand-alone tool or as a plugin for well known CodeLite IDE. This paper deals with the application description and reveals its internal structure, used technologies and provides simple use case scenario.

Title of the Paper: The Age-Distributions of Teachers between Prepared without and with Computer Literacy in Taiwan

Authors: Jui-Chen Yu, Lung-Hsing Kuo, Hung-Jen Yang, Hsueh-Chih Lin

Pages: 63-72

Abstract: The purpose of this study was to analysis the age distribution of secondary school teachers in Taiwan. Teachers’ professional is the fundamental of education, and teachers play the important role in education environment. With the time elapsed, age aging teachers. There is a problem presented that the in-service teachers must to retire and supplement continuously. In this paper, the secondary school inclusive of junior high school, senior high school and senior vocational school. An metadata analysis method was applied in this study for exploring the age distribution of in-service teachers in secondary school, the research data would based on yearbook of teacher education statistics published from Ministry of Education, Taiwan, R.O.C. The research population is nationwide secondary school in-service teachers, 94,168 at year 2009. The research data would be divided into two parts: first registered specialty in secondary education and first registered specialty in senior vocational education. About teachers’ computer literacy preparation, the pre-service teachers had learned computer-related courses during their teacher education since 1990 to now. The proportions of teachers that had learned computer courses would be discussed. The results of statistical analysis, normal distribution test would be used for analyzing data in this research, inclusive of skewness, kurtosis, arithmetic mean, etc., whether the age distribution is nearly normal distribution curve, or non-continuity, or generate notch, or loss of balance. And the age distribution of in-service teachers in secondary school would be presented in the conclusions of research.

Title of the Paper: New Technique towards Operator Independent Kidney Ultrasound Scanning

Authors: Wan M. Hafizah, Nurul A. Tahir, Eko Supriyanto, Adeela Arooj, Syed M. Nooh

Pages: 73-82

Abstract: Ultrasound imaging has been widely used as the primary screening of the kidney as it is non-invasive and affordable. Ultrasound can be used to measure the size and appearance of the kidneys and to detect tumors, congenital anomalies, swelling and blockage of urine flow. However, this scanning procedure is a time taking method because of the ultrasound image is full of speckle noise. Thus, the user eventually notices that it is hard to detect the boundary of the kidney in the US image, even it’s done by the trained sonographers. In addition, human error might occur during the interpretation of ultrasound image by untrained sonographer, especially when taking measurement. Therefore, in order to reduce the dependability to the sonographers’ expertise, some image processing can be done which can automatically detect the centroid of human kidney. The software was developed using MATLAB consist of speckle noise reduction, Gaussian filter, texture filter and morphological operators which were used for image segmentation in order to extract important features. For the result, median filter has been chosen as speckle noise reduction techniques as it is faster and detect kidney centroid better compared to wiener filter, wavelet filter and speckle noise anistrophic diffusion (SRAD) filter. This software can achieve until 96.43% of accuracy in detecting the centroid. The detected centroid can be implemented in the existing ultrasound machine that will be used as segmentation tool to reduce human errors and time.

Title of the Paper: A New Approach for Computing Housing Tax Rates

Authors: Alaa M. Rial, Hazem M. El-Bakry, Gamal H. El-Adl

Pages: 83-92

Abstract: In this paper a new service for E-Government is presented. A new technique for computing housing tax rates is introduced as an essential part in the framework of e-government. The proposed approach relies on integrating geographic information system (GIS) and global positioning system (GPS) to achieve efficient decision support system (DSS) for perfect calculation of housing tax rates. Furthermore, the design of DSS framework for E-government is described. Such approach is applied for the Egyptian ministry of finance in the form of e-service. The aim is to construct a complete e-government system that facilitates e-services for its partnerships such as citizens, businesses, employees as well as the government itself. Our proposed e-service clarifies the importance of the geographical criterions that affect the values of housing tax rates. By using this new e-service, the mission of the housing tax rates committee becomes easier as the calculation of housing tax rates is done automatically. As a result, the time required to manually check and investigate all buildings and housing units in the country is reduced. The presented approach can be applied for computing any other types of taxes that depend on the geographical position.

Title of the Paper: Probabilistic Predictive Monitoring with CHEERUP

Authors: Silvano Mussi

Pages: 93-102

Abstract: The paper presents CHEERUP: a general software environment for building, using and administering applicationoriented probabilistic predictive monitoring systems (in the paper called “portals”). Such specific “portals” are used to monitor populations of subjects and get, for single subjects, probabilistic predictions about the occurrence of given undesired/desired events. Probabilistic predictive monitoring is a powerful tool for supporting decisions. It allows to take suitable measures in advance, measures aiming at preventing/favoring the occurrence of the undesired/desired event the application is centered on.

Issue 2, Volume 6, 2012

Title of the Paper: Multiresolution Surface Representation using Combinatorial Maps

Authors: M.-H. Mousa, M.-K. Hussein

Pages: 103-110

Abstract: Multiresolution surfaces are traditionally represented by data structures based on quadtrees which are derived directly from the subdivision operations. However, these data structures suffer from a number of drawbacks. First of all, they are restricted to triangular orquadrilateral grids and must be developed specifically for each mesh type separately. Moreover, the time complexity of adjacency query is not optimal. In this paper, we present a data structure for representing multiresolution meshes. This data structure extends the n dimensional generalized maps. Given a 3D mesh, the proposed data structure is defined as a hierarchy of nested 2 dimensional maps, i.e. each level of details of the mesh is defined as a separate 2 dimensional map. This inherits the generality and effectiveness of the original n map data structure. Additionally, we define, based on the multiresolution 2-map, the dual multiresolution representation which allows the topological description of the traditional mesh operations. The multiresolution 2-map enables the representation of arbitrary meshes and support the adaptivity and efficiency of the adjacency operators. We apply the proposed framework to the representation of progressive meshes.

Title of the Paper: A Colored Petri Net for the France-Paris Metro

Authors: A. Spiteri Staines

Pages: 111-118

Abstract: This work tries to explain how colored Petri nets (CPNs) can be useful for modeling a real life complex networked transport system. Petri nets are expressive graphical formalisms that are useful for modeling discrete system behavior. Petri nets are well documented and have an extensive amount of well researched areas and interesting applications. Colored Petri nets (CPNs) are very useful extensions to traditional Petri nets, increasing their expressivity and modeling power. An underground metro system is agood case of complexity. The real French Paris Metro system that forms part of the RER is considered and analyzed with part of it being modeled as a colored Petri net or (CPN). The CPN used is as close as possible to the real life system and is fully executable. Different scenarios and results are obtainable from this apart from its usefulness for validation and verification. This paper thus assesses he suitability of colored Petri nets for modeling real world high complexity transport systems. The findings are briefly discussed.

Title of the Paper: Developing a New Java Algorithm for Playing Backgammon

Authors: Manuela Panoiu, Caius Panoiu, Ionel Muscalagiu, Anca Iordan, Raluca Rob

Pages: 119-126

Abstract: A computer game is a very convenient way of recreation. In order to simulate most classical games, many algorithms have been implemented. The complexity of algorithms used in implementing the games leads to a continuous increasing of the computer performance. The application presented in this paper is able to play backgammon. The software allows a game between two players and also a game between one player and the computer. A software package module allows monitoring games in the network. All software programs were implemented in Java language.

Title of the Paper: Generic Architecture for Knowledge Base Learning Control System

Authors: Aboubekeur Hamdi-Cherif

Pages: 127-136

Abstract: In order to be fully self-controlled, a system needs built-in knowledge obtained by refining information as a further processing of data. All these three components, i.e. data, information and knowledge map different hierarchical levels in machine intelligence, with knowledge representing the most complex form. In order to account for diversified situations and issues encountered in control settings embracing both engineering and managerial issues, a generic architecture for knowledge base learning control system (KBLCS) is proposed. Relying on the main technologies available to us, including Web, data mining and cloud computing technologies, we describe the proposed architecture for numeric/symbolic data processing that is capable of addressing issues related not only to knowledge but also to its meaning as used in the characterization of the imprecision and incompleteness of the controlled plant, especially in decision support systems (DSSs) settings. Some tasks trade-offs as part of the control process are also considered.

Title of the Paper: Managing Architectural Design Decisions Documentation and Evolution

Authors: Meiru Che, Dewayne E. Perry

Pages: 137-148

Abstract: Software architecture is considered as a set of architectural design decisions (ADDs). Capturing and representing ADDs during the architecting process is necessary for reducing architectural knowledge evaporation. Moreover, managing the evolution of ADDs helps to maintain consistency between requirements and the deployed system. In this paper, we create the Triple View Model (TVM) as a general architecture framework for documenting ADDs. The TVM clarifies the notion of ADDs in three different views and covers key features of the architecting process. Based on the TVM, we propose a scenario-based methodology (SceMethod) to manage the documentation and evolution of ADDs. Furthermore, we also develop a UML metamodel that incorporates evolution-centered characteristics to manage ADDs evolution. We conduct a case study on an industrial project to validate the applicability and the effectiveness of the TVM, the SceMethod and the UML metamodel. The results show they provide complete documentation on ADDs for creating system architecture, and well support architecture evolution with changing requirements.

Issue 3, Volume 6, 2012

Title of the Paper: Application of Fuzzy Rough Temporal Approach in Patient Data Management (FRT-PDM)

Authors: Aqil Burney, Zain Abbas, Nadeem Mahmood, Qamar Ul Arifeen

Pages: 149-157

Abstract: Management of fuzzy and vague information has been a research problem for computer scientists, particularly in artificial intelligence, relational and temporal databases. Fuzzy set theory has been widely used to cater this problem. Rough set theory is a newer approach to deal with uncertainty. After many years of rivalry between the two theories, many researchers have started working towards a hybrid theory. In this paper we have discussed the fundamental concepts of fuzzy and rough set theories as well as their application in temporal database model. We have also presented a conceptual model of fuzzy rough temporal data processing along with a case study.

Title of the Paper: Production-Ready Source Code Round-Trip Engineering

Authors: Michal Bli┼żnák, Tomáš Dulík, Roman Jašek

Pages: 158-169

Abstract: Automated source code generation is often present in modern CASE and IDE tools. Unfortunately, the generated code often covers a basic application functionality/ structure only. This paper shows principles and algorithms used in open-source cross-platform CASE tool called CodeDesigner RAD developed at Tomas Bata University suitable for production-ready source code generation and reverse engineering which allows users to generate complete C/C++ applications from a formal visual description or for creation of UML diagrams from an existing source code.

Title of the Paper: Applying Prototyping Method on Web-based Course Development

Authors: Jui-Chen Yu, Lung-Hsing Kuo, Hsieh-Hua Yang, Wen-Chen Hu, Hung-Jen Yang

Pages: 170-178

Abstract: The purpose of this study was to design a web-based teachers’ in-service course for creating teaching material for Emerging Technology Learning offered by the High-Scope Project. For coping with new contents brought by fast advancing technology, education system should provide ways to integrating that new information of emerging technology into our curriculum for preparing students with up-to-date knowledge. There is a need to create a course for teachers to review how to creating teaching materials of emerging technology. A fast prototype method was adopted for creating the course. For further reusing on-line course components, the SCORM packaging technique was applied to create course ware from power-point file into SCORM packages. The Ministry of Education Digital Materials Evaluation Standard was used for verifying the course. The standard requirements were met according to two evaluators’ response and the reliability was reached at 0.90 levels. An on-line course was established upon the moodle platform and presented as conclusion.

Title of the Paper: A Novel Approach for Designing an E-Learning Pattern Language

Authors: Maysoon Aldhekhail, Mohammed Alawairdhi, Alaa Eldeen Ahmed, Azeddine Chikh

Pages: 179-186

Abstract: Recently, e-learning has become a very important topic, and researches are focusing on improving it using both the technology and the pedagogical domain theories. Design patterns in e-learning are descriptions of good practice in e-learning. It focuses on providing a solution to a learning problem in such a way that designers can use this solution a million times over without ever doing it the way twice. Experts from different disciplines are supposed to use these patterns for different objectives related to their community. These objectives may need to use several patterns together and consequently form a pattern language, which can help in solving a group of interrelated problems. In this paper we introduce a new mechanism for building a learning design pattern language for designers who are using IMS-LD as standard specification. All the existing works consider the designer as a main actor and so they use the bottom-up approach in building the language. We add experts as another actor and so we use a top-down approach to build a pattern language and a bottom-up approach to build patterns. Results show that our approach gets more stable results.

Title of the Paper: Dictionary Based Compression for Images

Authors: Bruno Carpentieri

Pages: 187-195

Abstract: Lempel-Ziv methods were original introduced to compress one-dimensional data (text, object codes, etc.) but recently they have been successfully used in image compression. Constantinescu and Storer in [6] introduced a single-pass vector quantization algorithm that, with no training or previous knowledge of the digital data was able to achieve better compression results with respect to the JPEG standard and had also important computational advantages. We review some of our recent work on LZ-based, single pass, adaptive algorithms for the compression of digital images, taking into account the theoretical optimality of these approach, and we experimentally analyze the behavior of this algorithm with respect to the local dictionary size and with respect to the compression of bilevel images.

Issue 4, Volume 6, 2012

Title of the Paper: A Model for Determining How the Public Interprets Print Advertisements, by Means of a Smart Questionnaire

Authors: Ciprian-Viorel Pop, Diana-Aderina Moisuc, Nela Steliac, Anca-Petruta Nan

Pages: 197-205

Abstract: The efficiency of an advertisement resides in the meanings it conveys to the audience. Just like before launching a new product, preparing a new advertisement involves several rounds of testing the meanings that it transmits to the public. The validity of the meaning assessment tool is directly dependent on the interpretation theory that it is founded on. Along the lines of Relevance Theory, we believe that people filter from advertisements only what is relevant to them, i.e. whatever brings cognitive gains to them. Starting from the hypothesis that an individual’s interpretation of an advertisement is traceable in the cognitive changes that the advertisement determines in the individual, we devised an expert-system aided print advertising evaluation tool which assesses the existence of cognitive effects after viewing the advertisement. Although the interface looks like a questionnaire, its functions exceed by far those of classic questionnaires, as it is designed to assign numeric values to answers, add them up selectively, provide intermediate and final scores, as well as display final reports about advertising interpretations, cognitive effects, and assumptions that might have been changed by the advertisement. The paper presents the step-by-step construction of the expert system on an Exsys Corvid platform, and discusses the results of the first survey founded on the “smart” questionnaire.

Title of the Paper: Strategic and Tactical Success Factors in ERP System Implementation

Authors: K. Curko, D. Stepanic, M. Varga

Pages: 206-214

Abstract: In order to successfully implement an ERP system it is necessary to properly balance critical success factors. By researching what the critical success factors in ERP implementation are, why they are critical, and to what extent they are relevant to users, consultants and suppliers, this paper seeks to identify critical success factors in ERP implementation and to understand the impact of each factor on the success of ERP system introduction. This paper lists ten critical success factors (CSF) providing two points of view: strategic and tactical. These are: top management support, a business plan and vision, top management support, change management program, project management, implementation strategies, project team, business process modeling and minimal customizations, monitoring and performance evaluation, software development, testing and troubleshooting, legacy systems.

Title of the Paper: XML-RPC vs. SOAP vs. REST Web Services in Java – Uniform using WSWrapper

Authors: Adina Ploscar

Pages: 215-223

Abstract: Every day more businesses migrate towards the web. They offer on-line support for their clients. This support translated in web pages must be linked with the companies’data servers. The solution for this are web services. With the advent of various web services also appear the problem of cross-library and cross-language compatibility. The solution to this could be a wrapper which hide the implementation details of various distributions. We call this wraper WSWrapper. The scope of WSWrapper is to provide an unique interface by wrapping some existing and widely used libraries for all XML-RPC, SOAP and REST web service. WSWrapper offers solutions in four languages most used at the moment : java, php, c#, and python. In this paper, we will refer to WSWrapper from Java view. We will see the unique interface of WSWrapper for all three models and examples of a web service and a web service client.

Title of the Paper: Analysis of the Fractal Structures for the Information Encrypting Process

Authors: Ivo Motyl, Roman Jasek, Pavel Varacha

Pages: 224-231

Abstract: This article is focused on the analysis of the fractal structures for purpose of encrypting information. For this purpose were used principles of fractal orbits on the fractal structures. The algorithm here uses a wide range of the fractal sets and the speed of its generation. The system is based on polynomial fractal sets, specifically on the Mandelbrot set. In the research were used also Bird of Prey fractal, Julia sets, 4Th Degree Multibrot, Burning Ship and Water plane fractals.

Title of the Paper: Advanced User Authentication Process Based on the Principles of 4th Degree Multibrot Fractal Structure

Authors: Ivo Motyl, Roman Jasek, Pavel Vaoacha

Pages: 232-239

Abstract: This article is focused on authentication of users inside and outside the information systems. For this purpose is widely used hash function. The proposed process is based on the elements of the fractal geometry. The algorithm here uses a wide range of the fractal sets and the speed of its generation. The system is based on polynomial fractal sets, specifically on the 4th Degree Multibrot. The system meets all the conditions for the construction of hash functions.

Title of the Paper: Open-Source Security Solution for False Antiviruses Removing

Authors: C. Pop, A. Naaji, M. Popescu

Pages: 240-247

Abstract: One of the goals of existing strategies in companies and organizations regarding the security of information systems is to protect computer networks against attacks caused by viruses. For detection, and if necessary, removing computer viruses, specialized companies are increasingly concerned with the development of advanced antivirus products. However, although there have been made efforts in this regard, in recent years a new threat appeared, related to false antivirus programs for which there are still insufficient protection tools. The purpose of our software is to improve existing products in dealing with false antiviruses or false alerts, which may pose serious threats to computer safety. Once infected, these programs block access to the system, overwhelming the antivirus software. The application is created based on the fact that the same false antivirus makes the same files, the file is encrypted and sent again “into the wild” as 0-Day Malware file. We developed a open-source program that assists the existing security solution in cleaning the computer from false antiviruses or false alerts and provides an option to help the user with an alternative, combining commands from the terminal (command prompt) with a console application type, which means it can be run directly from the console.

Title of the Paper: Secured Knowledge Sharing and Creation Using RBAC Model for e-Village

Authors: Wendi Usino, Teddy Mantoro, Media A. Ayu, Nazir Harjanto

Pages: 248-256

Abstract: This paper introduces a Role Based Access Control (RBAC) model for e-Village in creating and sharing knowledge in a secure way. This includes secure information about marketing to help the products become an innovation and highly valuable. As a case study, the Pasir Region, East Kalimantan, has the 100 families who moved were originally from Jepara in order to enhance their living standard, as encouraged by the Indonesian Government. Jepara is a very famous area of craft products and carving furniture. The problem is that they have to change their livelihood from a crafter to a farmer and lack the necessary original skills. It also discusses and analyses the secure knowledge sharing and knowledge creating using e-village, not only for supporting the market development but also preserving the village culture more, especially in crafting and carving furniture. This study shows that the RBAC of e-village contributes significantly to the quality of knowledge for people in a remote area such as the Pasir Region village.