|
ISSN: 1998-4308
Year 2009
All papers of the journal were peer reviewed by two
independent reviewers. Acceptance was granted when both
reviewers' recommendations were positive.
Main
Page
Paper
Title, Authors, Abstract (Issue 1, Volume 3, 2009) |
Pages |
The Role of Avatars
with Facial Expressions to Communicate Customer
Knowledge
Mutlaq B. Alotaibi, Dimitrios I. Rigas
Abstract:
This paper describes a comparative evaluation study
conducted to examine the impact of incorporating
avatars with facial expressions into Electronic
Customer Knowledge Management Systems (E-CKMS) on
usability of E-CKMS, and the user’s attitudes and
knowledge. Although the implementation of E-CKMS
encounters several challenges, such as lack of trust
and information overload, few empirical studies were
devoted to examine the role of metaphors of
audio-visual nature. As a result, an empirical
investigation was carried out by implementing
avatars-enhanced multimodal E-CKMS (ACKMS), and
comparing it with text with graphics E-CKMS (VCKMS),
and another multimodal E-CKMS (MCKMS) that utilises
speech, earcons and auditory icons. The three
experimental systems were evaluated by three
independent groups of twenty users each (n=60)
performed eight common tasks, increasing in
complexity and designed based on three different
styles of Customer Knowledge Management (CKM).
Results and analysis revealed that ACKMS outperform
MCKMS and VCKMS with regard to the user’s attitudes
and knowledge.
|
1-10 |
Implementation of a
USN-based Disaster Prevention System in Korea
Dae-Hyun Ryu, Ho-Jun Na, Seung-Hoon Nam
Abstract:
The rapid economic rise of Korea has also led to the
rapid development of infrastructure. As this
infrastructure becomes more complex, it is becoming
more and more of a challenge to appropriately
monitor. As a result, there have been several
occurrences of preventable disasters taking place
and, if nothing is done, the frequency of such
disasters is likely to increase. With wireless
communication, sensor networks, and standards such
as Zigbee becoming mainstream, it is now possible to
implement disaster prevention systems for many
applications. This paper suggests and designs an
efficient ubiquitous sensor network-based disaster
prevention system that monitors gas lines for leaks.
Using our system, it's possible to monitor and
control relevant facilities in real-time. This
near-immediate reaction time will allow for the
evacuation of affected people and rapid emergency
response in the event of a leak, thereby saving
lives and preventing a disaster from occurring. This
system may be a key component of new government
policy that holds the safety of citizens in the
highest regard.
|
11-19 |
Eigenface-Gabor
Algorithm for Features Extraction in Face
Recognition
Gualberto Aguilar-Torres, Karina Toscano-Medina,
Gabriel Sanchez-Perez, Mariko Nakano-Miyatake,
Hector Perez-Meana
Abstract:
This paper provides a study on Face Recognition
Algorithms; several methods are used to extract
image face features vector, which presents small
inter-person variation. This feature vector is feed
to a multilayer perceptron to carry out the face
recognition or identity verification tasks. Proposed
system consists in a combination of Gabor and
Eigenfaces to obtain the feature vector. Evaluation
results show that proposed system provides
robustness against changes in illumination,
wardrobe, facial expressions, scale, and position
inside the captured image, as well as inclination,
noise contamination and filtering. Proposed scheme
also provides some tolerance to changes on the age
of the person under analysis. Evaluation results
using the proposed scheme with identification and
verification configurations are given and compared
with other feature extraction methods to show the
desirable features of proposed algorithm.
|
20-30 |
License Plate
Recognition Using a Novel Fuzzy Multilayer Neural
Network
Osslan Osiris Vergara Villegas, Daniel Gonzalez
Balderrama, Humberto de Jesus Ochoa Dominguez,
Vianey Guadalupe Cruz Sanchez
Abstract:
In this paper we present a proposal to solve the
problem of license plate recognition using a three
layer fuzzy neural network. In the first stage the
plate is detected inside the digital image using
rectangular perimeter detection and the finding of a
pattern by pattern matching, after that, the
characters are extracted from the plate by means of
horizontal and vertical projections. Finally, a
fuzzy neural network is used to recognize the
license plate. The tests were made in an
uncontrolled environment in a parking lot and using
Mexican and American plates. The results show that
the system is robust as compare to those systems
reported in the literature.
|
31-40 |
Buffer Caching
Algorithms for Storage Class RAMs
Junseok Park, Hyunkyoung Choi, Hyokyung Bahn,
Kern Koh
Abstract:
Due to recent advances in semiconductor
technologies, storage class RAMs (SCRAMs) such as
FRAM and PRAM are emerging rapidly. Since SCRAMs are
nonvolatile and byte-accessible, there are attempts
to use these SCRAMs as part of nonvolatile buffer
caches. A nonvolatile buffer cache provides improved
consistency of file systems by absorbing write I/Os
as well as improved performance. In this paper, we
discuss the optimality of cache replacement
algorithms in nonvolatile buffer caches and present
a new algorithm called NBM (Nonvolatile-RAM-aware
Buffer cache Management). NBM has three salient
features. First, it separately exploits read and
write histories of block references, and thus it
estimates future references of each operation more
precisely. Second, NBM guarantees the complete
consistency of write I/Os since all dirty data are
cached in nonvolatile buffer caches. Third, metadata
lists are maintained separately from cached blocks.
This allows more efficient management of volatile
and nonvolatile buffer caches based on read and
write histories, respectively. Trace-driven
simulations show that NBM improves the I/O
performance of file systems significantly compared
to the NVLRU algorithm that is a modified version of
LRU to hold dirty blocks in nonvolatile buffer
caches.
|
41-52 |
Web Page Information
Architecture Formalization Method and It's an
Example
Yorinori Kishimoto
Abstract:
This paper proposed a formalization method of Web
page information architecture by regular expression
for checking its structure. This method classifies
structure elements of a Web page in attributes on
the basis of an idea of the Web information
architecture, and it’s expressed by two types of
equations on the basis of the F-Shaped reading
pattern. This method can verify structure of the
information architecture of a Web page. As a result,
this method was able to analyze about structure
elements of lack point and a redundant point in Web
page information architecture.
|
53-62 |
Increasing Level of
Correctness in Correlation with McCabe Complexity
Nicolae-Iulian Enescu, Dan Mancas, Ecaterina-Irina
Manole, Stefan Udristoiu
Abstract:
The scope of our research is finding a correlation
between the correctness indicator and the McCabe
complexity indicator for software programs. For
this, the correctness and McCabe complexity
indicators will be calculated for a simple program,
written in C programming language. The computations
will be made for each program version obtained by
correcting different error type found in the testing
process. Will be observed there is a closed
correlation between correctness and McCabe
complexity in the way that for an increasing of the
correctness level there will also be a significant
increase of the complexity level.
|
63-74 |
Adaptation of
Satellite Navigation for Pedestrians with Electronic
Compass
Krzysztof Tokarz, Michal Dzik
Abstract:
Despite constantly improving medical techniques it
is still impossible to cure many severe vision
defects which causes great demand for developing new
techniques that could help visually impaired persons
to get through chores of everyday life. GPS
navigation is the most valuable technology but
currently available consumer GPS receivers do not
offer good accuracy. At the Silesian University of
Technology the research was done for improving the
functionality and accuracy of navigation for blind
persons. The GPS navigation device has been
developed equipped additionally with electronic
compass and accelerometer for improving accuracy of
determining the azimuth.
|
75-84 |
Word Co-occurrence
Matrix and Context Dependent Class in LSA based
Language Model for Speech Recognition
Welly Naptali, Masatoshi Tsuchiya, Seiichi
Nakagawa
Abstract:
A data sparseness problem for modeling a language
often occurs in many language models (LMs). This
problem is caused by the insufficiency of training
data, which in turn, makes the infrequent words have
unreliable probability. Mapping from words into
classes gives the infrequent words more confident
probability, because they can rely on other more
frequent words in the same class. In this research,
we investigates a class LM based on a latent
semantic analysis (LSA). A word-document matrix is
commonly used to represent a collection of text
(corpus) in LSA framework. This matrix tells how
many times a word occurs in a certain document. In
other words, this matrix ignores the word order in
the sentence. We propose several word co-occurrence
matrices that keep the word order. By applying LSA
to these matrices, words in the vocabulary are
projected to a continues vector space according to
their position in the sentences. To support this
matrices, we also define a context dependent class
(CDC) LM. Unlike traditional class LM, CDC LM
distinguishes classes according to their context in
the sentences. Experiments on Wall Street Journal (WSJ)
corpus show that the word co-occurrence matrix works
3.62%-12.72 better than worddocument matrix.
Furthermore, the CDC improves the performance and
achieves better perplexity than the traditional
class LM based on LSA. When the model is linearly
interpolated with the word-based trigram, it gives
improvements about 2.01% for trigram model and 9.47%
for fourgram model on relative perplexity against a
standard word-based trigram LM.
|
85-95 |
Design and Development
of a Qualitative Simulator for Learning Organic
Reactions
Y. C. Alicia Tang, S. M. Zain, N. A. Rahman
Abstract:
Our work features an ontology-supported framework
for developing a qualitative simulator for
explaining the behaviors of selected sample of
organic chemistry reactions. The design of the
simulator uses Qualitative Reasoning (QR), and in
particular, Qualitative Process Theory (QPT) for
constructing qualitative models and the simulation
of basic steps in the chemical reactions such as
creating and deleting bonds. The qualitative
simulator allows learners to access notions of how
the behavior of chemical systems evolves in time.
Students would benefit from it in terms of improving
their reasoning skills and enhancing their
understanding in organic processes. The roles of
each functional component of the qualitative
simulator will first be introduced. Then, we move on
to discuss the qualitative modeling and simulation
design for reproducing the chemical behaviors of
organic reactions. Finally, a discussion on the
simulation results and explanation generation
capability are presented.
|
96-103 |
Validation Methods of
Suspicious Network Flows for Unknown Attack
Detection
Ikkyun Kim, Daewon Kim, Yangseo Choi, Koohong
Kang, Jintae Oh, Jongsoo Jang
Abstract:
The false rate of the detection methods which are
based on abnormal traffic behavior is a little high
and the accuracy of the signature generation is
relatively low. Moreover, it is not suitable to
detect exploits and generate its signature. In this
paper, we have presented ZASMIN (Zeroday-Attack
Signature Management Infrastructure) system, which
is developed for novel network attack detection.
This system provides early warning at the moment the
attacks start to spread on the network and to block
the spread of the cyber attacks by automatically
generating a signature that could be used by the
network security appliance such as IPS. This system
have adopted various technologies — suspicious
traffic monitoring, attack validation, polymorphic
worm recognition, signature generation — for unknown
network attack detection. Especially, the validation
functions in ZASMIN have to able to cover 1)
polymorphism, which is an encrypted attack code at
the penetration and operation step, 2) executables,
which are any binary functions at each step, and 3)
malicious string. And also, we introduce two
concepts to validate the preprocessing of the
suspicious traffic. The one is attack-based
validation and the other is signature-based
validation. These validation functions can reduce
the false rate of the unknown attack detection. In
order to check the feasibility of the validation
functions in ZASMIN, we have installed it on real
honeynet environment, then we have analyzed the
result about detection of unknown attack. Even
though short–period analysis is not enough long to
detect various unknown attacks, we confirmed that
ZASMIN can detect some attacks without any
well-known signature.
|
104-114 |
Multi-Hash based
Pattern Matching Mechanism for High-Performance
Intrusion Detection
Byoungkoo Kim, Seungyong Yoon, Jintae Oh
Abstract:
Many Network-based Intrusion Detection Systems (NIDSs)
are developed till now to respond these network
attacks. As network technology presses forward,
Gigabit Ethernet has become the actual standard for
large network installations. Therefore, software
solutions in developing high-speed NIDSs are
increasingly impractical. It thus appears well
motivated to investigate the hardware-based
solutions. Although several solutions have been
proposed recently, finding an efficient solution is
considered as a difficult problem due to the
limitations in resources such as a small memory
size, as well as the growing link speed. Therefore,
we propose the FPGA-based intrusion detection
technique to detect and respond variant attacks on
high-speed links. It was designed to fully exploit
hardware parallelism to achieve real-time packet
inspection, to require a small memory for storing
signature. The technique is a part of our system,
called ATPS (Adaptive Threat Prevention System)
recently developed. Most of all, the proposed system
has a novel content filtering technique called
Table-driven Bottom-up Tree (TBT) for exact string
matching. However, as the number of signatures to be
compared is growing rapidly, the improved mechanism
is required. In this paper, we present the multi-hash
based TBT technique with memory-efficiency.
Simulation based performance evaluations showed that
the proposed technique used on-chip SRAM less than
20% of the one-hash based TBT technique. Finally,
experimental results about our system show a
consistent performance in traffic level and had
nothing to do with increasing number of signatures
applied.
|
115-124 |
A Proposed Model for
Individualized Learning through Mobile Technologies
Farhan Obisat, Ezz Hattab
Abstract:
Mobile Learning (mLearning) describes a new trend of
learning that uses innovations like wireless
communication, personal digital assistants, digital
content from traditional textbooks, and other
sources to provide a dynamic learning environment.
With the facility of connecting people and
information world-widely, the Internet has a major
impact on the traditional education. Currently,
students can easily access online courses at anytime
anywhere in the globe. Since the Internet has been
adopted by students, traditional pedagogical models
are no more appropriate models. Consequently, new
pedagogical models are required. Such models should
be student-centric that based on individual
student’s learning expectation, styles, interests
and abilities. In this paper, first we discuss these
four dimensions and then we introduce an
individualized learning model that takes these
dimensions into account. It discusses 1) student
learning styles, 2) student learning interests and
3) student devices, such as personal profiles. The
main objective is to help understanding the
behaviors of the students and to materialize the
concept of personalization.
|
125-132 |
A Fast Geometric
Rectification of Remote Sensing Imagery Based on
Feature Ground Control Point Database
Jian Yang, Zhongming Zhao
Abstract:
This paper, on the basis of the traditional design
of database for ground control point, tries to
founded a fast auto-correction method of satellite
remote sensing imagery based on feature ground
control point database which brings local feature
points as the effective supplement and aims to
achieve the automatic matching between feature
ground control points and original images that need
geometric correction and improve the rectified
process utilizing random sample consensus (RANSAC
algorithm). In this method, the author realize the
auto-extraction of feature ground control points for
ensuring speed and precise geometric correction of a
high volume of satellite remote sensing images by
means of analyzing feature ground control point
database algorithm.
|
133-142 |
Hierarchical
Denormalizing: A Possibility to Optimize the Data
Warehouse Design
Morteza Zaker, Somnuk Phon-Amnuaisuk, Su-Cheng
Haw
Abstract:
Two of the most common processes in database design
community include data normalization and
denormalization which play pivotal roles in the
underlying performance. Today data warehouse queries
comprise a group of aggregations and joining
operations. As a result, normalization process does
not seem to be an adequate option since several
relations must combine to provide answers for
queries that involve aggregation. Further,
denormalization process engages a wealth of
administrative tasks, which include the
documentation structure of the denormalization
assessments, data validation, and data migration
schedule, among others. It is the objective of the
present paper to investigate the possibility that,
under certain circumstances, the above-mentioned
justifications cannot provide justifiable reasons to
ignore the effects of denormalization. To date,
denormalization techniques have been applied in
several database designs one of which is
hierarchical denormalization. The findings provide
empirical data that show the query response time is
remarkably minimized once the schema is deployed by
hierarchical denormalization on a large dataset with
multi-billion records. It is, thus, recommended that
hierarchical denormalization be considered a more
preferable method to improve query processing
performance.
|
143-150 |
A Study on Industrial
Customers Loyalty to Application Service Providers –
The Case of Logistics Information Services
Cheng-Kiang Farn, Li Ting Huang
Abstract:
The growth of application service providers has been
phenomenal in application service industry
worldwide. Application service providers usually
provide service which is comprised with modularized
and standard components. Customers can easily switch
to another supplier based on the comparison between
cost and benefit if their service is comprised with
modularized and standard components. So, keeping a
long-term relationship with industrial customers is
getting an imperative strategy for application
service providers in order to pursue more
predictable source of revenues and successive income
streams. Yet, cost-effective feature of ASPs is not
the sufficient condition for ensuring business
success. Cultivating relationship management whose
core concept is to enhance loyalty of existing
customers gradually becomes a critical issue for
application service providers. This study
investigates economic and psychological factors
simultaneously and compares subtle difference
between influences of economic and psychological
factors on customer loyalty. Empirical result from a
questionnaire survey leads to several findings. The
importance of psychological factors is relative
importance to loyalty formation in comparison with
economic factors. Service quality both directly and
indirectly affects customer loyalty. Trust affects
loyalty mediated by affective commitment. Switching
barrier affects continuous commitment, while it
positively moderates the relationship of service
quality and loyalty. This finding is contrary to
literature. Moreover, influences of affective and
continuous commitment are distinct by business
types. Findings reveal that psychological factors
are also important to loyalty formation in B2B
environment. Firms could pay more attention on
commitment. Implications and limitations are
discussed.
|
151-160 |
Digital Steganalysis:
Computational Intelligence Approach
Roshidi Din, Azman Samsudin
Abstract:
In this paper, we present a consolidated view of
digital media steganalysis from the perspective of
computational intelligence. In our analysis the
digital media steganalysis is divided into three
domains which are image steganalysis, audio
steganalysis, and video steganalysis. Three major
computational intelligence methods have also been
identified in the steganalysis domains which are
bayesian, neural network, and genetic algorithm.
Each of these methods has its own pros and cons.
|
161-170 |
An Empirical Analysis
of Relationship Commitment and Trust in Virtual
Programmer Community
Yu-Ren Yen
Abstract:
Virtual Communities (VCs) have become a forum for
programmer seeking knowledge to resolve problems and
communicate with each other. The Internet makes
participant relatively easy to switch for one VC to
another VC that provides similar content or
services. However, many VCs have failed due to the
reluctance of members to continue their
participation in these VCs. In volatile cyberspaces,
VCs without specific domain knowledge may face
challenges such as large populations, unstable
memberships, and imperfect information and memory,
which also affect knowledge flows within members.
The most important aspect of VCs from the members’
perspective is the increase satisfaction, and engage
behavioral intention to use VCs, but satisfaction
does not always predict continuous usage. This study
proposes a conceptual model based on
commitment-trust theory (CTT) and investigates the
continuance intention in VC. It seeks to theorize
the antecedents and consequence of relationship
commitment in the VCs and identify how CTT can be
adapted in a knowledge sharing environment. The
members of Programmer Club, a representative
professional community in Taiwan, were chosen to
participate in the survey, and 488 usable responses
were collected in three months. Structural Equation
Model (SEM) was used to test the model, the findings
show that relationship commitment and trust is the
strongest predictor of members’ continuance
intention. Implications are proposed in the final
section.
|
171-180 |
Why Focal Firms Share
Information? A Relational Perspective
Chia-Chen Wang, Chun-Der Chen, Yu-Fen Chen,
Cheng-Kiang Farn
Abstract:
Supply chain management has become an important
issue for Taiwan’s manufacturing industry due to
escalating global competition. Virtual vertical
integration is an important issue in supply chain
management. Because organizations only have limited
resources, they pursue long-term partnership with
specific transaction partners. They share
information to improve visibility, speed responses
to markets, and reduce costs from information
distortion or information asymmetry. This study
empirically explores the factors affecting
inter-organizational information sharing from the
perspective of focal firms. 1,000 questionnaires
were administered to top 1,000 manufacturing
companies in Taiwan, with 139 valid responses. The
results show that partner’s power, trust, and
relation-specific asset investments positively
affect inter-organizational information sharing. On
the other hand, the partner’s power does not
significantly affect the organization’s
relation-specific investments. This study further
investigates the moderating role of information
technology competence and trust. The result
indicates that when an organization has lower
information technology competence, the relationship
between the partner’s power and relation-specific
investments is significant. In addition, when the
focal firm has lower trust in the customer, there is
significant relationship between relation-specific
investments and information sharing. Implications
and discussion are then provided.
|
181-190 |
DEA-RTA: A Dynamic
Encryption Algorithm for the Real-Time Applications
Ahmad H. Omari, Basil M. Al-Kasasbeh, Rafa E. Al-Qutaish,
Mohammad I. Muhairat
Abstract:
The Internet and it is applications are hungry for
high level of Quality of Service (QoS), and most of
the Internet applications seek to minimize packets
delay, especially, the Real-Time Applications (RTA).
QoS is considered as a major issue in the Internet,
where RTA services like IPTelephony and XoIP become
a successful business in the world, call
distribution may result in big money loss, for this
reason researchers put their efforts to build
applications that can deal with different levels of
QoS. In addition to the basic QoS some customers ask
to preserve confidentiality which makes it more
complicated and may result in higher delay time.
Delay is very complex issue specially in RTA, and it
consists of many types of delays, such as,
Packetization delay (sampling, coder-decoder (codec),compression
and encryption), and end-to-end delay (processing,
queuing, serialization and propagation delays), our
research try to achieve better encryption delay at
the user machine CPU level while maintain
confidentiality. The proposed algorithm is a new
symmetric encryption technique that allows users to
choose using new different key for each single
packet if they wish. The encryption key is flexible
in length, the plain text is flexible in size, the
encryption process is very simple, the transposition
table is simple too, the shifted transposition table
is easy to initiate and complex to regenerate. These
properties results in better encryption delay while
maintaining confidentiality, the algorithm is 15
times faster and 10 times faster than AES algorithm.
|
191-199 |
Paper
Title, Authors, Abstract (Issue 2, Volume 3, 2009) |
Pages |
The Integration of
Seafood Traceability System for Shrimp Value Chain
Systems
Echo Huang, Juh-Cheng Yang
Abstract:
The impact of information systems on productivity is
wide ranging and potentially affects all other
activities of a company. This trend extends beyond
high-tech companies in Taiwan. The aquacultural
industry is finding they can substantially increase
productivity and reduce costs by moving process
tracking, management and reverse tracking functions
online. This paper presents a new radio frequency
identification (RFID) and quick response code-based
(QR codes) system for managing in-house seafood
cultivation, inspection, distribution and retailing
and its impact on productivity and costs. The value
chain system was tested in a white shrimp
cultivation factory which provides live-shrimp
offerings, and demonstrated efficiency,
effectiveness, and better customer services from the
model. The traceability is mainly the standard that
guarantees the food security of consumer, is a
system that improves a risk management and promotes
to produce effect and industry level. Integrate RFID
and the technique of QR CODE, combine entity
logistics and information process, record product
all messages inside the life cycle in detail,
increase information visibility, change traditional
farming supply chain homework mode, will effectively
raise the additional value of whole supply chain
system.
|
201-210 |
Integration of
Variance Analysis and Multi Attribute Methods of
Decision in Application of Optimal Factor
Combination Choice in One Experiment
Dragan Randjelovic
Abstract:
Beside one affirmation their hypothesis scientists
make experiments to choose the optimal from
available possibilities in one experiment. It is
known that on the results of one experiment make
influnce treatments and other uncontrolled factors
called experimental error which must be smaller and
for this reason scientists make different
statistical plans. Mathematical apparatus for
experiment organization are possible to search on
the basis of total random distribution, random block
distribution and special organized block
distribution while they can most effectively
represent complex most often multifactor even
multivariate experiments. It is very dificult to
make analysis of results and especially determine
the optimal factor combination choice in these
experiments with respondable apparatus of multiple
regression analysis, or canonical analysis in
multivarite case, and at any rate with help of
variance analysis. Because of that for optimal
factor combination choice in one experiment authors
propose procedure based on integration of analysis
of variance and multi attribute methods of decision.
In the end of this paper are given three examples on
which are demonstrated proposed procedure.
|
211-221 |
A Compact Colored
Petri Net Model for Fault Diagnosis and Recovery in
Embedded and Control Systems
A. Spiteri Staines
Abstract:
This paper describes the modeling and use of a
reduced Colored Petri net for fault diagnosis and
recovery in embedded and control systems. The
reduced or compact Colored Petri net modeling
approach can be extended to other classes of real
time systems, real time hardware, etc. A reduced
colored Petri net is a compact form of a Colored
Petri net having complex token types based on sets
or complex sets containing the structured
information for error handling. The approach
presented here will reduce the size of the Colored
Petri net because information is put in the token
instead of having many additional places and
transitions as is typically done. This approach is
illustrated with a comprehensive example of a
computerized fuel control system for a combustion
turbine. The Colored Petri net is an executable
model. It is analyzed structurally and results are
shown and interpreted.
|
222-229 |
A Fuzzy Model to
Evaluate the Motivation to Quality Programs
Denise Dumke de Medeiros
Abstract:
This article emphasizes motivation and competence as
basic factors needed to optimize human action with
regard to quality. To evaluate the motivation to
quality, a model with objective characteristics is
proposed, using Herzberg’s Two Factor Theory, and
Fuzzy Set Theory. As this is a difficult area to
measure, the model proposes an objective methodology
that makes it possible to detect the motivational
strategies that make employees more susceptible to
the reality of the enterprise. This can help
managers choose the best model to motivate the
employees. An application of the methodology is also
presented.
|
230-237 |
Email Threads: A
Comparative Evaluation of Textual, Graphical and
Multimodal Approaches
Saad Alharbi, Dimitrios Rigas
Abstract:
Email threads were implemented by enormous number of
studies in order to improve the efficiency of email
clients. Nevertheless, contextual information about
messages in the threads was somewhat neglected by
most of these studies. This paper describes an
empirical study carried out to investigate into how
extent can such information be implemented in email
threads. Furthermore, this study aimed to
investigate various ways of communicating this type
of information. Therefore, three email threads
approaches that presented various types of
information in different ways were developed.
Textual approach, which presented related messages
with chronological and contextual information in the
main view of the email client. Graphical approach,
which presented related messages with chronological,
relationships and contextual information in a
temporal view. Multimodal approach, where threads
presented in a similar way of the previous approach
with some contextual information communicated
aurally (i.e. non-speech sound). These approaches
were tested comparatively with three independent
groups of users. The results were analysed based on
effectiveness (i.e. tasks completion rate and
identification of threads information) and
efficiency (i.e. tasks accomplishment time and
errors rate). The results indicated that multimodal
threads approach was more effective and efficient
than the textual approach. The results also
highlighted that the large scale of graphically
presented information in the graphical approach has
negatively affected its effectiveness when compared
to the textual approach especially with complex
email threads. However, communicating messages
information through two channels (i.e. visual and
auditory channels) in the multimodal approach helped
to reduce the graphical overload and hence
significantly improved the usability when compared
to the graphical approach.
|
238-250 |
Using Assembler
Encoding to Solve Predator-Prey Problem
Tomasz Praczyk
Abstract:
The paper presents a neuro-evolutionary method
called Assembler Encoding. The method was tested in
the predator-prey problem. To compare Assembler
Encoding with another neuro-evolutionary method, in
the experiments, a co-evolutionary version of simple
connectivity matrix was also applied.
|
251-259 |
Solving Traveling
Salesman Problem on Cluster Compute Nodes
Izzatdin A. Aziz, Nazleeni Haron, Mazlina Mehat,
Low Tan Jung, Aisyah Nabilah Mustapa, Emelia Akashah
P.Akhir
Abstract:
In this paper, we present a parallel implementation
of a solution for the Traveling Salesman Problem
(TSP). TSP is the problem of finding the shortest
path from point A to point B, given a set of points
and passing through each point exactly once.
Initially a sequential algorithm is fabricated from
scratch and written in C language. The sequential
algorithm is then converted into a parallel
algorithm by integrating it with the Message Passing
Interface (MPI) libraries so that it can be executed
on a cluster computer. Our main aim by creating the
parallel algorithm is to accelerate the execution
time of solving TSP. Experimental results conducted
on Beowulf cluster are presented to demonstrate the
viability of our work as well as the efficiency of
the parallel algorithm.
|
260-269 |
Emotional Agents in
Computer Games
Khalil Shihab
Abstract:
In this paper, we consider emotion as a factor in
the decision-making process and actions taken by an
agent can be represented by a model, called
“emotional model” created with specific focus on
computer games development. It is designed to
explore people’s behavior in certain circumstances,
while under specified emotional states. Special
attention was given to the thought process and
actions displayed in the hypothetical scenarios. We
characterized thoughts and actions associated with
each scenario and emotional state. Each particular
action or proof of steps taken in the thought
process was given a percentage value directly
proportional to answers given by the test
population. Finally, we developed an experimental
game program for the evaluation of our emotional
decision making model. The aim of the evaluation was
to find out how real life agents reacted in certain
situations and what processes the human mind runs
through when thinking and acting upon certain
situations.
|
270-277 |
Paper
Title, Authors, Abstract (Issue 3, Volume 3, 2009) |
Pages |
A Contribution to the
Application of Autonomous Control in Manufacturing
B. Scholz-Reiter, St. Sowade, D. Rippel, M.
Teucke, M. Ozsahin, T. Hildebrandt
Abstract:
The apparel industry is a prime example for the
field of manual manufacture. Problems in
manufacturing control are caused by manual handling
of garments and influence the availability and
correctness of information. This bad information
quality leads to problems along the supply chain
from production to disposition. Automated data
management based on radio frequency identification
technology is proposed to solve these problems.
Autonomous control can be established on top to
increase the system robustness and flexibility and
to enable smaller manufacturing batch sizes.
Although autonomous control is easily applicable in
highly automated systems its application in manual
processes is generally difficult. Three different
system architectures are discussed, diverse
technical approaches are analyzed and decision is
made for one approach based on radio frequency
identification and manufacturing batches that suits
the apparel scenario well.
|
279-291 |
An Action Decision
Model for Emotions based on Transactional Analysis
H. Fujita, N. Sawai, J. Hakura, M. Kurematsu
Abstract:
Human computer Interaction based on emotional
modeling has been investigated and reported in this
paper. Human personality has been analyzed based on
ego-gram analysis and accordingly human "SELF"
emotional model has been created. We have created as
one part a computerized model which reflects a human
user (in this paper Miyzawa Kenji model) impeded as
a computer model and through it, an emotional
interaction between that model and the real human
user is established. The interaction scenarios and
reasoning are based on transactional analysis. We
have implemented the system and empirically,
examined it, as experiment in public space for
revision and evaluation.
|
292-300 |
Generic Interactive
Natural Language Interface to Databases (GINLIDB)
Faraj A. El-Mouadib, Zakaria S. Zubi, Ahmed A.
Almagrous, Irdess S. El-Feghi
Abstract:
To override the complexity of SQL, and to facilitate
the manipulation of data in databases for common
people (not SQL professionals), many researches have
turned out to use natural language instead of SQL.
The idea of using natural language instead of SQL
has prompted the development of new type of
processing method called Natural Language Interface
to Database systems (NLIDB). The NLIDB system is
actually a branch of more comprehensive method
called Natural Language Processing (NLP). In
general, the main objective of NLP research is to
create an easy and friendly environment to interact
with computers in the sense that computer usage does
not require any programming language skills to
access the data; only natural language (i.e.
English) is required. Many systems have been
developed to use the concept of NLP in different
varieties of domains, for example the system LUNAR
[19] and the system LADDER [8]. One drawback of
previous systems is that the grammar must be
tailor-made for each given database. Another
drawback is that many NLP systems cover only a small
domain of the English language questions. In this
paper we present the design and implementation of a
natural language interface to a database system. The
system is called Generic Interactive Natural
Language Interface to Databases (GINLIDB). It is
designed by the use of UML and developed using
Visual Basic.NET-2005. Our system is generic in
nature given the appropriate database and knowledge
base. This feature makes our system distinguishable.
|
301-310 |
Estimation Model of
Labor Time at the Information Security Audit and
Standardization of Audit Work by Probabilistic Risk
Assessment
Naoki Satoh, Hiromitsu Kumamoto
Abstract:
Based on the factors that are available at the
initial phase of the audit task, this paper proposes
a labor time estimation method for the information
security audit in the form of formula, statistically
analyzing the data of the past 20 cases. Initially,
audit mode, operation mode, penetration degree, and
company size are considered to be the factors that
could influence the labor time, and thus the
“quantitative analysis I” is conducted with these
factors. However the results were not sufficiently
positive. As a result, by dividing audit mode into
regular and emergency audit and by using company
size as the factor, labor time estimation formula
has been established by means of the regression
analysis. Compared to regular audit, it is found
that emergency audit takes more labor time at
information security audit. We try to investigate
this factor by probabilistic risk assessment.
|
311-320 |
Secret Image Recovery
based on Search Order Coding
Wei-Kai Su, Lee Shu-Teng Chen, Shang-Kuan Chen,
Ja-Chen Lin
Abstract:
In this paper, we propose an image recovery method
based on search order coding (SOC). By using SOC
technique, we can generate a SOC image for an input
secret image. If the secret image is damaged, by
referring to its SOC image, the damaged image can be
repaired to a better one. In addition to the
proposed basic version of the SOC recovery
technique, we also modify it to an advanced one. The
advanced version provides a more flexible method
that repairs the damaged image by two different ways
according to the availability of the mapping table.
The secret image can still be recovered even when it
is seriously damaged. Besides, the proposed SOC
recovery technique can be applied to not only the
gray values of the secret image but also the VQ
indices. Experiments show that the recovery ability
of the VQ indices is better than that of pixel
values. Moreover, the SOC image alone reveals
nothing about the secret image. Therefore, the SOC
image is safer than directly duplicating the secret
image.
|
321-328 |
Efficient Biometric
Watermark Embedding by Flipping on Binary Text
Documents
Chi-Man Pun, Ioi-Tun Lam
Abstract:
In respect to the issues on privacy, security and
legal significance of a document, some sorts of
security protections should be put on a document to
ensure its genuineness and integrity. In this paper,
signatories’ encrypted digital biometric fingerprint
in binary format will be embedded into a binary text
document by Flipping – one of the methods in spatial
domain. During the embedding process, document will
be adaptively partitioned into blocks with fixed
size of pixels according to the number of bits in
the watermark message. Each watermark bit is
embedded into each block by Flipping. Based on the
odd or even number of pixels in each block on the
embedded document, the fingerprint watermark message
is extracted after decryption. Experimental results
from our prototype system show that the proposed
method is successfully tested for embedding and
extracting a fingerprint watermark message in a
document no matter it is written in hieroglyph or in
alphabetic character.
|
329-336 |
Analysis of
Information Security Problem by Probabilistic Risk
Assessment
Naoki Satoh, Hiromitsu Kumamoto
Abstract:
The information security risk assessment is
investigated from perspectives of most advanced
probabilistic risk assessment (PRA) for nuclear
power plants. Accident scenario enumeration by
initiating events, mitigation systems and event
trees are first described and demonstrated. Assets,
confidentiality, integrity, availability, threats,
vulnerabilities, impacts, likelihoods, and
safeguards are reformulated by the PRA. Two
illustrative examples are given: network access
attacker and physical access attacker. Defenseless
time spans and their frequencies are introduced to
cope with non-rare initiating events of information
security problems. A common event tree structure may
apply to variety of security problems, thus
facilitating the risk assessment.
|
337-347 |
Paper
Title, Authors, Abstract (Issue 4, Volume 3, 2009) |
Pages |
Customers Satisfaction in
Shipping Companies under Artificial Intelligence and
Multicriteria Decision Analysis
Nikolaos Loukeris
Abstract:
Strategic Planning is formed considering customers
satisfaction to maximise the market share. In
shipping companies, the identification of
satisfaction within clients is very difficult, thus
satisfaction’s prediction provides valuable
information. Previous research in the field used
techniques of multicriteria analysis, data mining
and analytical-synthetical preference models. This
research paper aims to define the most effective
method to predict satisfaction, between techniques
of data mining, rough sets, neural networks and
multcriteria decision analysis.
|
349-356 |
Improvement of Rule Based
Morphological Analysis and POS Tagging in Tamil
Language via Projection and Induction Techniques
M. Selvam, A. M. Natarajan
Abstract:
Morphological analysis and part of speech (POS)
tagging are very essential for natural language
processes like generation of Treebanks, training of
parsing models and parsing. Rule based approach is
applicable to the languages which have well defined
set of rules to accommodate most of the words with
inflectional and derivational morphology. Rule based
morphological analysis and POS tagging are very
difficult and cannot accommodate all combinations
through the rules due to inflections and exceptions
especially in languages like Tamil. Statistical
methods are very important which in turn need large
volume of electronic corpus and automated tools
which are very rare in Tamil. Since English is very
rich in all aspects, POS tags can be projected to
Tamil through alignment and projection techniques.
Rule based morphological analyzer and POS tagger can
be built from well defined morphological rules of
Tamil. They can be further improved by the root
words induced from English to Tamil through the
sequence of processes like alignment, lemmatization
and induction with the help of sentence aligned
corpora like Bible corpora, TV news, newspaper
articles since finding the root in the inflected
words is very difficult and leads to ambiguity. In
our experiments, rule based morphological analyzer
and POS tagger were built with 85.56% accuracy. POS
tagged sentences in Tamil were obtained for the
Bible corpus through alignment and projection
techniques and categorical information had been
obtained. Root words were induced from English to
Tamil through alignment, lemmatization and induction
processes. Further 7% improvement was made in rule
based morphological analyzer and POS tagger using
categorical information and root words obtained from
POS projection and morphological induction
respectively via sentence aligned corpora.
|
357-367 |
Viewpoint of Probabilistic Risk
Assessment in Information Security Audit
Naoki Satoh, Hiromitsu Kumamoto
Abstract:
After the information security audit, the auditor
commonly points out the importance of information
assets, the vulnerability of the audited information
system, and the need of countermeasures. On such an
occasion, the audited often ask the auditor for the
quantitative assessment of the risk so that they can
take specific measures. Nevertheless, in reality,
the auditor can hardly meet this requirement because
they do not have any appropriate methods to assess
the risk quantitatively and systematically.
Therefore, this paper proposes the approach that
makes it possible to identify the scenarios of
information security accidents systematically, to
assess the risk of the occurrence of the scenario
quantitatively, and to point out the importance of
taking countermeasures by incorporating
Probabilistic Risk Assessment in information
security audit. For the concrete description and
explanation of this approach, this paper takes the
case of the audit of password management as an
example. By enumerating the possible scenarios that
indicate how initiating events, the vulnerability of
mitigation systems, and the failures of operations
can allow illegal accesses to the information assets,
this paper shows that it is possible to assess the
security risks by the pair of defenseless time span
and its occurrence frequency of each scenario.
Finally, since the parameters necessary for risk
quantification such as the occurrence frequency of
password theft, the probability of theft detection,
and the probability of taking countermeasure after
the theft have uncertainty, the uncertainty of the
occurrence of the scenario itself is assessed by
propagating the incompleteness of the knowledge of
these parameters with random digits.
|
368-376 |
|
|
Copyrighted Material, www.naun.org
NAUN
|