|
ISSN: 1998-4308
Year 2010
All papers of the journal were peer reviewed by two
independent reviewers. Acceptance was granted when both
reviewers' recommendations were positive.
Main
Page
Paper
Title, Authors, Abstract (Issue 1, Volume 4, 2010) |
Pages |
3D Battlefield Modeling and
Simulation of War Games
Baki Koyuncu, Erkan Bostanci
Abstract:
In this study, a real time 3D virtual simulation
software for visualizing the military battlefields
was developed. Developed software, named Sandbox,
used elevation data stored in DEM format
corresponding to the battlefield. Sandbox uses this
data to create the platform on which the military
units will be added. Different military units can be
added in the software. Military units were viewed by
both using a recent military symbology, NATO-APP-6A,
and 3D models of the real military units. Software
uses translation animation for the position updates.
Since data transmission between different platforms
was considered, developed software extensively uses
XML based data. A database was used for long term
storage of received reports. Web services were used
to transmit and receive reports to/from remote field
observers and change the state of the software.
|
1-8 |
Fault Tolerance by Replication
of Distributed Database in P2P System using Agent
Approach
May Mar Oo, The' The' Soe, Aye Thida
Abstract:
TThe term replication in a distributed database
refers to the operation of copying and maintaining
database objects in more than one location. There
are three types of replication in distributed system
that are snapshot, transactional and merge. A
partial failure may happen when one component in
such system fails. This failure may affect the
proper operation of other components while at the
same time leaving yet other components totally
unaffected. An important goal in such systems design
is to construct the system in a way that it can
automatically recover from partial failures without
seriously affecting the overall performance and
continue to operate in an acceptable way while
repairs are being made. The technologies,
architectures, and methodologies traditionally used
to develop distributed applications exhibit a
variety of limitations and drawbacks when applied to
large scale distributed settings (e.g., the
Internet). In particular, they fail in providing the
desired degree of configurability, scalability, and
customizability. To address these issues,
researchers are investigating a variety of
innovative approaches. The most promising and
intriguing ones are those based on the ability of
moving code across the nodes of a network,
exploiting the notion of mobile code, the agent
toolkit can be chosen as a platform for the
replication. So, this paper introduced the topic of
agent replication for distributed database and
examined the issues associated with using agent
replication in a multi-agent system as well as the
main issues of agent communication, read/write
consistency and state synchronization.
|
9-18 |
Image Segmentation Using
Discrete Cosine Texture Feature
Chi-Man Pun, Hong-Min Zhu
Abstract:
In this paper we propose a computational efficient
approach for image segmentation based on texture
analysis, a 2D discrete cosine transform (DCT) is
utilized to extract texture features in each image
block. We first split the input image into MxN
blocks, calculate the distances between neighbor
blocks by a set of largest energy signatures from
DCT for each block. Then we merge blocks with
smallest distances to form larger regions. The
process will repeat until we got desired number of
regions. Experimental results show that our proposed
method outperforms the existing image segmentation
method, especially on efficiency aspect.
|
19-26 |
A Triple Graph Grammar Mapping
of UML 2 Activities into Petri Nets
A. Spiteri Staines
Abstract:
Model-to-Model mapping has several advantages over
relational mapping. In model-to-model mapping an
active correspondence is kept between two pairs of
models. This is facilitated if graphical models are
used. UML 2 activities are based on Petri net like
semantics and substantial literature exists
explaining their conversion into Petri nets. This
paper explains how UML 2 activities can be formally
mapped into Petri nets or Petri net semantics from a
theoretical, practical and operational point of view
adding on previous work of Triple Graph Grammars (TGGs).
UML activity constructs are classified and
identified. This is useful for creating a basic set
of TGG rules. Generic TGG rules are identified and
created. The rules are mainly intended for forward
transformation. An example is given illustrating the
conversion process. The concepts presented can be
elaborated further and even extended to other visual
models or notations.
|
27-35 |
Using Edutainment in E-Learning
Application: An Empirical Study
Dimitrios Rigas, Khaled Ayad
Abstract:
Philosophers and psychologists' arguments in the
area of effective learning and HCI demonstrated that
humour or entertainment is one of many important
factors that help in developing improved learning.
Accordingly students' performance increases in
learning environment combined with amusement
features. This work investigates the role of
edutainment using avatars as tool to represent the
entertainment attributes in an e-learning framework.
The empirical investigation aimed at measuring
usability of two experimental interfaces: typical
e-learning and multimodal elearning system. The
usability of these two environments was analysed by
one dependent group of users. The results presented
here confirmed that edutainment interface as
learning medium persuaded users more than the
typical version.
|
36-43 |
Paper
Title, Authors, Abstract (Issue 2, Volume 4, 2010) |
Pages |
Mathematical Validation of
Object-Oriented Class Cohesion Metrics
Jehad Al Dallal
Abstract:
Class cohesion is an object-oriented software
quality attribute and refers to the extent to which
the members of a class are related. Software
developers use class cohesion measures to assess the
quality of their products and to guide the
restructuring of poorly designed classes. Several
class cohesion metrics are proposed in the
literature, and a few of them are mathematically
validated against the necessary properties of class
cohesion. Metrics that violate class cohesion
properties are not well defined, and their utility
as indictors of the relatedness of class members is
questionable. The purpose of this paper is to
mathematically validate sixteen class cohesion
metrics using class cohesion properties. Results
show that metrics differ considerably in satisfying
the cohesion properties; some of them satisfy all
properties, while others satisfy none.
|
45-52 |
Comparing User Satisfaction and
Customisation for Variable Size Personalised Menus
Khalid Al-Omar, Dimitrios Rigas
Abstract:
This paper reports a comparative empirical
investigation of the effects of content size on user
satisfaction and customisation of five different
personalised menu types: adaptable, adaptive split,
adaptive/adaptable highlighted, adaptive/adaptable
minimised and mixed-initiative menus. Two
independent experiments were conducted, on small
menus (17 items) and large menus (29 items)
respectively. The experiment was conducted with 60
subjects (30 subjects each on small and large menus)
and was tested empirically by four independent
groups (15 subjects each). Results show that in
small menus, the minimised condition was preferred
overall, followed by the adaptable and highlighted
types. By contrast, in large menus, the
mixed-initiative condition was the most strongly
preferred, followed by the minimised approach.
|
53-60 |
Text-Driven Avatars based on
Artificial Neural Networks and Fuzzy Logic
Mario Malcangi
Abstract:
We discuss a new approach for driving avatars using
synthetic speech generated from pure text. Lip and
face muscles are controlled by the information
embedded in the utterance and its related
expressiveness. Rule-based, text-to-speech synthesis
is used to generate phonetic and expression
transcriptions of the text to be uttered by the
avatar. Two artificial neural networks, one for
text-to-phone transcription and the other for
phone-to-viseme mapping have been trained from
phonetic transcription data. Two fuzzy-logic engines
were tuned for smoothed control of lip and face
movement. Simulations have been run to test
neural-fuzzy controls using a parametric speech
synthesizer to generate voices and a face
synthesizer to generate facial movement.
Experimental results show that soft computing
affords a good solution for the smoothed control of
avatars during the expressive utterance of text.
|
61-69 |
Comparing Effectiveness and
Efficiency between Multimodal and Textual
Note-Taking Interfaces
Mohamed Sallam, Dimitrios Rigas
Abstract:
This paper describes an experimental study conducted
to investigate the use of multimodal metaphors in
the interface of e-learning applications. This
investigation involved two different interface
versions of the experimental e-learning tool. In the
first interface platform (textual interface), three
input modalities, were used to deliver information
about note-taking: text, graphic, and image. The
second version of the interface application
(multimodal interface) offered a combination of
multimodal metaphors such as recorded speech, video,
and avatar with simple facial expressions to
communicate the same information. The aim of the
experiment was to measure and compare the level of
usability of textual and multimodal interfaces. The
usability parameters, which are efficiency,
effectiveness, and user? satisfaction were
considered in the experiment. The results obtained
from this investigation have shown that the
multimodal e-learning interface increased the level
of usability as users took significantly less time
to complete the tasks, performed successfully in a
higher number of tasks, and were more satisfied than
when using the textual interface. These input
modalities could be used to improve the
attractiveness of note taking which in turn will be
reflected in increasing users? motivation and
interest in the learning material presented.
|
70-77 |
Paper
Title, Authors, Abstract (Issue 3, Volume 4, 2010) |
Pages |
Fast Algorithms for Preemptive
Scheduling of Jobs with Release Times on a Single
Processor to Minimize the Number of Late Jobs
Nodari Vakhania
Abstract:
We have n jobs with release times and due-dates to
be scheduled preemptively on a single-machine that
can handle at most one job at a time. Our objective
is to minimize the number of late jobs, ones
completed after their due-dates. This problem is
known to be solvable in time O(n3 log n). Here we
present two polynomial-time algorithms with a
superior running time. The first algorithm solves
optimally in time O(n2) the special case of the
problem when job processing times and due-dates are
tied so that for each pair of jobs i; j with di > dj
, pi pj . This particular setting has real-life
applications. The second algorithm runs in time O(n
log n) and works for the general version of the
problem. As we show, there are strong cases when the
algorithm finds an optimal solution.
|
79-87 |
The Influence of Game in
E-Learning: An Empirical Study
Dimitrios Rigas, Khaled Ayad
Abstract:
A human-computer interface is an attempt to mimic
human-human communication. In human-human
communication, especially in learning, students
interact emotionally either with each other or with
their instructor in way that minimizes, to some
extent, the formality of the learning
arena/environment. In web based learning these
emotions are usually not present within many of the
types of e-learning environments. Researchers on the
other hand have articulated that humour strengthens
students? performance in a learning environment
combined with amusement features. This mostly
happens online were users in front of unadulterated
educational screens. In this paper, we empirically
investigated the role of edutainment applied avatar
as a tool to represent the entertainment attributes
in an e-learning framework. The empirical
investigation aimed at measuring the usability of
four experimental game-based interfaces; each of
which is integrated with a combination of different
multi-modal features which included; text, earcons,
speech, and avatar. These four game-based learning
interfaces were introduced in four phases; the first
to be introduced consisted of text and speech only
(TS), the second, text and earcons only (TE), the
third, integrated with text, speech and earcons
(TSE) and finally fourth game was with text, speech,
earcons and avatar (TSEA). This combination of
various multi-modal metaphors with elearning systems
were examined to determine the preferable set of
multi-modal grouping that entertained and enhanced
user's performance. Effectiveness and efficiency of
these four environments were analyzed using an
independent group of users. The outcomes showed a
higher improvement rate in performance of students
who learnt with the game interface integrated with
the avatar than the other versions.
|
88-96 |
A Testing Theory for Real-Time
Systems
Stefan D. Bruda, Chun Dai
Abstract:
We develop a testing theory for real-time systems.
We keep the usual notion of success or failure
(based on finite runs) but we also provide a
mechanism of determining the success or failure of
infinite runs, using a formalism similar to the
acceptance in B¨uchi automata. We present two
refinement timed preorders similar to De Nicola and
Hennessy’s may and must testing. We then provide
alternative, behavioural and languagebased
characterizations for these relations to show that
the new preorders are extensions of the traditional
preorders. Finally we focus on test generation,
showing how tests can be automatically generated out
of a timed variant of linear-time logic formulae
(namely, TPTL), so that a process must pass the
generated test if and only if the process satisfies
the given temporal logic formula. Beside the obvious
use of such an algorithm (to generate tests), our
result also establishes a correspondence between
timed must testing and timed temporal logic.
|
97-106 |
Paper
Title, Authors, Abstract (Issue 4, Volume 4, 2010) |
Pages |
Assessment and Evaluation of
Different Data Fusion Techniques
A. K. Helmy, A. H. Nasr, Gh. S. El-Taweel
Abstract:
Data fusion is a formal framework for combining and
utilizing data originating from different sources.
It aims at obtaining information of greater quality
depending upon the application. There are many data
fusion techniques that can be used to produce
high-resolution multispectral images from a
high-resolution panchromatic (PAN) image and
low-resolution multispectral (MS) images, including
but not limited to, modified
Intensity–hue–saturation, Brovey transform,
Principal component analysis, Multiplicative
transform, Wavelet resolution merge, High-pass
filtering, and Ehlers fusion. One of the major
problems associated with a data fusion technique is
how to assess the quality of the fused (spatially
enhanced) MS images. This paper represents a
comprehensive analysis and evaluation of the most
commonly used data fusion techniques. The
performance of each data fusion method is
qualitatively and quantitatively analyzed. Then, the
methods are ranked according to the conclusions of
the visual analysis and the results from quality
budgets. An experiment based on Quickbird images
shows that there is inconsistency between different
performances measures used to evaluate data fusion
techniques.
|
107-115 |
Coupling Metrics for Business
Process Modeling
Wiem Khlif, Nahla Zaaboub, Hanene Ben-Abdallah
Abstract:
Modeling business processes is vital when improving
or automating existing business processes,
documenting processes properly or comparing business
processes. In addition, it is necessary to evaluate
the quality of a business process model through a
set of quality metrics. One specific categorie of
such metrics is coupling which measures the
functional and informational dependencies between
the tasks/processes in a business process model. Our
contribution in this paper consists in adapting
object oriented software coupling metrics for
business process models. This adaptation is based on
correspondences we establish between concepts of the
Business Process Modeling Notation and object
oriented concepts. The new adapted coupling metrics
offer more information about the dependencies among
processes and their tasks in terms of data and
control. They can be used, for instances to evaluate
the transferability effects of errors occurring in a
particular task/process. Finally, we validate
theoretically the proposed metrics.
|
116-123 |
Minimum Flow in Monotone
Parametric Bipartite Networks
Eleonor Ciurea, Mircea Parpalea
Abstract:
The algorithm presented in this paper solves the
minimum flow problem for a special parametric
bipartite network. The algorithm does not work
directly in the original network but in the
parametric residual network and finds a particular
state of the residual network from which the minimum
flow and the maximum cut for any of the parameter
values are obtained. The approach implements a
round-robin algorithm looping over a list of nodes
until an entire pass ends without any change of the
flow.
|
124-135 |
Homeostasis and Artificial
Neuron
Martin Ruzek, Tomas Brandejsky
Abstract:
Homeostasis is a property of a system that regulates
its internal environment in order to maintain stable
condition. This property is typical for biological
systems and therefore also for neural cell. This
article presents one possible use of the idea of
homeostasis in the field of the artificial neural
networks. The proposed neuron is a homeostat for
which the state of equilibrium means a situation
when the level of acceptance of its output reaches
its maximum. The neuron is operating with two kinds
of information: its input signal (as any artificial
neuron), and the input weights of other neurons that
are receiving its output. This idea is inspired by
the fact that the biological neuron can know which
part of its output energy is accepted by other
neurons. Several methods of the learning are
presented. The main feature of the proposed neuron
is the independence of the learning mode; no teacher
or higher structure are needed as for example in
back-propagation algorithm. Several qualities of the
homeostatic neuron, such as stability, speed of
learning and independence, are discussed. The
results of the first test are presented.
|
136-144 |
An Adaptive Modeling Approach
in Collaborative Data and Process-Aware Management
Systems
Ion Lungu, Andrei Mihalache
Abstract:
Informational systems are used to reflect the
business they are supposed to assist. This is the
reason why each informational system needs a
representation of the business objects that are
involved in the processes and also the business
rules that are applied to the business objects.
Every object is simpler than we can think and also
more complex than we can imagine. Objects should be
represented as simple as they are, not simpler and
not more complicated. Anytime people need to
communicate or record information, in any context,
it is very useful to create a model. Once the model
is implemented into a business application, most of
these software platforms are too inflexible to keep
pace with the business processes they support and
take place in a changing business context. This
paper introduces an adaptive approach for enterprise
data and process flow modeling in informational
systems.
|
145-152 |
Development of a Visualization
Tool for XML Documents
Khalil Shihab, Doreen Ying Ying Sim
Abstract:
We present the development of a prototype system
called Angur, which is designed and built for
visualization of XML documents. There two main
motivations of this work: firstly is to allow the
users to explore and manipulate XML documents and
secondly is to display the search results
graphically, in two or three dimensions, grouped by
topic or category. This prototype employs modern
interactive visualization techniques to provide a
visual presentation of a set of XML documents. The
motivation and evaluation of several design
features, such as keyword to concept mapping,
explicit clustering, the use of 3-D vs. 2-D, and the
relationship of visualization to logical structure
are described.
|
153-160 |
Concurrent Differential
Evolution Based on MapReduce
Kiyoharu Tagawa, Takashi Ishimizu
Abstract:
Multi-core processors, which have more than one
Central Processing Unit (CPU), have been introduced
widely into personal computers. Therefore, in order
to utilize the additional cores, or CPUs, to execute
various costly application programs, concurrent
implementations of them have been paid to attention.
MapReduce is a concurrent programming model and an
associated implementation for processing and
generating large data sets. This paper has been
participated in plenary presentation at the
conference of WSEAS and is presenting a further
progress of a concurrent implementation of
Differential Evolution (DE) based on MapReduce.
Especially, through the numerical experiment
conducted on a wide range of benchmark problems, the
speedup of DE due to the use of multiple cores is
demonstrated. Furthermore, the goodness of the
proposed concurrent implementation of DE is examined
and proved with respect to four categories, namely
efficiency, simplicity, portability and scalability.
|
161-168 |
Embedding Conditional Knowledge
Bases into Question Answering Systems and Java
Implementation
Nicolae Tandareanu, Mihaela Colhon, Cristina Zamfir
Abstract:
A conditional schema is a graph-based structure
which is able to represent conditional knowledge.
This structure was introduced in [11]. The inference
mechanism corresponding to the conditional schema
representations was developed in [12]. In this paper
we propose a question answering system that can
represent and process conditional knowledge using
these mechanisms. In order to accomplish this task
we refine the concept of conditional schema by
introducing the concepts of XML-conditional
structure generated by a conditional schema and
XML-conditional knowledge base for such a structure.
We describe the architecture of a question answering
systems that uses these structures. An
implementation by means of Java platform is briefly
described.
|
169-176 |
Legal Protection in the Field
of Information Technology in the EU
A. Ciurea
Abstract:
The information and comunication technologies
represent an essential side of the economy and
European society. Apparently, their evolution has
determined complex consequences also in the legal
field, because the access and use of information
technology have created new rights and obligations
for the beneficiaries of this technical progress.
This paper aims at presenting the most important
legal consequences that arose from the activities
through computer systems, according to EU and
Romania regulations.
|
177-184 |
Apples & Oranges? Comparing
Unconventional Computers
Ed Blakey
Abstract:
Complexity theorists routinely compare—via the
preordering induced by asymptotic notation—the
efficiency of computers so as to ascertain which
offers the most efficient solution to a given
problem. Tacit in this statement, however, is that
the computers conform to a standard computational
model: that is, they are Turing machines,
random-access machines or similar. However, whereas
meaningful comparison between these conventional
computers is well understood and correctly
practised, that of non-standard machines (such as
quantum, chemical and optical computers) is rarely
even attempted and, where it is, is often attempted
under the typically false assumption that the
conventional-computing approach to comparison is
adequate in the unconventional-computing case. We
discuss in the present paper a
computational-model-independent approach to the
comparison of computers’ complexity (and define the
corresponding complexity classes). Notably, the
approach allows meaningful comparison between an
unconventional computer and an existing,
digital-computer benchmark that solves the same
problem.
|
185-192 |
Clustering of EEG Data using
Maximum Entropy Method and LVQ
Yuji Mizuno, Hiroshi Mabuchi, Goutam Chakraborty,
Masafumi Matsuhara
Abstract:
The study of extracting electroencephalogram (EEG)
data as a source of significant information has
recently gained attention. However, since EEG data
are complex, it is difficult to extract them as a
source of intended, significant information. In
order to effectively extract EEG data, this paper
employs the maximum entropy method (MEM) for
frequency analyses and investigates an alpha
frequency band and beta frequency band in which
features are more apparent. At this time, both the
alpha and beta frequency bands are divided further
into several sub-bands so as to extract detailed EEG
data where the loss of data is small. In addition,
learning vector quantization (LVQ) is used for
clustering the EEG data with features extracted. In
this paper, we will demonstrate the effectiveness of
the proposed method by applying it to the EEG data
of one subject and two subjects and comparing the
results with other related studies. By applying the
proposed method further to the EEG data of three
subjects, and comparing the results with a related
study, the effectiveness of the proposed method will
be determined.
|
193-200 |
Using Sequence DNA Chips Data
to Mining and Diagnosing Cancer Patients
Zakaria Suliman Zubi, Marim Aboajela Emsaed
Abstract:
Deoxyribonucleic acid (DNA) micro-arrays present a
powerful means of observing thousands of gene terms
levels at the same time. They consist of high
dimensional datasets, which challenge conventional
clustering methods. The data’s high dimensionality
calls for Self Organizing Maps (SOMs) to cluster DNA
micro-array data. The DNA micro-array dataset are
stored in huge biological databases for several
purposes [1]. The proposed methods are based on the
idea of selecting a gene subset to distinguish all
classes, it will be more effective to solve a
multi-class problem, and we will propose a genetic
programming (GP) based approach to analyze
multi-class micro-array datasets. This biological
dataset will be derived from multiple biological
databases. The procedure responsible for extracting
datasets called DNA-Aggregator. We will design a
biological aggregator, which aggregates various
datasets via DNA micro-array community-developed
ontology based upon the concept of semantic Web for
integrating and exchanging biological data. Our
aggregator is composed of modules that retrieve the
data from various biological databases. It will also
enable queries by other applications to recognize
the genes. The genes will be categorized in groups
based on a classification method, which collects
similar expression patterns. Using a clustering
method such as k-mean is required either to discover
the groups of similar objects from the biological
database to characterize the underlying data
distribution.
|
201-214 |
Supporting Requirements
Engineering with Different Petri Net Classes
A. Spiteri Staines
Abstract:
This paper considers how Petri net main classes or
categories can be used to support systems and
software requirements engineering processes. In
general Petri nets are classifiable into four main
categories which are i) elementary nets , ii) Normal
Petri nets , iii) higher order nets and iv) timed
Petri nets or Petri nets with time. Apart from some
major fundamental differences, each category has a
specific use for systems engineering and software
engineering and thus they can clearly help with
requirements engineering issues. In this work the
main differences between these categories are
briefly explained. It is also shown how these Petri
net classes can be made to fit in a semi structured
approach. This is very useful for the analysis and
design of a whole range of system types. A simple
case study of a vending machine is used for
illustrating this work.
|
215-222 |
An Intelligent Web-based
GRA/Cointegration analysis for Systematic Risk
Shu Ling Lin, Shun Jyh Wu
Abstract:
A new intelligent web-based grey relational analysis
(GRA)/cointegration analysis is proposed to examine
the effects of cross-border bank M&As on the
systematic risk that took place in the American,
Asia, Europe, Africa and Middle East of banks in
this paper. The potential diversification gains that
arise from geographic or cross-border
diversification are studied using a database that
includes deals and bank stock return information for
114 cross-border M&As during 1998-2005.
Cointegration analysis is first developed to obtain
the relationship between financial variables and
web-based GRA is then applied to establish the
ranking and clustering of all acquirer events. The
findings have important regulatory policy
implications in that, the potential diversification
gains have obtained in home country. Consequently,
regulators in home countries may be less concerned
with a rise in systematic risk following
cross-border M&As, and no need to impose barriers to
restrict the cross-border M&As activity. Grey
relational analysis is demonstrated to be well
developed to the clustering and ranking of
cross-border M&As events. This study suggests that
the proposed intelligent web-based GRA/cointegration
analysis is effective and robust.
|
223-234 |
Intelligent Web-Based Fuzzy and
Grey Models for Hourly Wind Speed Forecast
Shun Jyh Wu, Shu Ling Lin
Abstract:
An intelligent web-based fuzzy model is developed
for the forecast of hourly wind speed. The hourly
wind speed of three meteorological stations in
Taiwan are forecasted and compared. The selected
sites of Taiwan meteorological stations are Lan-Yu,
Tung-Chi-Tao, and Wuci, whose wind speeds are the
highest among 25 areas during the period of
1971-2000. An intelligent time series model and
GM(1,1) are developed and used to forecast the
randomly distributed wind speed data. Hourly records
of wind speed are first used to establish
intelligent fuzzy linguistic functions, and then
fuzzy relational matrix is developed to form the
time series relationship. Effects of interval number
are studied. For the same order of the intelligent
fuzzy model, the model with higher interval number
provides better prediction of the hourly wind speed
with lower RMSE. On the other hand, GM(1,1) gives
higher RMSE for all the three site. The present
results demonstrate the benefits and the robustness
of the intelligent model.
|
235-242 |
Obtaining Thin Layers of ZnO
with Magnetron Sputtering Method
Chitanu Elena, Ionita Gheorghe
Abstract:
This paper presents research results on the
obtaining of ZnO thin layers using a method of
physical vapor deposition, namely magnetron
sputtering. They used two types of ZnO targets
sintered and non sintered. Deposit layers was done
by magnetron sputtering method in argon atmosphere.
Rigorous characterization of the deposited layers
was performed by analysis of electron microscopy SEM
and HRTEM.
|
243-250 |
|
|
Copyrighted Material, www.naun.org
NAUN
|