id
stringlengths 9
10
| submitter
stringlengths 5
47
⌀ | authors
stringlengths 5
1.72k
| title
stringlengths 11
234
| comments
stringlengths 1
491
⌀ | journal-ref
stringlengths 4
396
⌀ | doi
stringlengths 13
97
⌀ | report-no
stringlengths 4
138
⌀ | categories
stringclasses 1
value | license
stringclasses 9
values | abstract
stringlengths 29
3.66k
| versions
listlengths 1
21
| update_date
int64 1,180B
1,718B
| authors_parsed
sequencelengths 1
98
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|
0910.1026 | Pierrick Tranouez | Eric Daud\'e (IDEES), Pierrick Tranouez (LITIS), Patrice Langlois
(IDEES) | A multiagent urban traffic simulation. Part II: dealing with the
extraordinary | null | ICCSA 2009, France (2009) | null | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | In Probabilistic Risk Management, risk is characterized by two quantities:
the magnitude (or severity) of the adverse consequences that can potentially
result from the given activity or action, and by the likelihood of occurrence
of the given adverse consequences. But a risk seldom exists in isolation: chain
of consequences must be examined, as the outcome of one risk can increase the
likelihood of other risks. Systemic theory must complement classic PRM. Indeed
these chains are composed of many different elements, all of which may have a
critical importance at many different levels. Furthermore, when urban
catastrophes are envisioned, space and time constraints are key determinants of
the workings and dynamics of these chains of catastrophes: models must include
a correct spatial topology of the studied risk. Finally, literature insists on
the importance small events can have on the risk on a greater scale: urban
risks management models belong to self-organized criticality theory. We chose
multiagent systems to incorporate this property in our model: the behavior of
an agent can transform the dynamics of important groups of them.
| [
{
"version": "v1",
"created": "Tue, 6 Oct 2009 14:41:57 GMT"
}
] | 1,254,873,600,000 | [
[
"Daudé",
"Eric",
"",
"IDEES"
],
[
"Tranouez",
"Pierrick",
"",
"LITIS"
],
[
"Langlois",
"Patrice",
"",
"IDEES"
]
] |
0910.1238 | Yves Deville | Quang Dung Pham, Yves Deville, Pascal Van Hentenryck | A Local Search Modeling for Constrained Optimum Paths Problems (Extended
Abstract) | null | EPTCS 5, 2009, pp. 5-11 | 10.4204/EPTCS.5.1 | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | Constrained Optimum Path (COP) problems appear in many real-life
applications, especially on communication networks. Some of these problems have
been considered and solved by specific techniques which are usually difficult
to extend. In this paper, we introduce a novel local search modeling for
solving some COPs by local search. The modeling features the compositionality,
modularity, reuse and strengthens the benefits of Constrained-Based Local
Search. We also apply the modeling to the edge-disjoint paths problem (EDP). We
show that side constraints can easily be added in the model. Computational
results show the significance of the approach.
| [
{
"version": "v1",
"created": "Wed, 7 Oct 2009 12:36:40 GMT"
}
] | 1,254,960,000,000 | [
[
"Pham",
"Quang Dung",
""
],
[
"Deville",
"Yves",
""
],
[
"Van Hentenryck",
"Pascal",
""
]
] |
0910.1239 | Yves Deville | Farshid Hassani Bijarbooneh, Pierre Flener, Justin Pearson | Dynamic Demand-Capacity Balancing for Air Traffic Management Using
Constraint-Based Local Search: First Results | null | EPTCS 5, 2009, pp. 27-40 | 10.4204/EPTCS.5.3 | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | Using constraint-based local search, we effectively model and efficiently
solve the problem of balancing the traffic demands on portions of the European
airspace while ensuring that their capacity constraints are satisfied. The
traffic demand of a portion of airspace is the hourly number of flights planned
to enter it, and its capacity is the upper bound on this number under which
air-traffic controllers can work. Currently, the only form of demand-capacity
balancing we allow is ground holding, that is the changing of the take-off
times of not yet airborne flights. Experiments with projected European flight
plans of the year 2030 show that already this first form of demand-capacity
balancing is feasible without incurring too much total delay and that it can
lead to a significantly better demand-capacity balance.
| [
{
"version": "v1",
"created": "Wed, 7 Oct 2009 12:50:34 GMT"
}
] | 1,254,960,000,000 | [
[
"Bijarbooneh",
"Farshid Hassani",
""
],
[
"Flener",
"Pierre",
""
],
[
"Pearson",
"Justin",
""
]
] |
0910.1244 | Yves Deville | David Pereira, In\^es Lynce, Steven Prestwich | On Improving Local Search for Unsatisfiability | null | EPTCS 5, 2009, pp. 41-53 | 10.4204/EPTCS.5.4 | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | Stochastic local search (SLS) has been an active field of research in the
last few years, with new techniques and procedures being developed at an
astonishing rate. SLS has been traditionally associated with satisfiability
solving, that is, finding a solution for a given problem instance, as its
intrinsic nature does not address unsatisfiable problems. Unsatisfiable
instances were therefore commonly solved using backtrack search solvers. For
this reason, in the late 90s Selman, Kautz and McAllester proposed a challenge
to use local search instead to prove unsatisfiability. More recently, two SLS
solvers - Ranger and Gunsat - have been developed, which are able to prove
unsatisfiability albeit being SLS solvers. In this paper, we first compare
Ranger with Gunsat and then propose to improve Ranger performance using some of
Gunsat's techniques, namely unit propagation look-ahead and extended
resolution.
| [
{
"version": "v1",
"created": "Wed, 7 Oct 2009 16:08:44 GMT"
}
] | 1,254,960,000,000 | [
[
"Pereira",
"David",
""
],
[
"Lynce",
"Inês",
""
],
[
"Prestwich",
"Steven",
""
]
] |
0910.1247 | Yves Deville | Gilles Audenard, Jean-Marie Lagniez, Bertrand Mazure, Lakhdar Sa\"is | Integrating Conflict Driven Clause Learning to Local Search | null | EPTCS 5, 2009, pp. 55-68 | 10.4204/EPTCS.5.5 | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | This article introduces SatHyS (SAT HYbrid Solver), a novel hybrid approach
for propositional satisfiability. It combines local search and conflict driven
clause learning (CDCL) scheme. Each time the local search part reaches a local
minimum, the CDCL is launched. For SAT problems it behaves like a tabu list,
whereas for UNSAT ones, the CDCL part tries to focus on minimum unsatisfiable
sub-formula (MUS). Experimental results show good performances on many classes
of SAT instances from the last SAT competitions.
| [
{
"version": "v1",
"created": "Wed, 7 Oct 2009 16:06:29 GMT"
}
] | 1,254,960,000,000 | [
[
"Audenard",
"Gilles",
""
],
[
"Lagniez",
"Jean-Marie",
""
],
[
"Mazure",
"Bertrand",
""
],
[
"Saïs",
"Lakhdar",
""
]
] |
0910.1253 | Yves Deville | Fang He, Rong Qu | A Constraint-directed Local Search Approach to Nurse Rostering Problems | null | EPTCS 5, 2009, pp. 69-80 | 10.4204/EPTCS.5.6 | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | In this paper, we investigate the hybridization of constraint programming and
local search techniques within a large neighbourhood search scheme for solving
highly constrained nurse rostering problems. As identified by the research, a
crucial part of the large neighbourhood search is the selection of the fragment
(neighbourhood, i.e. the set of variables), to be relaxed and re-optimized
iteratively. The success of the large neighbourhood search depends on the
adequacy of this identified neighbourhood with regard to the problematic part
of the solution assignment and the choice of the neighbourhood size. We
investigate three strategies to choose the fragment of different sizes within
the large neighbourhood search scheme. The first two strategies are tailored
concerning the problem properties. The third strategy is more general, using
the information of the cost from the soft constraint violations and their
propagation as the indicator to choose the variables added into the fragment.
The three strategies are analyzed and compared upon a benchmark nurse rostering
problem. Promising results demonstrate the possibility of future work in the
hybrid approach.
| [
{
"version": "v1",
"created": "Wed, 7 Oct 2009 13:17:36 GMT"
}
] | 1,254,960,000,000 | [
[
"He",
"Fang",
""
],
[
"Qu",
"Rong",
""
]
] |
0910.1255 | Yves Deville | Marie Pelleau, Pascal Van Hentenryck, Charlotte Truchet | Sonet Network Design Problems | null | EPTCS 5, 2009, pp. 81-95 | 10.4204/EPTCS.5.7 | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | This paper presents a new method and a constraint-based objective function to
solve two problems related to the design of optical telecommunication networks,
namely the Synchronous Optical Network Ring Assignment Problem (SRAP) and the
Intra-ring Synchronous Optical Network Design Problem (IDP). These network
topology problems can be represented as a graph partitioning with capacity
constraints as shown in previous works. We present here a new objective
function and a new local search algorithm to solve these problems. Experiments
conducted in Comet allow us to compare our method to previous ones and show
that we obtain better results.
| [
{
"version": "v1",
"created": "Wed, 7 Oct 2009 13:22:22 GMT"
}
] | 1,254,960,000,000 | [
[
"Pelleau",
"Marie",
""
],
[
"Van Hentenryck",
"Pascal",
""
],
[
"Truchet",
"Charlotte",
""
]
] |
0910.1264 | Yves Deville | Salvator Abreu, Daniel Diaz, Philippe Codognet | Parallel local search for solving Constraint Problems on the Cell
Broadband Engine (Preliminary Results) | null | EPTCS 5, 2009, pp. 97-111 | 10.4204/EPTCS.5.8 | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | We explore the use of the Cell Broadband Engine (Cell/BE for short) for
combinatorial optimization applications: we present a parallel version of a
constraint-based local search algorithm that has been implemented on a
multiprocessor BladeCenter machine with twin Cell/BE processors (total of 16
SPUs per blade). This algorithm was chosen because it fits very well the
Cell/BE architecture and requires neither shared memory nor communication
between processors, while retaining a compact memory footprint. We study the
performance on several large optimization benchmarks and show that this
achieves mostly linear time speedups, even sometimes super-linear. This is
possible because the parallel implementation might explore simultaneously
different parts of the search space and therefore converge faster towards the
best sub-space and thus towards a solution. Besides getting speedups, the
resulting times exhibit a much smaller variance, which benefits applications
where a timely reply is critical.
| [
{
"version": "v1",
"created": "Wed, 7 Oct 2009 13:44:11 GMT"
}
] | 1,254,960,000,000 | [
[
"Abreu",
"Salvator",
""
],
[
"Diaz",
"Daniel",
""
],
[
"Codognet",
"Philippe",
""
]
] |
0910.1266 | Yves Deville | Jun He, Pierre Flener, Justin Pearson | Toward an automaton Constraint for Local Search | null | EPTCS 5, 2009, pp. 13-25 | 10.4204/EPTCS.5.2 | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | We explore the idea of using finite automata to implement new constraints for
local search (this is already a successful technique in constraint-based global
search). We show how it is possible to maintain incrementally the violations of
a constraint and its decision variables from an automaton that describes a
ground checker for that constraint. We establish the practicality of our
approach idea on real-life personnel rostering problems, and show that it is
competitive with the approach of [Pralong, 2007].
| [
{
"version": "v1",
"created": "Wed, 7 Oct 2009 13:49:26 GMT"
}
] | 1,254,960,000,000 | [
[
"He",
"Jun",
""
],
[
"Flener",
"Pierre",
""
],
[
"Pearson",
"Justin",
""
]
] |
0910.1404 | EPTCS | Yves Deville, Christine Solnon | Proceedings 6th International Workshop on Local Search Techniques in
Constraint Satisfaction | null | EPTCS 5, 2009 | 10.4204/EPTCS.5 | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | LSCS is a satellite workshop of the international conference on principles
and practice of Constraint Programming (CP), since 2004. It is devoted to local
search techniques in constraint satisfaction, and focuses on all aspects of
local search techniques, including: design and implementation of new
algorithms, hybrid stochastic-systematic search, reactive search optimization,
adaptive search, modeling for local-search, global constraints, flexibility and
robustness, learning methods, and specific applications.
| [
{
"version": "v1",
"created": "Thu, 8 Oct 2009 06:27:26 GMT"
}
] | 1,255,046,400,000 | [
[
"Deville",
"Yves",
""
],
[
"Solnon",
"Christine",
""
]
] |
0910.1433 | Jean Dezert | Albena Tchamova (IPP BAS), Jean Dezert (ONERA), Florentin Smarandache
(UNM) | Tracking object's type changes with fuzzy based fusion rule | null | First International Conference on Modelling and Development of
Intelligent Systems, Sibiu : Romania (2009) | null | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | In this paper the behavior of three combinational rules for
temporal/sequential attribute data fusion for target type estimation are
analyzed. The comparative analysis is based on: Dempster's fusion rule proposed
in Dempster-Shafer Theory; Proportional Conflict Redistribution rule no. 5
(PCR5), proposed in Dezert-Smarandache Theory and one alternative class fusion
rule, connecting the combination rules for information fusion with particular
fuzzy operators, focusing on the t-norm based Conjunctive rule as an analog of
the ordinary conjunctive rule and t-conorm based Disjunctive rule as an analog
of the ordinary disjunctive rule. The way how different t-conorms and t-norms
functions within TCN fusion rule influence over target type estimation
performance is studied and estimated.
| [
{
"version": "v1",
"created": "Thu, 8 Oct 2009 07:53:27 GMT"
}
] | 1,255,046,400,000 | [
[
"Tchamova",
"Albena",
"",
"IPP BAS"
],
[
"Dezert",
"Jean",
"",
"ONERA"
],
[
"Smarandache",
"Florentin",
"",
"UNM"
]
] |
0910.2217 | Tshilidzi Marwala | Linda Mthembu, Tshilidzi Marwala, Michael I. Friswell, Sondipon
Adhikari | Finite element model selection using Particle Swarm Optimization | Accepted for the Proceedings of the International Modal Analysis
Conference 2010 | null | null | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | This paper proposes the application of particle swarm optimization (PSO) to
the problem of finite element model (FEM) selection. This problem arises when a
choice of the best model for a system has to be made from set of competing
models, each developed a priori from engineering judgment. PSO is a
population-based stochastic search algorithm inspired by the behaviour of
biological entities in nature when they are foraging for resources. Each
potentially correct model is represented as a particle that exhibits both
individualistic and group behaviour. Each particle moves within the model
search space looking for the best solution by updating the parameters values
that define it. The most important step in the particle swarm algorithm is the
method of representing models which should take into account the number,
location and variables of parameters to be updated. One example structural
system is used to show the applicability of PSO in finding an optimal FEM. An
optimal model is defined as the model that has the least number of updated
parameters and has the smallest parameter variable variation from the mean
material properties. Two different objective functions are used to compare
performance of the PSO algorithm.
| [
{
"version": "v1",
"created": "Mon, 12 Oct 2009 19:10:58 GMT"
}
] | 1,255,392,000,000 | [
[
"Mthembu",
"Linda",
""
],
[
"Marwala",
"Tshilidzi",
""
],
[
"Friswell",
"Michael I.",
""
],
[
"Adhikari",
"Sondipon",
""
]
] |
0910.3485 | Yongzhi Cao | Yongzhi Cao and Guoqing Chen | A Fuzzy Petri Nets Model for Computing With Words | double columns 14 pages, 8 figures | IEEE Trans. Fuzzy Syst., vol. 18, no. 3, pp. 486-499, 2010 | null | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | Motivated by Zadeh's paradigm of computing with words rather than numbers,
several formal models of computing with words have recently been proposed.
These models are based on automata and thus are not well-suited for concurrent
computing. In this paper, we incorporate the well-known model of concurrent
computing, Petri nets, together with fuzzy set theory and thereby establish a
concurrency model of computing with words--fuzzy Petri nets for computing with
words (FPNCWs). The new feature of such fuzzy Petri nets is that the labels of
transitions are some special words modeled by fuzzy sets. By employing the
methodology of fuzzy reasoning, we give a faithful extension of an FPNCW which
makes it possible for computing with more words. The language expressiveness of
the two formal models of computing with words, fuzzy automata for computing
with words and FPNCWs, is compared as well. A few small examples are provided
to illustrate the theoretical development.
| [
{
"version": "v1",
"created": "Mon, 19 Oct 2009 09:09:43 GMT"
}
] | 1,317,686,400,000 | [
[
"Cao",
"Yongzhi",
""
],
[
"Chen",
"Guoqing",
""
]
] |
0911.2405 | Karim Mahboub | Karim Mahboub, Evelyne Cl\'ement, Cyrille Bertelle, V\'eronique Jay | Emotion: Appraisal-coping model for the "Cascades" problem | 6 pages | null | null | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | Modelling emotion has become a challenge nowadays. Therefore, several models
have been produced in order to express human emotional activity. However, only
a few of them are currently able to express the close relationship existing
between emotion and cognition. An appraisal-coping model is presented here,
with the aim to simulate the emotional impact caused by the evaluation of a
particular situation (appraisal), along with the consequent cognitive reaction
intended to face the situation (coping). This model is applied to the
"Cascades" problem, a small arithmetical exercise designed for ten-year-old
pupils. The goal is to create a model corresponding to a child's behaviour when
solving the problem using his own strategies.
| [
{
"version": "v1",
"created": "Thu, 12 Nov 2009 15:03:22 GMT"
}
] | 1,258,070,400,000 | [
[
"Mahboub",
"Karim",
""
],
[
"Clément",
"Evelyne",
""
],
[
"Bertelle",
"Cyrille",
""
],
[
"Jay",
"Véronique",
""
]
] |
0911.2501 | Karim Mahboub | Karim Mahboub (LITIS), Cyrille Bertelle (LITIS), V\'eronique Jay
(LITIS), Evelyne Cl\'ement | Emotion : mod\`ele d'appraisal-coping pour le probl\`eme des Cascades | null | null | null | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | Modeling emotion has become a challenge nowadays. Therefore, several models
have been produced in order to express human emotional activity. However, only
a few of them are currently able to express the close relationship existing
between emotion and cognition. An appraisal-coping model is presented here,
with the aim to simulate the emotional impact caused by the evaluation of a
particular situation (appraisal), along with the consequent cognitive reaction
intended to face the situation (coping). This model is applied to the
?Cascades? problem, a small arithmetical exercise designed for ten-year-old
pupils. The goal is to create a model corresponding to a child's behavior when
solving the problem using his own strategies.
| [
{
"version": "v1",
"created": "Thu, 12 Nov 2009 23:08:43 GMT"
}
] | 1,258,329,600,000 | [
[
"Mahboub",
"Karim",
"",
"LITIS"
],
[
"Bertelle",
"Cyrille",
"",
"LITIS"
],
[
"Jay",
"Véronique",
"",
"LITIS"
],
[
"Clément",
"Evelyne",
""
]
] |
0911.5394 | Ping Zhu | Ping Zhu | Covering rough sets based on neighborhoods: An approach without using
neighborhoods | 13 pages; to appear in International Journal of Approximate Reasoning | International Journal of Approximate Reasoning, 52(3): 461-472,
2011 | null | null | cs.AI | http://creativecommons.org/licenses/by-nc-sa/3.0/ | Rough set theory, a mathematical tool to deal with inexact or uncertain
knowledge in information systems, has originally described the indiscernibility
of elements by equivalence relations. Covering rough sets are a natural
extension of classical rough sets by relaxing the partitions arising from
equivalence relations to coverings. Recently, some topological concepts such as
neighborhood have been applied to covering rough sets. In this paper, we
further investigate the covering rough sets based on neighborhoods by
approximation operations. We show that the upper approximation based on
neighborhoods can be defined equivalently without using neighborhoods. To
analyze the coverings themselves, we introduce unary and composition operations
on coverings. A notion of homomorphismis provided to relate two covering
approximation spaces. We also examine the properties of approximations
preserved by the operations and homomorphisms, respectively.
| [
{
"version": "v1",
"created": "Sat, 28 Nov 2009 11:04:06 GMT"
},
{
"version": "v2",
"created": "Fri, 10 Dec 2010 06:34:05 GMT"
}
] | 1,426,204,800,000 | [
[
"Zhu",
"Ping",
""
]
] |
0911.5395 | Ping Zhu | Ping Zhu | An axiomatic approach to the roughness measure of rough sets | to appear in the Fundamenta Informaticae | Fundamenta Informaticae, 109(4): 463-480, 2011 | null | null | cs.AI | http://creativecommons.org/licenses/by-nc-sa/3.0/ | In Pawlak's rough set theory, a set is approximated by a pair of lower and
upper approximations. To measure numerically the roughness of an approximation,
Pawlak introduced a quantitative measure of roughness by using the ratio of the
cardinalities of the lower and upper approximations. Although the roughness
measure is effective, it has the drawback of not being strictly monotonic with
respect to the standard ordering on partitions. Recently, some improvements
have been made by taking into account the granularity of partitions. In this
paper, we approach the roughness measure in an axiomatic way. After
axiomatically defining roughness measure and partition measure, we provide a
unified construction of roughness measure, called strong Pawlak roughness
measure, and then explore the properties of this measure. We show that the
improved roughness measures in the literature are special instances of our
strong Pawlak roughness measure and introduce three more strong Pawlak
roughness measures as well. The advantage of our axiomatic approach is that
some properties of a roughness measure follow immediately as soon as the
measure satisfies the relevant axiomatic definition.
| [
{
"version": "v1",
"created": "Sat, 28 Nov 2009 11:07:59 GMT"
},
{
"version": "v2",
"created": "Tue, 25 May 2010 12:04:06 GMT"
}
] | 1,426,204,800,000 | [
[
"Zhu",
"Ping",
""
]
] |
0912.0132 | Fadi Badra | Fadi Badra (INRIA Lorraine - LORIA), Am\'elie Cordier (LIRIS), Jean
Lieber (INRIA Lorraine - LORIA) | Opportunistic Adaptation Knowledge Discovery | null | 8th International Conference on Case-Based Reasoning, ICCBR 2009,
Seattle : United States (2009) | null | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | Adaptation has long been considered as the Achilles' heel of case-based
reasoning since it requires some domain-specific knowledge that is difficult to
acquire. In this paper, two strategies are combined in order to reduce the
knowledge engineering cost induced by the adaptation knowledge (CA) acquisition
task: CA is learned from the case base by the means of knowledge discovery
techniques, and the CA acquisition sessions are opportunistically triggered,
i.e., at problem-solving time.
| [
{
"version": "v1",
"created": "Tue, 1 Dec 2009 12:08:47 GMT"
}
] | 1,259,712,000,000 | [
[
"Badra",
"Fadi",
"",
"INRIA Lorraine - LORIA"
],
[
"Cordier",
"Amélie",
"",
"LIRIS"
],
[
"Lieber",
"Jean",
"",
"INRIA Lorraine - LORIA"
]
] |
0912.3228 | Vadim Bulitko | Valeriy K. Bulitko and Vadim Bulitko | On Backtracking in Real-time Heuristic Search | null | null | null | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | Real-time heuristic search algorithms are suitable for situated agents that
need to make their decisions in constant time. Since the original work by Korf
nearly two decades ago, numerous extensions have been suggested. One of the
most intriguing extensions is the idea of backtracking wherein the agent
decides to return to a previously visited state as opposed to moving forward
greedily. This idea has been empirically shown to have a significant impact on
various performance measures. The studies have been carried out in particular
empirical testbeds with specific real-time search algorithms that use
backtracking. Consequently, the extent to which the trends observed are
characteristic of backtracking in general is unclear. In this paper, we present
the first entirely theoretical study of backtracking in real-time heuristic
search. In particular, we present upper bounds on the solution cost exponential
and linear in a parameter regulating the amount of backtracking. The results
hold for a wide class of real-time heuristic search algorithms that includes
many existing algorithms as a small subclass.
| [
{
"version": "v1",
"created": "Wed, 16 Dec 2009 18:59:29 GMT"
}
] | 1,261,008,000,000 | [
[
"Bulitko",
"Valeriy K.",
""
],
[
"Bulitko",
"Vadim",
""
]
] |
0912.3309 | Afshin Rostamizadeh | Corinna Cortes, Mehryar Mohri and Afshin Rostamizadeh | New Generalization Bounds for Learning Kernels | null | null | null | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | This paper presents several novel generalization bounds for the problem of
learning kernels based on the analysis of the Rademacher complexity of the
corresponding hypothesis sets. Our bound for learning kernels with a convex
combination of p base kernels has only a log(p) dependency on the number of
kernels, p, which is considerably more favorable than the previous best bound
given for the same problem. We also give a novel bound for learning with a
linear combination of p base kernels with an L_2 regularization whose
dependency on p is only in p^{1/4}.
| [
{
"version": "v1",
"created": "Thu, 17 Dec 2009 02:29:41 GMT"
}
] | 1,261,094,400,000 | [
[
"Cortes",
"Corinna",
""
],
[
"Mohri",
"Mehryar",
""
],
[
"Rostamizadeh",
"Afshin",
""
]
] |
0912.4584 | Brijnesh Jain | Brijnesh Jain and Klaus Obermayer | A Necessary and Sufficient Condition for Graph Matching Being Equivalent
to the Maximum Weight Clique Problem | 19 pages | null | null | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | This paper formulates a necessary and sufficient condition for a generic
graph matching problem to be equivalent to the maximum vertex and edge weight
clique problem in a derived association graph. The consequences of this results
are threefold: first, the condition is general enough to cover a broad range of
practical graph matching problems; second, a proof to establish equivalence
between graph matching and clique search reduces to showing that a given graph
matching problem satisfies the proposed condition; and third, the result sets
the scene for generic continuous solutions for a broad range of graph matching
problems. To illustrate the mathematical framework, we apply it to a number of
graph matching problems, including the problem of determining the graph edit
distance.
| [
{
"version": "v1",
"created": "Wed, 23 Dec 2009 08:40:51 GMT"
}
] | 1,261,612,800,000 | [
[
"Jain",
"Brijnesh",
""
],
[
"Obermayer",
"Klaus",
""
]
] |
0912.4598 | Brijnesh Jain | Brijnesh J. Jain and Klaus Obermayer | Elkan's k-Means for Graphs | 21 pages; submitted to MLJ | null | null | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | This paper extends k-means algorithms from the Euclidean domain to the domain
of graphs. To recompute the centroids, we apply subgradient methods for solving
the optimization-based formulation of the sample mean of graphs. To accelerate
the k-means algorithm for graphs without trading computational time against
solution quality, we avoid unnecessary graph distance calculations by
exploiting the triangle inequality of the underlying distance metric following
Elkan's k-means algorithm proposed in \cite{Elkan03}. In experiments we show
that the accelerated k-means algorithm are faster than the standard k-means
algorithm for graphs provided there is a cluster structure in the data.
| [
{
"version": "v1",
"created": "Wed, 23 Dec 2009 10:30:11 GMT"
}
] | 1,261,612,800,000 | [
[
"Jain",
"Brijnesh J.",
""
],
[
"Obermayer",
"Klaus",
""
]
] |
0912.4879 | Alain Bonardi | Alain Bonardi (STMS), Francis Rousseaux (STMS, CRESTIC) | Similarit\'e en intension vs en extension : \`a la crois\'ee de
l'informatique et du th\'e\^atre | null | Revue d'Intelligence Artificielle (2005) 281-288 | null | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | Traditional staging is based on a formal approach of similarity leaning on
dramaturgical ontologies and instanciation variations. Inspired by interactive
data mining, that suggests different approaches, we give an overview of
computer science and theater researches using computers as partners of the
actor to escape the a priori specification of roles.
| [
{
"version": "v1",
"created": "Thu, 24 Dec 2009 15:28:15 GMT"
}
] | 1,261,699,200,000 | [
[
"Bonardi",
"Alain",
"",
"STMS"
],
[
"Rousseaux",
"Francis",
"",
"STMS, CRESTIC"
]
] |
0912.5511 | Hans Tompits | James Delgrande, Torsten Schaub, Hans Tompits and Stefan Woltran | A general approach to belief change in answer set programming | 44 pages | null | null | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | We address the problem of belief change in (nonmonotonic) logic programming
under answer set semantics. Unlike previous approaches to belief change in
logic programming, our formal techniques are analogous to those of
distance-based belief revision in propositional logic. In developing our
results, we build upon the model theory of logic programs furnished by SE
models. Since SE models provide a formal, monotonic characterisation of logic
programs, we can adapt techniques from the area of belief revision to belief
change in logic programs. We introduce methods for revising and merging logic
programs, respectively. For the former, we study both subset-based revision as
well as cardinality-based revision, and we show that they satisfy the majority
of the AGM postulates for revision. For merging, we consider operators
following arbitration merging and IC merging, respectively. We also present
encodings for computing the revision as well as the merging of logic programs
within the same logic programming framework, giving rise to a direct
implementation of our approach in terms of off-the-shelf answer set solvers.
These encodings reflect in turn the fact that our change operators do not
increase the complexity of the base formalism.
| [
{
"version": "v1",
"created": "Wed, 30 Dec 2009 18:33:43 GMT"
}
] | 1,262,217,600,000 | [
[
"Delgrande",
"James",
""
],
[
"Schaub",
"Torsten",
""
],
[
"Tompits",
"Hans",
""
],
[
"Woltran",
"Stefan",
""
]
] |
0912.5533 | Reinhard Moratz | Reinhard Moratz, Dominik L\"ucke, Till Mossakowski | Oriented Straight Line Segment Algebra: Qualitative Spatial Reasoning
about Oriented Objects | null | null | null | null | cs.AI | http://creativecommons.org/licenses/publicdomain/ | Nearly 15 years ago, a set of qualitative spatial relations between oriented
straight line segments (dipoles) was suggested by Schlieder. This work received
substantial interest amongst the qualitative spatial reasoning community.
However, it turned out to be difficult to establish a sound constraint calculus
based on these relations. In this paper, we present the results of a new
investigation into dipole constraint calculi which uses algebraic methods to
derive sound results on the composition of relations and other properties of
dipole calculi. Our results are based on a condensed semantics of the dipole
relations.
In contrast to the points that are normally used, dipoles are extended and
have an intrinsic direction. Both features are important properties of natural
objects. This allows for a straightforward representation of prototypical
reasoning tasks for spatial agents. As an example, we show how to generate
survey knowledge from local observations in a street network. The example
illustrates the fast constraint-based reasoning capabilities of the dipole
calculus. We integrate our results into two reasoning tools which are publicly
available.
| [
{
"version": "v1",
"created": "Wed, 30 Dec 2009 20:38:12 GMT"
}
] | 1,262,217,600,000 | [
[
"Moratz",
"Reinhard",
""
],
[
"Lücke",
"Dominik",
""
],
[
"Mossakowski",
"Till",
""
]
] |
1001.0063 | Enrico Nardelli | Alessandro Epasto and Enrico Nardelli | On a Model for Integrated Information | null | null | null | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | In this paper we give a thorough presentation of a model proposed by Tononi
et al. for modeling \emph{integrated information}, i.e. how much information is
generated in a system transitioning from one state to the next one by the
causal interaction of its parts and \emph{above and beyond} the information
given by the sum of its parts. We also provides a more general formulation of
such a model, independent from the time chosen for the analysis and from the
uniformity of the probability distribution at the initial time instant.
Finally, we prove that integrated information is null for disconnected systems.
| [
{
"version": "v1",
"created": "Thu, 31 Dec 2009 01:44:12 GMT"
}
] | 1,262,649,600,000 | [
[
"Epasto",
"Alessandro",
""
],
[
"Nardelli",
"Enrico",
""
]
] |
1001.0921 | Brijnesh Jain | Brijnesh J. Jain and Klaus Obermayer | Graph Quantization | 24 pages; submitted to CVIU | null | null | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | Vector quantization(VQ) is a lossy data compression technique from signal
processing, which is restricted to feature vectors and therefore inapplicable
for combinatorial structures. This contribution presents a theoretical
foundation of graph quantization (GQ) that extends VQ to the domain of
attributed graphs. We present the necessary Lloyd-Max conditions for optimality
of a graph quantizer and consistency results for optimal GQ design based on
empirical distortion measures and stochastic optimization. These results
statistically justify existing clustering algorithms in the domain of graphs.
The proposed approach provides a template of how to link structural pattern
recognition methods other than GQ to statistical pattern recognition.
| [
{
"version": "v1",
"created": "Wed, 6 Jan 2010 15:46:03 GMT"
}
] | 1,262,822,400,000 | [
[
"Jain",
"Brijnesh J.",
""
],
[
"Obermayer",
"Klaus",
""
]
] |
1001.1257 | Franco Bagnoli | Graziano Barnabei, Franco Bagnoli, Ciro Conversano, Elena Lensi | Decisional Processes with Boolean Neural Network: the Emergence of
Mental Schemes | 11 pages, 7 figures | null | null | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | Human decisional processes result from the employment of selected quantities
of relevant information, generally synthesized from environmental incoming data
and stored memories. Their main goal is the production of an appropriate and
adaptive response to a cognitive or behavioral task. Different strategies of
response production can be adopted, among which haphazard trials, formation of
mental schemes and heuristics. In this paper, we propose a model of Boolean
neural network that incorporates these strategies by recurring to global
optimization strategies during the learning session. The model characterizes as
well the passage from an unstructured/chaotic attractor neural network typical
of data-driven processes to a faster one, forward-only and representative of
schema-driven processes. Moreover, a simplified version of the Iowa Gambling
Task (IGT) is introduced in order to test the model. Our results match with
experimental data and point out some relevant knowledge coming from
psychological domain.
| [
{
"version": "v1",
"created": "Fri, 8 Jan 2010 12:34:04 GMT"
}
] | 1,473,292,800,000 | [
[
"Barnabei",
"Graziano",
""
],
[
"Bagnoli",
"Franco",
""
],
[
"Conversano",
"Ciro",
""
],
[
"Lensi",
"Elena",
""
]
] |
1001.1836 | Rdv Ijcsis | Mofreh Hogo, Khaled Fouad, Fouad Mousa | Web-Based Expert System for Civil Service Regulations: RCSES | 10 pages IEEE format, International Journal of Computer Science and
Information Security, IJCSIS December 2009, ISSN 1947 5500,
http://sites.google.com/site/ijcsis/ | International Journal of Computer Science and Information
Security, IJCSIS, Vol. 6, No. 3, pp. 007-016, December 2009, USA | null | ISSN 1947 5500 | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | Internet and expert systems have offered new ways of sharing and distributing
knowledge, but there is a lack of researches in the area of web based expert
systems. This paper introduces a development of a web-based expert system for
the regulations of civil service in the Kingdom of Saudi Arabia named as RCSES.
It is the first time to develop such system (application of civil service
regulations) as well the development of it using web based approach. The
proposed system considers 17 regulations of the civil service system. The
different phases of developing the RCSES system are presented, as knowledge
acquiring and selection, ontology and knowledge representations using XML
format. XML Rule-based knowledge sources and the inference mechanisms were
implemented using ASP.net technique. An interactive tool for entering the
ontology and knowledge base, and the inferencing was built. It gives the
ability to use, modify, update, and extend the existing knowledge base in an
easy way. The knowledge was validated by experts in the domain of civil service
regulations, and the proposed RCSES was tested, verified, and validated by
different technical users and the developers staff. The RCSES system is
compared with other related web based expert systems, that comparison proved
the goodness, usability, and high performance of RCSES.
| [
{
"version": "v1",
"created": "Tue, 12 Jan 2010 10:07:44 GMT"
}
] | 1,263,340,800,000 | [
[
"Hogo",
"Mofreh",
""
],
[
"Fouad",
"Khaled",
""
],
[
"Mousa",
"Fouad",
""
]
] |
1001.2277 | Rdv Ijcsis | I. Elamvazuthi, T. Ganesan, P. Vasant, J. F. Webb | Application of a Fuzzy Programming Technique to Production Planning in
the Textile Industry | 6 pages IEEE format, International Journal of Computer Science and
Information Security, IJCSIS December 2009, ISSN 1947 5500,
http://sites.google.com/site/ijcsis/ | International Journal of Computer Science and Information
Security, IJCSIS, Vol. 6, No. 3, pp. 238-243, December 2009, USA | null | Volume 6, No. 3, ISSN 1947 5500 | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | Many engineering optimization problems can be considered as linear
programming problems where all or some of the parameters involved are
linguistic in nature. These can only be quantified using fuzzy sets. The aim of
this paper is to solve a fuzzy linear programming problem in which the
parameters involved are fuzzy quantities with logistic membership functions. To
explore the applicability of the method a numerical example is considered to
determine the monthly production planning quotas and profit of a home textile
group.
| [
{
"version": "v1",
"created": "Wed, 13 Jan 2010 19:22:26 GMT"
}
] | 1,263,427,200,000 | [
[
"Elamvazuthi",
"I.",
""
],
[
"Ganesan",
"T.",
""
],
[
"Vasant",
"P.",
""
],
[
"Webb",
"J. F.",
""
]
] |
1001.2279 | Rdv Ijcsis | I. Elamvazuthi, P. Vasant, J. F. Webb | The Application of Mamdani Fuzzy Model for Auto Zoom Function of a
Digital Camera | 6 pages IEEE format, International Journal of Computer Science and
Information Security, IJCSIS December 2009, ISSN 1947 5500,
http://sites.google.com/site/ijcsis/ | International Journal of Computer Science and Information
Security, IJCSIS, Vol. 6, No. 3, pp. 244-249, December 2009, USA | null | Volume 6, No. 3, ISSN 1947 5500 | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | Mamdani Fuzzy Model is an important technique in Computational Intelligence
(CI) study. This paper presents an implementation of a supervised learning
method based on membership function training in the context of Mamdani fuzzy
models. Specifically, auto zoom function of a digital camera is modelled using
Mamdani technique. The performance of control method is verified through a
series of simulation and numerical results are provided as illustrations.
| [
{
"version": "v1",
"created": "Wed, 13 Jan 2010 19:25:09 GMT"
}
] | 1,263,427,200,000 | [
[
"Elamvazuthi",
"I.",
""
],
[
"Vasant",
"P.",
""
],
[
"Webb",
"J. F.",
""
]
] |
1002.0102 | Florentin Smarandache | Florentin Smarandache | $\alpha$-Discounting Multi-Criteria Decision Making ($\alpha$-D MCDM) | 62 pages | Proceedings of Fusion 2010 International Conference, Edinburgh,
Scotland, 26-29 July, 2010 | null | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | In this book we introduce a new procedure called \alpha-Discounting Method
for Multi-Criteria Decision Making (\alpha-D MCDM), which is as an alternative
and extension of Saaty Analytical Hierarchy Process (AHP). It works for any
number of preferences that can be transformed into a system of homogeneous
linear equations. A degree of consistency (and implicitly a degree of
inconsistency) of a decision-making problem are defined. \alpha-D MCDM is
afterwards generalized to a set of preferences that can be transformed into a
system of linear and or non-linear homogeneous and or non-homogeneous equations
and or inequalities. The general idea of \alpha-D MCDM is to assign non-null
positive parameters \alpha_1, \alpha_2, and so on \alpha_p to the coefficients
in the right-hand side of each preference that diminish or increase them in
order to transform the above linear homogeneous system of equations which has
only the null-solution, into a system having a particular non-null solution.
After finding the general solution of this system, the principles used to
assign particular values to all parameters \alpha is the second important part
of \alpha-D, yet to be deeper investigated in the future. In the current book
we propose the Fairness Principle, i.e. each coefficient should be discounted
with the same percentage (we think this is fair: not making any favoritism or
unfairness to any coefficient), but the reader can propose other principles.
For consistent decision-making problems with pairwise comparisons,
\alpha-Discounting Method together with the Fairness Principle give the same
result as AHP. But for weak inconsistent decision-making problem,
\alpha-Discounting together with the Fairness Principle give a different result
from AHP. Many consistent, weak inconsistent, and strong inconsistent examples
are given in this book.
| [
{
"version": "v1",
"created": "Sun, 31 Jan 2010 02:38:07 GMT"
},
{
"version": "v2",
"created": "Wed, 3 Feb 2010 03:44:43 GMT"
},
{
"version": "v3",
"created": "Wed, 10 Feb 2010 18:58:01 GMT"
},
{
"version": "v4",
"created": "Fri, 2 Dec 2011 20:08:40 GMT"
},
{
"version": "v5",
"created": "Fri, 2 Oct 2015 19:09:38 GMT"
}
] | 1,444,003,200,000 | [
[
"Smarandache",
"Florentin",
""
]
] |
1002.0136 | Lars Kotthoff | Lars Kotthoff | Dominion -- A constraint solver generator | null | null | null | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | This paper proposes a design for a system to generate constraint solvers that
are specialised for specific problem models. It describes the design in detail
and gives preliminary experimental results showing the feasibility and
effectiveness of the approach.
| [
{
"version": "v1",
"created": "Sun, 31 Jan 2010 15:46:56 GMT"
}
] | 1,265,068,800,000 | [
[
"Kotthoff",
"Lars",
""
]
] |
1002.0177 | Chinmayananda Padhy Mr | C.N. Padhy, R.R. Panda | Logical Evaluation of Consciousness: For Incorporating Consciousness
into Machine Architecture | null | null | null | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | Machine Consciousness is the study of consciousness in a biological,
philosophical, mathematical and physical perspective and designing a model that
can fit into a programmable system architecture. Prime objective of the study
is to make the system architecture behave consciously like a biological model
does. Present work has developed a feasible definition of consciousness, that
characterizes consciousness with four parameters i.e., parasitic, symbiotic,
self referral and reproduction. Present work has also developed a biologically
inspired consciousness architecture that has following layers: quantum layer,
cellular layer, organ layer and behavioral layer and traced the characteristics
of consciousness at each layer. Finally, the work has estimated physical and
algorithmic architecture to devise a system that can behave consciously.
| [
{
"version": "v1",
"created": "Mon, 1 Feb 2010 04:07:34 GMT"
}
] | 1,265,068,800,000 | [
[
"Padhy",
"C. N.",
""
],
[
"Panda",
"R. R.",
""
]
] |
1002.0449 | Ping Zhu | Ping Zhu and Qiaoyan Wen | Some improved results on communication between information systems | 12 pages | Information Sciences, 180(18): 3521-3531, 2010 | null | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | To study the communication between information systems, Wang et al. [C. Wang,
C. Wu, D. Chen, Q. Hu, and C. Wu, Communicating between information systems,
Information Sciences 178 (2008) 3228-3239] proposed two concepts of type-1 and
type-2 consistent functions. Some properties of such functions and induced
relation mappings have been investigated there. In this paper, we provide an
improvement of the aforementioned work by disclosing the symmetric relationship
between type-1 and type-2 consistent functions. We present more properties of
consistent functions and induced relation mappings and improve upon several
deficient assertions in the original work. In particular, we unify and extend
type-1 and type-2 consistent functions into the so-called
neighborhood-consistent functions. This provides a convenient means for
studying the communication between information systems based on various
neighborhoods.
| [
{
"version": "v1",
"created": "Tue, 2 Feb 2010 10:54:30 GMT"
},
{
"version": "v2",
"created": "Thu, 8 Jul 2010 09:34:19 GMT"
}
] | 1,426,204,800,000 | [
[
"Zhu",
"Ping",
""
],
[
"Wen",
"Qiaoyan",
""
]
] |
1002.0908 | Ping Zhu | Ping Zhu and Qiaoyan Wen | Homomorphisms between fuzzy information systems revisited | 10 pages | Applied Mathematics Letters, 24(9): 1548-1553, 2011 | null | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | Recently, Wang et al. discussed the properties of fuzzy information systems
under homomorphisms in the paper [C. Wang, D. Chen, L. Zhu, Homomorphisms
between fuzzy information systems, Applied Mathematics Letters 22 (2009)
1045-1050], where homomorphisms are based upon the concepts of consistent
functions and fuzzy relation mappings. In this paper, we classify consistent
functions as predecessor-consistent and successor-consistent, and then proceed
to present more properties of consistent functions. In addition, we improve
some characterizations of fuzzy relation mappings provided by Wang et al.
| [
{
"version": "v1",
"created": "Thu, 4 Feb 2010 07:09:46 GMT"
}
] | 1,320,710,400,000 | [
[
"Zhu",
"Ping",
""
],
[
"Wen",
"Qiaoyan",
""
]
] |
1002.1157 | Vishal Goyal | K. Soorya Prakash, S. S. Mohamed Nazirudeen, M. Joseph Malvin Raj | Establishment of Relationships between Material Design and Product
Design Domains by Hybrid FEM-ANN Technique | International Journal of Computer Science Issues, IJCSI, Vol. 7,
Issue 1, No. 1, January 2010,
http://ijcsi.org/articles/Establishment-of-Relationships-between-Material-Design-and-Product-Design-Domains-by-Hybrid-FEM-ANN-Technique.php | International Journal of Computer Science Issues, IJCSI, Vol. 7,
Issue 1, No. 1, January 2010,
http://ijcsi.org/articles/Establishment-of-Relationships-between-Material-Design-and-Product-Design-Domains-by-Hybrid-FEM-ANN-Technique.php | null | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | In this paper, research on AI based modeling technique to optimize
development of new alloys with necessitated improvements in properties and
chemical mixture over existing alloys as per functional requirements of product
is done. The current research work novels AI in lieu of predictions to
establish association between material and product customary. Advanced
computational simulation techniques like CFD, FEA interrogations are made
viable to authenticate product dynamics in context to experimental
investigations. Accordingly, the current research is focused towards binding
relationships between material design and product design domains. The input to
feed forward back propagation prediction network model constitutes of material
design features. Parameters relevant to product design strategies are furnished
as target outputs. The outcomes of ANN shows good sign of correlation between
material and product design domains. The study enriches a new path to
illustrate material factors at the time of new product development.
| [
{
"version": "v1",
"created": "Fri, 5 Feb 2010 09:02:54 GMT"
}
] | 1,265,587,200,000 | [
[
"Prakash",
"K. Soorya",
""
],
[
"Nazirudeen",
"S. S. Mohamed",
""
],
[
"Raj",
"M. Joseph Malvin",
""
]
] |
1002.2202 | Rdv Ijcsis | Ramesh Kumar Gopala Pillai, Dr. Ramakanth Kumar .P | Modeling of Human Criminal Behavior using Probabilistic Networks | IEEE format, International Journal of Computer Science and
Information Security, IJCSIS January 2010, ISSN 1947 5500,
http://sites.google.com/site/ijcsis/ | International Journal of Computer Science and Information
Security, IJCSIS, Vol. 7, No. 1, pp. 216-219, January 2010, USA | null | Journal of Computer Science, ISSN 19475500 | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | Currently, criminals profile (CP) is obtained from investigators or forensic
psychologists interpretation, linking crime scene characteristics and an
offenders behavior to his or her characteristics and psychological profile.
This paper seeks an efficient and systematic discovery of nonobvious and
valuable patterns between variables from a large database of solved cases via a
probabilistic network (PN) modeling approach. The PN structure can be used to
extract behavioral patterns and to gain insight into what factors influence
these behaviors. Thus, when a new case is being investigated and the profile
variables are unknown because the offender has yet to be identified, the
observed crime scene variables are used to infer the unknown variables based on
their connections in the structure and the corresponding numerical
(probabilistic) weights. The objective is to produce a more systematic and
empirical approach to profiling, and to use the resulting PN model as a
decision tool.
| [
{
"version": "v1",
"created": "Wed, 10 Feb 2010 20:21:52 GMT"
}
] | 1,265,846,400,000 | [
[
"Pillai",
"Ramesh Kumar Gopala",
""
],
[
"P",
"Dr. Ramakanth Kumar .",
""
]
] |
1002.2897 | Raphael Chenouard | Raphael Chenouard (LINA), Laurent Granvilliers (LINA), Ricardo Soto
(LINA) | Model-Driven Constraint Programming | null | International Conference on Principles and Practice of Declarative
Programming, Valence : Spain (2008) | 10.1145/1389449.1389479 | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | Constraint programming can definitely be seen as a model-driven paradigm. The
users write programs for modeling problems. These programs are mapped to
executable models to calculate the solutions. This paper focuses on efficient
model management (definition and transformation). From this point of view, we
propose to revisit the design of constraint-programming systems. A model-driven
architecture is introduced to map solving-independent constraint models to
solving-dependent decision models. Several important questions are examined,
such as the need for a visual highlevel modeling language, and the quality of
metamodeling techniques to implement the transformations. A main result is the
s-COMMA platform that efficiently implements the chain from modeling to solving
constraint problems
| [
{
"version": "v1",
"created": "Mon, 15 Feb 2010 15:47:29 GMT"
}
] | 1,266,278,400,000 | [
[
"Chenouard",
"Raphael",
"",
"LINA"
],
[
"Granvilliers",
"Laurent",
"",
"LINA"
],
[
"Soto",
"Ricardo",
"",
"LINA"
]
] |
1002.3023 | Raphael Chenouard | Raphael Chenouard (LINA), Laurent Granvilliers (LINA), Ricardo Soto
(LINA) | Rewriting Constraint Models with Metamodels | null | The eight symposium on abstraction, reformulation, and
approximation, Lake Arrowhead : United States (2009) | null | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | An important challenge in constraint programming is to rewrite constraint
models into executable programs calculat- ing the solutions. This phase of
constraint processing may require translations between constraint programming
lan- guages, transformations of constraint representations, model
optimizations, and tuning of solving strategies. In this paper, we introduce a
pivot metamodel describing the common fea- tures of constraint models including
different kinds of con- straints, statements like conditionals and loops, and
other first-class elements like object classes and predicates. This metamodel
is general enough to cope with the constructions of many languages, from
object-oriented modeling languages to logic languages, but it is independent
from them. The rewriting operations manipulate metamodel instances apart from
languages. As a consequence, the rewriting operations apply whatever languages
are selected and they are able to manage model semantic information. A bridge
is created between the metamodel space and languages using parsing techniques.
Tools from the software engineering world can be useful to implement this
framework.
| [
{
"version": "v1",
"created": "Tue, 16 Feb 2010 07:26:48 GMT"
}
] | 1,266,364,800,000 | [
[
"Chenouard",
"Raphael",
"",
"LINA"
],
[
"Granvilliers",
"Laurent",
"",
"LINA"
],
[
"Soto",
"Ricardo",
"",
"LINA"
]
] |
1002.3078 | Raphael Chenouard | Raphael Chenouard (LINA), Laurent Granvilliers (LINA), Ricardo Soto
(LINA) | Using ATL to define advanced and flexible constraint model
transformations | null | MtATL2009, Nantes : France (2009) | null | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | Transforming constraint models is an important task in re- cent constraint
programming systems. User-understandable models are defined during the modeling
phase but rewriting or tuning them is manda- tory to get solving-efficient
models. We propose a new architecture al- lowing to define bridges between any
(modeling or solver) languages and to implement model optimizations. This
architecture follows a model- driven approach where the constraint modeling
process is seen as a set of model transformations. Among others, an interesting
feature is the def- inition of transformations as concept-oriented rules, i.e.
based on types of model elements where the types are organized into a hierarchy
called a metamodel.
| [
{
"version": "v1",
"created": "Tue, 16 Feb 2010 13:09:07 GMT"
}
] | 1,266,364,800,000 | [
[
"Chenouard",
"Raphael",
"",
"LINA"
],
[
"Granvilliers",
"Laurent",
"",
"LINA"
],
[
"Soto",
"Ricardo",
"",
"LINA"
]
] |
1002.4522 | Vitaly Schetinin | L. Jakaite, V. Schetinin, and C. Maple | Feature Importance in Bayesian Assessment of Newborn Brain Maturity from
EEG | Proceedings of the 9th WSEAS International Conference on Artificial
Intelligence, Knowledge Engineering and Data Bases (AIKED), University of
Cambridge, UK, 2010, edited by L. A. Zadeh et al, pp 191 - 195 | Proceedings of the 9th WSEAS International Conference on
Artificial Intelligence, Knowledge Engineering and Data Bases (AIKED),
University of Cambridge, UK, 2010, edited by L. A. Zadeh et al, pp 191 - 195 | null | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | The methodology of Bayesian Model Averaging (BMA) is applied for assessment
of newborn brain maturity from sleep EEG. In theory this methodology provides
the most accurate assessments of uncertainty in decisions. However, the
existing BMA techniques have been shown providing biased assessments in the
absence of some prior information enabling to explore model parameter space in
details within a reasonable time. The lack in details leads to disproportional
sampling from the posterior distribution. In case of the EEG assessment of
brain maturity, BMA results can be biased because of the absence of information
about EEG feature importance. In this paper we explore how the posterior
information about EEG features can be used in order to reduce a negative impact
of disproportional sampling on BMA performance. We use EEG data recorded from
sleeping newborns to test the efficiency of the proposed BMA technique.
| [
{
"version": "v1",
"created": "Wed, 24 Feb 2010 11:11:52 GMT"
}
] | 1,267,056,000,000 | [
[
"Jakaite",
"L.",
""
],
[
"Schetinin",
"V.",
""
],
[
"Maple",
"C.",
""
]
] |
1003.0590 | Eduard Babkin | Sami Al-Maqtari (LITIS), Habib Abdulrab (LITIS), Eduard Babkin (LITIS) | A new model for solution of complex distributed constrained problems | null | Computer Systems and Applications, ACS/IEEE International
Conference on 0 (2009) 660-667 | null | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | In this paper we describe an original computational model for solving
different types of Distributed Constraint Satisfaction Problems (DCSP). The
proposed model is called Controller-Agents for Constraints Solving (CACS). This
model is intended to be used which is an emerged field from the integration
between two paradigms of different nature: Multi-Agent Systems (MAS) and the
Constraint Satisfaction Problem paradigm (CSP) where all constraints are
treated in central manner as a black-box. This model allows grouping
constraints to form a subset that will be treated together as a local problem
inside the controller. Using this model allows also handling non-binary
constraints easily and directly so that no translating of constraints into
binary ones is needed. This paper presents the implementation outlines of a
prototype of DCSP solver, its usage methodology and overview of the CACS
application for timetabling problems.
| [
{
"version": "v1",
"created": "Tue, 2 Mar 2010 13:40:43 GMT"
}
] | 1,267,574,400,000 | [
[
"Al-Maqtari",
"Sami",
"",
"LITIS"
],
[
"Abdulrab",
"Habib",
"",
"LITIS"
],
[
"Babkin",
"Eduard",
"",
"LITIS"
]
] |
1003.0746 | Raphael Chenouard | Raphael Chenouard (LINA), Fr\'ed\'eric Jouault (INRIA - EMN) | Automatically Discovering Hidden Transformation Chaining Constraints | null | ACM/IEEE 12th International Conference on Model Driven Engineering
Languages and Systems, Denver : United States (2009) | 10.1007/978-3-642-04425-0_8 | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | Model transformations operate on models conforming to precisely defined
metamodels. Consequently, it often seems relatively easy to chain them: the
output of a transformation may be given as input to a second one if metamodels
match. However, this simple rule has some obvious limitations. For instance, a
transformation may only use a subset of a metamodel. Therefore, chaining
transformations appropriately requires more information. We present here an
approach that automatically discovers more detailed information about actual
chaining constraints by statically analyzing transformations. The objective is
to provide developers who decide to chain transformations with more data on
which to base their choices. This approach has been successfully applied to the
case of a library of endogenous transformations. They all have the same source
and target metamodel but have some hidden chaining constraints. In such a case,
the simple metamodel matching rule given above does not provide any useful
information.
| [
{
"version": "v1",
"created": "Wed, 3 Mar 2010 08:04:45 GMT"
}
] | 1,267,660,800,000 | [
[
"Chenouard",
"Raphael",
"",
"LINA"
],
[
"Jouault",
"Frédéric",
"",
"INRIA - EMN"
]
] |
1003.1493 | Rdv Ijcsis | Mariana Maceiras Cabrera, Ernesto Ocampo Edye | Integration of Rule Based Expert Systems and Case Based Reasoning in an
Acute Bacterial Meningitis Clinical Decision Support System | Pages IEEE format, International Journal of Computer Science and
Information Security, IJCSIS, Vol. 7 No. 2, February 2010, USA. ISSN 1947
5500, http://sites.google.com/site/ijcsis/ | null | null | null | cs.AI | http://creativecommons.org/licenses/by-nc-sa/3.0/ | This article presents the results of the research carried out on the
development of a medical diagnostic system applied to the Acute Bacterial
Meningitis, using the Case Based Reasoning methodology. The research was
focused on the implementation of the adaptation stage, from the integration of
Case Based Reasoning and Rule Based Expert Systems. In this adaptation stage we
use a higher level RBC that stores and allows reutilizing change experiences,
combined with a classic rule-based inference engine. In order to take into
account the most evident clinical situation, a pre-diagnosis stage is
implemented using a rule engine that, given an evident situation, emits the
corresponding diagnosis and avoids the complete process.
| [
{
"version": "v1",
"created": "Sun, 7 Mar 2010 17:09:49 GMT"
}
] | 1,272,412,800,000 | [
[
"Cabrera",
"Mariana Maceiras",
""
],
[
"Edye",
"Ernesto Ocampo",
""
]
] |
1003.1504 | Rdv Ijcsis | Saba Bashir, Farhan Hassan Khan, M.Younus Javed, Aihab Khan, Malik
Sikandar Hayat Khiyal | Indexer Based Dynamic Web Services Discovery | Pages IEEE format, International Journal of Computer Science and
Information Security, IJCSIS, Vol. 7 No. 2, February 2010, USA. ISSN 1947
5500, http://sites.google.com/site/ijcsis/ | null | null | null | cs.AI | http://creativecommons.org/licenses/by-nc-sa/3.0/ | Recent advancement in web services plays an important role in business to
business and business to consumer interaction. Discovery mechanism is not only
used to find a suitable service but also provides collaboration between service
providers and consumers by using standard protocols. A static web service
discovery mechanism is not only time consuming but requires continuous human
interaction. This paper proposed an efficient dynamic web services discovery
mechanism that can locate relevant and updated web services from service
registries and repositories with timestamp based on indexing value and
categorization for faster and efficient discovery of service. The proposed
prototype focuses on quality of service issues and introduces concept of local
cache, categorization of services, indexing mechanism, CSP (Constraint
Satisfaction Problem) solver, aging and usage of translator. Performance of
proposed framework is evaluated by implementing the algorithm and correctness
of our method is shown. The results of proposed framework shows greater
performance and accuracy in dynamic discovery mechanism of web services
resolving the existing issues of flexibility, scalability, based on quality of
service, and discovers updated and most relevant services with ease of usage.
| [
{
"version": "v1",
"created": "Sun, 7 Mar 2010 18:04:28 GMT"
}
] | 1,272,412,800,000 | [
[
"Bashir",
"Saba",
""
],
[
"Khan",
"Farhan Hassan",
""
],
[
"Javed",
"M. Younus",
""
],
[
"Khan",
"Aihab",
""
],
[
"Khiyal",
"Malik Sikandar Hayat",
""
]
] |
1003.1588 | Umberto Straccia | Fernando Bobillo and Felix Bou and Umberto Straccia | On the Failure of the Finite Model Property in some Fuzzy Description
Logics | null | null | null | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | Fuzzy Description Logics (DLs) are a family of logics which allow the
representation of (and the reasoning with) structured knowledge affected by
vagueness. Although most of the not very expressive crisp DLs, such as ALC,
enjoy the Finite Model Property (FMP), this is not the case once we move into
the fuzzy case. In this paper we show that if we allow arbitrary knowledge
bases, then the fuzzy DLs ALC under Lukasiewicz and Product fuzzy logics do not
verify the FMP even if we restrict to witnessed models; in other words, finite
satisfiability and witnessed satisfiability are different for arbitrary
knowledge bases. The aim of this paper is to point out the failure of FMP
because it affects several algorithms published in the literature for reasoning
under fuzzy ALC.
| [
{
"version": "v1",
"created": "Mon, 8 Mar 2010 10:18:12 GMT"
}
] | 1,268,092,800,000 | [
[
"Bobillo",
"Fernando",
""
],
[
"Bou",
"Felix",
""
],
[
"Straccia",
"Umberto",
""
]
] |
1003.1658 | Zolt\'an K\'asa | Agnes Achs | A multivalued knowledge-base model | null | Acta Univ. Sapientiae, Informatica, 2,1(2010) 51-79 | null | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | The basic aim of our study is to give a possible model for handling uncertain
information. This model is worked out in the framework of DATALOG. At first the
concept of fuzzy Datalog will be summarized, then its extensions for
intuitionistic- and interval-valued fuzzy logic is given and the concept of
bipolar fuzzy Datalog is introduced. Based on these ideas the concept of
multivalued knowledge-base will be defined as a quadruple of any background
knowledge; a deduction mechanism; a connecting algorithm, and a function set of
the program, which help us to determine the uncertainty levels of the results.
At last a possible evaluation strategy is given.
| [
{
"version": "v1",
"created": "Mon, 8 Mar 2010 16:00:28 GMT"
},
{
"version": "v2",
"created": "Tue, 9 Mar 2010 13:08:57 GMT"
},
{
"version": "v3",
"created": "Sun, 21 Mar 2010 19:27:10 GMT"
}
] | 1,270,684,800,000 | [
[
"Achs",
"Agnes",
""
]
] |
1003.2641 | Frederic Dambreville | Fr\'ed\'eric Dambreville | Release ZERO.0.1 of package RefereeToolbox | null | null | null | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | RefereeToolbox is a java package implementing combination operators for
fusing evidences. It is downloadable from:
http://refereefunction.fredericdambreville.com/releases RefereeToolbox is based
on an interpretation of the fusion rules by means of Referee Functions. This
approach implies a dissociation between the definition of the combination and
its actual implementation, which is common to all referee-based combinations.
As a result, RefereeToolbox is designed with the aim to be generic and
evolutive.
| [
{
"version": "v1",
"created": "Fri, 12 Mar 2010 21:25:10 GMT"
}
] | 1,268,697,600,000 | [
[
"Dambreville",
"Frédéric",
""
]
] |
1003.5173 | Charles Robert | Charles A. B. Robert (LORIA) | LEXSYS: Architecture and Implication for Intelligent Agent systems | null | null | null | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | LEXSYS, (Legume Expert System) was a project conceived at IITA (International
Institute of Tropical Agriculture) Ibadan Nigeria. It was initiated by the
COMBS (Collaborative Group on Maize-Based Systems Research in the 1990. It was
meant for a general framework for characterizing on-farm testing for technology
design for sustainable cereal-based cropping system. LEXSYS is not a true
expert system as the name would imply, but simply a user-friendly information
system. This work is an attempt to give a formal representation of the existing
system and then present areas where intelligent agent can be applied.
| [
{
"version": "v1",
"created": "Fri, 26 Mar 2010 16:01:52 GMT"
}
] | 1,269,820,800,000 | [
[
"Robert",
"Charles A. B.",
"",
"LORIA"
]
] |
1003.5305 | David Tolpin | David Tolpin and Solomon Eyal Shimony | Rational Value of Information Estimation for Measurement Selection | 7 pages, 2 figures, presented at URPDM2010; plots fixed | null | null | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | Computing value of information (VOI) is a crucial task in various aspects of
decision-making under uncertainty, such as in meta-reasoning for search; in
selecting measurements to make, prior to choosing a course of action; and in
managing the exploration vs. exploitation tradeoff. Since such applications
typically require numerous VOI computations during a single run, it is
essential that VOI be computed efficiently. We examine the issue of anytime
estimation of VOI, as frequently it suffices to get a crude estimate of the
VOI, thus saving considerable computational resources. As a case study, we
examine VOI estimation in the measurement selection problem. Empirical
evaluation of the proposed scheme in this domain shows that computational
resources can indeed be significantly reduced, at little cost in expected
rewards achieved in the overall decision problem.
| [
{
"version": "v1",
"created": "Sat, 27 Mar 2010 14:56:16 GMT"
},
{
"version": "v2",
"created": "Fri, 16 Apr 2010 08:52:06 GMT"
}
] | 1,426,204,800,000 | [
[
"Tolpin",
"David",
""
],
[
"Shimony",
"Solomon Eyal",
""
]
] |
1003.5899 | Agnieszka Patyk | Agnieszka Patyk | Geometric Algebra Model of Distributed Representations | 30 pages, 19 figures | null | 10.1007/978-1-84996-108-0_19 | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | Formalism based on GA is an alternative to distributed representation models
developed so far --- Smolensky's tensor product, Holographic Reduced
Representations (HRR) and Binary Spatter Code (BSC). Convolutions are replaced
by geometric products, interpretable in terms of geometry which seems to be the
most natural language for visualization of higher concepts. This paper recalls
the main ideas behind the GA model and investigates recognition test results
using both inner product and a clipped version of matrix representation. The
influence of accidental blade equality on recognition is also studied. Finally,
the efficiency of the GA model is compared to that of previously developed
models.
| [
{
"version": "v1",
"created": "Tue, 30 Mar 2010 19:03:43 GMT"
}
] | 1,431,907,200,000 | [
[
"Patyk",
"Agnieszka",
""
]
] |
1004.1540 | Jean Dezert | Florentin Smarandache (UNM), Jean Dezert (ONERA) | Importance of Sources using the Repeated Fusion Method and the
Proportional Conflict Redistribution Rules #5 and #6 | null | null | null | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | We present in this paper some examples of how to compute by hand the PCR5
fusion rule for three sources, so the reader will better understand its
mechanism. We also take into consideration the importance of sources, which is
different from the classical discounting of sources.
| [
{
"version": "v1",
"created": "Fri, 9 Apr 2010 12:38:17 GMT"
}
] | 1,272,758,400,000 | [
[
"Smarandache",
"Florentin",
"",
"UNM"
],
[
"Dezert",
"Jean",
"",
"ONERA"
]
] |
1004.1772 | Rdv Ijcsis | Uraiwan Inyaem, Choochart Haruechaiyasak, Phayung Meesad, Dat Tran | Terrorism Event Classification Using Fuzzy Inference Systems | IEEE Publication format, ISSN 1947 5500,
http://sites.google.com/site/ijcsis/ | IJCSIS, Vol. 7 No. 3, March 2010, 247-256 | null | null | cs.AI | http://creativecommons.org/licenses/by-nc-sa/3.0/ | Terrorism has led to many problems in Thai societies, not only property
damage but also civilian casualties. Predicting terrorism activities in advance
can help prepare and manage risk from sabotage by these activities. This paper
proposes a framework focusing on event classification in terrorism domain using
fuzzy inference systems (FISs). Each FIS is a decision-making model combining
fuzzy logic and approximate reasoning. It is generated in five main parts: the
input interface, the fuzzification interface, knowledge base unit, decision
making unit and output defuzzification interface. Adaptive neuro-fuzzy
inference system (ANFIS) is a FIS model adapted by combining the fuzzy logic
and neural network. The ANFIS utilizes automatic identification of fuzzy logic
rules and adjustment of membership function (MF). Moreover, neural network can
directly learn from data set to construct fuzzy logic rules and MF implemented
in various applications. FIS settings are evaluated based on two comparisons.
The first evaluation is the comparison between unstructured and structured
events using the same FIS setting. The second comparison is the model settings
between FIS and ANFIS for classifying structured events. The data set consists
of news articles related to terrorism events in three southern provinces of
Thailand. The experimental results show that the classification performance of
the FIS resulting from structured events achieves satisfactory accuracy and is
better than the unstructured events. In addition, the classification of
structured events using ANFIS gives higher performance than the events using
only FIS in the prediction of terrorism events.
| [
{
"version": "v1",
"created": "Sun, 11 Apr 2010 08:12:31 GMT"
}
] | 1,271,116,800,000 | [
[
"Inyaem",
"Uraiwan",
""
],
[
"Haruechaiyasak",
"Choochart",
""
],
[
"Meesad",
"Phayung",
""
],
[
"Tran",
"Dat",
""
]
] |
1004.1794 | Rdv Ijcsis | T.Krishna Kishore, T.Sasi Vardhan, N.Lakshmi Narayana | Probabilistic Semantic Web Mining Using Artificial Neural Analysis | IEEE Publication format, ISSN 1947 5500,
http://sites.google.com/site/ijcsis/ | IJCSIS, Vol. 7 No. 3, March 2010, 294-304 | null | null | cs.AI | http://creativecommons.org/licenses/by-nc-sa/3.0/ | Most of the web user's requirements are search or navigation time and getting
correctly matched result. These constrains can be satisfied with some
additional modules attached to the existing search engines and web servers.
This paper proposes that powerful architecture for search engines with the
title of Probabilistic Semantic Web Mining named from the methods used. With
the increase of larger and larger collection of various data resources on the
World Wide Web (WWW), Web Mining has become one of the most important
requirements for the web users. Web servers will store various formats of data
including text, image, audio, video etc., but servers can not identify the
contents of the data. These search techniques can be improved by adding some
special techniques including semantic web mining and probabilistic analysis to
get more accurate results. Semantic web mining technique can provide meaningful
search of data resources by eliminating useless information with mining
process. In this technique web servers will maintain Meta information of each
and every data resources available in that particular web server. This will
help the search engine to retrieve information that is relevant to user given
input string. This paper proposing the idea of combing these two techniques
Semantic web mining and Probabilistic analysis for efficient and accurate
search results of web mining. SPF can be calculated by considering both
semantic accuracy and syntactic accuracy of data with the input string. This
will be the deciding factor for producing results.
| [
{
"version": "v1",
"created": "Sun, 11 Apr 2010 11:19:32 GMT"
}
] | 1,271,116,800,000 | [
[
"Kishore",
"T. Krishna",
""
],
[
"Vardhan",
"T. Sasi",
""
],
[
"Narayana",
"N. Lakshmi",
""
]
] |
1004.2008 | Ameet Talwalkar | Ameet Talwalkar and Afshin Rostamizadeh | Matrix Coherence and the Nystrom Method | null | null | null | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | The Nystrom method is an efficient technique to speed up large-scale learning
applications by generating low-rank approximations. Crucial to the performance
of this technique is the assumption that a matrix can be well approximated by
working exclusively with a subset of its columns. In this work we relate this
assumption to the concept of matrix coherence and connect matrix coherence to
the performance of the Nystrom method. Making use of related work in the
compressed sensing and the matrix completion literature, we derive novel
coherence-based bounds for the Nystrom method in the low-rank setting. We then
present empirical results that corroborate these theoretical bounds. Finally,
we present more general empirical results for the full-rank setting that
convincingly demonstrate the ability of matrix coherence to measure the degree
to which information can be extracted from a subset of columns.
| [
{
"version": "v1",
"created": "Mon, 12 Apr 2010 17:09:16 GMT"
}
] | 1,271,116,800,000 | [
[
"Talwalkar",
"Ameet",
""
],
[
"Rostamizadeh",
"Afshin",
""
]
] |
1004.2624 | Toby Walsh | Marijn Heule and Toby Walsh | Symmetry within Solutions | AAAI 2010, Proceedings of Twenty-Fourth AAAI Conference on Artificial
Intelligence | null | null | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | We define the concept of an internal symmetry. This is a symmety within a
solution of a constraint satisfaction problem. We compare this to solution
symmetry, which is a mapping between different solutions of the same problem.
We argue that we may be able to exploit both types of symmetry when finding
solutions. We illustrate the potential of exploiting internal symmetries on two
benchmark domains: Van der Waerden numbers and graceful graphs. By identifying
internal symmetries we are able to extend the state of the art in both cases.
| [
{
"version": "v1",
"created": "Thu, 15 Apr 2010 13:15:06 GMT"
}
] | 1,271,376,000,000 | [
[
"Heule",
"Marijn",
""
],
[
"Walsh",
"Toby",
""
]
] |
1004.2626 | Toby Walsh | Christian Bessiere and George Katsirelos and Nina Narodytska and
Claude-Guy Quimper and Toby Walsh | Propagating Conjunctions of AllDifferent Constraints | AAAI 2010, Proceedings of the Twenty-Fourth AAAI Conference on
Artificial Intelligence | null | null | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | We study propagation algorithms for the conjunction of two AllDifferent
constraints. Solutions of an AllDifferent constraint can be seen as perfect
matchings on the variable/value bipartite graph. Therefore, we investigate the
problem of finding simultaneous bipartite matchings. We present an extension of
the famous Hall theorem which characterizes when simultaneous bipartite
matchings exists. Unfortunately, finding such matchings is NP-hard in general.
However, we prove a surprising result that finding a simultaneous matching on a
convex bipartite graph takes just polynomial time. Based on this theoretical
result, we provide the first polynomial time bound consistency algorithm for
the conjunction of two AllDifferent constraints. We identify a pathological
problem on which this propagator is exponentially faster compared to existing
propagators. Our experiments show that this new propagator can offer
significant benefits over existing methods.
| [
{
"version": "v1",
"created": "Thu, 15 Apr 2010 13:37:49 GMT"
}
] | 1,271,376,000,000 | [
[
"Bessiere",
"Christian",
""
],
[
"Katsirelos",
"George",
""
],
[
"Narodytska",
"Nina",
""
],
[
"Quimper",
"Claude-Guy",
""
],
[
"Walsh",
"Toby",
""
]
] |
1004.3260 | Vishal Goyal | Rosmayati Mohemad, Abdul Razak Hamdan, Zulaiha Ali Othman, Noor
Maizura Mohamad Noor | Decision Support Systems (DSS) in Construction Tendering Processes | International Journal of Computer Science Issues online at
http://ijcsi.org/articles/Decision-Support-Systems-DSS-in-Construction-Tendering-Processes.php | IJCSI, Volume 7, Issue 2, March 2010 | null | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | The successful execution of a construction project is heavily impacted by
making the right decision during tendering processes. Managing tender
procedures is very complex and uncertain involving coordination of many tasks
and individuals with different priorities and objectives. Bias and inconsistent
decision are inevitable if the decision-making process is totally depends on
intuition, subjective judgement or emotion. In making transparent decision and
healthy competition tendering, there exists a need for flexible guidance tool
for decision support. Aim of this paper is to give a review on current
practices of Decision Support Systems (DSS) technology in construction
tendering processes. Current practices of general tendering processes as
applied to the most countries in different regions such as United States,
Europe, Middle East and Asia are comprehensively discussed. Applications of
Web-based tendering processes is also summarised in terms of its properties.
Besides that, a summary of Decision Support System (DSS) components is included
in the next section. Furthermore, prior researches on implementation of DSS
approaches in tendering processes are discussed in details. Current issues
arise from both of paper-based and Web-based tendering processes are outlined.
Finally, conclusion is included at the end of this paper.
| [
{
"version": "v1",
"created": "Mon, 19 Apr 2010 17:56:06 GMT"
}
] | 1,271,721,600,000 | [
[
"Mohemad",
"Rosmayati",
""
],
[
"Hamdan",
"Abdul Razak",
""
],
[
"Othman",
"Zulaiha Ali",
""
],
[
"Noor",
"Noor Maizura Mohamad",
""
]
] |
1004.4342 | Martin Slota | Martin Slota and Jo\~ao Leite | Towards Closed World Reasoning in Dynamic Open Worlds (Extended Version) | 40 pages; an extended version of the article published in Theory and
Practice of Logic Programming, 10 (4-6): 547 - 564, July. Copyright 2010
Cambridge University Press | Theory and Practice of Logic Programming, 10(4-6), 547-564, 2010 | 10.1017/S147106841000027X | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | The need for integration of ontologies with nonmonotonic rules has been
gaining importance in a number of areas, such as the Semantic Web. A number of
researchers addressed this problem by proposing a unified semantics for hybrid
knowledge bases composed of both an ontology (expressed in a fragment of
first-order logic) and nonmonotonic rules. These semantics have matured over
the years, but only provide solutions for the static case when knowledge does
not need to evolve. In this paper we take a first step towards addressing the
dynamics of hybrid knowledge bases. We focus on knowledge updates and,
considering the state of the art of belief update, ontology update and rule
update, we show that current solutions are only partial and difficult to
combine. Then we extend the existing work on ABox updates with rules, provide a
semantics for such evolving hybrid knowledge bases and study its basic
properties. To the best of our knowledge, this is the first time that an update
operator is proposed for hybrid knowledge bases.
| [
{
"version": "v1",
"created": "Sun, 25 Apr 2010 10:51:35 GMT"
},
{
"version": "v2",
"created": "Fri, 23 Jul 2010 00:31:53 GMT"
}
] | 1,311,724,800,000 | [
[
"Slota",
"Martin",
""
],
[
"Leite",
"João",
""
]
] |
1004.4734 | Martin Josef Geiger | Martin Josef Geiger | On the comparison of plans: Proposition of an instability measure for
dynamic machine scheduling | null | Proceedings of the 25th Mini EURO Conference on Uncertainty and
Robustness in Planning and Decision Making, April 15-17, 2010, Coimbra,
Portugal. ISBN 978-989-95055-3-7. | null | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | On the basis of an analysis of previous research, we present a generalized
approach for measuring the difference of plans with an exemplary application to
machine scheduling. Our work is motivated by the need for such measures, which
are used in dynamic scheduling and planning situations. In this context,
quantitative approaches are needed for the assessment of the robustness and
stability of schedules. Obviously, any `robustness' or `stability' of plans has
to be defined w. r. t. the particular situation and the requirements of the
human decision maker. Besides the proposition of an instability measure, we
therefore discuss possibilities of obtaining meaningful information from the
decision maker for the implementation of the introduced approach.
| [
{
"version": "v1",
"created": "Tue, 27 Apr 2010 08:13:51 GMT"
}
] | 1,426,550,400,000 | [
[
"Geiger",
"Martin Josef",
""
]
] |
1004.4801 | Yves Moinard | Philippe Besnard (INRIA - IRISA, IRIT), Marie-Odile Cordier (INRIA -
IRISA), Yves Moinard (INRIA - IRISA) | Ontology-based inference for causal explanation | null | Integrated Computer-Aided Engineering 15, 4 (2008) 351-367 | null | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | We define an inference system to capture explanations based on causal
statements, using an ontology in the form of an IS-A hierarchy. We first
introduce a simple logical language which makes it possible to express that a
fact causes another fact and that a fact explains another fact. We present a
set of formal inference patterns from causal statements to explanation
statements. We introduce an elementary ontology which gives greater
expressiveness to the system while staying close to propositional reasoning. We
provide an inference system that captures the patterns discussed, firstly in a
purely propositional framework, then in a datalog (limited predicate)
framework.
| [
{
"version": "v1",
"created": "Tue, 27 Apr 2010 13:42:49 GMT"
}
] | 1,272,758,400,000 | [
[
"Besnard",
"Philippe",
"",
"INRIA - IRISA, IRIT"
],
[
"Cordier",
"Marie-Odile",
"",
"INRIA -\n IRISA"
],
[
"Moinard",
"Yves",
"",
"INRIA - IRISA"
]
] |
1005.0089 | Lars Kotthoff | Tom Kelsey and Lars Kotthoff | The Exact Closest String Problem as a Constraint Satisfaction Problem | null | null | null | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | We report (to our knowledge) the first evaluation of Constraint Satisfaction
as a computational framework for solving closest string problems. We show that
careful consideration of symbol occurrences can provide search heuristics that
provide several orders of magnitude speedup at and above the optimal distance.
We also report (to our knowledge) the first analysis and evaluation -- using
any technique -- of the computational difficulties involved in the
identification of all closest strings for a given input set. We describe
algorithms for web-scale distributed solution of closest string problems, both
purely based on AI backtrack search and also hybrid numeric-AI methods.
| [
{
"version": "v1",
"created": "Sat, 1 May 2010 16:00:59 GMT"
}
] | 1,272,931,200,000 | [
[
"Kelsey",
"Tom",
""
],
[
"Kotthoff",
"Lars",
""
]
] |
1005.0104 | Rahul Gupta | Rahul Gupta, Sunita Sarawagi | Joint Structured Models for Extraction from Overlapping Sources | null | null | 10.1145/1935826.1935868 | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | We consider the problem of jointly training structured models for extraction
from sources whose instances enjoy partial overlap. This has important
applications like user-driven ad-hoc information extraction on the web. Such
applications present new challenges in terms of the number of sources and their
arbitrary pattern of overlap not seen by earlier collective training schemes
applied on two sources. We present an agreement-based learning framework and
alternatives within it to trade-off tractability, robustness to noise, and
extent of agreement. We provide a principled scheme to discover low-noise
agreement sets in unlabeled data across the sources. Through extensive
experiments over 58 real datasets, we establish that our method of additively
rewarding agreement over maximal segments of text provides the best trade-offs,
and also scores over alternatives such as collective inference, staged
training, and multi-view learning.
| [
{
"version": "v1",
"created": "Sat, 1 May 2010 20:55:23 GMT"
}
] | 1,499,385,600,000 | [
[
"Gupta",
"Rahul",
""
],
[
"Sarawagi",
"Sunita",
""
]
] |
1005.0605 | Vladimir Gavrikov L | Vladimir L. Gavrikov, Rem G. Khlebopros | An approach to visualize the course of solving of a research task in
humans | 20 pages, 9 figures | null | null | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | A technique to study the dynamics of solving of a research task is suggested.
The research task was based on specially developed software Right- Wrong
Responder (RWR), with the participants having to reveal the response logic of
the program. The participants interacted with the program in the form of a
semi-binary dialogue, which implies the feedback responses of only two kinds -
"right" or "wrong". The technique has been applied to a small pilot group of
volunteer participants. Some of them have successfully solved the task
(solvers) and some have not (non-solvers). In the beginning of the work, the
solvers did more wrong moves than non-solvers, and they did less wrong moves
closer to the finish of the work. A phase portrait of the work both in solvers
and non-solvers showed definite cycles that may correspond to sequences of
partially true hypotheses that may be formulated by the participants during the
solving of the task.
| [
{
"version": "v1",
"created": "Mon, 26 Apr 2010 11:00:24 GMT"
}
] | 1,273,017,600,000 | [
[
"Gavrikov",
"Vladimir L.",
""
],
[
"Khlebopros",
"Rem G.",
""
]
] |
1005.0608 | Kurt Ammon | Kurt Ammon | Informal Concepts in Machines | null | null | null | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | This paper constructively proves the existence of an effective procedure
generating a computable (total) function that is not contained in any given
effectively enumerable set of such functions. The proof implies the existence
of machines that process informal concepts such as computable (total) functions
beyond the limits of any given Turing machine or formal system, that is, these
machines can, in a certain sense, "compute" function values beyond these
limits. We call these machines creative. We argue that any "intelligent"
machine should be capable of processing informal concepts such as computable
(total) functions, that is, it should be creative. Finally, we introduce
hypotheses on creative machines which were developed on the basis of
theoretical investigations and experiments with computer programs. The
hypotheses say that machine intelligence is the execution of a self-developing
procedure starting from any universal programming language and any input.
| [
{
"version": "v1",
"created": "Tue, 4 May 2010 19:00:37 GMT"
}
] | 1,273,017,600,000 | [
[
"Ammon",
"Kurt",
""
]
] |
1005.0896 | Jean Dezert | Jean-Marc Tacnet (UR ETGR), Mireille Batton-Hubert (ENSM-SE), Jean
Dezert (ONERA) | A two-step fusion process for multi-criteria decision applied to natural
hazards in mountains | null | Workshop on the Theory of Belief Functions, April 1- 2, 2010
Brest, France, Brest : France (2010) | null | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | Mountain river torrents and snow avalanches generate human and material
damages with dramatic consequences. Knowledge about natural phenomenona is
often lacking and expertise is required for decision and risk management
purposes using multi-disciplinary quantitative or qualitative approaches.
Expertise is considered as a decision process based on imperfect information
coming from more or less reliable and conflicting sources. A methodology mixing
the Analytic Hierarchy Process (AHP), a multi-criteria aid-decision method, and
information fusion using Belief Function Theory is described. Fuzzy Sets and
Possibilities theories allow to transform quantitative and qualitative criteria
into a common frame of discernment for decision in Dempster-Shafer Theory (DST
) and Dezert-Smarandache Theory (DSmT) contexts. Main issues consist in basic
belief assignments elicitation, conflict identification and management, fusion
rule choices, results validation but also in specific needs to make a
difference between importance and reliability and uncertainty in the fusion
process.
| [
{
"version": "v1",
"created": "Thu, 6 May 2010 06:32:59 GMT"
}
] | 1,295,481,600,000 | [
[
"Tacnet",
"Jean-Marc",
"",
"UR ETGR"
],
[
"Batton-Hubert",
"Mireille",
"",
"ENSM-SE"
],
[
"Dezert",
"Jean",
"",
"ONERA"
]
] |
1005.0917 | Christoph Schwarzweller | Agnieszka Rowinska-Schwarzweller and Christoph Schwarzweller | On Building a Knowledge Base for Stability Theory | To appear in The 9th International Conference on Mathematical
Knowledge Management: MKM 2010 | Lecture Notes in Computer Science, 2010, Volume 6167, Intelligent
Computer Mathematics, Pages 427-439 | null | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | A lot of mathematical knowledge has been formalized and stored in
repositories by now: different mathematical theorems and theories have been
taken into consideration and included in mathematical repositories.
Applications more distant from pure mathematics, however --- though based on
these theories --- often need more detailed knowledge about the underlying
theories. In this paper we present an example Mizar formalization from the area
of electrical engineering focusing on stability theory which is based on
complex analysis. We discuss what kind of special knowledge is necessary here
and which amount of this knowledge is included in existing repositories.
| [
{
"version": "v1",
"created": "Thu, 6 May 2010 07:59:52 GMT"
}
] | 1,285,113,600,000 | [
[
"Rowinska-Schwarzweller",
"Agnieszka",
""
],
[
"Schwarzweller",
"Christoph",
""
]
] |
1005.1518 | Liane Gabora | Liane Gabora | Recognizability of Individual Creative Style Within and Across Domains:
Preliminary Studies | 6 pages, submitted to Annual Meeting of the Cognitive Science
Society. August 11-14, 2010, Portland, Oregon | In S. Ohlsson & R. Catrambone (Eds.), Proceedings of the 32nd
Annual Meeting of the Cognitive Science Society (pp. 2350-2355). Austin, TX:
Cognitive Science Society. (2010) | null | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | It is hypothesized that creativity arises from the self-mending capacity of
an internal model of the world, or worldview. The uniquely honed worldview of a
creative individual results in a distinctive style that is recognizable within
and across domains. It is further hypothesized that creativity is domaingeneral
in the sense that there exist multiple avenues by which the distinctiveness of
one's worldview can be expressed. These hypotheses were tested using art
students and creative writing students. Art students guessed significantly
above chance both which painting was done by which of five famous artists, and
which artwork was done by which of their peers. Similarly, creative writing
students guessed significantly above chance both which passage was written by
which of five famous writers, and which passage was written by which of their
peers. These findings support the hypothesis that creative style is
recognizable. Moreover, creative writing students guessed significantly above
chance which of their peers produced particular works of art, supporting the
hypothesis that creative style is recognizable not just within but across
domains.
| [
{
"version": "v1",
"created": "Mon, 10 May 2010 12:30:36 GMT"
},
{
"version": "v2",
"created": "Sun, 30 Jun 2019 00:41:40 GMT"
},
{
"version": "v3",
"created": "Fri, 5 Jul 2019 20:58:02 GMT"
},
{
"version": "v4",
"created": "Tue, 9 Jul 2019 19:56:00 GMT"
}
] | 1,562,803,200,000 | [
[
"Gabora",
"Liane",
""
]
] |
1005.1860 | Gavin Taylor | Marek Petrik, Gavin Taylor, Ron Parr, Shlomo Zilberstein | Feature Selection Using Regularization in Approximate Linear Programs
for Markov Decision Processes | Technical report corresponding to the ICML2010 submission of the same
name | null | null | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | Approximate dynamic programming has been used successfully in a large variety
of domains, but it relies on a small set of provided approximation features to
calculate solutions reliably. Large and rich sets of features can cause
existing algorithms to overfit because of a limited number of samples. We
address this shortcoming using $L_1$ regularization in approximate linear
programming. Because the proposed method can automatically select the
appropriate richness of features, its performance does not degrade with an
increasing number of features. These results rely on new and stronger sampling
bounds for regularized approximate linear programs. We also propose a
computationally efficient homotopy method. The empirical evaluation of the
approach shows that the proposed method performs well on simple MDPs and
standard benchmark problems.
| [
{
"version": "v1",
"created": "Tue, 11 May 2010 15:24:36 GMT"
},
{
"version": "v2",
"created": "Thu, 20 May 2010 14:19:17 GMT"
}
] | 1,426,550,400,000 | [
[
"Petrik",
"Marek",
""
],
[
"Taylor",
"Gavin",
""
],
[
"Parr",
"Ron",
""
],
[
"Zilberstein",
"Shlomo",
""
]
] |
1005.2815 | Marc Schoenauer | Miguel Nicolau (INRIA Saclay - Ile de France, LRI), Marc Schoenauer
(INRIA Saclay - Ile de France, LRI), W. Banzhaf | Evolving Genes to Balance a Pole | null | EUropean Conference on Genetic Programming, Istanbul : Turkey
(2010) | null | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | We discuss how to use a Genetic Regulatory Network as an evolutionary
representation to solve a typical GP reinforcement problem, the pole balancing.
The network is a modified version of an Artificial Regulatory Network proposed
a few years ago, and the task could be solved only by finding a proper way of
connecting inputs and outputs to the network. We show that the representation
is able to generalize well over the problem domain, and discuss the performance
of different models of this kind.
| [
{
"version": "v1",
"created": "Mon, 17 May 2010 06:44:44 GMT"
}
] | 1,274,313,600,000 | [
[
"Nicolau",
"Miguel",
"",
"INRIA Saclay - Ile de France, LRI"
],
[
"Schoenauer",
"Marc",
"",
"INRIA Saclay - Ile de France, LRI"
],
[
"Banzhaf",
"W.",
""
]
] |
1005.3502 | Lars Kotthoff | Lars Kotthoff and Ian Gent and Ian Miguel | Using machine learning to make constraint solver implementation
decisions | null | null | null | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | Programs to solve so-called constraint problems are complex pieces of
software which require many design decisions to be made more or less
arbitrarily by the implementer. These decisions affect the performance of the
finished solver significantly. Once a design decision has been made, it cannot
easily be reversed, although a different decision may be more appropriate for a
particular problem.
We investigate using machine learning to make these decisions automatically
depending on the problem to solve with the alldifferent constraint as an
example. Our system is capable of making non-trivial, multi-level decisions
that improve over always making a default choice.
| [
{
"version": "v1",
"created": "Wed, 19 May 2010 17:53:43 GMT"
}
] | 1,274,313,600,000 | [
[
"Kotthoff",
"Lars",
""
],
[
"Gent",
"Ian",
""
],
[
"Miguel",
"Ian",
""
]
] |
1005.4025 | William Jackson | Siddharths Sankar Biswas | A Soft Computing Model for Physicians' Decision Process | http://www.journalofcomputing.org | Journal of Computing, Volume 2, Issue 5, May 2010 | null | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | In this paper the author presents a kind of Soft Computing Technique, mainly
an application of fuzzy set theory of Prof. Zadeh [16], on a problem of Medical
Experts Systems. The choosen problem is on design of a physician's decision
model which can take crisp as well as fuzzy data as input, unlike the
traditional models. The author presents a mathematical model based on fuzzy set
theory for physician aided evaluation of a complete representation of
information emanating from the initial interview including patient past
history, present symptoms, and signs observed upon physical examination and
results of clinical and diagnostic tests.
| [
{
"version": "v1",
"created": "Fri, 21 May 2010 17:40:38 GMT"
}
] | 1,274,659,200,000 | [
[
"Biswas",
"Siddharths Sankar",
""
]
] |
1005.4159 | Andrew Lin | Andrew Lin | The Complexity of Manipulating $k$-Approval Elections | null | null | null | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | An important problem in computational social choice theory is the complexity
of undesirable behavior among agents, such as control, manipulation, and
bribery in election systems. These kinds of voting strategies are often
tempting at the individual level but disastrous for the agents as a whole.
Creating election systems where the determination of such strategies is
difficult is thus an important goal.
An interesting set of elections is that of scoring protocols. Previous work
in this area has demonstrated the complexity of misuse in cases involving a
fixed number of candidates, and of specific election systems on unbounded
number of candidates such as Borda. In contrast, we take the first step in
generalizing the results of computational complexity of election misuse to
cases of infinitely many scoring protocols on an unbounded number of
candidates. Interesting families of systems include $k$-approval and $k$-veto
elections, in which voters distinguish $k$ candidates from the candidate set.
Our main result is to partition the problems of these families based on their
complexity. We do so by showing they are polynomial-time computable, NP-hard,
or polynomial-time equivalent to another problem of interest. We also
demonstrate a surprising connection between manipulation in election systems
and some graph theory problems.
| [
{
"version": "v1",
"created": "Sun, 23 May 2010 00:04:11 GMT"
},
{
"version": "v2",
"created": "Thu, 27 May 2010 14:14:44 GMT"
},
{
"version": "v3",
"created": "Thu, 19 Apr 2012 05:02:18 GMT"
}
] | 1,334,880,000,000 | [
[
"Lin",
"Andrew",
""
]
] |
1005.4272 | Chriss Romy | G. Arutchelvan, S. K. Srivatsa, R. Jagannathan | Inaccuracy Minimization by Partioning Fuzzy Data Sets - Validation of
Analystical Methodology | IEEE Publication format, International Journal of Computer Science
and Information Security, IJCSIS, Vol. 8 No. 1, April 2010, USA. ISSN 1947
5500, http://sites.google.com/site/ijcsis/ | null | null | null | cs.AI | http://creativecommons.org/licenses/by-nc-sa/3.0/ | In the last two decades, a number of methods have been proposed for
forecasting based on fuzzy time series. Most of the fuzzy time series methods
are presented for forecasting of car road accidents. However, the forecasting
accuracy rates of the existing methods are not good enough. In this paper, we
compared our proposed new method of fuzzy time series forecasting with existing
methods. Our method is based on means based partitioning of the historical data
of car road accidents. The proposed method belongs to the kth order and
time-variant methods. The proposed method can get the best forecasting accuracy
rate for forecasting the car road accidents than the existing methods.
| [
{
"version": "v1",
"created": "Mon, 24 May 2010 07:50:55 GMT"
}
] | 1,274,745,600,000 | [
[
"Arutchelvan",
"G.",
""
],
[
"Srivatsa",
"S. K.",
""
],
[
"Jagannathan",
"R.",
""
]
] |
1005.4496 | Secretary Aircc Journal | Dewan Md. Farid(1), Nouria Harbi(1), and Mohammad Zahidur Rahman(2),
((1)University Lumiere Lyon 2 - France, (2)Jahangirnagar University,
Bangladesh) | Combining Naive Bayes and Decision Tree for Adaptive Intrusion Detection | 14 Pages, IJNSA | International Journal of Network Security & Its Applications 2.2
(2010) 12-25 | 10.5121/ijnsa.2010.2202 | null | cs.AI | http://creativecommons.org/licenses/by-nc-sa/3.0/ | In this paper, a new learning algorithm for adaptive network intrusion
detection using naive Bayesian classifier and decision tree is presented, which
performs balance detections and keeps false positives at acceptable level for
different types of network attacks, and eliminates redundant attributes as well
as contradictory examples from training data that make the detection model
complex. The proposed algorithm also addresses some difficulties of data mining
such as handling continuous attribute, dealing with missing attribute values,
and reducing noise in training data. Due to the large volumes of security audit
data as well as the complex and dynamic properties of intrusion behaviours,
several data miningbased intrusion detection techniques have been applied to
network-based traffic data and host-based data in the last decades. However,
there remain various issues needed to be examined towards current intrusion
detection systems (IDS). We tested the performance of our proposed algorithm
with existing learning algorithms by employing on the KDD99 benchmark intrusion
detection dataset. The experimental results prove that the proposed algorithm
achieved high detection rates (DR) and significant reduce false positives (FP)
for different types of network intrusions using limited computational
resources.
| [
{
"version": "v1",
"created": "Tue, 25 May 2010 07:47:00 GMT"
}
] | 1,279,152,000,000 | [
[
"Farid",
"Dewan Md.",
""
],
[
"Harbi",
"Nouria",
""
],
[
"Rahman",
"Mohammad Zahidur",
""
]
] |
1005.4592 | Josef Urban | Josef Urban and Geoff Sutcliffe | Automated Reasoning and Presentation Support for Formalizing Mathematics
in Mizar | To appear in 10th International Conference on. Artificial
Intelligence and Symbolic Computation AISC 2010 | Intelligent Computer Mathematics 2010, LNCS 6167, pp. 132-146 | 10.1007/978-3-642-14128-7_12 | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | This paper presents a combination of several automated reasoning and proof
presentation tools with the Mizar system for formalization of mathematics. The
combination forms an online service called MizAR, similar to the SystemOnTPTP
service for first-order automated reasoning. The main differences to
SystemOnTPTP are the use of the Mizar language that is oriented towards human
mathematicians (rather than the pure first-order logic used in SystemOnTPTP),
and setting the service in the context of the large Mizar Mathematical Library
of previous theorems,definitions, and proofs (rather than the isolated problems
that are solved in SystemOnTPTP). These differences poses new challenges and
new opportunities for automated reasoning and for proof presentation tools.
This paper describes the overall structure of MizAR, and presents the automated
reasoning systems and proof presentation tools that are combined to make MizAR
a useful mathematical service.
| [
{
"version": "v1",
"created": "Tue, 25 May 2010 14:49:03 GMT"
}
] | 1,311,724,800,000 | [
[
"Urban",
"Josef",
""
],
[
"Sutcliffe",
"Geoff",
""
]
] |
1005.4963 | Anon Plangprasopchok | Anon Plangprasopchok, Kristina Lerman, Lise Getoor | Integrating Structured Metadata with Relational Affinity Propagation | 6 Pages, To appear at AAAI Workshop on Statistical Relational AI | null | null | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | Structured and semi-structured data describing entities, taxonomies and
ontologies appears in many domains. There is a huge interest in integrating
structured information from multiple sources; however integrating structured
data to infer complex common structures is a difficult task because the
integration must aggregate similar structures while avoiding structural
inconsistencies that may appear when the data is combined. In this work, we
study the integration of structured social metadata: shallow personal
hierarchies specified by many individual users on the SocialWeb, and focus on
inferring a collection of integrated, consistent taxonomies. We frame this task
as an optimization problem with structural constraints. We propose a new
inference algorithm, which we refer to as Relational Affinity Propagation (RAP)
that extends affinity propagation (Frey and Dueck 2007) by introducing
structural constraints. We validate the approach on a real-world social media
dataset, collected from the photosharing website Flickr. Our empirical results
show that our proposed approach is able to construct deeper and denser
structures compared to an approach using only the standard affinity propagation
algorithm.
| [
{
"version": "v1",
"created": "Wed, 26 May 2010 23:13:05 GMT"
}
] | 1,275,004,800,000 | [
[
"Plangprasopchok",
"Anon",
""
],
[
"Lerman",
"Kristina",
""
],
[
"Getoor",
"Lise",
""
]
] |
1005.4989 | Evgeny Chutchev | Evgeny Chutchev | A Formalization of the Turing Test | 10 pages | null | null | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | The paper offers a mathematical formalization of the Turing test. This
formalization makes it possible to establish the conditions under which some
Turing machine will pass the Turing test and the conditions under which every
Turing machine (or every Turing machine of the special class) will fail the
Turing test.
| [
{
"version": "v1",
"created": "Thu, 27 May 2010 05:59:56 GMT"
}
] | 1,275,004,800,000 | [
[
"Chutchev",
"Evgeny",
""
]
] |
1005.5114 | Anon Plangprasopchok | Anon Plangprasopchok, Kristina Lerman, Lise Getoor | Growing a Tree in the Forest: Constructing Folksonomies by Integrating
Structured Metadata | 10 pages, To appear in the Proceedings of ACM SIGKDD Conference on
Knowledge Discovery and Data Mining(KDD) 2010 | null | null | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | Many social Web sites allow users to annotate the content with descriptive
metadata, such as tags, and more recently to organize content hierarchically.
These types of structured metadata provide valuable evidence for learning how a
community organizes knowledge. For instance, we can aggregate many personal
hierarchies into a common taxonomy, also known as a folksonomy, that will aid
users in visualizing and browsing social content, and also to help them in
organizing their own content. However, learning from social metadata presents
several challenges, since it is sparse, shallow, ambiguous, noisy, and
inconsistent. We describe an approach to folksonomy learning based on
relational clustering, which exploits structured metadata contained in personal
hierarchies. Our approach clusters similar hierarchies using their structure
and tag statistics, then incrementally weaves them into a deeper, bushier tree.
We study folksonomy learning using social metadata extracted from the
photo-sharing site Flickr, and demonstrate that the proposed approach addresses
the challenges. Moreover, comparing to previous work, the approach produces
larger, more accurate folksonomies, and in addition, scales better.
| [
{
"version": "v1",
"created": "Thu, 27 May 2010 16:46:04 GMT"
}
] | 1,275,004,800,000 | [
[
"Plangprasopchok",
"Anon",
""
],
[
"Lerman",
"Kristina",
""
],
[
"Getoor",
"Lise",
""
]
] |
1005.5270 | Toby Walsh | George Katsirelos and Toby Walsh | Symmetries of Symmetry Breaking Constraints | To appear in Proceedings of the 19th European Conference on
Artificial Intelligence (ECAI 2010). Revises workshop paper that appears at
SymCon 2009 | null | null | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | Symmetry is an important feature of many constraint programs. We show that
any problem symmetry acting on a set of symmetry breaking constraints can be
used to break symmetry. Different symmetries pick out different solutions in
each symmetry class. This simple but powerful idea can be used in a number of
different ways. We describe one application within model restarts, a search
technique designed to reduce the conflict between symmetry breaking and the
branching heuristic. In model restarts, we restart search periodically with a
random symmetry of the symmetry breaking constraints. Experimental results show
that this symmetry breaking technique is effective in practice on some standard
benchmark problems.
| [
{
"version": "v1",
"created": "Fri, 28 May 2010 11:22:29 GMT"
}
] | 1,275,264,000,000 | [
[
"Katsirelos",
"George",
""
],
[
"Walsh",
"Toby",
""
]
] |
1006.0274 | Nan Li | Nan Li, William Cushing, Subbarao Kambhampati, Sungwook Yoon | Learning Probabilistic Hierarchical Task Networks to Capture User
Preferences | 30 pages | null | null | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | We propose automatically learning probabilistic Hierarchical Task Networks
(pHTNs) in order to capture a user's preferences on plans, by observing only
the user's behavior. HTNs are a common choice of representation for a variety
of purposes in planning, including work on learning in planning. Our
contributions are (a) learning structure and (b) representing preferences. In
contrast, prior work employing HTNs considers learning method preconditions
(instead of structure) and representing domain physics or search control
knowledge (rather than preferences). Initially we will assume that the observed
distribution of plans is an accurate representation of user preference, and
then generalize to the situation where feasibility constraints frequently
prevent the execution of preferred plans. In order to learn a distribution on
plans we adapt an Expectation-Maximization (EM) technique from the discipline
of (probabilistic) grammar induction, taking the perspective of task reductions
as productions in a context-free grammar over primitive actions. To account for
the difference between the distributions of possible and preferred plans we
subsequently modify this core EM technique, in short, by rescaling its input.
| [
{
"version": "v1",
"created": "Wed, 2 Jun 2010 01:33:11 GMT"
}
] | 1,275,523,200,000 | [
[
"Li",
"Nan",
""
],
[
"Cushing",
"William",
""
],
[
"Kambhampati",
"Subbarao",
""
],
[
"Yoon",
"Sungwook",
""
]
] |
1006.0385 | Dr. Paul J. Werbos | Paul J. Werbos | Brain-Like Stochastic Search: A Research Challenge and Funding
Opportunity | Plenary talk at IEEE Conference on Evolutionary Computing 1999,
extended in 2010 with new appendix | null | null | null | cs.AI | http://creativecommons.org/licenses/publicdomain/ | Brain-Like Stochastic Search (BLiSS) refers to this task: given a family of
utility functions U(u,A), where u is a vector of parameters or task
descriptors, maximize or minimize U with respect to u, using networks (Option
Nets) which input A and learn to generate good options u stochastically. This
paper discusses why this is crucial to brain-like intelligence (an area funded
by NSF) and to many applications, and discusses various possibilities for
network design and training. The appendix discusses recent research, relations
to work on stochastic optimization in operations research, and relations to
engineering-based approaches to understanding neocortex.
| [
{
"version": "v1",
"created": "Tue, 1 Jun 2010 18:16:10 GMT"
}
] | 1,275,523,200,000 | [
[
"Werbos",
"Paul J.",
""
]
] |
1006.0991 | Noam Shazeer | Georges Harik and Noam Shazeer | Variational Program Inference | null | null | null | HSL-000001 | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | We introduce a framework for representing a variety of interesting problems
as inference over the execution of probabilistic model programs. We represent a
"solution" to such a problem as a guide program which runs alongside the model
program and influences the model program's random choices, leading the model
program to sample from a different distribution than from its priors. Ideally
the guide program influences the model program to sample from the posteriors
given the evidence. We show how the KL- divergence between the true posterior
distribution and the distribution induced by the guided model program can be
efficiently estimated (up to an additive constant) by sampling multiple
executions of the guided model program. In addition, we show how to use the
guide program as a proposal distribution in importance sampling to
statistically prove lower bounds on the probability of the evidence and on the
probability of a hypothesis and the evidence. We can use the quotient of these
two bounds as an estimate of the conditional probability of the hypothesis
given the evidence. We thus turn the inference problem into a heuristic search
for better guide programs.
| [
{
"version": "v1",
"created": "Fri, 4 Jun 2010 20:55:04 GMT"
}
] | 1,275,955,200,000 | [
[
"Harik",
"Georges",
""
],
[
"Shazeer",
"Noam",
""
]
] |
1006.1080 | Marko A. Rodriguez | Marko A. Rodriguez, Alberto Pepe, Joshua Shinavier | The Dilated Triple | null | Emergent Web Intelligence: Advanced Semantic Technologies,
Advanced Information and Knowledge Processing series, pages 3-16,
ISBN:978-1-84996-076-2, Springer-Verlag, June 2010 | null | LA-UR-08-03927 | cs.AI | http://creativecommons.org/licenses/publicdomain/ | The basic unit of meaning on the Semantic Web is the RDF statement, or
triple, which combines a distinct subject, predicate and object to make a
definite assertion about the world. A set of triples constitutes a graph, to
which they give a collective meaning. It is upon this simple foundation that
the rich, complex knowledge structures of the Semantic Web are built. Yet the
very expressiveness of RDF, by inviting comparison with real-world knowledge,
highlights a fundamental shortcoming, in that RDF is limited to statements of
absolute fact, independent of the context in which a statement is asserted.
This is in stark contrast with the thoroughly context-sensitive nature of human
thought. The model presented here provides a particularly simple means of
contextualizing an RDF triple by associating it with related statements in the
same graph. This approach, in combination with a notion of graph similarity, is
sufficient to select only those statements from an RDF graph which are
subjectively most relevant to the context of the requesting process.
| [
{
"version": "v1",
"created": "Sun, 6 Jun 2010 05:16:55 GMT"
}
] | 1,288,569,600,000 | [
[
"Rodriguez",
"Marko A.",
""
],
[
"Pepe",
"Alberto",
""
],
[
"Shinavier",
"Joshua",
""
]
] |
1006.1190 | Harco Leslie Hendric Spits Warnars | Spits Warnars H.L.H | Game Information System | 14 pages | International Journal of Computer Science and Information
Technology 2.3 (2010) 135-148 | 10.5121/ijcsit.2010.2310 | null | cs.AI | http://creativecommons.org/licenses/by-nc-sa/3.0/ | In this Information system age many organizations consider information system
as their weapon to compete or gain competitive advantage or give the best
services for non profit organizations. Game Information System as combining
Information System and game is breakthrough to achieve organizations'
performance. The Game Information System will run the Information System with
game and how game can be implemented to run the Information System. Game is not
only for fun and entertainment, but will be a challenge to combine fun and
entertainment with Information System. The Challenge to run the information
system with entertainment, deliver the entertainment with information system
all at once. Game information system can be implemented in many sectors as like
the information system itself but in difference's view. A view of game which
people can joy and happy and do their transaction as a fun things.
| [
{
"version": "v1",
"created": "Mon, 7 Jun 2010 07:22:30 GMT"
},
{
"version": "v2",
"created": "Thu, 10 Jun 2010 02:44:03 GMT"
}
] | 1,279,152,000,000 | [
[
"H",
"Spits Warnars H. L.",
""
]
] |
1006.1701 | Harco Leslie Hendric Spits Warnars | Spits Warnars | Virtual information system on working area | 6 pages, 3 figures | Indonesian Students' International Scientific Meeting, (Temu
Ilmiah Internasional Mahasiswa Indonesia, TIIMI), London, United Kingdom, 5-7
December 2008 | null | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | In order to get strategic positioning for competition in business
organization, the information system must be ahead in this information age
where the information as one of the weapons to win the competition and in the
right hand the information will become a right bullet. The information system
with the information technology support isn't enough if just only on internet
or implemented with internet technology. The growth of information technology
as tools for helping and making people easy to use must be accompanied by
wanting to make fun and happy when they make contact with the information
technology itself. Basically human like to play, since childhood human have
been playing, free and happy and when human grow up they can't play as much as
when human was in their childhood. We have to develop the information system
which is not perform information system itself but can help human to explore
their natural instinct for playing, making fun and happiness when they interact
with the information system. Virtual information system is the way to present
playing and having fun atmosphere on working area.
| [
{
"version": "v1",
"created": "Wed, 9 Jun 2010 04:08:07 GMT"
}
] | 1,276,128,000,000 | [
[
"Warnars",
"Spits",
""
]
] |
1006.1703 | Harco Leslie Hendric Spits Warnars | Spits Warnars | Indonesian Earthquake Decision Support System | 8 pages, 7 figures | The 5th International Conference on Information & Communication
Technology and Systems (ICTS) 2009, Informatics Department, Institute of
Technology Sepuluh Nopember (ITS), Surabaya, Indonesia, 3-4 August 2009 | null | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | Earthquake DSS is an information technology environment which can be used by
government to sharpen, make faster and better the earthquake mitigation
decision. Earthquake DSS can be delivered as E-government which is not only for
government itself but in order to guarantee each citizen's rights for
education, training and information about earthquake and how to overcome the
earthquake. Knowledge can be managed for future use and would become mining by
saving and maintain all the data and information about earthquake and
earthquake mitigation in Indonesia. Using Web technology will enhance global
access and easy to use. Datawarehouse as unNormalized database for
multidimensional analysis will speed the query process and increase reports
variation. Link with other Disaster DSS in one national disaster DSS, link with
other government information system and international will enhance the
knowledge and sharpen the reports.
| [
{
"version": "v1",
"created": "Wed, 9 Jun 2010 04:36:14 GMT"
}
] | 1,276,128,000,000 | [
[
"Warnars",
"Spits",
""
]
] |
1006.2204 | Nan Rong | Joseph Y. Halpern, Nan Rong, Ashutosh Saxena | MDPs with Unawareness | 11 pages | null | null | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | Markov decision processes (MDPs) are widely used for modeling decision-making
problems in robotics, automated control, and economics. Traditional MDPs assume
that the decision maker (DM) knows all states and actions. However, this may
not be true in many situations of interest. We define a new framework, MDPs
with unawareness (MDPUs) to deal with the possibilities that a DM may not be
aware of all possible actions. We provide a complete characterization of when a
DM can learn to play near-optimally in an MDPU, and give an algorithm that
learns to play near-optimally when it is possible to do so, as efficiently as
possible. In particular, we characterize when a near-optimal solution can be
found in polynomial time.
| [
{
"version": "v1",
"created": "Fri, 11 Jun 2010 06:18:27 GMT"
}
] | 1,276,473,600,000 | [
[
"Halpern",
"Joseph Y.",
""
],
[
"Rong",
"Nan",
""
],
[
"Saxena",
"Ashutosh",
""
]
] |
1006.2743 | Marek Petrik | Marek Petrik and Shlomo Zilberstein | Global Optimization for Value Function Approximation | null | null | null | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | Existing value function approximation methods have been successfully used in
many applications, but they often lack useful a priori error bounds. We propose
a new approximate bilinear programming formulation of value function
approximation, which employs global optimization. The formulation provides
strong a priori guarantees on both robust and expected policy loss by
minimizing specific norms of the Bellman residual. Solving a bilinear program
optimally is NP-hard, but this is unavoidable because the Bellman-residual
minimization itself is NP-hard. We describe and analyze both optimal and
approximate algorithms for solving bilinear programs. The analysis shows that
this algorithm offers a convergent generalization of approximate policy
iteration. We also briefly analyze the behavior of bilinear programming
algorithms under incomplete samples. Finally, we demonstrate that the proposed
approach can consistently minimize the Bellman residual on simple benchmark
problems.
| [
{
"version": "v1",
"created": "Mon, 14 Jun 2010 15:38:41 GMT"
}
] | 1,276,560,000,000 | [
[
"Petrik",
"Marek",
""
],
[
"Zilberstein",
"Shlomo",
""
]
] |
1006.3021 | Michael Fink | Michael Fink | A General Framework for Equivalences in Answer-Set Programming by
Countermodels in the Logic of Here-and-There | 32 pages; to appear in Theory and Practice of Logic Programming
(TPLP) | null | null | INFSYS RR-1843-09-05 | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | Different notions of equivalence, such as the prominent notions of strong and
uniform equivalence, have been studied in Answer-Set Programming, mainly for
the purpose of identifying programs that can serve as substitutes without
altering the semantics, for instance in program optimization. Such semantic
comparisons are usually characterized by various selections of models in the
logic of Here-and-There (HT). For uniform equivalence however, correct
characterizations in terms of HT-models can only be obtained for finite
theories, respectively programs. In this article, we show that a selection of
countermodels in HT captures uniform equivalence also for infinite theories.
This result is turned into coherent characterizations of the different notions
of equivalence by countermodels, as well as by a mixture of HT-models and
countermodels (so-called equivalence interpretations). Moreover, we generalize
the so-called notion of relativized hyperequivalence for programs to
propositional theories, and apply the same methodology in order to obtain a
semantic characterization which is amenable to infinite settings. This allows
for a lifting of the results to first-order theories under a very general
semantics given in terms of a quantified version of HT. We thus obtain a
general framework for the study of various notions of equivalence for theories
under answer-set semantics. Moreover, we prove an expedient property that
allows for a simplified treatment of extended signatures, and provide further
results for non-ground logic programs. In particular, uniform equivalence
coincides under open and ordinary answer-set semantics, and for finite
non-ground programs under these semantics, also the usual characterization of
uniform equivalence in terms of maximal and total HT-models of the grounding is
correct, even for infinite domains, when corresponding ground programs are
infinite.
| [
{
"version": "v1",
"created": "Tue, 15 Jun 2010 16:04:54 GMT"
}
] | 1,276,646,400,000 | [
[
"Fink",
"Michael",
""
]
] |
1006.4544 | William Jackson | Mir Anamul Hasan, Khaja Md. Sher-E-Alam and Ahsan Raja Chowdhury | Human Disease Diagnosis Using a Fuzzy Expert System | IEEE Publication Format,
https://sites.google.com/site/journalofcomputing/ | Journal of Computing, Vol. 2, No. 6, June 2010, NY, USA, ISSN
2151-9617 | null | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | Human disease diagnosis is a complicated process and requires high level of
expertise. Any attempt of developing a web-based expert system dealing with
human disease diagnosis has to overcome various difficulties. This paper
describes a project work aiming to develop a web-based fuzzy expert system for
diagnosing human diseases. Now a days fuzzy systems are being used successfully
in an increasing number of application areas; they use linguistic rules to
describe systems. This research project focuses on the research and development
of a web-based clinical tool designed to improve the quality of the exchange of
health information between health care professionals and patients.
Practitioners can also use this web-based tool to corroborate diagnosis. The
proposed system is experimented on various scenarios in order to evaluate it's
performance. In all the cases, proposed system exhibits satisfactory results.
| [
{
"version": "v1",
"created": "Wed, 23 Jun 2010 15:02:19 GMT"
}
] | 1,277,337,600,000 | [
[
"Hasan",
"Mir Anamul",
""
],
[
"Sher-E-Alam",
"Khaja Md.",
""
],
[
"Chowdhury",
"Ahsan Raja",
""
]
] |
1006.4551 | William Jackson | Supriya Raheja and Smita Rajpal | Vagueness of Linguistic variable | IEEE Publication Format,
https://sites.google.com/site/journalofcomputing/ | Journal of Computing, Vol. 2, No. 6, June 2010, NY, USA, ISSN
2151-9617 | null | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | In the area of computer science focusing on creating machines that can engage
on behaviors that humans consider intelligent. The ability to create
intelligent machines has intrigued humans since ancient times and today with
the advent of the computer and 50 years of research into various programming
techniques, the dream of smart machines is becoming a reality. Researchers are
creating systems which can mimic human thought, understand speech, beat the
best human chessplayer, and countless other feats never before possible.
Ability of the human to estimate the information is most brightly shown in
using of natural languages. Using words of a natural language for valuation
qualitative attributes, for example, the person pawns uncertainty in form of
vagueness in itself estimations. Vague sets, vague judgments, vague conclusions
takes place there and then, where and when the reasonable subject exists and
also is interested in something. The vague sets theory has arisen as the answer
to an illegibility of language the reasonable subject speaks. Language of a
reasonable subject is generated by vague events which are created by the reason
and which are operated by the mind. The theory of vague sets represents an
attempt to find such approximation of vague grouping which would be more
convenient, than the classical theory of sets in situations where the natural
language plays a significant role. Such theory has been offered by known
American mathematician Gau and Buehrer .In our paper we are describing how
vagueness of linguistic variables can be solved by using the vague set
theory.This paper is mainly designed for one of directions of the eventology
(the theory of the random vague events), which has arisen within the limits of
the probability theory and which pursue the unique purpose to describe
eventologically a movement of reason.
| [
{
"version": "v1",
"created": "Wed, 23 Jun 2010 15:09:04 GMT"
}
] | 1,277,337,600,000 | [
[
"Raheja",
"Supriya",
""
],
[
"Rajpal",
"Smita",
""
]
] |
1006.4561 | William Jackson | Amjad Farooq, Syed Ahsan and Abad Shah | An Efficient Technique for Similarity Identification between Ontologies | IEEE Publication Format,
https://sites.google.com/site/journalofcomputing/ | Journal of Computing, Vol. 2, No. 6, June 2010, NY, USA, ISSN
2151-9617 | null | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | Ontologies usually suffer from the semantic heterogeneity when simultaneously
used in information sharing, merging, integrating and querying processes.
Therefore, the similarity identification between ontologies being used becomes
a mandatory task for all these processes to handle the problem of semantic
heterogeneity. In this paper, we propose an efficient technique for similarity
measurement between two ontologies. The proposed technique identifies all
candidate pairs of similar concepts without omitting any similar pair. The
proposed technique can be used in different types of operations on ontologies
such as merging, mapping and aligning. By analyzing its results a reasonable
improvement in terms of completeness, correctness and overall quality of the
results has been found.
| [
{
"version": "v1",
"created": "Wed, 23 Jun 2010 15:17:01 GMT"
}
] | 1,277,337,600,000 | [
[
"Farooq",
"Amjad",
""
],
[
"Ahsan",
"Syed",
""
],
[
"Shah",
"Abad",
""
]
] |
1006.4563 | William Jackson | Mohammad Mustafa Taye | The State of the Art: Ontology Web-Based Languages: XML Based | IEEE Publication Format,
https://sites.google.com/site/journalofcomputing/ | Journal of Computing, Vol. 2, No. 6, June 2010, NY, USA, ISSN
2151-9617 | null | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | Many formal languages have been proposed to express or represent Ontologies,
including RDF, RDFS, DAML+OIL and OWL. Most of these languages are based on XML
syntax, but with various terminologies and expressiveness. Therefore, choosing
a language for building an Ontology is the main step. The main point of
choosing language to represent Ontology is based mainly on what the Ontology
will represent or be used for. That language should have a range of quality
support features such as ease of use, expressive power, compatibility, sharing
and versioning, internationalisation. This is because different kinds of
knowledge-based applications need different language features. The main
objective of these languages is to add semantics to the existing information on
the web. The aims of this paper is to provide a good knowledge of existing
language and understanding of these languages and how could be used.
| [
{
"version": "v1",
"created": "Wed, 23 Jun 2010 15:18:33 GMT"
}
] | 1,277,337,600,000 | [
[
"Taye",
"Mohammad Mustafa",
""
]
] |
1006.4567 | William Jackson | Mohammad Mustafa Taye | Understanding Semantic Web and Ontologies: Theory and Applications | IEEE Publication Format,
https://sites.google.com/site/journalofcomputing/ | Journal of Computing, Vol. 2, No. 6, June 2010, NY, USA, ISSN
2151-9617 | null | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | Semantic Web is actually an extension of the current one in that it
represents information more meaningfully for humans and computers alike. It
enables the description of contents and services in machine-readable form, and
enables annotating, discovering, publishing, advertising and composing services
to be automated. It was developed based on Ontology, which is considered as the
backbone of the Semantic Web. In other words, the current Web is transformed
from being machine-readable to machine-understandable. In fact, Ontology is a
key technique with which to annotate semantics and provide a common,
comprehensible foundation for resources on the Semantic Web. Moreover, Ontology
can provide a common vocabulary, a grammar for publishing data, and can supply
a semantic description of data which can be used to preserve the Ontologies and
keep them ready for inference. This paper provides basic concepts of web
services and the Semantic Web, defines the structure and the main applications
of ontology, and provides many relevant terms are explained in order to provide
a basic understanding of ontologies.
| [
{
"version": "v1",
"created": "Wed, 23 Jun 2010 15:20:02 GMT"
}
] | 1,277,337,600,000 | [
[
"Taye",
"Mohammad Mustafa",
""
]
] |
1006.5041 | Yoshinobu Kawahara | Yoshinobu Kawahara, Kenneth Bollen, Shohei Shimizu and Takashi Washio | GroupLiNGAM: Linear non-Gaussian acyclic models for sets of variables | null | null | null | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | Finding the structure of a graphical model has been received much attention
in many fields. Recently, it is reported that the non-Gaussianity of data
enables us to identify the structure of a directed acyclic graph without any
prior knowledge on the structure. In this paper, we propose a novel
non-Gaussianity based algorithm for more general type of models; chain graphs.
The algorithm finds an ordering of the disjoint subsets of variables by
iteratively evaluating the independence between the variable subset and the
residuals when the remaining variables are regressed on those. However, its
computational cost grows exponentially according to the number of variables.
Therefore, we further discuss an efficient approximate approach for applying
the algorithm to large sized graphs. We illustrate the algorithm with
artificial and real-world datasets.
| [
{
"version": "v1",
"created": "Thu, 24 Jun 2010 13:09:36 GMT"
}
] | 1,277,683,200,000 | [
[
"Kawahara",
"Yoshinobu",
""
],
[
"Bollen",
"Kenneth",
""
],
[
"Shimizu",
"Shohei",
""
],
[
"Washio",
"Takashi",
""
]
] |
1006.5511 | Athar Kharal | Athar Kharal | Soft Approximations and uni-int Decision Making | This paper has been withdrawn by the author due to further expansion
of this work. Work is also submitted to a peer reviewed journal and is
expected to be published very soon | null | null | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | Notions of core, support and inversion of a soft set have been defined and
studied. Soft approximations are soft sets developed through core and support,
and are used for granulating the soft space. Membership structure of a soft set
has been probed in and many interesting properties presented. The mathematical
apparatus developed so far in this paper yields a detailed analysis of two
works viz. [N. Cagman, S. Enginoglu, Soft set theory and uni-int decision
making, European Jr. of Operational Research (article in press, available
online 12 May 2010)] and [N. Cagman, S. Enginoglu, Soft matrix theory and its
decision making, Computers and Mathematics with Applications 59 (2010) 3308 -
3314.]. We prove (Theorem 8.1) that uni-int method of Cagman is equivalent to a
core-support expression which is computationally far less expansive than
uni-int. This also highlights some shortcomings in Cagman's uni-int method and
thus motivates us to improve the method. We first suggest an improvement in
uni-int method and then present a new conjecture to solve the optimum choice
problem given by Cagman and Enginoglu. Our Example 8.6 presents a case where
the optimum choice is intuitively clear yet both uni-int methods (Cagman's and
our improved one) give wrong answer but the new conjecture solves the problem
correctly.
| [
{
"version": "v1",
"created": "Tue, 29 Jun 2010 06:58:35 GMT"
},
{
"version": "v2",
"created": "Mon, 5 Jul 2010 02:01:34 GMT"
}
] | 1,480,032,000,000 | [
[
"Kharal",
"Athar",
""
]
] |
1006.5657 | Alessandra Mileo | A. Mileo, D. Merico, R. Bisiani | Reasoning Support for Risk Prediction and Prevention in Independent
Living | 36 pages, 5 figures, 10 tables. To appear in Theory and Practice of
Logic Programming (TPLP) | null | null | null | cs.AI | http://arxiv.org/licenses/nonexclusive-distrib/1.0/ | In recent years there has been growing interest in solutions for the delivery
of clinical care for the elderly, due to the large increase in aging
population. Monitoring a patient in his home environment is necessary to ensure
continuity of care in home settings, but, to be useful, this activity must not
be too invasive for patients and a burden for caregivers. We prototyped a
system called SINDI (Secure and INDependent lIving), focused on i) collecting a
limited amount of data about the person and the environment through Wireless
Sensor Networks (WSN), and ii) inferring from these data enough information to
support caregivers in understanding patients' well being and in predicting
possible evolutions of their health. Our hierarchical logic-based model of
health combines data from different sources, sensor data, tests results,
common-sense knowledge and patient's clinical profile at the lower level, and
correlation rules between health conditions across upper levels. The logical
formalization and the reasoning process are based on Answer Set Programming.
The expressive power of this logic programming paradigm makes it possible to
reason about health evolution even when the available information is incomplete
and potentially incoherent, while declarativity simplifies rules specification
by caregivers and allows automatic encoding of knowledge. This paper describes
how these issues have been targeted in the application scenario of the SINDI
system.
| [
{
"version": "v1",
"created": "Tue, 29 Jun 2010 15:49:54 GMT"
}
] | 1,277,856,000,000 | [
[
"Mileo",
"A.",
""
],
[
"Merico",
"D.",
""
],
[
"Bisiani",
"R.",
""
]
] |
1007.0412 | Dimple Juneja Dr. | Ujwalla Gawande, Mukesh Zaveri, Avichal Kapur | Improving Iris Recognition Accuracy By Score Based Fusion Method | http://ijict.org/index.php/ijoat/article/view/improving-iris-recognition | International Journal of Advancements in Technology, Vol 1, No 1
(2010) | null | null | cs.AI | http://creativecommons.org/licenses/by/3.0/ | Iris recognition technology, used to identify individuals by photographing
the iris of their eye, has become popular in security applications because of
its ease of use, accuracy, and safety in controlling access to high-security
areas. Fusion of multiple algorithms for biometric verification performance
improvement has received considerable attention. The proposed method combines
the zero-crossing 1 D wavelet Euler number, and genetic algorithm based for
feature extraction. The output from these three algorithms is normalized and
their score are fused to decide whether the user is genuine or imposter. This
new strategies is discussed in this paper, in order to compute a multimodal
combined score.
| [
{
"version": "v1",
"created": "Thu, 1 Jul 2010 09:24:37 GMT"
}
] | 1,278,288,000,000 | [
[
"Gawande",
"Ujwalla",
""
],
[
"Zaveri",
"Mukesh",
""
],
[
"Kapur",
"Avichal",
""
]
] |
Subsets and Splits
No saved queries yet
Save your SQL queries to embed, download, and access them later. Queries will appear here once saved.