abstract
stringlengths
7
10.1k
authors
stringlengths
9
1.96k
title
stringlengths
6
367
__index_level_0__
int64
5
1,000k
This thesis presents Collabode, a web-based integrated development environment for Java. With real-time collaborative editing, multiple programmers can use Collabode to edit the same source code at the same time. Collabode introduces error-mediated integration , where multiple editors see the text of one another's changes while being isolated from errors and in-progress work, and error-free changes are integrated automatically. Three models of collaborative programming are presented and evaluated using Collabode. Classroom programming brings zero-setup web-based programming to computer science students working in a classroom or lab. Test-driven pair programming combines two existing software development strategies to create a model with clear roles and explicit tool support. And micro-outsourcing enables one programmer to easily request and integrate very small contributions from many distributed assistants, demonstrating how a system for highly-collaborative programming enables a development model infeasible with current tools. #R##N#To show that highly-collaborative programming, using real-time collaborative editing of source code, is practical, useful, and enables new models of software development, this thesis presents a series of user studies. A study with pairs of both student and professional programmers shows that error-mediated integration allows them to work productively in parallel. In a semester-long deployment of Collabode, students in an MIT software engineering course used the system for classroom programming. In a lab study of a Collabode prototype, professional programmers used test-driven pair programming. Finally, a study involving both in-lab participants and contractors hired online demonstrated how micro-outsourcing allowed participants to approach programming in a new way, one enabled by collaborative editing, automatic error-mediated integration, and a web-based environment requiring no local setup. (Copies available exclusively from MIT Libraries, libraries.mit.edu/docs - [email protected])
['Robert C. Miller', 'Max Goldman']
Software development with real-time collaborative editing
561,140
Deploying radio-capable gateways at the sea surface can mitigate the limitations of acoustic communications in underwater sensor networks (UWSN). Finding the best placement for gateways is formulated as an ILP problem. The choice of gateway candidate locations can affect both the feasibility and the quality of the solution. In this paper, we show the limitation of using a regular mesh of candidate locations, and present a novel algorithm for defining candidate locations that both reduces the complexity of the optimization problem and enhances the feasibility and quality of the solution.
['Saleh Ibrahim', 'R. Ammar', 'Jun-Hong Cui']
Geometry-assisted gateway deployment for underwater sensor networks
281,448
Sequential consistency is a multiprocessor memory model of both practical and theoretical importance. Designing and implementing a memory system that efficiently provides a given memory model is a challenging and error-prone task, so automated verification support would be invaluable. Unfortunately, the general problem of deciding whether a finite-state protocol implements sequential consistency is undecidable. In this paper, we identify a restricted class of protocols for which verifying sequential consistency is decidable. The class includes all published sequentially consistent protocols that are known to us, and we argue why the class is likely to include all real sequentially consistent protocols. In principle, our method can be applied in a completely automated fashion for verification of all implemented protocols.
['Anne Condon', 'Alan J. Hu']
Automatable verification of sequential consistency
545,933
Optimizing PolyACO Training with GPU-Based Parallelization
['Torry Tufteland', 'Guro Ødesneltvedt', 'Morten Goodwin']
Optimizing PolyACO Training with GPU-Based Parallelization
871,876
Agreeing to Disagree: Leveraging Consensus and Divergence in Bayesian Belief Aggregation.
['Kshanti A. Greene', 'George F. Luger']
Agreeing to Disagree: Leveraging Consensus and Divergence in Bayesian Belief Aggregation.
731,369
We present the concepts of ?-well-posedness for parametric noncooperative games and for optimization problems with constraints defined by parametric Nash equilibria. We investigate some classes of functions that ensure these types of well-posedness and the connections with ?-well-posedness for variational inequalities and optimization problems with variational inequality constraints.
['M. Beatrice Lignola', 'Jacqueline Morgan']
α-Well-posedness for Nash Equilibria and For Optimization Problems with Nash Equilibrium Constraints
140,184
In order to be able to measure the impact of future government proposals and policies on a large scale, policy makers need artificial environments that enable mass participation and user interaction in contexts that simulate real life societies. These environments will be used as "sandboxes" where potential policies will be put to force and users' reactions will be captured and analyzed in order to make estimations about the effectiveness of the policy in real societies. Virtual spaces including 3D virtual worlds and social networks are ideal for this purpose as, due to their popularity as well as their similarity to the real world, they can be seen as a simulation of society, a micro-society. The +Spaces project aims to provide the technology enabling governments to use Virtual Spaces as testing environments where prospective policies can be applied, as well as the necessary tools for capturing, processing and analyzing public reactions to them. The current paper presents a SOA based architecture that resides on top of Virtual Spaces and allows users to deploy applications that enable policy simulation, monitor and evaluate users' reactions to the new policy and present the results to the policy maker in a meaningful way.
['Magdalini Kardara', 'Omri Fuchs', 'Fotis Aisopos', 'Athanasios Papaoikonomou', 'Konstantinos Tserpes', 'Theodora A. Varvarigou']
A Service Oriented Architecture Enabling Policy Simulation in Virtual Spaces
301,433
IP address lookup is one of the most important functionalities in the router design. To meet the requirements in high speed routers consisting of line-cards with 40Gbps transfer rates, researchers usually take lookup/update speed, storage requirement, and scalability into consideration when designing a high performance forwarding engine. As a result, hardware-based solutions are often used to develop a high speed router nowadays. In this paper, we develop a FPGA-based pipelined forwarding engine which focuses on reducing the update overhead. The proposed scheme partitions the routing table into several disjoint groups. The prefix which resides in the same group is interleaving stored into several memory modules to ensure the parallel comparison at the comparison stage. With the pipeline enabled, the throughput of the design can achieve the speed of OC-768. The update overhead can also be reduced.
['Yeim-Kuan Chang', 'Yen-Cheng Liu', 'Fang-Chen Kuo']
A Pipelined IP Forwarding Engine with Fast Update
387,526
It is well-known that crisp RDF is not suitable to represent vague information. Fuzzy RDF variants are emerging to overcome this limitations. In this work we provide, under a very general semantics, a deductive system for a salient fragment of fuzzy RDF. We then also show how we may compute the top-k answers of the union of conjunctive queries in which answers may be scored by means of a scoring function.
['Umberto Straccia']
A Minimal Deductive System for General Fuzzy RDF
458,224
In this paper we present a model approximation technique based on N-step-ahead affine representations obtained via Monte-Carlo integrations. The approach enables simultaneous linearization and model order reduction of nonlinear systems in the original state space thus allowing the application of linear MPC algorithms to nonlinear systems. The methodology is detailed through its application to benchmark model examples.
['Romain S.C. Lambert', 'Pedro Rivotti', 'Efstratios N. Pistikopoulos']
A Monte-Carlo based model approximation technique for linear model predictive control of nonlinear systems
264,334
We present here the first version of an automatic tool for the synthesis of dataparts with fault detection or tolerance characteristics. This work is to be combined with solutions already proposed for controllers, in order to provide a complete control-dominated synthesis flow allowing the synthesis of control/data architectures with fault detection or tolerance capabilities.
['X. Wendling', 'H. Chauvet', 'Lionel Revéret', 'Raphaël Rochet', 'Régis Leveugle']
Automatic and optimized synthesis of dataparts with fault detection or tolerance capabilities
471,523
The objective of this paper is to derive some limit theorems of fuzzy random variables under the extension principle associated with continuous Archimedean triangular norms (t-norms). First of all, some convergence theorems for the sum of fuzzy random variables in chance measure and expected value are proved respectively based on the arithmetics of continuous Archimedean triangular norms. Then, a law of large numbers for fuzzy random variables is established by using the obtained convergence theorems. The results of the derived law of large numbers can degenerate to the strong laws of large numbers for random variables and fuzzy variables, respectively.
['Shuming Wang', 'Junzo Watada']
T-norm-based limit theorems for fuzzy random variables
239,381
Accurate classification of traffic flows is highly beneficial for network management and security monitoring. Nowadays, many researchers have proposed machine learning techniques (i.e., decision tree, SVM, BayesNet and Naïve Bayes) for traffic classification. However, none of these classification techniques can achieve the highest accuracy for all traffic classification tasks. Recently, more and more researchers tried to combine multiple classifiers to obtain better performance. In this paper, we propose a weighted combination technique for traffic classification. The weighted combination approach first takes advantage of the confidence values inferred by each individual classifier; then assigns weight for each classifier according to its prediction accuracy on a validation traffic dataset. Experimental results on two different traffic traces demonstrate that our new weighted multi-classification framework is able to obtain satisfactory results.
['Jinghua Yan', 'Xiaochun Yun', 'Zhigang Wu', 'Hao Luo', 'Shuzhuang Zhang']
A novel weighted combination technique for traffic classification
939,733
Sorting has tremendous usage in the applications that handle massive amount of data. Existing techniques accelerate sorting using multiprocessors or GPGPUs where a data set is partitioned into disjunctive subsets to allow multiple sorting threads working in parallel. Hardware sorters implemented in FPGAs have the potential of providing high-speed and low-energy solutions but the partition algorithms used in software systems are so data dependent that they cannot be easily adopted. The speed of most current sequential sorters still hangs around 1 number/cycle. Recently a new hardware merge sorter broke this speed limit by merging a large number of sorted sequences at a speed proportional to the number of sequences. This paper significantly improves its area and speed scalability by allowing stalls and variable sorting rate. A 32-port parallel merge-tree that merges 32 sequences is implemented in a Virtex-7 FPGA. It merges sequences at an average rate of 31.05 number/cycle and reduces the total sorting time by 160 times compared with traditional sequential sorters.
['Wei Song', 'Dirk Koch', 'Mikel Luján', 'Jim D. Garside']
Parallel Hardware Merge Sorter
865,957
This paper describes a methodology for performing system level signal and power integrity analyses of SiP-based systems. The paper briefly outlines some new modeling and simulation techniques that have been developed to enable the proposed methodology. Some results based on the application of this methodology on test systems are also presented.
['Rohan Mandrekar', 'Krishna Bharath', 'Krishna Srinivasan', 'Ege Engin', 'Madhavan Swaminathan']
System level signal and power integrity analysis methodology for system-in-package applications
223,638
Generating realistic and responsive traffic that reflects different network conditions is a challenging problem associated with performing valid experiments in network testbeds. In this work, we preset Caliper, a highly precise traffic generation tool, built on NetThreads, a flexible platform that we have created for developing packet processing applications on FPGA-based devices and the NetFPGA in particular. We will demonstrate the effect of ad-hoc inter-departure times on a commodity NIC compared to precisely timed inter-departures with Caliper. Both NetThreads and Caliper are available as free software to download.
['Monia Ghobadi', 'Martin Labrecque', 'Geoffrey Salmon', 'Kaveh Aasaraai', 'Soheil Hassas Yeganeh', 'Yashar Ganjali', 'J. Gregory Steffan']
Caliper: a tool to generate precise and closed-loop traffic
238,747
External reliability in data-center networking is today commonly reached via forms of provider multihoming, so as to guarantee higher service availability rates. In parallel, Cloud users also resort to multihoming via different device access interfaces (Wi-fi, 3G, Wired). Both practices add path diversity between Cloud users and servers, unusable with legacy communication protocols. To overcome this void, we present a holistic multipath communication architecture for Cloud access and inter-Cloud communications, and defend its possible implementation using three promising recent protocols functionally acting at three different communication layers: MPTCP, LISP and TRILL.
['Matthieu Coudron', 'Stefano Secci', 'Guido Maier', 'Guy Pujolle', 'Achille Pattavina']
Boosting Cloud Communications through a Crosslayer Multipath Protocol Architecture
919,047
In this paper we discuss how three types of fuzzy partitions can be used to describe the results of three types of cluster structures. Standard fuzzy partitions are suitable for centroid based clusters, and I-fuzzy partitions for clusters represented by segments or lines (e.g., c-varieties). In this paper, we introduce hesitant fuzzy partitions. They are suitable for clusters defined by sets of centroids. Because of that, we show that they are useful for hierarchical clustering. We also establish the relationship between hesitant fuzzy partitions and I-fuzzy partitions.
['Vicenç Torra', 'Laya Aliahmadipour', 'Anders Dahlbom']
Fuzzy, I-fuzzy, and H-fuzzy partitions to describe clusters
946,433
Programming language design and implementation are still one of the challenges of computer science. Programmers use a variety of languages in their daily work, and new languages appear frequently. With formal methods for programming language description, a language designer has the chance to automatically generate a compiler or an interpreter. Unfortunately, compiler generators nowadays use linear textual specifications, which are less suitable then visual presentations. In this paper, a language development in a visual manner is described.
['Robet Krusec', 'Mitja Lenic', 'Marjan Mernik', 'Viljem Zumer']
Language development in a visual manner
324,438
We propose the innovative architecture of Superfluidity, a Horizon 2020 project, co-funded by the European Union. Superfluidity targets 5G networks, by addressing key network operator challenges with a multi-pronged approach, based on the concept of a flexible, highly adaptive, superfluid network. Superfluidity supports rapid service deployment and migration in a heterogeneous network environment, regardless of the underlying hardware. The overall proposal offers advanced capabilities in terms of service deployment and interoperability, while guaranteeing high-performance levels end-to-end. Copyright © 2016 John Wiley & Sons, Ltd.
['Giuseppe Bianchi', 'Erez Biton', 'Nicola Blefari-Melazzi', 'Isabel Borges', 'Luca Chiaraviglio', 'Pedro de la Cruz Ramos', 'Philip Eardley', 'Francisco Fontes', 'Michael J. McGrath', 'Lionel Natarianni', 'Dragos Niculescu', 'Carlos Parada', 'Matei Popovici', 'Vincenzo Riccobene', 'Stefano Salsano', 'Bessem Sayadi', 'John Thomson', 'Christos Tselios', 'George Tsolis']
Superfluidity: a flexible functional architecture for 5G networks
991,963
Plants can have positive effects on each other in numerous ways, including protection from harsh environmental conditions. This phenomenon, known as facilitation, occurs in water-stressed environments when shade from larger shrubs protects smaller annuals from harsh sun, enabling them to exist on scarce water. The topic of this paper is a model of this phenomenon that allows search algorithms to find residential landscape designs that incorporate facilitation to conserve water. This model is based in botany; it captures the growth requirements of real plant species in a fitness function, but also includes a penalty term in that function that encourages facilitative interactions with other plants on the landscape. To evaluate the effectiveness of this approach, two search strategies–simulated annealing and agent-based search–were applied to models of different collections of simulated plant types and landscapes with different light distributions. These two search strategies produced landscape designs with different spatial distributions of the larger plants. All designs exhibited facilitation and lower water use than designs where facilitation was not included.
['Rhonda Hoenigman', 'Elizabeth Bradley', 'Nichole N. Barger']
Water conservation through facilitation on residential landscapes
687,959
The conservative properties of stiffness matrices via the nonconservative congruence mapping between the joint and Cartesian spaces are investigated with simulation of two fingers manipulating an object. The properties of both constant and configuration dependent stiffness matrices are presented with integration of work when manipulating along a closed path with no self-intersection. A stiffness matrix is conservative if the force resulting from the stiffness matrix is conservative, and the work done by such force along a closed path is zero, i.e., independent of the path. Both theoretical derivation and numerical simulation show that a stiffness matrix in /spl Rscr/3/spl times/3 Cartesian space or joint space with n generalized coordinates will be conservative if it is symmetric and satisfies the exact differential criterion. Simulation of two fingers manipulating an object is implemented using OpenGL with both Cartesian-based and joint-based stiffness control scheme. The results show that the congruence transformation generally results in nonconservative stiffness matrix, except for a special group configuration dependent solutions.
['Shih-Feng Cheng', 'Imin Kao']
Simulation of conservative properties of stiffness matrices in congruence transformation
273,680
This paper deals with the problem of using the data mining models in a real-world situation where the user can not provide all the inputs with which the predictive model is built. A learning system framework, Query Based Learning System (QBLS), is developed for improving the performance of the predictive models in practice where not all inputs are available for querying to the system. The automatic feature selection algorithm called Query Based Feature Selection (QBFS) is developed for selecting features to obtain a balance between the relative minimum subset of features and the relative maximum classification accuracy. Performance of the QBLS system and the QBFS algorithm is successfully demonstrated with a real-world application.
['Esther Ge', 'Richi Nayak', 'Yue Xu', 'Yuefeng Li']
A user driven data mining process model and learning system
76,140
Multiple Target Tracking for Mobile Robots Using the JPDAF Algorithm.
['Aliakbar A. Gorji', 'Mohammad Bagher Menhaj', 'Saeed Shiry']
Multiple Target Tracking for Mobile Robots Using the JPDAF Algorithm.
766,903
Classroom interaction is an important part of instruction. Based on the findings in psychology, artificial intelligence and cognitive neuroscience, this article focuses on analyzing a typical classroom interaction happened in Discrete Mathematics classroom from the view of cognitive process and simulating the process with ACT-R tool. The simulation can depict execution trace of the cognitive program and demonstrate the memory utilization, including Long-term declarative memory and Long-term procedural memory. The results would be helpful for the teachers understand the cognitive process deeply and to design instruction effectively.
['Xuefeng Wei', 'Li Li', 'Guangzuo Cui']
Research in Classroom Interaction: Insight from Cognitive Process
123,596
Open Justice in Latin America: Judiciary Websites Under Scrutiny.
['Rodrigo Sandoval-Almazán', 'David Valle Cruz']
Open Justice in Latin America: Judiciary Websites Under Scrutiny.
978,286
We provide simple but surprisingly useful direct product theorems for proving lower bounds on online algorithms with a limited amount of advice about the future. Intuitively, our direct product theorems say that if b bits of advice are needed to ensure a cost of at most t for some problem, then r*b bits of advice are needed to ensure a total cost of at most r*t when solving r independent instances of the problem. Using our direct product theorems, we are able to translate decades of research on randomized online algorithms to the advice complexity model. Doing so improves significantly on the previous best advice complexity lower bounds for many online problems, or provides the first known lower bounds. For example, we show that#R##N##R##N#- A paging algorithm needs Omega(n) bits of advice to achieve a competitive ratio better than H_k = Omega(log k), where k is the cache size. Previously, it was only known that Omega(n) bits of advice were necessary to achieve a constant competitive ratio smaller than 5/4.#R##N##R##N#- Every O(n^{1-epsilon})-competitive vertex coloring algorithm must use Omega(n log n) bits of advice. Previously, it was only known that Omega(n log n) bits of advice were necessary to be optimal.#R##N##R##N#For certain online problems, including the MTS, k-server, metric matching, paging, list update, and dynamic binary search tree problem, we prove that randomization and sublinear advice are equally powerful (if the underlying metric space or node set is finite). This means that several long-standing open questions regarding randomized online algorithms can be equivalently stated as questions regarding online algorithms with sublinear advice. For example, we show that there exists a deterministic O(log k)-competitive k-server algorithm with sublinear advice if and only if there exists a randomized O(log k)-competitive k-server algorithm without advice. Technically, our main direct product theorem is obtained by extending an information theoretical lower bound technique due to Emek, Fraigniaud, Korman, and Rosen [ICALP'09].
['Jesper W. Mikkelsen']
Randomization Can Be as Helpful as a Glimpse of the Future in Online Computation
555,333
EI 2016 : IS&T International Symposium on Electronic Imaging , Feb 14-18, 2016 , San Francisco , CA , USA
['H. Takehara', 'Yuta Nakashima', 'Tomokazu Sato', 'Naokazu Yokoya']
3D shape template generation from RGB-D images capturing a moving and deforming object
990,232
3D Tracking of Human Hands in Interaction with Unknown Objects.
['Paschalis Panteleris', 'Nikolaos Kyriazis', 'Antonis A. Argyros']
3D Tracking of Human Hands in Interaction with Unknown Objects.
704,928
When analyzing and modeling dynamical systems in the frequency domain, the effects of nonlinearities need to be taken into account. This paper contributes to the analysis of the effects of nonlinearities in the frequency domain by supplying new analytical tools and results that allow spectral analysis of the output of a class of nonlinear systems. A mapping from the parameters defining the nonlinear and LTI dynamics to the output spectrum is derived, which allows analytic description and analysis of the corresponding higher order sinusoidal input describing functions. The theoretical results are illustrated by examples that show both the use and efficiency of the proposed algorithms.
['Dj David Rijlaarsdam', 'Pwjm Pieter Nuij', 'J. Schoukens', 'M Maarten Steinbuch']
Brief paper: Spectral analysis of block structured nonlinear systems and higher order sinusoidal input describing functions
135,341
We investigate the representation of n-tone rows as paths on an n-dimensional hypercube graph with vertices labeled in the power set of the aggregate. These paths run from the vertex labeled by the null set to the one la- beled by the full set, passing through vertices whose labels gradually accumu- late members of the aggregate. Row relations are then given as hypercube symmetries. Such a model is more sensitive to the musical process of chromatic completion than those that deal more exclusively with n-tone rows and their re- lations as permutations of an underlying set. Our results lead to a graph- theoretical representation of the duality inherent in the pitch-class/order-number isomorphism of serial theory.
['Robert W. Peck']
A Hypercube-Graph Model for n-Tone Rows and Relations
117,152
Hiding data in binary images can facilitate authentication of important documents in the digital domain, which generally requires a high embedding payload. Recently, a steganography framework known as the wet paper coding has been employed in binary image watermarking to achieve high embedding payload. In this paper, we introduce a new concept of super-pixels, and study how to incorporate them in the framework of wet paper coding to further improve the embedding payload in binary images. Using binary text documents as an example, we demonstrate the effectiveness of the proposed super-pixel technique.
['Hongmei Gou', 'Min Wu']
Improving Embedding Payload in Binary Imageswith "Super-Pixels"
278,888
In many applications, the manipulations require only part of the degrees of freedom (DOFs) of the end-effector, or some DOFs are more important than the rest. We name these applications prioritized manipulations. The end-effector's DOFs are divided into those which are critical and must be controlled as precisely as possible, and those which have loose specifications, so their tracking performance can be traded off to achieve other needs. In this paper, for the class of general constrained rigid multibody systems (including passive joints and multiple closed kinematic loops), we derive a formulation for partitioning the task space into major and secondary task directions, and finding the velocity and static force mappings that precisely accomplish the major task and optimize some secondary goals such as reliability enhancement, obstacle and singularity avoidance, fault tolerance, or joint limit avoidance. The major task and secondary goals need to be specified in term of velocities/forces. In addition, a framework is developed to handle two kinds of common actuator failures, torque failure and position failure, by reconfiguring the differential kinematics and static force models. The techniques are tested on a 6-DOF parallel robot. Experimental results illustrate that the approach is practical and yields good performance.
['Yixin Chen', 'John E. McInroy', 'Yong Yi']
Optimal, fault-tolerant mappings to achieve secondary goals without compromising primary performance
140,538
A Duplication Task Scheduling Algorithm in Cloud Environments
['Min Ruan', 'Yun Li', 'Yinjuan Zhang']
A Duplication Task Scheduling Algorithm in Cloud Environments
966,254
Abstract We provide a triple of diagonal Latin squares of order 10 that is the closest to being a triple of mutually orthogonal diagonal Latin squares found so far. It was obtained by constructing all orthogonal mates for diagonal Latin squares generated according to a specific scheme. We also show that a triple of mutually orthogonal diagonal Latin squares of order 10 cannot be constructed based on diagonal Latin squares from specific families.
['Oleg Zaikin', 'A. B. Zhuravlev', 'Stepan Kochemazov', 'Eduard Vatutin']
On the Construction of Triples of Diagonal Latin Squares of Order 10
915,916
A distributed grid computing environment, like the European datagrid (EDG), that manages enormous numbers of computing resources (in terms of processing power, disk space, memory usage, etc.) and astronomical amounts of scientific data, requires mechanisms for a balanced access of many thousands of simultaneous users. The datagrid accounting system (DGAS), that we present, is designed to support an economy-based approach to regulating the distribution of the resources among the authorized grid users. We describe two possible models for resource pricing and a utility function for economic brokering.
['Rosario M. Piro', 'Andrea Guarise', 'Albert Werbrouck']
An economy-based accounting infrastructure for the datagrid
400,740
Pardon the title, but I wanted to both grab your attention and get straight to the point (which naturally brings my own ethics in choosing the title into question). To clarify: I am referring to textbooks and courses on computer ethics, not to the field of ethics as taught to philosophy majors. I am also speaking only for myself, not for anyone else.
['Peter Froelich']
Ethics considered harmful
51,898
Developers have to to constantly improve their apps by fixing critical bugs and implementing the most desired features in order to gain shares in the continuously increasing and competitive market of mobile apps. A precious source of information to plan such activities is represented by reviews left by users on the app store. However, in order to exploit such information developers need to manually analyze such reviews. This is something not doable if, as frequently happens, the app receives hundreds of reviews per day. In this paper we introduce CLAP ( C rowd L istener for rele A se P lanning), a thorough solution to (i) categorize user reviews based on the information they carry out ( e.g. , bug reporting), (ii) cluster together related reviews ( e.g. , all reviews reporting the same bug), and (iii) automatically prioritize the clusters of reviews to be implemented when planning the subsequent app release. We evaluated all the steps behind CLAP, showing its high accuracy in categorizing and clustering reviews and the meaningfulness of the recommended prioritizations. Also, given the availability of CLAP as a working tool, we assessed its practical applicability in industrial environments.
['Lorenzo Villarroel', 'Gabriele Bavota', 'Barbara Russo', 'Rocco Oliveto', 'Massimiliano Di Penta']
Release planning of mobile apps based on user reviews
730,541
The paper describes the research carried out into the process of program comprehension during software maintenance within the EUREKA project REM (Reverse Engineering and Maintenance). Tools to aid maintenance programmers to achieve and document an overall interpretation of the system being maintained, as well as a deep understanding of the fine details of the source code, are presented. The cognition model assumed exploits both the top down and the bottom up approaches: program comprehension is intended as an iterative process of guessing, constructing hypotheses and verifying them This process is supported by providing maintenance programmers with a flexible system for querying source code and testing hypotheses against the evidence in the code. Several facilities generate new documents at the design and specification level, thus allowing maintenance programmers to record the knowledge gained for future use.
['Gerardo Canfora', 'L. Mancini', 'Maria Tortorella']
A workbench for program comprehension during software maintenance
471,444
We consider a one-to-two Gaussian broadcasting problem where the transmitter observes a memoryless bi-variate Gaussian source and each receiver wishes to estimate one of the source components. The transmitter describes the source pair by means of an average-power-constrained signal and each receiver observes this signal corrupted by a different additive white Gaussian noise. From its respective observation, Receiver 1 wishes to estimate the first source component and Receiver 2 wishes to estimate the second. We seek to characterize the pairs of expected squared-error distortions that are simultaneously achievable at the two receivers. Our result is that below a certain SNR-threshold an ldquouncoded schemerdquo that sends a linear combination of the source components is optimal. We present a lower bound on this threshold in terms of the source correlation and the distortion at the receiver with weaker channel noise.
['Shraga I. Bross', 'Amos Lapidoth', 'Stephan Tinguely']
Broadcasting correlated Gaussians
195,138
The multiple hypothesis tracking (MHT) approach has been proven to be successful in multiple target tracking applications, however, its computational complexity remains a major hurdle to its practical implementation. This paper presents an efficient MHT implementation, referred to as “GRASP-MHT”, which integrates a greedy randomized adaptive search procedure (GRASP) within a track-oriented MHT framework. The hypothesis generating problem arising in the MHT framework is formulated as a maximum weighted independent set problem, and a GRASP module is designed to generate multiple high-quality hypotheses. An extensive simulation study was carried out to compare the performance of the proposed GRASP-MHT against several well-known multitarget tracking algorithms, and multiple metrics were considered in order to make the performance evaluation more comprehensive. Experimental results indicate that, by efficiently generating and fusing multiple high-quality global hypotheses in the data association process, GRASP-MHT is able to achieve better overall tracking performance than other algorithms, especially in a closely-spaced multitarget scenario.
['Xiaoyi Ren', 'Zhipei Huang', 'Shuyan Sun', 'Dongyan Liu', 'Jiankang Wu']
An Efficient MHT Implementation Using GRASP
290,970
We propose and demonstrate a new paradigm for active vision research that draws upon recent advances in the fields of artificial life and computer graphics. A software alternative to the prevailing hardware vision mindset, animat vision prescribes artificial animals, or animats, situated in physics-based virtual worlds as autonomous virtual robots possessing active perception systems. To be operative in its world, an animat must autonomously control its eyes and muscle-actuated body, applying computer vision algorithms to continuously analyze the retinal image streams acquired by its eyes in order to locomote purposefully through its world. We describe an initial animat vision implementation within lifelike artificial fishes inhabiting a physics-based, virtual marine world. Emulating the appearance, motion, and behavior of real fishes in their natural habitats, these animats are capable of spatially nonuniform retinal imaging, foveation, retinal image stabilization, color object recognition, and perceptually-guided navigation. These capabilities allow them to pursue moving targets such as fellow artificial fishes. Animat vision offers a fertile approach to the development, implementation, and evaluation of computational theories that profess sensorimotor competence for animal or robotic situated agents. >
['Demetri Terzopoulos', 'Tamer F. Rabie']
Animat vision: Active vision in artificial animals
516,957
A central activity in Sprakbanken, an R&D unit at the University of Gothenburg, is the systematic construction of a research infrastructure based on interoperability and widely accepted standards for metadata and data. The two main components of this infrastructure deal with text corpora and with lexical resources. For modularity and flexibility, both components have a backend, or server-side part, accessed through an API made up of a set of well-defined web services. This means that there can be any number of different user interfaces to these components, corresponding, e.g., to different research needs. Here, we will demonstrate the standard corpus and lexicon search interfaces, designed primarily for linguistic searches: Korp and Karp.
['Malin Ahlberg', 'Lars Borin', 'Markus Forsberg', 'Martin Hammarstedt', 'Leif-Jöran Olsson', 'Olof Olsson', 'Johan Roxendal', 'Jonatan Uppström']
Korp and Karp - a bestiary of language resources: the research infrastructure of Språkbanken
562,876
Multi-relational PageRank for Tree Structure Sense Ranking
['Roberto Interdonato', 'Andrea Tagarelli']
Multi-relational PageRank for Tree Structure Sense Ranking
629,812
In this paper, we propose a new agent-based flexible videoconference system (AVCS) by modifying videoconference manger (VCM) agent in a conventional flexible videoconference system (FVCS). The proposed AVCS can more flexibly cope with changes in working conditions during videoconferencing than the conventional FVCS. It is because an automatic parameter tuning algorithm is imbedded to VCM dynamically adapt QoS (Quality of Service) parameters in the sense that the current working condition can meet with the desired working condition of the user, which can change in time during videoconferencing. In the experimental section, we design a new structure of the VCM with the automatic parameter tuning module, imbed to the prototype of FVCS and implement the new AVCS. Also, it is shown that the proposed AVCS outperforms the existing FVCS in the experiment.
['Sung-Doke Lee', 'Sanggil Kang', 'Dongsoo Han']
Agent-based flexible videoconference system with automatic QoS parameter tuning
531,230
Market Strategy Choices Made by Company Using Reinforcement Learning
['Dawid Chodura', 'Paweł Dominik', 'Jarosław Koźlak']
Market Strategy Choices Made by Company Using Reinforcement Learning
650,519
This paper reports our experiments on the concept detection task of TRECVID 2007. In these experiments, we have addressed two ap- proaches which are selecting and fusing features and kernel-based learn- ing method. As for the former one, we investigate the following issues: (i) which features are more appropriate for the concept detection task?, (ii) whether the fusion of features can help to improve the final detection per- formance? and (iii) how does the correlation between training and testing sets affect the final performance?. As for the latter one, a combination of global alignment (GA) kernel and penalized logistic regression ma- chine (PLRM) is studied. The experimental results on TRECVID 2007 have shown that the former approach that fuses simple features such as color moments, local binary patterns and edge orientation histogram can achieve high performance. Furthermore, the correlation between the training and testing also plays an important role in generalization of concept detectors.
['Duy-Dinh Le', 'Shin’ichi Satoh', 'Tomoko Matsui']
NII-ISM, Japan at TRECVID 2007: High Level Feature Extraction
250,857
In federated and pervasive networks, trust management has become a cornerstone for information security and privacy. Although people have recognized the importance of privacy and security for their personal information, they remain uncertain when they have to define and enforce their own access control rules or have to handle indirect information. Indirect information and subjective judgment are the major sources of uncertainty in federated trust management. This paper introduces fuzzy logic into the definition and evaluation of trust, and then provides a formal representation of fuzzy rules. It also offers a set of derivation rules for analyzing and reasoning among fuzzy rules in order to enforce these rules with a certain level of uncertainty. Application of this model to a healthcare environment with pervasive computing devices across trust domains provides a new method to handle uncertainty in trust management for federated and pervasive networks.
['Zhengping Wu', 'Alfred C. Weaver']
Application of Fuzzy Logic in Federated Trust Management for Pervasive Computing
505,057
Semantic matchmaking - i.e., the task of finding matching (Web) services based on semantic information - has been a prominent field of research lately, and a wide range of supporting tools both for research and practice have been published. However, no suitable solution for the visualization of matchmaking results exists so far. In this paper, we present the Matchmaking Visualizer, an application for the visual representation and analysis of semantic matchmaking results. It allows for the comparing of matchmaking approaches for semantic Web services in a fine-grained manner and thus complements existing evaluation suites that are based on rather coarse-grained information retrieval metrics.
['Ulrich Lampe', 'Melanie Siebenhaar', 'Stefan Schulte', 'Ralf Steinmetz']
A graphical evaluation tool for semantic web service matchmaking
616,605
Recently, the remarkable capacity potential of multiple-input multiple-output (MIMO) wireless communication systems was unveiled. The predicted enormous capacity gain of MIMO is nonetheless significantly limited by cochannel interference (CCI) in realistic cellular environments. The previously proposed advanced receiver technique improves the system performance at the cost of increased receiver complexity, and the achieved system capacity is still significantly away from the interference-free capacity upper bound, especially in environments with strong CCI. In this paper, base station cooperative processing is explored to address the CCI mitigation problem in downlink multicell multiuser MIMO networks, and is shown to dramatically increase the capacity with strong CCI. Both information-theoretic dirty paper coding approach and several more practical joint transmission schemes are studied with pooled and practical per-base power constraints, respectively. Besides the CCI mitigation potential, other advantages of cooperative processing including the power gain, channel rank/conditioning advantage, and macrodiversity protection are also addressed. The potential of our proposed joint transmission schemes is verified with both heuristic and realistic cellular MIMO settings.
['Hongyuan Zhang', 'Huaiyu Dai']
Cochannel interference mitigation and cooperative processing in downlink multicell multiuser MIMO networks
219,175
Abstract A simple set of representatives for the congruence classes of (2×2) matrices over an arbitrary field is determined. The result is then specialized to various particular fields, including F q .
['Graham D. Williams']
Congruence of (2 × 2) matrices
50,241
Highly-dynamic wireless environments present unique challenges to end-to-end communication networks, caused by the time-varying connectivity of high-velocity nodes combined with the unreliability of the wireless communication channel. Such conditions are found in a variety of networks, including those used for tactical communications and aeronautical telemetry. Addressing these challenges requires the design of new protocols and mechanisms specific to this environment. We present a new domain-specific architecture and protocol suite, including cross-layer optimizations between the physical, MAC, network, and transport layers. This provides selectable reliability for multiple applications within highly mobile tactical airborne networks. Our contributions for this environment include the transmission control protocol (TCP)-friendly transport protocol, AeroTP; the IP-compatible network layer, AeroNP; and the geolocation aware routing protocol AeroRP. Through simulations we show significant performance improvement over the traditional TCP/IP/MANET protocol stack.
['Justin P. Rohrer', 'Abdul Jabbar', 'Egemen K. Çetinkaya', 'Erik Perrins', 'James P. G. Sterbenz']
Highly-Dynamic Cross-Layered Aeronautical Network Architecture
522,310
Challenged by rising populations, modern cities are affected by the increasing effects of transportation such as congestion and greenhouse gas emissions. Collaborative route planning addresses this challenge by consolidating goods and optimizing transport routes to the customers within the urban area. This paper presents a modeling approach for collaborative route planning in supply chain simulation. A practical example for a collaborative planning implementation is presented using a discrete event simulation model. By comparing the delivery concept with and without collaboration on an exemplary supply chain, the potentials of collaborative planning in relation to the reduction of total transport distance are evaluated.
['Markus Rabe', 'Astrid Klueter', 'Uwe Clausen', 'Moritz Poeting']
An approach for modeling collaborative route planning in supply chain simulation
976,142
We present a new optimization procedure which is particularly suited for the solution of second-order kernel methods like e.g. Kernel-PCA. Common to these methods is that there is a cost function to be optimized, under a positive definite quadratic constraint, which bounds the solution. For example, in Kernel-PCA the constraint provides unit length and orthogonal (in feature space) principal components. The cost function is often quadratic which allows to solve the problem as a generalized eigenvalue problem. However, in contrast to Support Vector Machines, which employ box constraints, quadratic constraints usually do not lead to sparse solutions. Here we give up the structure of the generalized eigenvalue problem in favor of a non-quadratic regularization term added to the cost function, which enforces sparse solutions. To optimize this more `complicated' cost function, we introduce a modified conjugate gradient descent method. Starting from an admissible point, all iterations are carried out inside the subspace of admissible solutions, which is defined by the hyper-ellipsoidal constraint surface.
['Roland Vollgraf', 'Klaus Obermayer']
Sparse Optimization for Second Order Kernel Methods
151,333
We consider aMX/G/1 queueing system withN-policy. The server is turned off as soon as the system empties. When the queue length reaches or exceeds a predetermined valueN (threshold), the server is turned on and begins to serve the customers. We place our emphasis on understanding the operational characteristics of the queueing system. One of our findings is that the system size is the sum of two independent random variables: one has thePGF of the stationary system size of theMX/G/1 queueing system withoutN-policy and the other one has the probability generating function ∑j=0N=1 πjzj/∑j=0N=1 πj, in which πj is the probability that the system state stays atj before reaching or exceedingN during an idle period. Using this interpretation of the system size distribution, we determine the optimal thresholdN under a linear cost structure.
['Ho Woo Lee', 'Soon Seok Lee', 'Kyung-Chul Chae']
Operating characteristics of MX/G/1 queue with N-policy
730,591
Next-generation cellular networks are expected to be assisted by femtocaches (FCs), which collectively store the most popular files for the clients. Given any arbitrary non-fragmented placement of such files, a strict no-latency constraint, and clients' prior knowledge, new file download requests could be efficiently handled by both the FCs and the macrocell base station (MBS) using opportunistic network coding (ONC). In this paper, we aim to find the best allocation of coded file downloads to the FCs so as to minimize the MBS involvement in this download process. We first formulate this optimization problem over an ONC graph, and show that it is NP-hard. We then propose a greedy approach that maximizes the number of files downloaded by the FCs, with the goal to reduce the download share of the MBS. This allocation is performed using a dual conflict ONC graph to avoid conflicts among the FC downloads. Simulations show that our proposed scheme almost achieves the optimal performance and significantly saves on the MBS bandwidth.
['Yousef N. Shnaiwer', 'Sameh Sorour', 'Neda Aboutorab', 'Parastoo Sadeghi', 'Tareq Y. Al-Naffouri']
Network-Coded Content Delivery in Femtocaching-Assisted Cellular Networks
658,236
Being able to predict events and occurrences which may arise from a current situation is a desirable capability#R##N##R##N#of an intelligent agent. In this paper, we show that a high-level scene interpretation system, implemented as#R##N##R##N#part of a comprehensive robotic system developed in the XXX project, can also be used for prediction. This#R##N##R##N#way, the robot can foresee possible developments of the environment and the effect they may have on its activities. As a guiding example, we consider a robot acting as a waiter in a restaurant and the task of predicting#R##N##R##N#possible occurrences and courses of action, e.g. when serving a coffee to a guest. Our approach requires that#R##N##R##N#the robot possesses conceptual knowledge about occurrences in the restaurant and its own activities, represented in the standardized ontology language OWL and augmented by constraints using SWRL. Conceptual#R##N##R##N#knowledge may be acquired by conceptualizing experiences collected in the robot’s memory. Predictions#R##N##R##N#are generated by a model-construction process which seeks to explain evidence as parts of such conceptual#R##N##R##N#knowledge, this way generating possible future developments. The experimental results show, among others,#R##N##R##N#the prediction of possible obstacle situations and their effect on the robot actions and estimated execution#R##N##R##N#times.
['Jos Lehmann', 'Bernd Neumann', 'Wilfried Bohlken', 'Lothar Hotz']
A Robot Waiter that Predicts Events by High-level Scene Interpretation
760,922
Building on probabilistic models for interval-valued variables, parametric classification rules, based on Normal or Skew-Normal distributions, are derived for interval data. The performance of such rules is then compared with distancebased methods previously investigated. The results show that Gaussian parametric approaches outperform Skew-Normal parametric and distance-based ones in most conditions analyzed. In particular, with heterocedastic data a quadratic Gaussian rule always performs best. Moreover, restricted cases of the variance-covariance matrix lead to parsimonious rules which for small training samples in heterocedastic problems can outperform unrestricted quadratic rules, even in some cases where the model assumed by these rules is not true. These restrictions take into account the particular nature of interval data, where observations are defined by both MidPoints and Ranges, which may or may not be correlated. Under homocedastic conditions linear Gaussian rules are often the best rules, but distance-based methods may perform better in very specific conditions.
['A. Pedro Duarte Silva', 'Paula Brito']
Discriminant Analysis of Interval Data: An Assessment of Parametric and Distance-Based Approaches
557,271
The k-means algorithm is a well-known method for partitioning n points that lie in the d-dimensional space into k clusters. Its main features are simplicity and speed in practice. Theoretically, however, the best known upper bound on its running time (i.e. O(n kd )) is, in general, exponential in the number of points (when kd=Ω(n log n)). Recently, Arthur and Vassilvitskii [2] showed a super-polynomial worst-case analysis, improving the best known lower bound from Ω(n) to 2 Ω (√n) with a construction in d=Ω(√n) dimensions. In [2] they also conjectured the existence of super-polynomial lower bounds for any d≥ 2. Our contribution is twofold: we prove this conjecture and we improve the lower bound, by presenting a simple construction in the plane that leads to the exponential lower bound 2 Ω (n).
['Andrea Vattani']
k-means requires exponentially many iterations even in the plane
150,477
This paper describes the general service recommendation process matched to the telecommunication service delivery characteristics of the Ubiquitous Consumer Wireless World (UCWW). The goal is to provide consumers with the ‘best’ service instances that match their dynamic, contextualized and personalized requirements and expectations, thereby aligning their usage of mobile services to the always best connected and best served (ABC&S) paradigm. A four-tiered architectural configuration of the UCWW service recommendation framework is proposed along with a suitable service recommendation model. Specific and generic smart-city application examples are outlined. Other relevant social impact of the proposed approach is highlighted at the conclusion of the paper.
['Haiyang Zhang', 'Ivan Ganchev', 'Nikola S. Nikolov', "Mairtin O'Droma"]
A service recommendation model for the Ubiquitous Consumer Wireless World
947,955
An attempt is made to identify the limitations of current information system development methods, to introduce the requirements for future methods, and to propose a method whose long-term goal is to meet these requirements. The goal is to avoid the invention of a completely new method to meet the requirements of object orientation, prototyping, and knowledge modeling. The ability of the results of specific research efforts in the area of conventional information system method to be extended in order to meet these requirements is analyzed. The basic step-by-step idea of the THM-M (temporal-hierarchic model M) has been considered for the incremental approach, where most of the steps have been redefined and extended. >
['Ulrich Schiel', 'Ivan Mistrik']
Using object-oriented analysis and design for integrated systems
144,451
XQuery is the de facto standard XML query language, and it is important to have efficient query evaluation techniques available for it. A core operation in the evaluation of XQuery is the finding of matches for specified tree patterns, and there has been much work towards algorithms for finding such matches efficiently. Multiple XPath expressions can be evaluated by computing one or more tree pattern matches.#R##N##R##N#However, relatively little has been done on efficient evaluation of XQuery queries as a whole. In this paper, we argue that there is much more to XQuery evaluation than a tree pattern match. We propose a structure called generalized tree pattern (GTP) for concise representation of a whole XQuery expression. Evaluating the query reduces to finding matches for its GTP. Using this idea we develop efficient evaluation plans for XQuery expressions, possibly involving join, quantifiers, grouping, aggregation, and nesting.#R##N##R##N#XML data often conforms to a schema. We show that using relevant constraints from the schema, one can optimize queries significantly, and give algorithms for automatically inferring GTP simplifications given a schema. Finally, we show, through a detailed set of experiments using the TIMBER XML database system, that plans via GTPs (with or without schema knowledge) significantly outperform plans based on navigation and straightforward plans obtained directly from the query.
['Zhimin Chen', 'H. V. Jagadish', 'Laks V. S. Lakshmanan', 'Stelios Paparizos']
From tree patterns to generalized tree patterns: on efficient evaluation of XQuery
82,718
We investigate the probabilistic convergence behavior of minimum mean square error (MMSE) turbo equalization in space-time (ST) block-coded multiple-input multiple-output (MIMO) systems with finite block lengths. For this purpose, the extrinsic information transfer characteristics (EXIT)-band chart technique, which was originally proposed for analyzing turbo decoding of parallel concatenated codes is applied. The impact on the convergence behavior and performance due to the use of ST block coding is analyzed through the EXIT-band characteristics. The convergence behaviors of the exact and approximate implementations of MMSE equalization are compared. We show that in the hybrid equalization schemes, the EXIT-band chart analysis can be used to determine the number of iterations that should be performed using the exact implementation. Our results clearly demonstrate the usefulness of the EXIT-band chart technique as a simple tool to analyze the convergence behavior of turbo equalization in systems with finite block lengths.
['K. C. B. Wavegedara', 'Vijay K. Bhargava']
Convergence analysis of turbo equalization in ST block-coded MIMO systems
357,585
The replacement of pointers with graph grammar productions is discussed. Such a replacement provides a substantial improvement in the programming model used, makes better use of current high-resolution screen technology than a strictly text-based language, and provides improved support for parallel processing due to characteristics of the graph grammar formulation used. The background of this project, and the relationship to visual languages, is described. The use of graph grammars in programming and the graph grammar programming languages are described. The editing environment being developed for programming in graph grammars is presented. Compiler development for the system is described. >
['Joseph J. Pfeiffer']
Using graph grammars for data structure manipulation
501,179
For low-rank Frobenius-norm approximations of matrices with non-negative entries, it is shown that the Lagrange dual is computable by semi-definite programming. Under certain assumptions the duality gap is zero. Even when the duality gap is non-zero, several new insights are provided.
['Christian Grussler', 'Anders Rantzer']
On optimal low-rank approximation of non-negative matrices
656,192
In this paper we introduce the concept of Identity Tokens in multi-domain service environments and show how it is used to bridge between authentication/authorization and users' privacy. This token provides, together with further authentication and authorization techniques, a high level of privacy without anonymity.
['David J. Lutz', 'R. del Campo']
Bridging the Gap between Privacy and Security in Multi-Domain Federations with Identity Tokens
166,249
This paper examines the efficiency of spatial and frequency dimensions in serving multiple users in the downlink of a small cell wireless network with randomly deployed access points. For this purpose, the stochastic geometry framework is incorporated, taking into account the user distribution within each cell and the effect of sharing the available system resources to multiple users. An analysis of performance in terms of signal-to-interference-ratio and achieved user rate is provided that holds under the class of non-cooperative multiple access schemes. In order to obtain concrete results, two simple instances of multiple access schemes are considered. It is shown that performance depends critically on both the availability of frequency and/or spatial dimensions as well as the way they are employed. In particular, increasing the number of available frequency dimensions alone is beneficial for users experiencing large interference, whereas increasing spatial dimensions without employing frequency dimensions degrades performance. However, best performance is achieved when both dimensions are combined in serving the users.
['Stelios Stefanatos', 'Angeliki Alexiou']
Exploiting frequency and spatial dimensions in small cell wireless networks
18,139
This paper considers an underlay access strategy for coexisting wireless networks where the secondary system utilizes the primary spectrum to serve its users. We focus on the practical cases where there is uncertainty in the estimation of channel state information (CSI). Here the throughput performance of each system is limited by the interference imposed by the other, resulting in conflicting objectives. We first analyze the fundamental tradeoff between the tolerance interference level at the primary system and the total achievable throughput of the secondary users. We then introduce a beamforming design problem as a multiobjective optimization to minimize the interference imposed on each of the primary users while maximizing the intended signal received at every secondary user, taking into account the CSI uncertainty. We then map the proposed optimization problem to a robust counterpart under the maximum CSI estimation error. The robust counterpart is then transformed into a standard convex semi-definite programming. Simulation results confirm the effectiveness of the proposed scheme against various levels of CSI estimation error. We further show that in the proposed approach, the trade-off in the two systems modelled by Pareto frontier can be engineered by adjusting system parameters. For instance, the simulations show that at the primary system interference thresholds of -10 dBm (-5 dBm) by increasing number of antennas from 4 to 12, the secondary system throughput is increased by 3.3 bits/s/channel-use (5.3 bits/s/channel-use).
['Tuan Anh Le', 'Keivan Navaie', 'Quoc-Tuan Vien', 'Huan Xuan Nguyen']
Beamforming in Coexisting Wireless Systems with Uncertain Channel State Information
877,172
This paper addresses the problem of certifying the performance of a precision flexure-base mechanism design with respect to the given constraints. Due to the stringent requirements associated with the applications of flexure-based precision mechanisms, it is necessary to be able to evaluate and certify the performance at the design stage, taking into account the possible sources of errors: such as fabrication tolerance and modeling inaccuracies in flexure joints. An interval-based method is proposed to certify whether various constraints are satisfied for all points within a required workspace. This paper presents the interval-based methodology and its implementation on a planar 3RRR parallel flexure-based manipulator.
['Denny Oetomo', 'David Daney', 'Bijan Shirinzadeh', 'Jean-Pierre Merlet']
Certified workspace analysis of 3RRR planar parallel flexure mechanism
157,586
Stochastic modeling and the theory of queues, by R. W. WOE, Rentice Hall, Engle‐ wood Cliffs, NJ, 1989, 556 pp
['Stephen G. Strickland']
Stochastic modeling and the theory of queues, by R. W. WOE, Rentice Hall, Engle‐ wood Cliffs, NJ, 1989, 556 pp
255,465
In this paper, we investigate how to establish the relationship between semantic concepts based on the large-scale real-world click data from image commercial engine, which is a challenging topic because the click data suffers from the noise such as typos, the same concept with different queries, etc. We first define five specific relationships between concepts. We then extract some concept relationship features in textual and visual domain to train the concept relationship models. The relationship of each pair of concepts will thus be classified into one of the five special relationships. We study the efficacy of the conceptual relationships by applying them to augment imperfect image tags, i.e., improve representative power. We further employ a sophisticated hashing approach to transform augmented image tags into binary codes, which are subsequently used for content-based image retrieval task. Experimental results on NUS-WIDE dataset demonstrate the superiority of our proposed approach as compared to state-of-the-art methods.
['Richang Hong', 'Yang Yang', 'Meng Wang', 'Xian-Sheng Hua']
Learning Visual Semantic Relationships for Efficient Visual Retrieval
649,728
The current SystemC modelling language lacks a standard framework that supports the modelling of wireless communication systems. This research investigates how wireless features can be incorporated into existing SystemC design methodology. The components to be investigated in order to achieve this target are divided into three parts: developing a system-level model of a digital wireless communication channel, creating a small library of dedicated elements at system level, and concluding with a case study on flocking behaviour system to validate the wireless extension methodology. In previous works, all these parts were successfully modelled and implemented. In this paper, the integration of communication modelling is introduced into design modelling during the early stages of system development. We use a flocking behaviour system to show how the stability of the system and converging point are measured and optimised in terms of system construction, using some important concepts of graph theory.
['Ibrahim Aref', 'Nuredin Ahmed', 'F. Rodriguez-Salazar', 'K. Elgaid']
Measuring and Optimising Convergence and Stability in Terms of System Construction in SystemC
151,465
Präzises Interrupt Scheduling in abstrakten RTOS Modellen in SystemC.
['Henning Zabel', 'Wolfgang Müller']
Präzises Interrupt Scheduling in abstrakten RTOS Modellen in SystemC.
782,687
An Application of Quantum Finite Automata to Interactive Proof Systems.
['Harumichi Nishimura', 'Tomoyuki Yamakami']
An Application of Quantum Finite Automata to Interactive Proof Systems.
760,086
LOBOS (Linux Os Boots OS) is a system call that allows a running Linux kernel to boot a new kernel, without leaving 32-bit protected mode and, in particular, without using the BIOS in any way. This capability in turn allows Linux to be used as a network bootstrap program and even as a BIOS, both of which we are working on now. In this paper we discuss how LOBOS works, how we use it, and how LOBOS makes Linux usable as a BIOS, replacing the proprietary PC BIOSes we have today. LOBOS has been used by two other groups as a reference implementation for their Linux-boots-Linux system calls. One of these other implementations, bootimg, may become a part of the 2.4 kernel.
['Ronald G. Minnich']
LOBOS: (linux OS boots OS) booting a kernel in 32-bit mode
685,204
This paper presents a decentralized relay selection protocol for a dense wireless network and describes channel feedback strategies that improve its performance. The proposed selection protocol supports hybrid automatic repeat request transmission where relays forward parity information to the destination in the event of a decoding error. Channel feedback is employed for refining the relay selection process and to select an appropriate transmission mode in a proposed adaptive modulation transmission framework. An approximation of the throughput of the proposed adaptive modulation strategy is presented, and the dependence of the throughput on system parameters such as the relay contention probability and the adaptive modulation switching point is illustrated via maximization of this approximation. Simulations show that the throughput of the proposed selection strategy is comparable with that yielded by a centralized selection approach that relies on geographic information.
['Caleb K. Lo', 'Robert W. Heath', 'Sriram Vishwanath']
The Impact of Channel Feedback on Opportunistic Relay Selection for Hybrid-ARQ in Wireless Networks
508,059
Most everyday electrical and electromechanical objects emit small amounts of electromagnetic (EM) noise during regular operation. When a user makes physical contact with such an object, this EM signal propagates through the user, owing to the conductivity of the human body. By modifying a small, low-cost, software-defined radio, we can detect and classify these signals in real-time, enabling robust on-touch object detection. Unlike prior work, our approach requires no instrumentation of objects or the environment; our sensor is self-contained and can be worn unobtrusively on the body. We call our technique EM-Sense and built a proof-of-concept smartwatch implementation. Our studies show that discrimination between dozens of objects is feasible, independent of wearer, time and local environment.
['Gierad Laput', 'Chouchang Yang', 'Robert Xiao', 'Alanson P. Sample', 'Chris Harrison']
EM-Sense: Touch Recognition of Uninstrumented, Electrical and Electromechanical Objects
264,715
The fuzzy rule interpolation (FRI) and the fuzzy signature methodology was successfully adapted for expressing the building condition. In this paper we extend this concept for estimating the life quality of the built environment, especially of the residential segment. The applied methodology is based on a hierarchical FRI as a straightforward implementation of the fuzzy signature concept. The paper also introduces some application details of a case study related to residential houses located in a historic district of Budapest, Hungary.
['Gergely I. Molnárka', 'László T. Kóczy', 'Szilveszter Kovács']
Evaluating the life quality of the built environment by FRI method
550,955
Sharing electronic medical record on the WWW using InterCare architecture and smart cards.
['Mikko Rotonen', 'Pekka Ruotsalainen', 'Atte Kaskihalme', 'Matti Aarnio']
Sharing electronic medical record on the WWW using InterCare architecture and smart cards.
824,581
This paper considers the problem of communicating correlated information from multiple source nodes over a network of noiseless channels to multiple destination nodes, where each destination node wants to recover all sources. The problem involves a joint consideration of distributed compression and network information relaying. Although the optimal rate region has been theoretically characterized, it was not clear how to design practical communication schemes with low complexity. This work provides a partial solution to this problem by proposing a low-complexity scheme for the special case with two sources whose correlation is characterized by a binary symmetric channel. Our scheme is based on a careful combination of linear syndrome-based Slepian-Wolf coding and random linear mixing (network coding). It is in general suboptimal; however, its low complexity and robustness to network dynamics make it suitable for practical implementation.
['Yunnan Wu', 'Vladimir Stankovic', 'Zixiang Xiong', 'Sun-Yuan Kung']
On Practical Design for Joint Distributed Source and Network Coding
499,229
A perceptually motivated objective measure for evaluating speech quality is presented. The measure, computed from the original and coded versions of an utterance, exhibits statistically a monotonic relationship with the mean opinion score, a widely used criterion for speech coder assessment. For each 10-ms segment of an utterance, a weighted spectral vector is computed via 15 critical band filters for telephone bandwidth speech. The overall distortion, called Bark spectral distortion (BSD), is the average squared Euclidean distance between spectral vectors of the original and coded utterances. The BSD takes into account auditory frequency warping, critical band integration, amplitude sensitivity variations with frequency, and subjective loudness. >
['Shihua Wang', 'A. Sekey', 'Allen Gersho']
An objective measure for predicting subjective quality of speech coders
454,167
This paper presents an approach to optical blur estimation in images based on the measuring of the spread of edges. Two measures based on this approach are proposed. The first measure defines a model of amplified Gaussian with background for the profile of edges. The second measure is an approximation that does not require the extraction of profiles at edges. Both measures do not depend on the content or illumination of the image, making them suitable for dynamic videos. The behaviour of the proposed measures is finally presented in the context of dynamic videos from a surveillance camera embedded in a transportation vehicle.
['Sebastien Harasse', 'Laurent Bonnaud', 'Michel Desvignes']
Content and Illumination Invariant Blur Measures for Realtime Video Processing
418,192
Innovative contribution to aggregated search.Innovative contribution to the definition of interfaces for visualizing search results.Graph-based visualization of multimedia search results.Multimodal access to media objects.User based evaluation. Nowadays, together with the increasing spread of the content available online, users' information needs have become more complex. To fulfill them, users strongly rely on Web search engines, but traditional ways of presenting search results are often unsatisfactory. In fact, Web pages carry information that exists in multiple media formats, such as text, audio, image and video objects. Vertical search engines and medium-specific search services do not provide users with an integrated view of search results. Furthermore, multiple media objects are in most of the cases highly semantically interlinked, but the connections between them are not sufficiently exploited to provide a further exploration of the retrieved objects. To address these issues, in this paper we propose a graph-based approach aimed at providing users with the possibility to dynamically visualize and explore a search result space built over a repository of multimedia documents containing interconnected multiple media objects. To do this, we represent the search result space via a graph-based data model, where both the retrieved multimedia documents and connected relevant media objects are considered. Media objects, among them, are connected via different kinds of similarity relationships, which depend on the low-level features and metadata taken into consideration to access the media objects. The approach and the connected visualization and exploration interface have been implemented and tested on a publicly available dataset, and they have been evaluated by means of a usability test.
['Umer Rashid', 'Marco Viviani', 'Gabriella Pasi']
A graph-based approach for visualizing and exploring a multimedia search result space
840,010
Along with increasingly intense desire to achieve super high-resolution images, synthetic aperture radar (SAR) is facing more severe technical challenges such as sampling, storage and transmission of massive data as well as high complexity of hardware. Compressive sensing (CS) theory, which utilizes the signal sparsity, can implement accurate image reconstruction from an extremely less amount of measurements than what is typically considered necessary in Nyquist-Shannon sampling theorem. In this paper, the detailed CS SAR image reconstruction model and process is presented in comparison with back-projection algorithm (BPA). Meanwhile, a pseudorandom 2-D subsampling measurement matrix is redesigned by considering the antenna pattern weighting and linear frequency modulation (LFM) signal. The validity and performance of CS technique using this matrix is exhibited by sparse scene simulation results with multi-point targets.
['Tengfei Li', 'Qingjun Zhang']
Compressive sensing SAR image reconstruction based on a pseudorandom 2-D subsampling measurement matrix
931,118
With the steady price reduction of 3D visualization devices such as Head Mounted Displays (HMDs) and more recently 3D TVs, we can now foresee the dissemination of applications that was only possible within very expensive virtual reality environments. However, interaction within virtual 3D environments requires more natural modes than those provided by the ubiquitous mouse and keyboard. In this paper we introduce a novel low cost 3D hand gesture based interaction system. We have developed a real-time stereo computer vision system and a hybrid interface that combines natural and symbolic gestures for navigation and manipulation of 3D objects in virtual environments. Results from a pilot study reveals that the hybrid interface presents great power and flexibility without significant increase in the complexity of the user interaction.
['Silvia Ghirotti', 'Carlos Hitoshi Morimoto']
Um sistema de interação baseado em gestos manuais tridimensionais para ambientes virtuais
609,889
This paper presents a testing and simulated fault injection framework for time-triggered safety-critical embedded systems. Our approach facilitates the validation of fault-tolerance mechanisms by performing non-intrusive (SFI) on models of the system at different stages of the development, from the (PIM) to the (PSM). The SFI enables exercising the intended fault tolerance mechanisms by injecting faults in a simulated model of a system. The main benefit of this work is that it enables an early detection of design flaws in fault-tolerant systems, what reduces the possibility of late discovery of design pitfalls that might require an expensive redesign of the system. We examine the feasibility of the proposed approach in a case study, where SFI is used to assess the fault tolerance mechanisms designed in a simplified railway signaling system.
['Iban Ayestaran', 'Carlos Fernando Nicolas', 'Jon Perez', 'Asier Larrucea', 'Peter P. Puschner']
A Simulated Fault Injection Framework for Time-Triggered Safety-Critical Embedded Systems
380,943
Editorial: A Message from the New Editor-in-Chief
['Jeff Kramer']
Editorial: A Message from the New Editor-in-Chief
389,116
Robot-assisted surgical procedures require to generate paths for surgical instruments during a planning step and then to follow automatically the computed trajectories as close as possible. Most of the existing procedures use external localization devices to control the robot. However, these devices are not adequate for minimally invasive interventions such as laparoscopic surgery because this kind of surgery involves large lever effects which make high accuracy unreachable. In this paper, we propose a path following method based on the use of the endoscopic camera and instruments with markers. It mainly stands on an image-based visual servoing scheme and on the estimation of the 3D position of the incision point in the abdominal wall. We show that this method can lead to precise tracking despite the errors on the position of the incision point. The proposed method is finally demonstrated in vitro in an automatic suturing intervention.
['Florent Nageotte', 'Philippe Zanne', 'Christophe Doignon', 'Michel de Mathelin']
Visual Servoing-Based Endoscopic Path Following for Robot-Assisted Laparoscopic Surgery
509,047
Cost-Sensitive Online Classification is recently proposed to directly online optimize two well-known cost-sensitive measures: (i) maximization of weighted sum of sensitivity and specificity, and (ii) minimization of weighted misclassification cost. However, the previous existing learning algorithms only utilized the first order information of the data stream. This is insufficient, as recent studies have proved that incorporating second order information could yield significant improvements on the prediction model. Hence, we propose a novel cost-sensitive online classification algorithm with adaptive regularization. We theoretically analyzed the proposed algorithm and empirically validated its effectiveness with extensive experiments. We also demonstrate the application of the proposed technique for solving several online anomaly detection tasks, showing that the proposed technique could be an effective tool to tackle cost-sensitive online classification tasks in various application domains.
['Peilin Zhao', 'Furen Zhuang', 'Min Wu', 'Xiao-Li Li', 'Steven C. H. Hoi']
Cost-Sensitive Online Classification with Adaptive Regularization and Its Applications
607,271
Modeling and Implementation Approach to Evaluate the Intrusion Detection System
['Mohammed Saber', 'Sara Chadli', 'Mohamed Emharraf', 'Ilhame El Farissi']
Modeling and Implementation Approach to Evaluate the Intrusion Detection System
758,013
Intent-based networking is a major component that will transform the manner in which the SDN/NFV-enabled future network infrastructures are operated. In particular, Intent-based networking is expected to play a major role in the multi-technological and software-defined 5G systems development roadmap. In this paper, we present the design and prototype implementation of an Intent-based mobile backhauling interface for 5G networks. Finally, we report on the empirical evaluation of of the proposed Intent-based interface over a small Enterprise WLAN. We also release the entire software stack under a permissive license for academic use.
['Tejas Subramanya', 'Roberto Riggio', 'Tinku Rasheed']
Intent-based mobile backhauling for 5G networks
983,884
Using Partial Tablebases in Breakthrough
['Andrew Isaac', 'Richard J. Lorentz']
Using Partial Tablebases in Breakthrough
954,832
An important property of software repositories is their level of cross-project redundancy. For instance, much has been done to assess how much code cloning happens across software corpora. In this paper we study a much less targeted type of replication: Interface Redundancy (IR). IR refers to the level of repetition of whole method interfaces - return type, method name, and parameters types - across a code corpus. Such type of redundancy is important because if two non-trivial methods ever share the same interface it is very likely that they implement analogous functions, even though their code, structure, or vocabulary might be diverse. A certain level of IR is a requirement for approaches that rely on the recurrence of interfaces to fulfill a given task (e.g., interface-driven code search - IDCS). In this paper we report on an experiment to measure IR in a large-scale Java repository. Our target corpus contains more than 380,000 methods from 99 Java projects extracted randomly from an open source repository. Results are promising as they show that the chances of an interface from a non-trivial method to repeat itself across a large repository is around 25% (i.e., approximately 1/4 of such interfaces are redundant). Also, more than 80% of the target projects contained IR (with the average percentage of redundant interfaces for these projects being above 30%). As additional analyses we investigated the distribution of the different types of redundant interfaces (e.g., intra-vs inter-project), characterized the redundant interfaces and show that such a knowledge can help improve IDCS, and provided evidence that only a very small part of IR refers to method cloning (around 0.002%).
['Adriano Carvalho de Paula', 'Eduardo Martins Guerra', 'Cristina Videira Lopes', 'Hitesh Sajnani', 'Otávio Augusto Lazzarini Lemos']
An Exploratory Study of Interface Redundancy in Code Repositories
958,066
Using structural geometry, Whiteley (1991) showed that a line drawing is a correct projection of a spherical polyhedron if and only if it has a cross-section compatible with it. We extend the class of drawings to which this test applies, including those of polyhedral disks. Our proof is constructive, showing how to derive all spatial interpretations; it relies on elementary synthetic geometric arguments, and, as a by-product, it yields a simpler and shorter proof of Whiteley's result. Moreover, important properties of line drawings are visually derived as corollaries: realizability is independent of the adopted projection, it is an invariant projective property, and for trihedral drawings it can be checked with a pencil and an unmarked ruler alone.
['Lluíís Ros', 'Federico Thomas']
Shape-from-image via cross-sections
432,227
Clinical Virtual Reality.
['Albert “Skip” Rizzo', 'Belinda Lange', 'Sebastian Koenig']
Clinical Virtual Reality.
772,533
Online/Offline Identity-Based Signcryption Revisited.
['Joseph K. Liu', 'Joonsang Baek', 'Jianying Zhou']
Online/Offline Identity-Based Signcryption Revisited.
798,308
The effect of sampling and quantization on frequency estimation for a single sinusoid is investigated. Asymptotic Cramer-Rao bounds (CRB) for 1-bit quantization and for non-ideal filters are derived, which are simpler to calculate than the exact CRB while still relatively accurate. It is further investigated how many bits should be used in quantization to avoid the problems of 1-bit quantization, and it turns out that 3-4 bits are enough. Finally, oversampled 1-bit quantization is investigated. It is determined how much the signal should be oversampled, and in addition /spl Sigma//spl Delta/ modulators are investigated.
['Peter Händel', 'Anders Host-Madsen']
On frequency estimation from oversampled quantized observations
310,505
In this paper, a multilinear approach based on image texture for face recognition is present. First, we extract the texture features of the facial images using the local binary pattern (LBP) algorithm. Then, we apply the high-order orthogonal iteration (HOOI) algorithm, the algebra of higher-order tensors, to obtain a compact and effective representation of the facial images based on the texture features. Our representation yields improved facial recognition rates relative to standard eigenface and tensorface especially when the facial images are confronted by a variety of viewpoints and illuminations. To evaluate the validity of our approach, a series of experiments are performed on the CMU PIE facial databases.
['Huchuan Lu', 'Hao Chen', 'Yen-Wei Chen']
Multilinear analysis based on image texture for face recognition
41,547
Recent advances in Web and information technologies have resulted in many e-learning resources. There is an emerging requirement to manage and reuse relevant resources together to achieve on-demand e-learning in the Web. Ontologies have become a key technology for enabling semantic-driven resource management. We argue that to meet the requirements of semantic-based resource management for Web-based e-learning, one should go beyond using domain ontologies statically. In this paper, we provide a semantic mapping mechanism to integrate e-learning databases by using ontology semantics. Heterogeneous e-learning databases can be integrated under a mediated ontology. Taking into account the locality of resource reuse, we propose to represent context-specific portions from the whole ontology as sub-ontologies. We present a sub-ontology-based approach for resource reuse by using an evolutionary algorithm. We also conduct simulation experiments to evaluate the approach with a traditional Chinese medicine e-learning scenario and obtain promising results.
['Zhaohui Wu', 'Yuxin Mao', 'Huajun Chen']
Subontology-Based Resource Management for Web-Based e-Learning
505,933
A solution to the path following problem for underactuated autonomous vehicles in the presence of possibly large modeling parametric uncertainty is proposed. For a general class of vehicles moving in 2D space, we demonstrated a path following control law based on multiple variable sliding mode that yields global boundedness and convergence of the position tracking error to a small neighborhood and robustness to parametric modeling uncertainty. An error integration element is added into the “tanh” function of the traditional sliding mode control. We illustrated our results in the context of the vehicle control applications that an underwater vehicle moves along with the desired paths in 2D space. Simulations show that the control objectives were accomplished.
['Daxiong Ji', 'Jian Liu', 'Hongyu Zhao', 'Yiqun Wang']
Path Following of Autonomous Vehicle in 2D Space Using Multivariable Sliding Mode Control
399,304