query
stringlengths 8
1.13k
| pos
stringlengths 13
1.09k
| neg
stringlengths 11
1.07k
| query_lang
stringclasses 32
values | __index_level_0__
int64 41
1.05M
|
---|---|---|---|---|
after meeting Chinese people ,what is your opinion of them?
|
I'm impressed that Chinese people are usually polite and hard-working.One of my friends who is Chinese is excellent in her study and she gets along with her friends in a good manner. I quite like Chinese people.
|
I hope this doesn't offend anyone, but how u know there not just acting, pretending and talking gibberish?
|
eng_Latn
| 135,027 |
The anatomy of Bloom : Harold Bloom and the study of influence and anxiety
|
Here at last is a comprehensive introduction to the career of America's leading intellectual. "The Anatomy of Bloom" surveys Harold Bloom's life as a literary critic, exploring all of his books in chronological order, to reveal that his work, and especially his classic "The Anxiety of Influence," is best understood as an expression of reprobate American Protestantism and yet haunted by a Jewish fascination with the Holocaust. Heys traces Bloom's intellectual development from his formative years spent as a poor second-generation immigrant in the Bronx to his later eminence as an international literary phenomenon. He argues that, as the quintessential living embodiment of the American dream, Bloom's career-path deconstructs the very foundations of American Protestantism.
|
Through this article the author presents an experimental collage about the experiment of writing and its relation to the analogies of kinship—an enactment of folk phenomenology.
|
eng_Latn
| 135,057 |
Summary on Modern Adaptation of All Men Are Brothers
|
The paper examines All Men Are Brothers-based fictions and plays written between 1917 to 1949.The author tries to outline a biography of related works and categorizes them into three dramatic themes constituted with chivalry,humanity and ideology.The paper also explores specific historical and cultural background,the physiology of creative adaptation,the reader's expectations,presentation and thematic analysis.With a discussion of the adaptive works' artistic success and failure,the author attempts to sort out the possibility and limit of the creative efforts based on the classic.The essay ends with a case study of "Lin Chong Dash at Night",a scenario adapted by three different plays.
|
Preface 1. Changing Times 2. On the Way to a Different Place 3. This Land Is Their Land 4. Their Name Is "Woman" 5. A Rightful Coming of Age 6. Not Without Struggle Afterword Notes Church Documents Cited Index
|
yue_Hant
| 135,064 |
very careful, wonderful study
|
I recently used this text in a university class; my students and I value this work a great deal. It's great to see a medievalist criticize Aries appropriately, without throwing out the baby with the bathwater. This is essential reading for anyone interested in the history of childhood.
|
This is emotive of foggy London, seedy places and shadow people. Le Carre is the master. I cannot add to the intimacy with the world of seedy espionage he can evoke. It's a wonder we survived the Cold War!
|
eng_Latn
| 135,354 |
A data stream-based evaluation framework for traffic information systems
|
Traffic information systems based on mobile, in-car sensor technology are a challenge for data management systems as a huge amount of data has to be processed in real-time. Data mining methods must be adapted to cope with these challenges in handling streaming data. Although several data stream mining methods have been proposed, an evaluation of such methods in the context of traffic applications is yet missing. In this paper, we present an evaluation framework for data stream mining for traffic applications. We apply a traffic simulation software to emulate the generation of traffic data by mobile probes. The framework is evaluated in a first case study, namely queue-end detection. We show first results of the evaluation of a data stream mining method, varying multiple parameters for the traffic simulation. The goal of our work is to identify parameter settings for which the data stream mining methods produce useful results for the traffic application at hand.
|
With the rise of social media like Twitter and distribution platforms like app stores, users have various ways to express their opinions about software products. Popular software vendors get user feedback thousandfold per day. Research has shown that such feedback contains valuable information for software development teams. However, a manual analysis of user feedback is cumbersome and hard to manage. We present OpenReq Analytics, a software requirements intelligence service, that collects, processes, analyzes, and visualizes user feedback.
|
eng_Latn
| 135,434 |
Virtually Shared Memory Architectures for Scalable Universal Parallel Computers
|
Recent results in theoretical computer science confirm that highly parallel, general-purpose shared-memory computers can in principle be built. These results are established by studying emulations of an idealised shared-memory parallel machine model, the Parallel Random Access Machine or PRAM, on realistic distributed-memory parallel systems. Within this context, this paper reports on the basic approaches to provide the common-memory abstraction in a distributed-memory machine (Virtually Shared Memory or VSM).
|
Understanding the composition of the Internet traffic has many applications nowadays, mainly tracking bandwidth consuming applications, QoS-based traffic engineering and lawful interception of illegal traffic. Although many classification methods such as Support Vector Machines (SVM) have demonstrated their accuracy, not enough attention has been paid to the practical implementation of lightweight classifiers. In this paper, we consider the design of a real-time SVM classifier at many Gbps to allow online detection of categories of applications. Our solution is based on the design of a hardware accelerated SVM classifier on a FPGA board.
|
eng_Latn
| 135,443 |
Comparison of multi-user detection algorithm in TD-SCDMA system
|
A nonlinear multi-user detection algorithm that is multi-path interference cancellation(MPIC) algorithm was developed to suppress the inter-symbol interference caused by multi-path fading and to increase the throughput of TD-SCDMA system.The principle of MPIC and linear multi-user detection algorithm were described.Based on the simulation and complexity comparison,the conclusion was drawn that the MPIC is better than joint detection.
|
The need to search for complex and recurring patterns in database sequences is shared by many applications. In this paper, we discuss how to express and support efficiently sophisticated sequential pattern queries in databases. Thus, we first introduce SQL-TS, an extension of SQL, to express these patterns, and then we study how to optimize search queries for this language. We take the optimal text search algorithm of Knuth, Morris and Pratt, and generalize it to handle complex queries on sequences. Our algorithm exploits the inter-dependencies between the elements of a sequential pattern to minimize repeated passes over the same data. Experimental results on typical sequence queries, such as double bottom queries, confirm that substantial speedups are achieved by our new optimization techniques.
|
eng_Latn
| 135,445 |
Proceedings of the 10th Central and Eastern European Software Engineering Conference in Russia
|
Organized since 2005, CEE-SECR is the key annual software event in Central and Eastern Europe. It is regularly attended by 800 participants from the local industry. Thanks to online and media coverage the total reach is over 1 million people from CEE region. ::: ::: The conference was initially positioned as a Russian event; however, it attracted speakers and attendees from many other countries, so in 2009 the conference was repositioned as a CEE event.
|
The data acquisition system for the CMS experiment at the Large Hadron Collider (LHC) will require a large and high performance event building network.Several architectures and swithch technologies are currently being evaluated.This paper describes demonstrators which have been set up to study a small-scale event builder based on PCs emulating high performance sources and sinks connected via Ethernet or Myrinet switches.Results from ongoing studies,including measurements on throughput and scaling,are presented.
|
eng_Latn
| 135,454 |
Long term simulation of the Israel power system dynamic response a case study
|
Long term simulation of the dynamic response of power systems has received much attention for several years. However, there are still numerous difficulties involved in reconstructing widespread disturbances. The authors describe the experience gained at the Israel Electric Corporation (IEC) using long term simulation for the reconstruction of a system wide contingency for a period of 15 minutes. >
|
This paper designed and implemented a database service multi-point disaster tolerance system.The system monitored the data changes of the primary database in real-time.And the data changes,which were monitored,were replayed on the several remote standby databases in real-time,ensured the data consistency of the standby databases and the primary database.The system detected the failure of the primary database.When the primary database failed,the database service would be automatically migrated to a standby database in few seconds,ensured the service continuity,and improved the disaster tole-rance capacity of the database service.
|
eng_Latn
| 135,461 |
Temporal versus Spatial Observability in Model-Based Diagnosis
|
Accurate fault diagnosis is a crucial success factor in achieving system dependability. The unambiguity of a diagnosis is critically dependent on the number of observations available for the inference process. Observability, therefore, significantly determines diagnostic quality. In this paper we introduce the notion of a temporal and a spatial dimension to observability of Model-Based Diagnosis, and study their impact on diagnostic quality. We use uncertainty, measured as the expected entropy of a diagnosis after observation, as quality metric. Empirical results confirm that for both dimensions an increase in observability always leads to a reduction of expected, diagnostic entropy. However, the temporal and the spatial reduction factors differ in that they decrease and increase respectively with increasing observability. The results from this study are useful for performing practical trade-offs between additional sensor placements and longer measurement horizons.
|
Introduction Section One: Ending Business Surprises Chapter OneTurning Business Disasters Into Opportunities Chapter Two Identifying and Justifying the Right Real Time Information Section Two: Real Time in the Real World Chapter Three Surprise Event: Missing the Warning Chapter Four Suspected Event: Reporting Too Late Chapter Five Surmounted Events: Getting it Right Section Three: From Real Time Opportunity Detection to the Real Time Enterprise Chapter Six Deploying Real Time Opportunity Detection Across the Enterprise Chapter Seven Solving the Challenges of Deploying Real Time Opportunity Chapter Eight The Future in a Real Time World Conclusion: The Time is Now
|
eng_Latn
| 135,468 |
Scalable Processing of Context Information with COSMOS
|
Ubiquitous computing environments are characterised by a high number of heterogeneous devices that generate a huge amount of context data. These data are used to adapt applications to changing execution contexts. However, legacy frameworks fail to process context information in a scalable and efficient manner. In this paper, we propose to organise the classical functionalities of a context manager to introduce a 3-steps cycle of data collection, interpretation, and situation identification. We propose the COSMOS framework, which is based on the concepts of context node and context management policies translated into software components in software architecture. This paper presents COSMOS and evaluates its efficiency throughout the example of the composition of context information to implement a caching/offloading adaptation situation.
|
The data acquisition system for the CMS experiment at the Large Hadron Collider (LHC) will require a large and high performance event building network.Several architectures and swithch technologies are currently being evaluated.This paper describes demonstrators which have been set up to study a small-scale event builder based on PCs emulating high performance sources and sinks connected via Ethernet or Myrinet switches.Results from ongoing studies,including measurements on throughput and scaling,are presented.
|
kor_Hang
| 135,472 |
Method and apparatus for contents management
|
A content management method and apparatus for storing and deleting content, backing up content, and restoring the backup content are provided. The method includes receiving the content from a first server and storing the content, and generating and storing content information including information about a location from which the content can be downloaded.
|
The data acquisition system for the CMS experiment at the Large Hadron Collider (LHC) will require a large and high performance event building network.Several architectures and swithch technologies are currently being evaluated.This paper describes demonstrators which have been set up to study a small-scale event builder based on PCs emulating high performance sources and sinks connected via Ethernet or Myrinet switches.Results from ongoing studies,including measurements on throughput and scaling,are presented.
|
eng_Latn
| 135,527 |
A CONSTANT TIME ALGORITHM FOR REDUNDANCY ELIMINATION IN TASK GRAPHS ON PROCESSOR ARRAYS WITH RECONFIGURABLE BUS SYSTEMS
|
The task or precedence graph formalism is a practical tool to study algorithm parallelization. Redundancy in such task graphs gives rise to numerous avoidable inter-task dependencies which invariably complicates the process of parallelization. In this paper we present an O(1) time algorithm for the elimination of redundancy in such graphs on Processor Arrays with Reconfigurable Bus Systemusing O(n4) processors, The previous parallel algorithm available in the literature for redundancy elimination in task graphs takes O(n2) time using O(n) processors.
|
Abstract To combat against persistent sex offenders, recent laws call for the use of the GPS technology to monitor their movements. In [1] , it has been proposed the adoption of a Spatio-Temporal DataBase (S-T DB) for archiving this type of complex data, besides traditional ones (i.e., data about their home, (pending) crimes, and sensible areas). In the present paper we discuss investigative strategies that can be easily implemented on top of such a kind of database by taking advantage of the history of the movements of sex offenders. As implementation platform, the SECONDO DB Management System (DBMS) is adopted since it supports a data type about moving objects as well as a reach set of spatio-temporal operators to query such complex data [2] .
|
yue_Hant
| 135,530 |
Fault Diagnosis of the Twin-lift Hydraulic Hoist System Based on Fault Tree Analysis
|
Based on the Fault Tree Analysis(FTA),the fault tree of twin-lift hydraulic hoist system for gate is established by analyzing system failure form,system structure and the logical relationship between the parts and the system and taking the faults as the top event.The qualitative and quantitative analyses are carried out through computing the minimum cut sets and the importance.The results show that the fault is mainly caused by the wear of hydraulic components and the jamming of valve spool.The study can provide guidance to routine maintenance and repair.
|
We propose a framework for examining trust in the storage stack based on different levels of trustworthiness present across different channels of information flow. We focus on corruption in one of the channels, the data channel and as a case study, we apply type-aware corruption techniques to examine Windows NTFS behavior when on-disk pointers are corrupted. We find that NTFS does not verify on-disk pointers thoroughly before using them and that even established error handling techniques like replication are often used ineffectively. Our study indicates the need to more carefully examine how trust is managed within modern file systems.
|
eng_Latn
| 135,531 |
A data synchronization system based on distributed SQL and stream copy
|
For different real-time require in data synchronization,a data synchronization system is proposed,which is based on distributed SQL and query optimization to achieve data simultaneous distribution and based on log mining and stream copy to achieve asynchronous distribution. The paper gives an application example for data sharing and data synchronization in government sectors. Application results show that the integrated use of synchronous and asynchronous data distribution provides a good solution for data synchronization.
|
We present here a study for a scheduler which cooperates with the queueing system TORQUE and is tailored to the needs of a HEP-dominated large Grid site with around 10000 jobs slots. Triggered by severe scaling problems of MAUI, a scheduler, referred to as MYSCHED, was developed and put into operation. We discuss conceptional aspects as well as experiences after almost two years of running.
|
eng_Latn
| 135,536 |
Parallel changes in large scale software development: an observational case study
|
An essential characteristic of large scale software development is parallel development by teams of developers. How this parallel development is structured and supported has a profound effect on both the quality and timeliness of the product. We conduct an observational case study in which me collect and analyze the change and configuration management history of a legacy system to delineate the boundaries of, and to understand the nature of, the problems encountered in parallel development. The results of our studies are: 1) that the degree of parallelism is very high-higher than considered by tool builders; 2) there are multiple levels of parallelism and the data for some important aspects are uniform and consistent for all levels and 3) the tails of the distributions are long, indicating the tail, rather than the mean, must receive serious attention in providing solutions for these problems.
|
Abstract : Experimentation Data Process: *Lockheed Martin experimentation at the Center for Innovation, -Constructive Simulations, -Human-in-the-Loop Simulation; *Two main issues; -Data Extraction/Storage, -Data Manipulation/Reduction; *Early Experimentation (2006 Processes), -Post Run extraction, -Manual reduction/consolidation; *Current Experimentation (2007 Processes), -Real-Time and Post Run extraction, -Hyperion Intelligence for Data reduction
|
eng_Latn
| 135,538 |
Robust consensus of multi-agent systems with diverse input delays and asymmetric interconnection perturbations
|
The consensus problem of second-order multi-agent systems with diverse input delays is investigated. Based on the frequency-domain analysis, decentralized consensus conditions are obtained for the multi-agent system with symmetric coupling weights. Then, the robustness of the symmetric system with asymmetric perturbation is studied. A bound of the largest singular value of the perturbation matrix is obtained as the robust consensus condition. Simulation examples illustrate the design procedure of consensus protocols and validate the correctness of the results.
|
We study in this vision paper the problem of integrating several web data sources under uncertainty and dependencies. We present a concrete application with web sources about objects in the maritime domain where uncertainties and dependencies are omnipresent. Uncertainties are mainly caused by imprecise information trackers and imperfect human knowledge. Dependencies come from the recurrent copying relationships occurring among the sources. We answer the issue of data integration in such a setting by reformulating it as the merge of several uncertain versions of the same global XML document. As an initial result, we put forward a probabilistic XML data integration model by getting some intuitions from the versioning model with uncertain data we proposed in [5]. We explain how this model can be used for materializing the integration outcome.
|
eng_Latn
| 135,539 |
A FAD for data intensive applications
|
FAD is a strongly typed database programming language designed for uniformly manipulating transient and persistent data on Bubba, a parallel database system developed at MCC. The paper provides an overall description of FAD, and discusses the design rationale behind a number of its distinguishing features. Comparisons with other database programming languages are provided. >
|
We have modeled the AntSim case study for the GraBats 2008 tool contest with the Fujaba tool. It turned out that for this problem the moving of single ants is the most frequent operation. The execution time for this operation dominates the overall execution time. This paper will report how we addressed the move ant problem using dedicated Fujaba features and which performance we have achieved.
|
eng_Latn
| 135,546 |
What happens when datanode fails?
|
What happens when a DataNode fails?
|
What is the best website (video tutorials) to study distributed systems?
|
eng_Latn
| 135,550 |
DataSeries: an efficient, flexible data format for structured serial data
|
Crash data collection: a Windows case study
|
Safe railway crossing system based on Zigbee communication
|
eng_Latn
| 135,623 |
a service migration case study : migrating the condor schedd .
|
The design and implementation of Zap: a system for migrating computing environments
|
Globus: a Metacomputing Infrastructure Toolkit
|
eng_Latn
| 135,630 |
Comdb2: Bloomberg's Highly Available Relational Database System
|
Centiman: elastic, high performance optimistic concurrency control by watermarking
|
signalized intersection delay estimation : case study comparison of transyt - 7 f , synchro and hcs .
|
kor_Hang
| 135,639 |
Failure Analysis of Jobs in Compute Clouds: A Google Cluster Case Study
|
Workload characterization on a production Hadoop cluster: A case study on Taobao
|
HCI Research as Problem-Solving
|
kor_Hang
| 135,644 |
Management of interdependencies in collaborative software development: a field study
|
How a good software practice thwarts collaboration: the multiple roles of APIs in software development
|
The history of the mainstream rejection of interdependent preferences
|
eng_Latn
| 135,757 |
Measuring the Impact of Different Metrics on Software Quality: a Case Study in the Open Source Domain
|
Identifying security bug reports via text mining: An industrial case study
|
A novel metric of software quality: structural availability
|
eng_Latn
| 135,758 |
Can collaborative tagging improve user feedback? a case study
|
How Software Developers Use Tagging to Support Reminding and Refinding
|
Path collective variables without paths
|
eng_Latn
| 135,765 |
Granular computing is gradually changing from a label to a new field of study. The driving forces, the major schools of thought, and the future research directions on granular computing are examined. A triarchic theory of granular computing is outlined. Granular computing is viewed as an interdisciplinary study of human-inspired computing, characterized by structured thinking, structured problem solving, and structured information processing.
|
The year 2007 marks the 10th anniversary of the introduction of granular computing research. We have experienced the emergence and growth of granular computing research in the past ten years. It is essential to explore and review the progress made in the field of granular computing. We use two popular databases, ISI's Web of Science and IEEE Digital Library to conduct our research. We study the current status, the trends and the future direction of granular computing and identify prolific authors, impact authors, and the most impact papers in the past decade.
|
Blunt trauma abdomen rarely leads to gastrointestinal injury in children and isolated gastric rupture is even rarer presentation. We are reporting a case of isolated gastric rupture after fall from height in a three year old male child.
|
eng_Latn
| 135,770 |
Incorporating UCD into the software development lifecycle: a case study
|
This case study addresses how we applied user centered design (UCD) to the software development lifecycle for the new City of Austin Utilities Online Customer Care website. The case study focuses on the use of personas, prototypes, and user testing, discusses what worked well, and provides lessons learned.
|
In this paper, we develop a dynamic programming algorithm for the scenario-tree-based stochastic uncapacitated lot-sizing problem with random lead times. Our algorithm runs in O(N^2) time, where N is the input size of the scenario tree, and improves the recently developed algorithm that runs in O(N^3) time.
|
eng_Latn
| 135,776 |
Backfill Techniques Techniques used in cumulative case studies to collect information needed if the study is to be usable for aggregation; these techniques include, for example, obtaining missing information from the authors on how instances studied were identified and on the bases for instance selection.
|
Backfill techniques can be used to gather information on a case study so that it can be analysed in conjunction with other studies.
|
Case studies performed by different researches using different techniques cannot be compared or combined to reach useful conclusions.
|
eng_Latn
| 135,783 |
Home Incorporating nonparametric statistics into Delphi studies in library and information science
|
Exploring the barriers and challenges of information and communication technology use in distributed research today: A ranking-type Delphi study
|
Some perspectives on nonparametric statistical process control
|
eng_Latn
| 135,787 |
The growing importance of collaboration in research and the still underdeveloped state-of-the-art of research on collaboration have encouraged scientists from16 countries to establish a global interdisciplinary research network under the title “Collaboration in Science and in Technology” (COLLNET)with Berlin as its virtual centre which has been set up on January 1st, 2000.The network is to comprise the prominent scientists, who work at present mostly in the field of quantitative science studies. The intention is to work together in co-operation both on theoretical and applied aspects.
|
A chronically weak area in research papers, reports, and reviews is the complete identification of background documents that formed the building blocks for these papers. A method for systematically determining these seminal references is presented. Citation-Assisted Background (CAB) is based on the assumption that seminal documents tend to be highly cited. CAB is being applied presently to three applications studies, and the results so far are much superior to those used by the first author for background development in any other study. An example of the application of CAB to the field of Nonlinear Dynamics is outlined. While CAB is a highly systematic approach for identifying seminal references, it is not a substitute for the judgement of the researchers, and serves as a supplement.
|
This publication contains reprint articles for which IEEE does not hold copyright. Full text is not available on IEEE Xplore for these articles.
|
eng_Latn
| 135,788 |
In order to understand how collaboration between people from different disciplines takes place, research is being undertaken in the area of art and technology. The paper describes two studies of collaboration between artists and technologists drawn from the COSTART (COmputer SupporT for ARTists) project, an artist-inresidency programme that provided a platform for studying the creative process. The paper describes how the research was carried out and, in particular, how the data analysis was conducted using a coding scheme developed specifically for this context. Finally, the preliminary findings are discussed and future work is proposed.
|
Introduction The Book and the Software Making a Start First Coding Making Data Working with Data Shaping Your Project 'Emerging' Theory? Ordering Concepts Moving Faster Getting There Preparing Data
|
Berzelius failed to make use of Faraday's electrochemical laws in his laborious determination of equivalent weights.
|
eng_Latn
| 135,789 |
Unique problems of dementia in the younger patient.
|
While dementia is often thought of as a problem unique to the elderly patient, nearly one in ten patients with dementia is younger than 65. The etiologies of dementia in this population are varied, including a genetically inherited form of Alzheimer's disease, as well as dementia related to other problems such as Parkinson's disease, Down syndrome, and cerebrovascular disease. Health care practitioners may have difficulty diagnosing early onset dementia because the diagnostic tools and the disease manifestation differ from those of the elderly patient. In addition, treatment of early-onset dementia can also pose unique challenges related to the speed of progression of the disease, depression, and behavioral disturbances, which often plague younger patients with dementia.
|
It is not uncommon that, in software projects, it is necessary to incorporate new developers at an advanced stage of project execution. These "newcomers" face various difficulties and challenges to find their place in the project that prevents them from starting to contribute quickly to the progress of the project. This article reports the results of an exploratory-descriptive study aimed at identifying the difficulties faced by new team members when they join a project later, as well as identifying actions that are often adopted to mitigate these difficulties. The study reveals that scarse or null documentation and the need to know the product under construction are the main difficulties, while the assignment of a referent and the provision of training are mentioned as the main actions organizations usually take to mitigate those problems.
|
eng_Latn
| 135,791 |
Exploring the Future of Development Learning: The Open Learning Campus.
|
The Open Learning Campus is changing the landscape for development learning around the world. By incorporating innovative ways of sharing knowledge across development professionals, partners, and clients, OLC provides learners a real opportunity to seamlessly and efficiently learn and grow, thereby increasing motivation and retention. This study explores the Open Learning Campus, its knowledge sharing tools and systems, as well as its impact within and outside the World Bank Group.
|
Abstract : Experimentation Data Process: *Lockheed Martin experimentation at the Center for Innovation, -Constructive Simulations, -Human-in-the-Loop Simulation; *Two main issues; -Data Extraction/Storage, -Data Manipulation/Reduction; *Early Experimentation (2006 Processes), -Post Run extraction, -Manual reduction/consolidation; *Current Experimentation (2007 Processes), -Real-Time and Post Run extraction, -Hyperion Intelligence for Data reduction
|
eng_Latn
| 135,796 |
Scientists study how HIV hides in body
|
The AIDS virus has hideouts deep in the immune system that today's drugs can't reach. Now scientists finally have discovered how HIV builds one of those fortresses  and they're exploring whether a drug already used to fight a parasite in developing countries just might hold a key to break in.
|
Autonomy's Introspect e-discovery software now finds data stored in virtual environments.
|
eng_Latn
| 135,805 |
ScienceWISE: A Web-based Interactive Semantic Platform for Scientific Collaboration
|
Formal models for expert finding in enterprise corpora
|
A study of smoothing methods for language models applied to Ad Hoc information retrieval
|
eng_Latn
| 135,816 |
Verification and validation of bioinformatics software without a gold standard: a case study of BWA and Bowtie
|
Reproducible Research in Computational Science
|
Simultaneous mosaicing and tracking with an event camera
|
eng_Latn
| 135,828 |
Predictive analytics for banking user data using AWS Machine Learning cloud service
|
A Data-Driven Approach to Predict the Success of Bank Telemarketing
|
Understanding the motivations, participation, and performance of open-source software developers: A longitudinal study of the apache projects
|
eng_Latn
| 135,847 |
Retrieval from software libraries for bug localization: a comparative study of generic and composite text models
|
Cluster-based retrieval using language models
|
Liquid Biopsy in Liquid Tumors
|
eng_Latn
| 135,854 |
Mapping world scientific collaboration: Authors, institutions, and countries
|
Investigating different types of research collaboration and citation impact: a case study of Harvard University’s publications
|
Do types of collaboration change citation? Collaboration and citation patterns of South African science publications
|
eng_Latn
| 135,870 |
Can FOSS projects benefit from integrating Kanban: a case study
|
The social structure of free and open source software development
|
Exploring N-gram Character Presentation in Bidirectional RNN-CRF for Chinese Clinical Named Entity Recognition
|
eng_Latn
| 135,873 |
A survey of the use of crowdsourcing in software engineering
|
A human study of patch maintainability
|
Annotating Named Entities in Twitter Data with Crowdsourcing
|
eng_Latn
| 135,883 |
a pragmatic proposal for linking theory and data in the social sciences .
|
dendral : a case study of the first expert system for scientific hypothesis formation .
|
Cooperative Scheduling for Coexisting Body Area Networks
|
eng_Latn
| 135,888 |
An empirical study of an informal knowledge repository in a medium-sized software consulting company
|
Knowledge management in software engineering
|
Dropout Inference in Bayesian Neural Networks with Alpha-divergences
|
eng_Latn
| 135,896 |
Refactorings of Design Defects using Relational Concept Analysis
|
Software engineers often need to identify and correct design defects, i.e., recurring design problems that hinder development and maintenance by making programs harder to comprehend and/or evolve. While detection of design defects is an actively researched area, their correction- mainly a manual and time-consuming activity- is yet to be extensively investigated for automation. In this paper, we propose an automated approach for suggesting defect-correcting refactorings using relational concept analysis (RCA). The added value of rca consists in exploiting the links between formal objects which abound in a software re-engineering context. We validated our approach on instances of the Blob design defect taken from four different open-source programs.
|
Abstract : The present study shows by example the potential amount of information available in a set of observations of targets where there are known relations between these targets. Known relations between objects significantly reduces the set of possible explanations behind a set of observations. The application here is classification of military targets. Cost functions and interaction with decision makers extend the feasibility of the present approach meaningfully to treat many observations and possible targets behind these.
|
eng_Latn
| 135,912 |
Measurement of knowledge management maturity level within organizations
|
Purpose – The purpose of this paper is to develop a model for measuring knowledge management maturity level in organizations.Design/methodology/approach – This paper defines and extracts effective factors and indicators on knowledge management and proposes a schema for prioritizing and specifying the weight of each factor and variable. The paper further surveys and evaluates existing knowledge management models and presents a knowledge management maturity model with defined factors and variables.Findings – Defining and extracting 8 factors and 42 variables that affect knowledge management and subsequently developing a knowledge management maturity model. The model of this study is practical and helps to determine the maturity position of an organization in knowledge management by defining existing status of in factors and variables, and from the prioritization of factors and variables enabling the organization to optimize its profile.Research limitations/implications – For increasing generalization of mod...
|
With the rise of social media like Twitter and distribution platforms like app stores, users have various ways to express their opinions about software products. Popular software vendors get user feedback thousandfold per day. Research has shown that such feedback contains valuable information for software development teams. However, a manual analysis of user feedback is cumbersome and hard to manage. We present OpenReq Analytics, a software requirements intelligence service, that collects, processes, analyzes, and visualizes user feedback.
|
eng_Latn
| 135,914 |
How can usability contribute to user experience?: a study in the domain of e-commerce
|
Interaction design: beyond human-computer interaction
|
Tree2Tree Neural Translation Model for Learning Source Code Changes
|
eng_Latn
| 135,938 |
Multi-Factor Duplicate Question Detection in Stack Overflow
|
Categorizing bugs with social networks: a case study on four open source software communities
|
creative learning environments in education – a systematic literature review .
|
eng_Latn
| 135,939 |
A study on the software requirements elicitation issues - its causes and effects
|
Elicitation technique selection: how do experts do it?
|
Learning styles and performance in the introductory programming sequence
|
eng_Latn
| 135,946 |
User preferences of software documentation genres
|
Usage and usefulness of technical software documentation: An industrial case study
|
DCU: Aspect-based Polarity Classification for SemEval Task 4
|
eng_Latn
| 135,949 |
An empirical study on how expert knowledge affects bug reports
|
Retrieval from software libraries for bug localization: a comparative study of generic and composite text models
|
Word Embeddings for the Construction Domain
|
eng_Latn
| 135,950 |
Research synthesis in software engineering: A tertiary study
|
Preliminary Guidelines for Empirical Research in Software Engineering
|
A review of studies on expert estimation of software development effort
|
eng_Latn
| 135,952 |
The GNOME project: a case study of open source, global software development
|
Cave or Community? An Empirical Examination of 100 Mature Open Source Projects
|
Hematopoietic Cell Transplant and Use of Massage for Improved Symptom Management: Results from a Pilot Randomized Control Trial
|
eng_Latn
| 135,958 |
Evaluating usage and quality of technical software documentation: an empirical study
|
A study of the documentation essential to software maintenance
|
Picture: A probabilistic programming language for scene perception
|
eng_Latn
| 135,961 |
An empirical study on the impact of static typing on software maintainability
|
All syntax errors are not equal
|
A comprehensive assessment of the structural similarity index
|
eng_Latn
| 135,969 |
A systematic review of research on open source software in commercial software product development
|
Guidelines for conducting and reporting case study research in software engineering
|
Using cluster analysis for market segmentation - typical misconceptions, established methodological weaknesses and some recommendations for improvement
|
eng_Latn
| 135,970 |
Understanding the Factors that Impact the Popularity of GitHub Repositories
|
an exploratory study of the pull - based software development model .
|
High performance gradient driver for magnetic resonance imaging system
|
eng_Latn
| 135,978 |
A study on the software requirements elicitation issues - its causes and effects
|
Elicitation technique selection: how do experts do it?
|
Influence maximization: near-optimal time complexity meets practical efficiency
|
eng_Latn
| 135,981 |
What is Wrong with Topic Modeling? (and How to Fix it Using Search-based Software Engineering)
|
Predicting defect-prone software modules using support vector machines
|
A study of effective regression testing in practice
|
eng_Latn
| 136,000 |
Which process metrics can significantly improve defect prediction models? An empirical study
|
Using Software Dependencies and Churn Metrics to Predict Field Failures: An Empirical Case Study
|
Successful treatment of intractable vulvitis circumscripta plasmacellularis via combination therapy with topical tacrolimus and tetracycline
|
eng_Latn
| 136,007 |
Managing technical debt: An industrial case study
|
agile software development : it ' s about feedback and change .
|
Review. Machine learning techniques for traffic sign detection
|
eng_Latn
| 136,008 |
Systematic literature studies: database searches vs. backward snowballing
|
Identifying relevant studies in software engineering
|
What makes a helpful online review? a study of customer reviews on amazon.com
|
eng_Latn
| 136,013 |
Assessing programming language impact on development and maintenance: a study on c and c++
|
Two case studies of open source software development: Apache and Mozilla
|
Determinants of malaria infection in Dembia district, Northwest Ethiopia: a case-control study
|
eng_Latn
| 136,017 |
Using ISO/IEC 12207 to analyze open source software development processes: an e-learning case study
|
The situational factors that affect the software development process: Towards a comprehensive reference framework
|
Integrated pathology and radiology learning for a musculoskeletal system module: an example of interdisciplinary integrated form
|
eng_Latn
| 136,019 |
Interprocedural Semantic Change-Impact Analysis using Equivalence Relations
|
The impact of code review coverage and code review participation on software quality: a case study of the qt, VTK, and ITK projects
|
Self-automated parking lots for autonomous vehicles based on vehicular ad hoc networking
|
eng_Latn
| 136,023 |
Research synthesis in software engineering: A tertiary study
|
Strength of evidence in systematic reviews in software engineering
|
Cortical Activation by Yamamoto New Scalp Acupuncture in the Treatment of Patients with a Stroke: A Sham-Controlled Study Using Functional Mri
|
eng_Latn
| 136,029 |
Where do developers log? an empirical study on logging practices in industry
|
Structured comparative analysis of systems logs to diagnose performance problems
|
A Conversational Model of art therapy
|
eng_Latn
| 136,035 |
Automated classification of software issue reports using machine learning techniques: an empirical study
|
The MSR Cookbook: Mining a decade of research
|
Heads, Hox and the phylogenetic position of trilobites
|
eng_Latn
| 136,039 |
On the value of user preferences in search-based software engineering: a case study in software product lines
|
Pareto efficient multi-objective test case selection
|
Artefacts in magnetic resonance imaging caused by dental material
|
eng_Latn
| 136,043 |
User preferences of software documentation genres
|
Usage and usefulness of technical software documentation: An industrial case study
|
Is Working Capital Management Value-Enhancing? Evidence from Firm Performance and Investments
|
eng_Latn
| 136,049 |
Defect prediction on a legacy industrial software: a case study on software with few defects
|
A systematic review of machine learning techniques for software fault prediction
|
A METRICS SUITE FOR OBJECT ORIENTED DESIGN
|
eng_Latn
| 136,059 |
A study on the software requirements elicitation issues - its causes and effects
|
Elicitation technique selection: how do experts do it?
|
Fake Review Detection : Classification and Analysis of Real and Pseudo Reviews
|
eng_Latn
| 136,063 |
An empirical study on dependence clusters for effort-aware fault-proneness prediction
|
A Validation of Object-Oriented Design Metrics as Quality Indicators
|
Contextual Code Completion Using Machine Learning
|
eng_Latn
| 136,072 |
An ethnographic study of copy and paste programming practices in OOPL
|
Clone detection using abstract syntax trees
|
Collaborative Fashion Recommendation: A Functional Tensor Factorization Approach
|
eng_Latn
| 136,075 |
An empirical study on dependence clusters for effort-aware fault-proneness prediction
|
Are Slice-Based Cohesion Metrics Actually Useful in Effort-Aware Post-Release Fault-Proneness Prediction? An Empirical Study
|
Designing a Measurement Method for the Portability Non-functional Requirement
|
eng_Latn
| 136,094 |
achieving quality in open - source software .
|
A case study of open source software development: the Apache server
|
Evaluating Web Search with a Bejeweled Player Model
|
eng_Latn
| 136,095 |
Computational Higher Type Theory IV: Inductive Types
|
Cubical Type Theory: a constructive interpretation of the univalence axiom
|
Toward understanding the causes of unanswered questions in software information sites: a case study of stack overflow
|
kor_Hang
| 136,101 |
retrospecting on work and productivity : a study on self - monitoring software developers ' work .
|
What makes a great software engineer?
|
A low cost wireless sensor node for building monitoring
|
eng_Latn
| 136,123 |
Recently, several approaches have been introduced for incorporating the information from multiple cameras to increase the robustness of tracking. This allows to handle problems of mutually occluding objects - a reasonable scenario for many tasks such as visual surveillance or sports analysis. However, these methods often ignore problems such as inaccurate geometric constraints and violated geometric assumptions, requiring complex methods to resolve the resulting errors. In this paper, we introduce a new multiple camera tracking approach that inherently avoids these problems. We build on the ideas of generalized Hough voting and extend it to the multiple camera domain. This offers the following advantages: we reduce the amount of data in voting and are robust to projection errors. Moreover, we show that using additional geometric information can help to train more specific classifiers drastically improving the tracking performance. We confirm these findings by comparing our approach to existing (multi-camera) tracking methods.
|
To increase the robustness of detection in intelligent video surveillance systems, homography has been widely used to fuse foreground regions projected from multiple camera views to a reference view. However, the intersections of non-corresponding foreground regions can cause phantoms. This paper proposes an algorithm based on geometry and colour cues to cope with this problem, in which the homography between different camera views and the Mahalanobis distance between the colour distributions of every two associated foreground regions are considered. The integration of these two matching algorithms improves the robustness of the pedestrian and phantom classification. Experiments on real-world video sequences have shown the robustness of this algorithm.
|
Monitoring a ground source heat pump can provide important insights into its working, but to study the behaviour of the borehole heat exchanger (BHE) we require monitored data for the whole period ...
|
eng_Latn
| 136,142 |
Unattended Packages Recognition Based on HMM
|
With the thorough of studying behavior understanding,the detection and analysis of unattended packages is receiving increasing attention from computer vision researchers,it aims at attempting to detect,track and identify people,and more generally,to understand human behaviors,from image sequences involving humans.With application requirements of intelligent video surveillance,in order to achieve the intelligent unattended packages recognition,implement the HMM(Hidden Markov Models) method in unattended packages recognition,provide a project of unattended packages recognition on HMM,took on the image preprocessing,the training HMM parameters,and identifying from a test sequence,and have been studied in the recognition capability.At the end of this survey,some discussions on future directions in motion analysis are also provided.
|
This paper addresses the problem of the considerably decreased performance of spread spectrum audio watermarking in the presence of time scale modification attack. An adaptive receiver is described, which can precisely estimate the quantization step needed to overcome the time scale modification attack. The proposed scheme obtains significantly lower bit error rates in the presence of DA/AD conversion and time scalemodification, obtaining smaller bit error rate up to an order of value.
|
eng_Latn
| 136,167 |
Sensors fusion for head tracking using Particle filter in a context of falls detection
|
In the context of ageing societies, assessing risk factors and detecting falls for the elderly is becoming a crucial issue. In this paper, we propose an iterative head tracking method based on particle filtering using the fusion of low cost thermal and depth sensors for home environments whilst preserving privacy. The iteration process begins by segmenting the head in the depth image to calculate the depth coefficients and the thermal coefficients used for updating the particle weights. The method was tested on several sequences, with or without depth-thermal fusion: results show its robustness and accuracy , and also demonstrate that fusion improves tracking, namely when fast motion occurs (in case of a fall for instance) or when segmentation is erroneous.
|
We study the effects of gluon radiation on top production and decay processes at an $e^+e^-$ collider.The matrix elements are computed without any approximations, using spinor techniques. We use a Monte Carlo event generator which takes into account the infrared singularity due to soft gluons and differences in kinematics associated with radiation in the production versus decay process. The calculation is illustrated for several strategies of top mass reconstruction.
|
eng_Latn
| 136,184 |
The 3dSOBS+ algorithm for moving object detection
|
real - time foreground - background segmentation using codebook model .
|
Comparative study of background subtraction algorithms
|
eng_Latn
| 136,207 |
Background Modeling Using Adaptive Cluster Density Estimation for Automatic Human Detection
|
Detection is an inherent part of every advanced automatic tracking system. In this work we focus on automatic detection of humans by enhanced background subtraction. Background subtraction (BS) refers to the process of segmenting moving regions from video sensor data and is usually performed at pixel level. In its standard form this technique involves building a model of the background and extracting regions of the foreground. In this paper, we propose a cluster-based BS technique using a mixture of Gaussians. An adaptive mechanism is developed that allows automated learning of the model parameters. The efficiency of the designed technique is demonstrated in comparison with a pixel-based BS.
|
Summary form only given, as follows. A study is presented of precision constraints imposed by a hybrid chip architecture with analog neurons and digital backpropagation calculations. Conversions between the analog and digital domains and weight storage restrictions impose precision limits on both analog and digital calculations. It is shown through simulations that a learning system of this nature can be implemented in spite of limited resolution in the analog circuits and using fixed-point arithmetic to implement the backpropagation algorithm. >
|
kor_Hang
| 136,314 |
Attribute-based vehicle search in crowded surveillance videos
|
Robust real-time object detection
|
A study of website content in webometrics ranking of world university by using similar web tool
|
eng_Latn
| 136,360 |
A Novel Video Dataset for Change Detection Benchmarking
|
Dense Estimation and Object-Based Segmentation of the Optical Flow with Robust Techniques
|
A comparative study of texture measures with classification based on featured distributions
|
eng_Latn
| 136,361 |
On the purity of training and testing data for learning: The case of pedestrian detection
|
machine learning algorithms : a study on noise sensitivity .
|
Video eCommerce: Towards Online Video Advertising
|
eng_Latn
| 136,403 |
Category independent object discovery via background modeling
|
Object Detection with Discriminatively Trained Part-Based Models
|
Parametric cost estimation based on activity-based costing: A case study for design and development of rotational parts
|
eng_Latn
| 136,479 |
Education for Growth: Why and For Whom?
|
Determinants of economic growth: a cross-country empirical study
|
Confidence-Based Data Association and Discriminative Deep Appearance Learning for Robust Online Multi-Object Tracking
|
eng_Latn
| 136,550 |
A comparative study of the temperature dependence of lasing wavelength of conventional edge emitting stripe laser and vertical cavity surface emitting laser
|
Simple experimental verification of the relation between the band-gap energy and the energy of photons emitted by LEDs
|
Completely Stale Transmitter Channel State Information is Still Very Useful
|
eng_Latn
| 136,625 |
Highly reliable 60/spl deg/C 50-mW operation of 650-nm band window-mirror laser diodes
|
High-power GaInP QW laser diodes with a window-mirror-structure lasing at a wavelength of around 650 nm have been fabricated. The maximum light output power over 150 mW has been realized without optical mirror damage. In addition, the laser shows the fundamental-mode-operation at 50 mW and the dynamic characteristics sufficient for recordable digital versatile disc (DVD) applications. The lasers have been operating for 2000 h under the condition of CW, 50 mW, and 60/spl deg/C, for the first time.
|
We study the coupling of plasmons and Dyakonov surface waves propagating at the interfaces between isotropic-birefringent-metal layered structures. Efficient coupling is shown to occur with a proper choice of the crystal birefringence, the refractive index of the isotropic medium, and the light propagation direction relative to the crystal optical axis. In the case of low-loss metals, coupling efficiencies as high as 90% are predicted to be possible.
|
eng_Latn
| 136,647 |
VCSEL structures and applications
|
Vertical-Cavity Surface-Emitting Lasers (VCSELs) for sensing and communication applications in the 1.3-2.3μm wavelength range are presented. The devices feature low thresholds, electronically tunable single-mode emission and modulation bandwidths exceeding 10Gb/s.
|
This paper takes a case study of the secondary and higher vocational education in Shijiazhuang,and analyzes the problems like inefficient administration and independence of schools.Therefore,countermeasures are suggested for multilateral coordination of government,society,and schools for the purpose of construction of the modern vocational education system.
|
eng_Latn
| 136,654 |
Phase transitions in isolated molecules
|
Abstract It is shown that an isolated polyatomic molecule can undergo a structural phase transition when optically pumped.
|
General trends in In surface segregation during MOVPE of InGaN-based heterostructures are considered in terms of a rate-equation model. A parametric study of segregation effects is carried out with reference to a single GaN/InGaN/GaN quantum well. The theoretical predictions are compared with available observations.
|
eng_Latn
| 136,683 |
Laser-based 2D and 3D nanomanufacturing for plasmonic applications
|
Growing interest in the field of surface plasmon polaritons (SPPs) comes from a rapid advance of nanostructuring technologies. The application of two-photon polymerisation technique for the fabrication of dielectric and metallic SPP-structures, which can be used for localisation, guiding, and manipulation of SPP waves on a subwavelength scale are studied. This technology is based on non-linear absorption of near-infrared femtosecond laser pulses. Excitation, propagation, and interaction of SPP waves with nanostructures are controlled and studied by leakage radiation imaging. It is demonstrated that created nanostructures are very efficient for the excitation and focusing of SPPs on the metal film. Examples of passive and active SPP components are presented and discussed.
|
We have performed a set of prototypical quantum calculations of the electronic structure of a none-atom helium plasma over wide ranges of temperature and density. These calculations reveal the presence of very tightly bound quasimolecular states in high-density plasmas, even at temperatures high enough to ionize fully the component atoms. The results of the study suggest that there are four regimes in a plasma, dependent upon the relationship of the electron wavelength to the interionic spacing: low-density atomic, intermediate-density screened atomic, high-density quasimolecular, and very-high-density homogeneous.
|
eng_Latn
| 136,688 |
Fabrication of nanopatterned germanium surface by laser-induced etching: AFM, Raman and PL studies
|
Abstract Fabrication of the nanopatterned germanium (Ge) surface is done by laser-induced etching. Atomic force microscopy is utilized here to study the surface and sizes of Ge nanoparticles. Raman and photoluminescence (PL) spectroscopy have been used to characterize their vibrational and light emission properties. Wavelength-dependent Raman investigations of these nanopatterned Ge surface reveal spatial distribution of sizes of nanoparticles. Nanopatterned Ge structures (etched for 60 min) emit a broad PL band having two maxima at ∼2.1 and 2.35 eV.
|
We demonstrate GaSb overgrowth over tungsten patterns and that selective area epitaxy is achievable in the W/GaSb system. By controlling the facet growth at low temperatures, it is possible to embed a metal grating in a thin layer.
|
eng_Latn
| 136,690 |
On hierarchical brain tumor segmentation in MRI using fully convolutional neural networks: A preliminary study
|
Rectifier Nonlinearities Improve Neural Network Acoustic Models
|
Broadband Substrate Integrated Waveguide 4$\,\times\,$4 Nolen Matrix Based on Coupler Delay Compensation
|
eng_Latn
| 136,747 |
Skin lesion segmentation: U-Nets versus clustering
|
Automated Melanoma Recognition in Dermoscopy Images via Very Deep Residual Networks
|
Biomechanical analysis of the Universal 2 implant in total wrist arthroplasty: a finite element study.
|
kor_Hang
| 136,775 |
Comparing feature-based classifiers and convolutional neural networks to detect arrhythmia from short segments of ECG
|
Cardiologist-Level Arrhythmia Detection with Convolutional Neural Networks
|
Subtalar instability: a biomechanical cadaver study
|
eng_Latn
| 136,781 |
Predicting Alzheimer's disease: a neuroimaging study with 3D convolutional neural networks
|
Practical recommendations for gradient-based training of deep architectures
|
Density-based clustering and radial basis function modeling to generate credit card fraud scores
|
eng_Latn
| 136,793 |
Shifting the Baseline: Single Modality Performance on Visual Navigation&QA
|
Deep Residual Learning for Image Recognition
|
Split bolus technique in polytrauma: a prospective study on scan protocols for trauma analysis
|
eng_Latn
| 136,826 |
Deep Semantic Architecture with discriminative feature visualization for neuroimage analysis
|
Deep learning for neuroimaging: a validation study
|
Are advanced three-dimensional imaging studies always needed to measure the coronal knee alignment of the lower extremity?
|
eng_Latn
| 136,829 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.